Replies: 2 comments 3 replies
-
The notion of actual runtime complexity is far too complex to fathom. Think of unification and the "union-find" operation which in Scryer as in many current systems is approximated with the more costly refchains. Counting inferences is more abstract and a bit easier. But you will have to consider various modes separately.
Not clear which both lists you are referring to. However, the second argument has no influence on the number of inferences performed. Whereas the first and third are solely responsible. And their size is not necessarily of influence. Think of
Where also any instance thereof has constant cost, regardless of the sizes.
Not clear which n you refer to. If the second argument is fixed, that limits also the operation. But before you go into such details, better consider an even more abstract measure first. That is non-termination and termination. It's like counting inferences with the "numbers" finite and infinite. With the help of failure-slices you can approach these notions rapidly. |
Beta Was this translation helpful? Give feedback.
-
Maybe this definition:
"The power of the logic variable is best illustrated by difference lists and the ability in Prolog to concatenate lists in constant time and declaratively. append(H1-T1,T1-T2,H1-T2). This is one of gems of Prolog that will stay with everyone for a lifetime." Pascal Van Hentenryck: Programmation Par Contraintes, 2023 https://bpb-us-w2.wpmucdn.com/sites.gatech.edu/dist/3/865/files/2023/10/AlainColmerauer.pdf Also, my second source: |
Beta Was this translation helpful? Give feedback.
-
For most programming languages I can typically understand complexity guarantees on most primitive operations. For example, adding to the head of a linked list is
O(1)
while inserting at the beginning of an array is at leastO(n)
. I find it significantly more challenging to reason about time complexity in Prolog.For example, in most languages, a nonlazy mapping operation such as
maplist/2
orfindall/3
would beO(n)
in the size of the elements. However, since Prolog used a backtracking search, wouldn't the complexity actually beO(V+E)
akaO(choicepoints)
?Or can you look at a logically linear operation and conclude linear time complexity?
I know some things are surprising, for instance that
append/3
is apparently constant time in list concatenation? That's pretty unusual. Normally thats at least a linear scan of both lists.On the other hand, 'length/2
is probably an
O(n)` operation as it is with linked lists, which is not bad but not ideal (but better than dealing with a complex object).I know in general the idea is "just write good prolog and over time the implementation speed will get better", but I'd like to learn to understand it a little more deeply than that!
Beta Was this translation helpful? Give feedback.
All reactions