Recursion is one of the things that makes computation happen – whether you’re doing something on your computer, smart TV, or smartphone.
For example, here’s a definition of the addition function represented in first-order logic:
Or, the more commonly known variant:
In this blog post, we will generalize recursive functions.
Primitive recursion
Note that primitive recursion can be defined within first-order logic using Peano axioms. All Peano axioms except the induction axiom are statements in first-order logic. This relation to computation is one of the things that makes number theory interesting to study.
In order to represent primitive recursive functions, we need the following axioms for basic primitive functions:
- The constant function
of type
- The successor function
of type
- The projection function
of type
The first and the third functions are generalized to parameters. Here’s a few examples for each:
. The constant function returns
for any
. The successor function returns the next number
. The projection function can return one of
,
, or
. E.g.
The constant function represents the base case of the recursion and the successor function to increase the index loop. The projection function allows the extraction of a variable from a sequence.
In addition, to generate more complex primitive functions, we can use the following two axioms:
- Composition:
, where
and the functions
and
are constructed by using the previous rules (constant, successor, projection)
- The recursive operator
, defined as
, where
,
and
with
represents a for-loop, starting from 0 and proceeding upwards to its first argument. The remaining arguments represent the state of the computation.
describes the initial calculations, and
describes the iterative calculations.
Applying these axioms a finite number of times allows us to generate all primitive recursive functions.
Going back to the addition example, we can now represent it as ; we choose
to capture
and
to capture
.
Example evaluation:
A direct implementation of the rules in Haskell:
constant n xs = [n]
successor [x] = [x + 1]
projection i xs = [xs !! (i - 1)]
compose h gs xs = f xs h gs
where
f xs h gs = h $ concat $ map (\g -> g xs) gs
rho g h xs = f xs g h
where
f (0:xs) g h = g xs
f (n:xs) g h = h ([n - 1] ++ f ((n - 1) : xs) g h ++ xs)
Example evaluations:
> add = rho (projection 1) (compose successor [projection 2])
> add [5,5]
[10]
> double = compose add [projection 1, projection 1]
> double [20]
[40]
> mult = rho (constant 0) (compose add [projection 2, projection 3])
> mult [2,5]
[10]
General recursion
If in primitive recursive functions the loops are always bounded, then general recursive functions generalize them and allow for unbounded loops.
One additional axiom is required for that:
- The minimization operator
is defined as
for
, and
otherwise, for some function
Minimization starts searching from 0 and proceeds upwards to find the minimal argument that causes the function to return zero. If there is no such argument, or if is undefined, the search never terminates.
With the minimization operator, the system becomes powerful enough to simulate a Turing machine.
A direct implementation in Haskell:
mu f xs = go 0 f xs
where
go n f xs = if f (n:xs) == 0 then n else go (n+1) f xs
Conclusion
There are many ways to represent computation, and recursive functions are just one way to do it. Other popular ways are Turing machines* and lambda calculus**. They are shown to be equivalent in terms of what can be computed. In fact, we represented recursive functions within the system of lambda calculus – Haskell. The Church-Turing conjecture states that essentially all computation can be calculated by a Turing machine – note that this is not a theorem but rather a conjecture.
We showed how “all” computation can be captured by just a few axioms: the constant function, the successor function, the projection function, function composition, the recursion operator, and the minimization operator.
Recently, I’ve been busy working on Budge, an esoteric programming language. If we look at the paper, we can notice that the function for evaluating a Budge program is similar to the recursive operator
(in that a code is allowed to be repeated based on some condition), and the minimization operator
(a code can be repeated an infinite number of times).
*: A Turing machine consists of a finite set of states
, a finite set of symbols
, an initial state
, and a function
that defines the next state.
**: Lambda calculus expressions can be (vars),
(abstraction) and
(application). The evaluation rule
where
replaces all free vars (vars in the abstraction body not bound by its params)
with
in
.
👍
LikeLike