In this section the authors provide the standard introduction to orders of growth (big-Oh notation) and give a few simple examples of formulas and their growth orders.
The tree is pretty big, so I’ll refrain from drawing it. However, it is quite clear how it goes. Obviously, the deepest path in the tree happens when the result is the sum in all pennies. So the space is O(n) for n that is the amount we want to change.
Figuring out the time complexity here is much more challenging, because the recursion formula is hard to solve analytically. Initially I thought it's exponential, but Thomas Morgan convinced me that it is polynomial, somewhere in the order of O(N^5). If you have a proof, please let me know.
Note that similarly to the recursive Fibonacci, there is a lot of repetitiveness – we’ll be solving the same problem over an over again in different sub-branches of the tree. Memoization can improve matters here a lot.
Here is the code rewritten in CL, along with some printing to find out how many times
p is called:
(defun cube (x) (* x x x)) (defun p (x) (format t "p ~f~%" x) (- (* 3 x) (* 4 (cube x)))) (defun sine (angle) (if (not (> (abs angle) 0.1)) angle (p (sine (/ angle 3.0)))))
(sine 12.15) we can see that
p is called 5 times. How does the call sequence look:
(sine 12.15) (p (sine 4.05)) (p (p (sine 1.35))) (p (p (p (sine 0.45)))) (p (p (p (p (sine 0.15))))) (p (p (p (p (p (sine 0.05)))))) (p (p (p (p (p 0.05))))) ...
p is called for as long as the division of the angle by 3 stays above 0.1; therefore, the runtime of this function is logarithmic, more specifically O(log3(n)). The space is logarithmic as well, because this is a linear recursion where each computation is a single recursive call (as opposed to tree recursion, where each computation splits to 2 or more calls).