Consider a function $f(n)$ whose definition requires one to compute $f(1),f(2),..f(n-1)$ in order to evaluate $f(n)$. Suppose that some algorithm to compute $f(n)$ has time complexity that is given by the recurrence: $T(n) = n + T(1) + T(2) +...+ T(n-1)$. This is $O(2^{n})$.
One can use dynamic programming to store the values of $f(1),f(2),..,f(n-1)$ in an array so that computation of $f(n)$ only requires $O(n)$ time as long as the values of $f(1),f(2)...,f(n-1)$ are already stored in an array. The total time complexity of such a dynamic programming algorithm would be $O(n^{2})$ which makes sense if you consider the actual psuedocode of such an algorithm. But what would be the recurrence? Would it be $T(n) = n + \max(T(1),...,T(n-1))$ ?
1 Answer 1
If all of your values start out unstored, then $T(n) = n + T(1) + T(2) + \cdots + T(n - 1)$ still, however, when computing $T(n - 1)$, $T(1)$ through $T(n - 2)$ have already been found, so $T(n - 1) = n - 1 + O(1)$.
Same for $T(1)$ through $T(n - 2)$ ($T(1)$ is just $O(1)$).
This gives an overall runtime of $$T(n) = n + O(1) + (1 + O(1)) + \cdots + (n - 1 + O(1)) = \frac{n(n-1)}{2}+n(1+O(1))=O(n^2)$$
Which is really just the sums of an arithmetic sequence.
Explore related questions
See similar questions with these tags.