I mean why should we use such epic buzzword when you can say: "cache your results!".
-
1$\begingroup$ I think that you will find that caching one's result is necessary but not sufficient to implement dynamic programming. $\endgroup$Shawn Mehan– Shawn Mehan2015年11月19日 18:08:44 +00:00Commented Nov 19, 2015 at 18:08
-
$\begingroup$ A buzzword way would be dynamic programming is induction, memoizing is recursion. $\endgroup$AProgrammer– AProgrammer2015年11月19日 19:23:32 +00:00Commented Nov 19, 2015 at 19:23
-
$\begingroup$ @AProgrammer What's the difference between induction and recursion? $\endgroup$David Richerby– David Richerby2015年11月19日 19:46:03 +00:00Commented Nov 19, 2015 at 19:46
-
1$\begingroup$ @DavidRicherby, It's the same journey, just going in opposite directions. $\endgroup$AProgrammer– AProgrammer2015年11月19日 20:43:02 +00:00Commented Nov 19, 2015 at 20:43
-
$\begingroup$ possible duplicate: cs.stackexchange.com/q/47216/755. (See also cs.stackexchange.com/q/17871/755 and cstheory.stackexchange.com/q/5635/5038 for more about the history of the name.) $\endgroup$D.W.– D.W. ♦2015年11月20日 09:18:18 +00:00Commented Nov 20, 2015 at 9:18
1 Answer 1
There are some other key aspects of dynamic programming besides caching, namely, it is about writing solutions of optimization problems recursively to exploit the fact that optimal solutions are constructed out of optimal solutions to subproblems, and further that caching limits the number of recursive calls to the maximum number of distinct subproblems that need to be solved.
Also, dynamic programming is not so much a formalism, as an intuitive framework for solving many optimization problems recursively.
Further, the use of such an "epic buzzword" is largely the result of brilliant marketing by Richard Bellman...