Previous Section
 < Day Day Up > 
Next Section


Chapter notes

R. Bellman began the systematic study of dynamic programming in 1955. The word "programming," both here and in linear programming, refers to the use of a tabular solution method. Although optimization techniques incorporating elements of dynamic programming were known earlier, Bellman provided the area with a solid mathematical basis [34].

Hu and Shing [159, 160] give an O(n lg n)-time algorithm for the matrix-chain multiplication problem.

The O(mn)-time algorithm for the longest-common-subsequence problem appears to be a folk algorithm. Knuth [63] posed the question of whether subquadratic algorithms for the LCS problem exist. Masek and Paterson [212] answered this question in the affirmative by giving an algorithm that runs in O(mn/ lg n) time, where n m and the sequences are drawn from a set of bounded size. For the special case in which no element appears more than once in an input sequence, Szymanski [288] shows that the problem can be solved in O((n + m) lg(n + m)) time. Many of these results extend to the problem of computing string edit distances (Problem 15-3).

An early paper on variable-length binary encodings by Gilbert and Moore [114] had applications to constructing optimal binary search trees for the case in which all probabilities pi are 0; this paper contains an O(n3)-time algorithm. Aho, Hopcroft, and Ullman [5] present the algorithm from Section 15.5. Exercise 15.5-4 is due to Knuth [184]. Hu and Tucker [161] devised an algorithm for the case in which all probabilities pi are 0 that uses O(n2) time and O(n) space; subsequently, Knuth [185] reduced the time to O(n lg n).



Previous Section
 < Day Day Up > 
Next Section