数据结构和算法

DSA - 主页 DSA - 概述 DSA - 环境设置 DSA - 算法基础 DSA - 渐近分析

数据结构

DSA - 数据结构基础 DSA - 数据结构和类型 DSA - 数组数据结构

链接列表

DSA - 链接列表数据结构 DSA - 双向链接列表数据结构 DSA - 循环链表数据结构

堆栈 &队列

DSA - 堆栈数据结构 DSA - 表达式解析 DSA - 队列数据结构

搜索算法

DSA - 搜索算法 DSA - 线性搜索算法 DSA - 二分搜索算法 DSA - 插值搜索 DSA - 跳跃搜索算法 DSA - 指数搜索 DSA - 斐波那契搜索 DSA - 子列表搜索 DSA - 哈希表

排序算法

DSA - 排序算法 DSA - 冒泡排序算法 DSA - 插入排序算法 DSA - 选择排序算法 DSA - 归并排序算法 DSA - 希尔排序算法 DSA - 堆排序 DSA - 桶排序算法 DSA - 计数排序算法 DSA - 基数排序算法 DSA - 快速排序算法

图形数据结构

DSA - 图形数据结构 DSA - 深度优先遍历 DSA - 广度优先遍历 DSA - 生成树

树数据结构

DSA - 树数据结构 DSA - 树遍历 DSA - 二叉搜索树 DSA - AVL 树 DSA - 红黑树 DSA - B树 DSA - B+ 树 DSA - 伸展树 DSA - 尝试 DSA - 堆数据结构

递归

DSA - 递归算法 DSA - 使用递归的汉诺塔 DSA - 使用递归的斐波那契数列

分而治之

DSA - 分而治之 DSA - 最大最小问题 DSA - 施特拉森矩阵乘法 DSA - Karatsuba 算法

贪婪算法

DSA - 贪婪算法 DSA - 旅行商问题(贪婪方法) DSA - Prim 最小生成树 DSA - Kruskal 最小生成树 DSA - Dijkstra 最短路径算法 DSA - 地图着色算法 DSA - 分数背包问题 DSA - 作业排序截止日期 DSA - 最佳合并模式算法

动态规划

DSA - 动态规划 DSA - 矩阵链乘法 DSA - Floyd Warshall 算法 DSA - 0-1 背包问题 DSA - 最长公共子序列算法 DSA - 旅行商问题(动态方法)

近似算法

DSA - 近似算法 DSA - 顶点覆盖算法 DSA - 集合覆盖问题 DSA - 旅行商问题(近似方法)

随机算法

DSA - 随机算法 DSA - 随机快速排序算法 DSA - Karger 最小割算法 DSA - Fisher-Yates 洗牌算法

DSA 有用资源

DSA - 问答 DSA - 快速指南 DSA - 有用资源 DSA - 讨论


Dynamic Programming



Dynamic programming approach is similar to divide and conquer in breaking down the problem into smaller and yet smaller possible sub-problems. But unlike divide and conquer, these sub-problems are not solved independently. Rather, results of these smaller sub-problems are remembered and used for similar or overlapping sub-problems.

Mostly, dynamic programming algorithms are used for solving optimization problems. Before solving the in-hand sub-problem, dynamic algorithm will try to examine the results of the previously solved sub-problems. The solutions of sub-problems are combined in order to achieve the best optimal final solution. This paradigm is thus said to be using Bottom-up approach.

So we can conclude that −

  • The problem should be able to be divided into smaller overlapping sub-problem.

  • Final optimum solution can be achieved by using an optimum solution of smaller sub-problems.

  • Dynamic algorithms use memorization.

Dynamic_Programming_Approach

However, in a problem, two main properties can suggest that the given problem can be solved using Dynamic Programming. They are −

Overlapping Sub-Problems

Similar to Divide-and-Conquer approach, Dynamic Programming also combines solutions to sub-problems. It is mainly used where the solution of one sub-problem is needed repeatedly. The computed solutions are stored in a table, so that these don’t have to be re-computed. Hence, this technique is needed where overlapping sub-problem exists.

For example, Binary Search does not have overlapping sub-problem. Whereas recursive program of Fibonacci numbers have many overlapping sub-problems.

Optimal Sub-Structure

A given problem has Optimal Substructure Property, if the optimal solution of the given problem can be obtained using optimal solutions of its sub-problems.

For example, the Shortest Path problem has the following optimal substructure property −

If a node x lies in the shortest path from a source node u to destination node v, then the shortest path from u to v is the combination of the shortest path from u to x, and the shortest path from x to v.

The standard All Pair Shortest Path algorithms like Floyd-Warshall and Bellman-Ford are typical examples of Dynamic Programming.

Steps of Dynamic Programming Approach

Dynamic Programming algorithm is designed using the following four steps −

  • Characterize the structure of an optimal solution.

  • Recursively define the value of an optimal solution.

  • Compute the value of an optimal solution, typically in a bottom-up fashion.

  • Construct an optimal solution from the computed information.

Dynamic Programming vs. Greedy vs. Divide and Conquer

In contrast to greedy algorithms, where local optimization is addressed, dynamic algorithms are motivated for an overall optimization of the problem.

In contrast to divide and conquer algorithms, where solutions are combined to achieve an overall solution, dynamic algorithms use the output of a smaller sub-problem and then try to optimize a bigger sub-problem. Dynamic algorithms use memorization to remember the output of already solved sub-problems.

Examples of Dynamic Programming

The following computer problems can be solved using dynamic programming approach −

Dynamic programming can be used in both top-down and bottom-up manner. And of course, most of the times, referring to the previous solution output is cheaper than re-computing in terms of CPU cycles.