If I have to pick a favorite among many dynamic programming algorithms, it will be Floyd-Warshall's all-pairs shortest path algorithm.

procedure FloydWarshall () for k := 1 to n for i := 1 to n for j := 1 to n path[i][j] = min ( path[i][j], path[i][k]+path[k][j] );Here's why it's cool: when you first learn about the shortest-path problem in your graph theory course, you probably start with Dijkstra's algorithm that solves

*single*-source shortest path. It's quite complicated at first, but then you get over it, and you fully understood it.

Then the teacher says "Now we want to solve the same problem but for

*ALL*sources". You think to yourself, "Oh god, this is going to be a much harder problem!

*It's going to be at least N times more complicated than Dijkstra's algorithm!!!*".

Then the teacher gives you Floyd-Warshall. And your mind explodes. Then you start to tear up at how beautifully simple the algorithm is. It's just a triply-nested loop. It only uses a simple array for its data structure.

The most eye-opening part for me is the following realization: say you have a solution for problem A. Then you have a bigger "superproblem" B which contains problem A. The solution to problem B may in fact be simpler than the solution to problem A.

## No comments:

## Post a Comment