Podcast: Play in new window | Download
Subscribe: Apple Podcasts | Google Podcasts | Spotify | Stitcher | TuneIn | RSS
We continue digging into Rob Conery’s The Imposter’s Handbook as Joe explains Florida time, Allen likes greedy algorithms, and Michael shares his geography knowledge.
- Airbrake.io/CodingBlocks – Try Airbrake *free* for 30 days _and_ get 50% off the next 3 months!
- Thank you to everyone that left us a new review!
- iTunes: Friedlman, justanemployee, Orbitermonkey, RowedRage, Budlee, AhmadAnApp, nemonymous, jones_dad, Matt Kotwicki
- Stitcher: TroyandAbedintheModem, Rodolfo
- Come see Joe speak about Deliberate Practice for Programmers at the St Pete .NET Meetup
Finding the Shortest Route from A to Z
- Optimization method for algorithms, developed in the 1950’s by Richard Bellman.
- You simplify a problem by breaking it down into subproblems, then iterate over those sub-solutions.
- Works nicely with memoization, simply storing the values of common calculations.
- A set of vertices, connected by edges that have a specific direction.
- For example, A points to B, B points to C, C points to A.
How it works
- It uses a mathematical approach called a relaxation method – means an iterative approach where it starts with approximations and gets more exacting it iterates through.
- Source node initialized to 0 and all other nodes initialized to infinity.
- Start at the home node, and then get the weights for each connected node – keep track of those.
- Then move to the next node on the list, and get the weights for each of their connected nodes and add those weights to the previous totals.
- If there are any nodes that have already been given a weight, if the new weight is lower, you replace that node’s weight with the new weight (you’re always trying to get smaller) … i.e. you _relax_ the weight.
- You repeat this for |V| – 1 iterations (total number of vertices minus 1).
- Worst case, the time complexity is O(number of vertices * number of edges) … i.e. O(|V||E|).
- Best case, the time complexity is O(number of edges) … i.e. O(|E|).
- And worst case, the space complexity is O(number of vertices) … i.e. O(|V|).
- If each vertex has at least one outgoing edge, we can approximate the complexity to O(n^2).
- Works well for simple graphs, but not the most efficient algorithm for more complex or dense graphs.
- So if it allows for negative edge costs, what if the calculated cost from one vertex to another is negative? This would be a “negative cycle”.
- If there is a negative cycle that is reachable from the starting point, then there is no “cheapest path”.
- This is because any point along this negative cycle path can be made cheaper by another “walk” around the negative cycle.
- This is both a pro and a con for the Bellman-Ford algorithm.
- On the one hand, because of the existence of the negative cycle, a correct answer cannot be found. Con.
- However, if you are looking _for_ negative cycles, then, once found you can stop processing and report the find. Pro.
About the algorithm
- Similar to the Bellman-Ford algorithm, you end up with the shortest paths from one node to every other node.
- Also, similar to the Bellman-Ford algorithm, you’ll use a weighted, directed graph again.
- Except … Dijkstra’s algorithm does not handle negative weights.
- Easy to adapt this algorithm for various other purposes, notably A*.
How it works
- Keep a table of shortest paths from your start node to every other node and keep track of which nodes have been visited.
- The start node to itself is initialized to 0, every other one is initialized to infinity.
- Start with the unvisited node with the lowest weight (first run, this will be the starting node because it has 0).
- Iterate through it’s connected neighbors, if the distance from the starting node to this node to the neighbor is smaller than the recorded distance from the starting node to the neighbor – record the new lowest distance (and path, if you like).
- Repeat until there are no more unvisited nodes.
- The secret here, is that we’re eliminating duplicate calculations – we only ever add the known distance from the starting node to our node, with the distance to our immediate neighbors.
- Well, it depends on the implementation …
- Adjacency List and Priority queue: O((|V|+|E|) log |V|).
- Matrix and Priority queue: O((|V|^2) log |V|).
- Fibonacci Heap: O(|V| + |E| log |V|).
Bellman-Ford vs Dijkstra
- You _can_ get the same result, assuming no negative weights.
- Dijkstra’s algorithm is greedy (makes small optimal decisions), and the Bellman-Ford algorithm is not.
- Bellman works with negative edge points (both fail with negative cycles).
Resources We Like
- The Imposter’s Handbook (bigmachine.io)
- How to Solve Any Dynamic Programming Problem (Pramp)
- Directed graph (Wikipedia)
- Bellman-Ford algorithm (Wikipedia)
- Dijkstra’s algorithm (Wikipedia)
- Bellman-Ford in 5 minutes — Step by step example (YouTube)
- Dijkstra’s algorithm in 3 minutes — Review and example (YouTube)
- A* search algorithm (Wikipedia)
Tip of the Week
- Switch focus between editor and integrated terminal in Visual Studio Code (Stack Overflow)
- Are you a student? You can use Azure on the cheap. (Microsoft Azure)
- All of the engineering blogs in one place. (GitHub)
- Start always the right project with this simple secret in Visual Studio (tabs over spaces)