Relative Content

Tag Archive for dijkstra

Is Dijkstra’s algorithm first proposed by Leyzorek et al.?

According to wikipedia Shortest_path_problem entry, Dijkstra’s algorithm was first proposed by Leyzorek et al. in 1957, in a paper called “Investigation of Model Techniques — First Annual Report — 6 June 1956 — 1 July 1957 — A Study of Model Techniques for Communication Systems”. My questions are:

Shortest Path Between Two Nodes in a +10 Million Nodes Graph

I have my own knowledge graph representation, read from ConceptNet and NELL, containing tens of millions of nodes in which I want to calculate the nearest distance (if any) between two concept nodes. The application is to find out how two concepts are related in the simplest explainable way. Typical connectedness of the graph lie in the range 100-1000. Do I need some variant of Dijkstras in this case? I want the solution to require at most around 10 GBytes of RAM. My current memory usage is around 2 GB.

Why does this implementation of Dijkstra’s algorithm work in O(n^2)?

Here is the code I use for implementing Dijkstra’s algorithm. Consider a graph with n vertices and m edges. Shouldn’t it run in O(n^2 m) ? Someone may say that there are n vertices and each edge gets processed once, therefore it is O(n m). But the while loop can run at most n times, the 1st for loop at most n times and the second for loop at most m times. Therefore, this should be an O(n^2 m) implementation.

Why does this implementation of Dijkstra’s algorithm work in O(n^2)?

Here is the code I use for implementing Dijkstra’s algorithm. Consider a graph with n vertices and m edges. Shouldn’t it run in O(n^2 m) ? Someone may say that there are n vertices and each edge gets processed once, therefore it is O(n m). But the while loop can run at most n times, the 1st for loop at most n times and the second for loop at most m times. Therefore, this should be an O(n^2 m) implementation.

Why does this implementation of Dijkstra’s algorithm work in O(n^2)?

Here is the code I use for implementing Dijkstra’s algorithm. Consider a graph with n vertices and m edges. Shouldn’t it run in O(n^2 m) ? Someone may say that there are n vertices and each edge gets processed once, therefore it is O(n m). But the while loop can run at most n times, the 1st for loop at most n times and the second for loop at most m times. Therefore, this should be an O(n^2 m) implementation.

Historical context of Dijkstra’s remarks on ellipses to denote ranges

Some programming languages, like Ada or Ruby, use series of dots to denote inclusive/exclusive ranges. In “Why numbering should start at zero” (EWD831), Dijkstra discusses alternative conventions, dismissing the dots as “pernicious” without going into further detail. Is anybody aware of any other writings that do elaborate on his/historical sentiment on this?