How I think history books circa 2100 will read:
When widespread adoption of classical computers and the Internet arrived in the late 20th century, human society suddenly became capable of quickly storing, accessing, and analyzing huge data sets. The stage was set for a band of young technologists called Google to harness Probability and Statistics to analyze these huge data sets. PageRank was their algorithm. Simple math, just finding the stationary state of a Markov chain, was key to how Google changed the world.
A second wave of technology followed soon after, and this one towered over the first in importance. Quantum Computer mainframes became available from 2020 onwards. These were harnessed by the visionary <insert your name here> to analyze, again huge data sets, again using the tools of Probability and Statistics (such as Bayesian Networks, of which Markov Chains are an example). But this time the data sets came fast and furious, not just from what people wrote and posted on the Internet, but also from AI and biotech and nanotech. Quantum computers could do such analyses much faster than classical computers. Subtle AI finally became possible. Quantum thinkers that could ponder about petabytes of data, and make sense of it in a heartbeat; machines, called johnnies, with IQ’s thousands of times higher than that of Johnny von Neumann, had finally arrived.
Okay, enough geek day dreaming for today. Here are some excellent links explaining Google’s PageRank. It’s clever but simple math (understandable by most undergraduate students in engineering or science).
BigTable and MapReduce are also very interesting Google technologies that go hand in hand with PageRank. Mike Nielsen (co-author of a popular textbook on Quantum Computing) explains those nicely in his Google Technology Stack