Quantum Bayesian Networks

June 24, 2010

ExaFLOPS Computing, Is it a Foolhardy Pursuit Headed for a Painful Belly-FLOPS?

Filed under: Uncategorized — rrtucci @ 7:48 pm

HPC industry

Previously (here and here), I discussed the present state of supercomputing (a.k.a. High Performance Computing, or HPC), and its relevance to quantum computing. Let me now say a few words about the future plans of the HPC community. They are attempting to build, hopefully before the end of this decade, machines that use multicore processors to achieve ExaFLOPS (1e18 floating point operations per second). Current technology can only achieve a mere ~1e15 FLOPS.

IBM, Cray, Intel, and HP are each pouring millions of dollars into R&D of exascale computing. Unfortunately, those same companies have invested approximately zero in quantum computing. They haven’t even funded a no-brainer like an X-prize for quantum computing. The sad truth is that about 99.9% of current research into quantum computing in the USA is being funded by military and spy agencies.

It’s fine and natural for those companies to try to extend what they already know—multicores. What I find troubling is that they are not investing in quantum computing at the same time. This despite the fact that commercially viable exascale computing will be extremely difficult to achieve—as difficult or even more so than quantum computing, IMHO.

Failure in achieving commercially viable exascale computing is a very real possibility. Dreams of parallel computing have flopped monumentally before; for example, with Japan’s Fifth Generation computer project. Even Linus Torvalds, whose common sense is legendary, has expressed some reservations about exascale computing using multicore processors. Even more damning is the 2008 Kogge report, a report commissioned by DARPA and authored by a slew of supercomputing luminaries, which concluded that power consumption, especially the power used to “move data around”, will make multicore exascale computers impractical. Is “deep computing” in deep trouble?

Some people might say, Oh well, even if we fail to achieve commercially viable exascale computing, our efforts in that direction will still produce numerous spinoffs. Well, I think quantum computing has the potential to produce even more spinoffs than exascale computing. Why? Let me put it this way. Exascale computing is like trying to reach for some extremely high fruit in the tree of multicore processors, because all the lowest lying fruit have already been picked from that tree. Quantum computing, on the other hand, is like picking low lying fruit from a tree that has never been picked before.

Here we have quantum computing, most likely a greener (more energy efficient) technology than multicore processors, and one which, if successful, could produce thousands of new jobs for the USA. But are those companies funding it? Nah. Instead, those geriatric companies are trying to build a horse buggy 10 times the usual length, even when everyone tells them that maybe they should also look into internal combustion engines.

It appears that QCs can perform only certain special tasks better than classical computers. Hence, QCs will probably never totally replace classical computers, just complement them. Therefore, I believe in the need for research into exascale computing. And I sincerely hope that the HPC community will succeed in their quest for exascale computing. But let us be perfectly frank. According to many supercomputer experts, there are many, many reasons why (commercially viable) exascale computing may fail. Oh, let me count the ways.

A Toaster by any other name is still a …
Today’s fastest supercomputers have about 2e5 cores. At 10 watts/core, that means they consume 2e6 Watts, as much power as a whole town. Using the same technology with 1e8 cores would consume 1e9 Watts, the power produced by a medium-sized nuclear power plant. This is clearly indefensible. Hence, to achieve commercially viable exascale computing, the current technology must be made dramatically more energy efficient. The exascale computer advocates have set themselves a limit of 1e7 Watts per installation. I personally believe that 1e7 Watts is still too high. Even the current 2e6 Watts per installation is already too high, in my opinion.

Making supercomputers more energy efficient will require revolutions in chip design, interconnections and cooling. Furthermore, “The currently available main memories (DRAM) and disk drives (HDD)… consume way too much power. New technologies are needed.” there too.

And forget about obtaining any dramatic boosts in performance by further miniaturizing chip components. We’ve pretty much reached the end of that road, unless you want to reduce transistors to a few atoms, and then you might as well be doing quantum computing.

Programming Nightmare
Current top500 supercomputers have ~1e5 cores. The planned exascale computers will have 1e8 cores. With that many cores, one faces the prospect of cores failing constantly. The software running exascale computers will have to constantly detect and correct for hardware failures, or else the computer will only run for short periods of time of random length. So radically new “autonomous” software will be required. (Deja vu? QCs may also require error correction! But note that with QCs, one doesn’t correct for hardware failures. One corrects instead for noise induced errors, which is preferable if you want your hardware to last longer.)

Writing and debugging parallel programs is already notoriously difficult. Typically, one has to break the program into as many semi-autonomous tasks as possible, and design by hand how all those tasks are going to communicate with each other. Then one has to worry about the possibility that the computer will hang due to network collisions. These problems will be exacerbated by increasing the number of cores from 1e5 to 1e8. (Deja vu? QCs are also hard to program)

A Prestige-Race. Price Tag Puts It Out of Reach For Almost Everyone
Just one of today’s fastest supercomputers (Jaguar, Nebulae, Roadrunner) costs more than 100 million dollars (and that doesn’t include the additional costs of infrastructure, maintenance, staff and electric bills). One can expect exaFLOPS computers to cost at least that much. Who is going to be able to afford them, except maybe China? European countries and Japan are currently instituting austerity measures. The USA is reeling after 2 wars, the financial disaster of 2008 with the ensuing bailouts and stimulus packages, record unemployment, 48 out of 50 states that can’t balance their budgets, and the Gulf of Mexico Oil Spill. Few private companies or universities can afford to spend 100 million on just one computer, especially considering the fact that most computers quickly become obsolete. Volunteer distributed computing like BOINC shields the server managers from this huge price tag. But with current technology, an exaFLOPS BOINC will expect its volunteers to consume 1e9 Watts, which is kind of unconscionable. As I already pointed out here, even now, high performance computing is a race in which only rich countries can participate. India, South America, and Africa are too poor to compete, and even mighty Japan has been falling behind in this race for at least a decade.

Can Only Solve Narrow Class of Problems
Exascale computer software will have to distribute workload approximately evenly among 1e8 cores, or else the computer’s advantage will be nullified. There is only a narrow class of practical problems for which this is feasible. (Deja vu? Quantum computers also excel at solving only a narrow class of problems).

Wladawsky-Berger says in Ref.1,
“One of their most compelling conclusions is that with exascale computing, we are reaching a tipping point in predictive science, an area with potentially major implications across a whole range of new, massively complex problems.”

Hey! The same is true about quantum computing. Quantum computerists also plan to do MCMC (Markov Chain Monte Carlo) to make predictions based on probabilities. Using statistical techniques when faced with large data sets is a no-brainer. But note that mathematical complexity theory tells us that QCs can do MCMC much faster (in a time-complexity sense) than classical computers (and that means faster than a zillion cores, or a biological computer, which is a classical computer too).


  1. Opinion – Challenges to exascale computing
    by Irving Wladawsky-Berger (retired from IBM), ISGTW (International Science Grid This Week), April 7, 2010
  2. Future exaflop supercomputers will consume too much power without new software by BY ANNE-MARIE CORLEY, Ieee Spectrum, JUNE 2010
  3. Exascale supercomputers: Can’t get there from here? by Sam Moore, in Blogs/Tech talk/Ieee Spectrum, OCTOBER 07, 2008
  4. Why the computer is doomed, by Omar El Akkad, Saturday’s Globe and Mail, Jan. 29, 2010
  5. Exascale Computing Requires Chips, Power and Money,by Alexis Madrigal, Wired Feb 22, 2008


  1. how do you know that biological computers are classical. and what counts as a biological computer?

    Comment by Joshua vogelstein — June 26, 2010 @ 1:45 pm

  2. Joshua, I was thinking mainly of DNA computers, which are the only ones that we are actively trying to build. Those are expected to perform only classical logic operation like AND, NOT, OR.

    Comment by rrtucci — June 26, 2010 @ 5:45 pm

  3. I think that it is entirely reasonable to build computing systems capable of executing 10^18 flops, using existing semiconductor process technologies. All that’s needed is a look into alternate architectures, as we’ve clearly panned out the Von Neuman model.

    Comment by Mike Warot — July 2, 2010 @ 2:13 am

  4. A sales pitch for your own turf, nothing wrong with that, but…
    What if in the end both exascale and quantum computing are infeasible?
    I mean, what if “mother nature” herself only walks haphazardly into tiny subsets of a HUUUGE search space without any intent whatsoever?
    Our obsession with “goals and purposes” might be delusional in the sense that arbitrary goal seeking could be intractable in principle.

    Comment by Kevembuangga — August 4, 2010 @ 6:10 am

  5. Kevin: I might be wrong, but I believe quantum computing is a technology that will work well and be commercially viable (like engine powered human flight, or transistors or lasers) and soon too.

    Comment by rrtucci — August 4, 2010 @ 3:13 pm

  6. […] to detail about the fact that quantum computing is a better way to work on super huge problems. ExaFLOPS Computing, Is it a Foolhardy Pursuit Headed for a Painful Belly-FLOPS? Quantum Bayesian Net… __________________ Jacare on EA Sports MMA "If you do not like the game tell him he will […]

    Pingback by Fujitsu delivers a 10 petaflops supercomputer - Sherdog Mixed Martial Arts Forums — September 28, 2010 @ 5:01 pm

  7. […] read over here. Outlined a few interesting points for me, specifically that QC might be more of a […]

    Pingback by Good time to be getting into Quantum Computing I think « Interesting items — November 22, 2010 @ 6:03 am

  8. […] Interesting American article posted a few days ago that goes directly to concerns Tucci posted earlier in the year. Specifically, it is about whether the search for faster and faster (classical) computers is […]

    Pingback by It’s time to invest in algorithms « Interesting items — December 23, 2010 @ 8:28 am

  9. beautiful article. tHE COMPANIES SHOULD spend heavily on quantum computing which is very promising

    Comment by KALLOL ROY — October 1, 2012 @ 5:04 pm

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: