Quantum Bayesian Networks

March 30, 2017

Betting the farm on Rigetti Spaghetti

Filed under: Uncategorized — rrtucci @ 7:40 pm

#ElvisSighting #SteveJobsSighting

Steve Jobs and/or his Reality Distortion Field is still alive!!!

Rigetti is a quantum computer company that started about 3 years ago with an initial seed funding of $2.5M. Two days ago, the press revealed that an additional whopping $64M has been invested in the company. The company is named Trump Rigetti after its founder Chad Rigetti. Sorry but rigetti is not a new kind of macaroni. “il rigetti” actually means “he discards” in Italian.

Unlike our company artiste-qb.net which does only qc software (and also quantum ready classical software), Rigetti does mostly qc hardware (gate model, superconducting qubits). They intend to offer access to their machine via the cloud, something that DWave and IBM already do, but which slow-poke Google doesn’t do yet, although they are probably months away from doing it to keep up with the Joneses.

Although a $64M investment in qc is good news for artiste-qb.net because a rising tide raises all boats, I advise you not to invest your pension on this one. I consider it a very risky investment for the following reasons.

As far as I know, team Rigetti has yet to publish performance results for their hardware and thus subject their machine to scientific scrutiny the way Martinis at Google has amply done. (Team Rigetti did publish one paper https://arxiv.org/abs/1703.04613 on Mar 15, two weeks ago, but that paper is only about a single qubit and getting 2 qubits to interact in an ideal way is much more difficult).

Alibaba, Google, IBM, Intel, and Microsoft (alphabetical order) , not to mention the govs. of Australia, Canada, China, EU, Holland, UK and USA are each spending tens of millions of dollars per year to achieve the same goal as Rigetti, to build an elusive gate model quantum computer.

Silicon Valley is like a Republican echo chamber: a small, outlier perturbation can snowball in a hurry. Furthermore, the master-minds at Silicon Valley often try to grow their companies in a hurry hoping that these quickly become near monopolies that are too big to fail. That might work for a service company like Uber, but it might not work in the case of a company racing against some of the biggest nations and companies of the world, trying to build/invent a very NOVEL device. In this case, the company is strictly accountable to Mother Nature, and she might not favor it.

March 20, 2017

BNT and PNL, two masterpieces of Bayesian Networks retro-art

Filed under: Uncategorized — rrtucci @ 8:58 pm

An update on the latest adventures of our company artiste-qb.net.

In previous blog posts, I waxed poetic about Marco Scutari’s open source software called bnlearn for learning the structure of bnets (Bayesian networks) from data. bnlearn is written in the language R whereas Quantum Fog is written in Python. But by using Jupyter notebooks with Rmagic installed, we have been able to write some notebooks running both QFog and bnlearn side by side in the same notebook for the same bnets, and to compare outputs of both programs. That is a good bench-marking exercise for the bnet learning side of QFog, but what about it’s bnet inference side?

Two open source programs that are very good at doing bnet inference (and many other things too) are BNT (Bayes Net Toolbox, by Kevin Murphy et al) and OpenPNL (PNL = Probabilistic Networks Library, written by Intel. I like to call it PaNeL to remember the letters quickly).

So our next adventure is to learn how to use BNT and PNL and to compare them to QFog.

BNT is written in Matlab. PNL is written in C++ but it includes Matlab wrappers for most of its functions. Both BNT and PNL are very mature and comprehensive programs. Since its core is written in C++ rather than Matlab, we expect PNL to be much faster than BNT for large networks.

As you already know if you’ve ever checked Matlab’s pricing, the software is very costly for everyone except students. However, this is one case when the open source gods have listened to our prayers. Octave is a free, open source program that can run 99% of a Matlab program and the few differences between Matlab and Octave (mostly in the plotting packages) are well documented. Furthermore, one can run Octave in a Jupyter notebook, either on an Octave kernel or on a Python kernel with octavemagic (oct2py) installed.

So in order to compare QFog to bnlearn, we’ve had to start using Jupyter notebooks on R kernelĀ  or on Python kernel with Rmagic. And in order to compare QFog with BNT&PNL, we’ve had to start using Jupyter notebooks on Octave kernel or on Python kernel with octavemagic. We have seen the light and we are now believers in a holy trinity of computer languages (diversity and open source is our strength, Mr Trump):

Python, R, Octave
(our polyglot notebooks)

Curiously, Duke Univ. offers a course called “Computational Statistics in Python” that also advocates the use of Jupyter notebooks, and the languages Python, R and Matlab intermixed to do statistics. So when two cultures independently come up with the same idea, it’s probably a good one, don’t you think?

Since BNT is written in Matlab, running it does not require any compilation. PNL, on the other hand, is written in C++, so it does. Compiling PNL has proven a difficult task because the software is ten years old, but, after a lot of sweat and tears, our wiz Henning Dekant has managed to compile it (a few issues remain).

BNT was last changed in a bigly way circa 2007 (or even earlier) and PBL on 2006. (bnlearn, by comparison, is still very active). BNT and PNL belong to what I like to call the first bnet revolution (inference by junction tree) whereas bnlearn belongs to the second revolution (structure learning from Markov blankets). Even though PNL belongs to the first, not second, revolution, it is a major mystery to me why Intel abandoned it. PNL is a very impressive, large and mature piece of software. A lot of work, love and passion seems to have gone into it. Then sometime in mid 2006, it seems to have been abandoned in a hurry, and now it looks like a ghost town, or the deserted island in the video game Myst. I already know how the game Myst ends. If anyone knows why Intel stopped PNL development circa 2006, I would appreciate it if you would tell me, either in public in this blog or in private by email. Luckily, Intel had the wisdom to make PNL open source. PNL will go on, because šŸ’OPEN SOURCE IS FOREVERšŸ’.

Sorry for the length of this blog post (almost as long as a Scott Aaronson blog post or a never ending New Yorker article).

myst.jpg
mystmeme.jpg

Create a free website or blog at WordPress.com.

%d bloggers like this: