Quantum Bayesian Networks

March 20, 2017

BNT and PNL, two masterpieces of Bayesian Networks retro-art

Filed under: Uncategorized — rrtucci @ 8:58 pm

An update on the latest adventures of our company artiste-qb.net.

In previous blog posts, I waxed poetic about Marco Scutari’s open source software called bnlearn for learning the structure of bnets (Bayesian networks) from data. bnlearn is written in the language R whereas Quantum Fog is written in Python. But by using Jupyter notebooks with Rmagic installed, we have been able to write some notebooks running both QFog and bnlearn side by side in the same notebook for the same bnets, and to compare outputs of both programs. That is a good bench-marking exercise for the bnet learning side of QFog, but what about it’s bnet inference side?

Two open source programs that are very good at doing bnet inference (and many other things too) are BNT (Bayes Net Toolbox, by Kevin Murphy et al) and OpenPNL (PNL = Probabilistic Networks Library, written by Intel. I like to call it PaNeL to remember the letters quickly).

So our next adventure is to learn how to use BNT and PNL and to compare them to QFog.

BNT is written in Matlab. PNL is written in C++ but it includes Matlab wrappers for most of its functions. Both BNT and PNL are very mature and comprehensive programs. Since its core is written in C++ rather than Matlab, we expect PNL to be much faster than BNT for large networks.

As you already know if you’ve ever checked Matlab’s pricing, the software is very costly for everyone except students. However, this is one case when the open source gods have listened to our prayers. Octave is a free, open source program that can run 99% of a Matlab program and the few differences between Matlab and Octave (mostly in the plotting packages) are well documented. Furthermore, one can run Octave in a Jupyter notebook, either on an Octave kernel or on a Python kernel with octavemagic (oct2py) installed.

So in order to compare QFog to bnlearn, we’ve had to start using Jupyter notebooks on R kernel  or on Python kernel with Rmagic. And in order to compare QFog with BNT&PNL, we’ve had to start using Jupyter notebooks on Octave kernel or on Python kernel with octavemagic. We have seen the light and we are now believers in a holy trinity of computer languages (diversity and open source is our strength, Mr Trump):

Python, R, Octave
(our polyglot notebooks)

Curiously, Duke Univ. offers a course called “Computational Statistics in Python” that also advocates the use of Jupyter notebooks, and the languages Python, R and Matlab intermixed to do statistics. So when two cultures independently come up with the same idea, it’s probably a good one, don’t you think?

Since BNT is written in Matlab, running it does not require any compilation. PNL, on the other hand, is written in C++, so it does. Compiling PNL has proven a difficult task because the software is ten years old, but, after a lot of sweat and tears, our wiz Henning Dekant has managed to compile it (a few issues remain).

BNT was last changed in a bigly way circa 2007 (or even earlier) and PBL on 2006. (bnlearn, by comparison, is still very active). BNT and PNL belong to what I like to call the first bnet revolution (inference by junction tree) whereas bnlearn belongs to the second revolution (structure learning from Markov blankets). Even though PNL belongs to the first, not second, revolution, it is a major mystery to me why Intel abandoned it. PNL is a very impressive, large and mature piece of software. A lot of work, love and passion seems to have gone into it. Then sometime in mid 2006, it seems to have been abandoned in a hurry, and now it looks like a ghost town, or the deserted island in the video game Myst. I already know how the game Myst ends. If anyone knows why Intel stopped PNL development circa 2006, I would appreciate it if you would tell me, either in public in this blog or in private by email. Luckily, Intel had the wisdom to make PNL open source. PNL will go on, because 💍OPEN SOURCE IS FOREVER💍.

Sorry for the length of this blog post (almost as long as a Scott Aaronson blog post or a never ending New Yorker article).

myst.jpg
mystmeme.jpg

Advertisements

1 Comment »

  1. Beware of the ‘frogs’ of ‘masterpiece’ for they shall end in the pit of ‘forever’…

    Comment by teras — March 20, 2017 @ 11:31 pm


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: