Quantum Bayesian Networks

April 29, 2017

Miss Quantum Computing, may I introduce to you Miss Bayesian Hierarchical Models and Miss MCMC?

Filed under: Uncategorized — rrtucci @ 5:49 pm

Warning: Intense talk about computer software ahead. If you are a theoretical computer scientist, you better stop reading this now. Your weak constitution probably can’t take it.

When you enter the nerd paradise and secret garden that is Bayesforge.com (a free service on the Amazon cloud), you will see one folder named “Classical” and another named “Quantum”. Here is a screenshot of this taken from Henning Dekant’s excellent post in Linkedin

The “Quantum” folder contains some major open source quantum computing programs: Quantum Fog, Qubiter, IBM-QisKit (aka kiss-kit), QuTip, DWave, ProjectQ, Rigetti

The “Classical” folder contains some major Bayesian analysis open source programs: Marco Scutari’s bnlearn (R), Kevin Murphy’s BNT (Octave/matlab), OpenPNL (C++/matlab), PyMC, PyStan.

The idea is to promote cross fertilization between “Quantum” and “Classical” Bayesian statisticians.

Today I want to talk mostly about PyMC and PyStan. PyMC and PyStan deal with “Hierarchical Models” (Hmods). The other programs in the “Classical” folder deal with “Bayesian Networks”(Bnets).

Bnets and Hmods are almost the same thing. The community of people working on Bnets has Judea Pearl as one of its distinguished leaders. The community of people working on Hmods has Andrew Gelman as one of its distinguished leaders. You might know Gelman (Prof. at Columbia U.) from his great blog “Statistical Modeling, Causal Inference, and Social Science” or from one of his many books

Both PyStan and PyMC do MCMC (Markov Chain Monte Carlo) for Hmods. They are sort of competitors but also complementary.

PyStan (its GitHub repo here) is a Python wrapper of a computer program written in C++ called Stan. According to Wikipedia, “Stan is named in honour of Stanislaw Ulam, pioneer of the Monte Carlo method.” Prof. Gelman is one of the fathers of Stan (I mean the program, of course).

PyMC comes in 2 incompatible versions: 2.X and 3.X. Version 3 is more advanced and intends to replace Ver 2. PyMC2’s best sampler is a Metropolis-Hastings (MH) sampler. PyMC3 contains an MH sampler, but it also contains the “No U turns” or “NUTS” sampler that is supposed to be much faster than MH for large networks. Currently, Bayesforge contains only PyMC2, but the next version will contain both PyMC2 and PyMC3. As an added bonus, PyMC3 comes with Theano, one of the leading deep neural networks frameworks.

Check out this really cool course:

Sta-663 “Statistical Programming” , available at GitHub, taught at Duke U. by Prof. Chi Wei Cliburn Chan.

This wonderful course has some nice jupyter notebooks illustrating the use of PyMC2, PyMC3 and PyStan. Plus it contains discussions of many other statistical programmimg topics. I love it. It has a similar philosophy to BayesForge, namely to do statistical programming with jupyter notebooks because they are great for communicating your ideas to others and allow you to combine seamlessly various languages like Python, R, Octave, etc

Leave a Comment »

No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

Blog at WordPress.com.