Quantum Bayesian Networks

November 11, 2015

Google Open-sources TensorFlow (A Fakesian Networks Software Library). Microsoft, Tear Down This Infer.NET Wall

Filed under: Uncategorized — rrtucci @ 5:21 pm

Check out

On Nov. 10, Google announced to much fanfare that it was open-sourcing TensorFlow, their software library for “Large-Scale Machine Learning on Heterogeneous Distributed Systems”. At this point in time, I know very little about TensorFlow, but I can already see that it is not very Bayesian Networky. Me being such an avid B Net fan, I can’t deny that I was a little disappointed by how little B net stuff it contains. To me, TensorFlow is Fakesian Networks instead of Bayesian Networks🙂

In my opinion, TensorFlow has VERY LITTLE in common, except for the name, with what quantum information theorists call “quantum tensor networks”, although I’m sure that some sleazy, opportunistic physicists and science journalists will claim that the two are co-joined twins. Unlike the classical, mostly deterministic, highly distributed calculations that TensorFlow performs, quantum computers have to deal mostly with quantum probabilistic instead of deterministic calculations, and distributed computing for QCs would be very different than its classical counterpart. I think when dealing with probabilistic calculations, either on a classical or quantum computer, Classical Bnets and Quantum Bnets are the most natural and productive framework/strategy, as I’ve explained before.

Despite TensorFlow being Fakesian Networks, I welcome Google’s move to open TensorFlow, because it certainly raises the level of visibility, cooperation, competition and tension/suspense in the AI arena. For example, I’m sure that right about now Microsoft is facing a lot of pressure to respond in kind to the news about TensorFlow. So what does Microsoft use instead of TensorFlow to do its Bing AI? Is it Infer.net? Whatever it is, will MS have to open source part of its Bing AI, to keep up with the Joneses and the Kardashians?

I like Infer.net. It looks much more Bayesian Networky to me than TensorFlow does. Unfortunately, so far MS has only released to the public infer.net’s binary and API, and it has forbidden non MS people from using infer.net for commercial purposes.

Told you so UPDATE1: Ta-tan, Nostradamucci’s predictions have turned into reality again. Nov 12, just 2 days after Google released TensorFlow, Microsoft announces the open-sourcing of DMTK (Distributed Machine Learning ToolKit). And of course, Facebook was the first, with its open-sourcing of Torch enhancements on Jan 16 of this year.

UPDATE2: Related news. After UPDATE1, I learned that IBM has also been busy open-sourcing distributed AI software. It has recently open-sourced SystemML (ML=Machine Learning), a small part of its Watson software. The GitHub repository of SystemML was started on Aug 17, 2015. According to this press article, circa Nov 23, 2015, SystemML was accepted into the Apache incubator (a preliminary step to being declared a saint, where sainthood means you have performed 2 miracles and you are declared officially integrated and compatible with the Apache libraries, especially Spark, which SystemML will rely heavily upon.)

4 Comments »

  1. Hello Dr. Tucci: Here is some off-topic breaking news about new sale of D-Wave 2X to Los Alamos National Lab. Another feather in the cap of D-Wave, which won’t please some people such Dr. S. Aaronson!. Oh well!. I hope you are well. Have a good day and thanks.
    http://insidehpc.com/2015/11/los-alamos-orders-d-wave-2x-quantum-computer/

    Sol Warda

    Comment by Sol Warda — November 12, 2015 @ 2:53 pm

  2. Thanks Sol.

    Comment by rrtucci — November 12, 2015 @ 4:39 pm

  3. It would be nice if this was a sign of more to come. After all Infer.net could also benefit from Open Source collaboration, to quote from the FAQ “Note that the model compiler is not itself particularly efficient. We have focused our efforts so far on making the generated code efficient, rather than the generation process itself”.

    Comment by Henning Dekant — November 14, 2015 @ 4:09 pm

  4. Given that Gibbs sampling can be used for approximate inference in a Bayesian network, I think the following bit from the DMTK may be of special interest:

    “LightLDA is a new, highly-efficient O(1) Metropolis-Hastings sampling algorithm, whose running cost is (surprisingly) agnostic of model size, and empirically converges nearly an order of magnitude more quickly than current state-of-the-art Gibbs samplers”.

    Comment by Henning Dekant — November 14, 2015 @ 4:17 pm


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: