Quantum Bayesian Networks

February 6, 2016

Quantum Open Source 2016

Filed under: Uncategorized — rrtucci @ 7:31 am


Big Brother is saying: “Liqui|> is closed source for your own good. Quantum open source is EVIL. Using it will harm you!”

Quantum Carly Fiorina

Quantum Carly Fiorina

Big Brother is saying: Vote for Quantum Carly Fiorina, my handpicked leader

Big Brother is saying: Vote for Quantum Carly Fiorina, my handpicked leader



On 2016, the quantum open source community will be born, and you will see why quantum computer software won’t be like “1984”

February 2, 2016

First Major Commit to Quantum Fog in Python

Filed under: Uncategorized — rrtucci @ 4:56 pm

Today is Groundhog day, Feb. 2, 2016. This holiday holds special significance in Quantum Computing History. Indeed, on another Groundhog day in the distant past, in the year 2015, a watershed historic quantum event occurred, namely Jimmy the Groundhog bit Seth Lloyd’s ear.

Other quantum watershed events have occurred since them, like, for example, Caltech held its “One Entangled Evening” and Alex Winter screened his NSF funded 10 minute mini-movie “Anyone can Quantum”. But those events are California/Hollywood glitzy, whereas groundhog day is a more folksy affair (old white men with funny top hats bugging a chubby rodent who just wants to go back to sleep).

artiste-logo Since we here at artiste-qb.net are more folksy than glitzy, we decided to celebrate this holiday by releasing today our first version of Quantum Fog in Python (QFog is open sourced under the BSD license, and it is available at GitHub)

Basically, what I did was to refurbish an old open-source program called PBNT by Elliot Cohen. PBNT does classical Bayesian Networks using 3 inference algorithms: Enumeration (brute force), MCMC and Join Tree (aka Junction Tree). Our new QFog release does BOTH, classical and quantum Bnets using the same 3 algos. One of our eventual goals is to write a quantum computer programming language based on quantum Bnets.

January 1, 2016

Happy New Year 2016

Filed under: Uncategorized — rrtucci @ 3:16 am

A new quantum computing star is born a few minutes shy of 2016.


(Image comes from Google search page for New Year’s eve 2016, from frame 203 of animated gif with 272 frames. What Google’s punchline is going to be tomorrow, I do not know, but I beat them to the punch by 2 hours in east coast time) Happy quantum computing New Year 2016!

December 22, 2015

The Battle of the Century For Quantum Supremacy: Alibaba Versus Google

Filed under: Uncategorized — rrtucci @ 6:33 pm

Have you read some of the recent quantum computing headlines in the Chinese news media? No…, well it so happens that I am a fluent Mandarin Chinese speaker* , so let me translate a few of those headlines for you.(* with the help of Google translator)

  • China Planning to Unveil Soon Their Own Quantum Computer. Pundits Expect It To Be Much More Popular Than Google’s.
    chinese-dwaver (click to expand)

  • The Computer Fight of the Century: Alibaba versus Google
  • Alibaba Says: “Damn You Google. We Will Beat You At Quantum Computing”

Keep in mind that China is about to finalize its next 5 year plan at the end of March 2016, and that it might choose in that plan either a $10B, 100 TeV particle accelerator, OR 50 quantum computers (at $200M apiece) , BUT NOT BOTH

Previous blog posts of mine about this topic:

December 15, 2015

Artiste-qb.net Makes Dec 8 watershed Quantum Computing announcement on Dec 15

Filed under: Uncategorized — rrtucci @ 10:11 pm

There is no denying. Google and NASA are publicity geniuses. Geniuses I tell you!

On Nov 20, Google and NASA announced in dramatic and coy Apple fashion, that they would hold a press conference on Dec. 8 to announce a major QC breakthrough. At the Dec 8 conference and subsequent blog post and arXiv paper, Hartmut Neven announced that Google is now “faster than the universe” (Steve Jurvetson expression), having achieved a 10^8 speedup compared to classical computers. According to TMZ, Hartmut Neven was wearing a black turtle neck and jeans to the event(citation needed).

Quite predictably, on Dec 9, just one day later, Scott Aaronson posted in his blog a voluminous refutation of Google’s claims, but most of the general public showed itself quite disinterested in his opinion on the matter. If only his blog were more succinct. Someone who wishes to remain anonymous once tweeted: “Thank God Scott Aaronson is too wordy for Twitter”.

Since Dec.8, our company artiste-qb.net’s publicity department, 50 persons strong, has been struggling to come up with our own “watershed announcement” to match Google’s. We decided to send Mark Zuckerberg a book for his baby girl and Jack Ma a baseball cap. We also sent a TMZ spy to capture in a photo the precise moment when Mark and Jack were trying these items on for fit, just before they decided to discard them. Here is what we came up with.



December 6, 2015

The Canadian Starship and the Dutch Campervan

Filed under: Uncategorized — rrtucci @ 9:19 pm

On Dec. 3, 2015, three mayor Dutch universities opened a new institute called QuSoft that, according to the press release, will be dedicated exclusively to writing quantum computing software. QuSoft is being billed as the first academic institute to be dedicated in that manner, and I tend to agree. As far as I know, no other nation has a similar institute.

It seems clear that QuSoft and its partner Dutch institute, QuTech, which does hardware instead of software, have intentions of eventually converting into private corporations after government funding has made them profitable. I reported earlier on QuTech in my previous blog post entitled “The Dutch Stallion and the Canadian Donkey”. Thus, it appears that the QuSoft-QuTech-Intel-Microsoft juggernaut will become more industrial and less academic as time goes on.

Meanwhile, IQC of the Univ. of Waterloo, Canada, founded 13 years ago (2002), has spent more than $150M to become the world’s leading powerhouse in NMR quantum computing and quantum cryptography, both known to be dead end streets since day one. They have also produced a computer game called QuantumCats, a knockoff of Angry Birds. Playing QuantumCats for 100 hours teaches less quantum mechanics than reading Wikipedia articles about QM for 15 minutes. Also, a Canadian Univ. (Dalhousie) used American defense funding (from IARPA) to write a QC programming environment called Quipper. The only problem is that Quipper is written in Haskell, a programming language so cryptic that almost nobody uses it in industry.

But despair not, Justin Bieber compatriots. I live in the US, but the company that I work for, artiste-qb.net, is incorporated in Canada. Oh Canada, we salute thee!

small-dragonflyartiste-qb.net is a software company led by scientist programmers with close ties to industry, not by academics that program in Haskell. Our “Quantum Fog on GitHub” project is open-source under the BSD license and coded in Python. We also have 12 QC software patents (6 granted and 6 pending). QuSoft starts off with zero patents.

Time will tell, but right now, QuSoft is looking pretty frumpy compared to my company. QuSoft reminds me of a Dutch campervan, of the type that plagues French, German and Spanish camping sites every summer. If you are not familiar with this natural phenomenon, comparable to a Monarch butterfly migration, here is a description from the National Geographic (sort of). On the other hand, artiste-qb.net reminds me of a starship, an analogy which I made before in my previous blog post entitled: “Houston, Tranquility Base here, the Dragonfly has landed”. artiste-qb.net will reach for the stars, and QuSoft will reach for the cows.

The TonkePR4, deluxe Dutch campervan. Now all my Dutch readers will want to own one.

The TonkePR4, deluxe Dutch campervan. Now all my Dutch readers will want to own one.

Houston, Tranquility Base here, the Dragonfly has landed. (image based on NASA photo from here)

Houston, Tranquility Base here, the Dragonfly has landed.
(image based on NASA photo from here)

November 30, 2015

Quantum Fog Comes Out of the Closet

Filed under: Uncategorized — rrtucci @ 4:06 am

Quantum Fog, my old Mac application for doing calculations with quantum Bayesian networks, is now open-source (it’s finally out of the closet). Check it out at GitHub.

Quantum Fog was originally written in C++, but we plan to rewrite most of it in Python

By the way, I highly recommend the GitHub website if you need to collaborate with a group of people on the writing of a computer app or a website or an arXiv paper, or many other possibilities. I use GitHub in conjunction with a Windows app called TortoiseGit, which I also love and highly recommend.

November 16, 2015

I just want to MOOC them All

Filed under: Uncategorized — rrtucci @ 2:50 am

mooc-them-all ( I conceived this poster in response to a friendly argument with CapitalistImperialistPig (that’s his nom de plume). You are a better pig than I am Gunga Din)

November 15, 2015

A Quantum Computer is the Ultimate Group Theory Box

Filed under: Uncategorized — rrtucci @ 4:31 pm

In previous blog posts, I mentioned my recent interest in using group theory in quantum computing algorithms. See for example,

Group Theory and quantum mechanics are like co-joined twins. It’s hard to tell where one begins and the other ends. So I was somewhat ashamed that I had used so little Group Theory in my quantum computing programming. I’ve been trying to overcome that weakness of mine in the last year and I’m beginning to see the results. I am all stoked up today because this week I filed my second US patent applying Group Theory (GT) to quantum computing (QC). This means our company http://www.artiste-q.net now has 12 US patents on QC programming (6 granted, 6 pending)

I can’t say much about my 2 QC-GT patents for now because loose lips sink ships, but I did prepare some pictures to convey my enthusiasm for QC-GT. In a previous blog post, I waxed poetic about the connections between the movie “2001, A Space Odyssey” and quantum computing. This time, I adapted a “2001- a Space Odyssey” T-shirt that I think is really cool so that the black monolith has some hieroglyphics on it dealing with GT. Ta-tan, here are my pictures:
Very little of the art work in these two images was originally created by me. My images were mostly based on the following two previous images

  1. T-shirt image from this page of the website of tshirtbordello.com.
  2. Image from Wikipedia article on Point Groups in Three Dimensions. There are 7 infinite sequences of point groups in 3 dimensions with cylindrical (uniaxial) symmetry. The groups in each sequence are indexed by n. The n-th group has n-fold rotational symmetry about the axis of symmetry. This figure from Wikipedia shows those 7 sequences for n=6.

November 11, 2015

Google Open-sources TensorFlow (A Fakesian Networks Software Library). Microsoft, Tear Down This Infer.NET Wall

Filed under: Uncategorized — rrtucci @ 5:21 pm

Check out

On Nov. 10, Google announced to much fanfare that it was open-sourcing TensorFlow, their software library for “Large-Scale Machine Learning on Heterogeneous Distributed Systems”. At this point in time, I know very little about TensorFlow, but I can already see that it is not very Bayesian Networky. Me being such an avid B Net fan, I can’t deny that I was a little disappointed by how little B net stuff it contains. To me, TensorFlow is Fakesian Networks instead of Bayesian Networks :)

In my opinion, TensorFlow has VERY LITTLE in common, except for the name, with what quantum information theorists call “quantum tensor networks”, although I’m sure that some sleazy, opportunistic physicists and science journalists will claim that the two are co-joined twins. Unlike the classical, mostly deterministic, highly distributed calculations that TensorFlow performs, quantum computers have to deal mostly with quantum probabilistic instead of deterministic calculations, and distributed computing for QCs would be very different than its classical counterpart. I think when dealing with probabilistic calculations, either on a classical or quantum computer, Classical Bnets and Quantum Bnets are the most natural and productive framework/strategy, as I’ve explained before.

Despite TensorFlow being Fakesian Networks, I welcome Google’s move to open TensorFlow, because it certainly raises the level of visibility, cooperation, competition and tension/suspense in the AI arena. For example, I’m sure that right about now Microsoft is facing a lot of pressure to respond in kind to the news about TensorFlow. So what does Microsoft use instead of TensorFlow to do its Bing AI? Is it Infer.net? Whatever it is, will MS have to open source part of its Bing AI, to keep up with the Joneses and the Kardashians?

I like Infer.net. It looks much more Bayesian Networky to me than TensorFlow does. Unfortunately, so far MS has only released to the public infer.net’s binary and API, and it has forbidden non MS people from using infer.net for commercial purposes.

Told you so UPDATE1: Ta-tan, Nostradamucci’s predictions have turned into reality again. Nov 12, just 2 days after Google released TensorFlow, Microsoft announces the open-sourcing of DMTK (Distributed Machine Learning ToolKit). And of course, Facebook was the first, with its open-sourcing of Torch enhancements on Jan 16 of this year.

UPDATE2: Related news. After UPDATE1, I learned that IBM has also been busy open-sourcing distributed AI software. It has recently open-sourced SystemML (ML=Machine Learning), a small part of its Watson software. The GitHub repository of SystemML was started on Aug 17, 2015. According to this press article, circa Nov 23, 2015, SystemML was accepted into the Apache incubator (a preliminary step to being declared a saint, where sainthood means you have performed 2 miracles and you are declared officially integrated and compatible with the Apache libraries, especially Spark, which SystemML will rely heavily upon.)

October 30, 2015

Watch Out For the Academics on Halloween Night

Filed under: Uncategorized — rrtucci @ 6:40 pm
You are now entering the IQIM_Caltech Twilight Zone

You are now entering the IQIM_Caltech Twilight Zone @iqim_caltech @preskill #NSF

John Preskill: Hmm...so Princeton is competing against IQIM for that quantum computing grant...Saddle up my horse, Tonto the postdoc. I've got some physics business to take care of.

John Preskill: Hmm…so Princeton is competing against IQIM for that quantum computing grant…

Saddle up my horse, Tonto the postdoc. I’ve got some physics business to take care of.

Stalin: Good job with arXiv, comrade Paul Ginsparg.

Stalin: Good job with arXiv, comrade Paul Ginsparg.

Donald Trump scaring Captain Kirk in an episode of the Twilight Zone.

Donald Trump scaring Captain Kirk in an episode of the Twilight Zone.

Max Tegmark: Give me research funding or I will haunt you from Many Worlds of 3 types

Max Tegmark: Give me research funding or I will haunt you in Many Worlds of 3 types.

How To Make A Red Velvet Brain Cake For Halloween

October 25, 2015

Quantum Gravity Photo

Filed under: Uncategorized — rrtucci @ 4:45 pm

This POV (Point of View) photo reminds me of quantum gravity (in a poetic sense). I found it in the Blog of Francesco Mugnai.

It’s beginning to look like Quantum Computing will elucidate Quantum Gravity, both by allowing us to perform simulations of theories of it (as Feynman predicted) and by enriching the theory itself (for instance, quantum information, error correction and complexity theory inspired elucidations of the black hole information paradox and Maldacena’s gauge/gravity duality).

Our company http://www.artiste-qb.net has a unique POV regarding quantum computing, that of quantum bayesian networks.

October 21, 2015

Snow White Bayesian Network For a Collection of Sets

Filed under: Uncategorized — rrtucci @ 5:21 pm

CB net= Classical Bayesian Network
DAG= directed acyclic graph

In supervised learning, you are given the graph (aka structure) of a CB net, and some data, and you evaluate the node probability matrices from the data. In unsupervised learning, you are given only data, and you are expected to come up with the structure and node probability matrices of the CB net from that data. Nowadays there are computer programs that do both supervised and unsupervised learning of a CB net on classical computers. I believe a quantum computer can do unsupervised learning of a CB net at least quadratically faster (due to Grover’s algo) than a classical computer. In fact, I have a patent for doing unsupervised learning of a CB net on a gate model QC. (The Quail group at NASA has proposed doing this also with a D-Wave annealer QC).

And yet, many of the examples of CB nets that show up in the literature (See, for example, the wonderful work of Andrew Gelman) were obtained “by hand”—their structure was derived without the help of a computer, arrived at partly by logic and partly by hunch. The quality and value of these CB net models depends on how well they fit the data.

So can I provide some guidance on how to find the structure of a CB net by hand? I don’t know how the experts do it, but I’ll tell you how I think about it.

There is one situation that I like to call the Snow White CB net (I call it Snow White because it’s the fairest CB net of them all). It concerns finding a CB net that connects a collection of sets.

Snow White DAG
Suppose you have n sets A_1, A_2, \ldots A_n which are not necessarily mutually disjoint.

  1. Merge all sets that are equal into a single set.
  2. Write an undirected line connecting those pairs of sets that overlap (but not if they don’t overlap).
  3. If A_i \subset A_j, then replace the undirected line joining them by an arrow pointing from A_i to A_j. Thus

    A_i \subset A_j
    A_i \rightarrow A_j
    x\in A_i \implies x\in A_j

    all mean the same thing.

  4. If A_i and A_j overlap, but neither is a proper subset of the other, then replace the undirected line between A_i and A_j by

    A_i \leftarrow A_i\cap A_j \rightarrow A_j

  5. Go back to step 1. Exit loop when last two repetitions yield the same DAG.

At the end, you will have a DAG in which the arrows all indicate a subset relationship. Also, by the end, all the root nodes (the ones with no incomming, only outgoing arrows) will be mutually disjoint sets. This is all very trivial and I’m sure a lot of people have come up with the Snow White DAG on their own. I just mention it in case you haven’t yet.

PS. In the above convention, a typical operating system with folders is represented by a tree, with the arrows pointing away from the multiple innermost folders towards the single outermost folder. The outermost folder is often called the root directory in operating system parlance, but here I am calling the innermost folders the root nodes.

October 12, 2015

Caltech Abhors MOOCs

Filed under: Uncategorized — rrtucci @ 4:12 am

I’ve expressed my very favorable opinion of MOOCs many times before in this blog.

Today I visited the Coursera and EdX websites and learned that they are currently offering 1,465 courses and 233 courses, respectively. So MOOCs are alive and well, at least today.

I was curious to see how many MOOCs Caltech is currently offering so I went to a website called “MOOC List”. According to that site, the grand total of Caltech MOOCs since the beginning of time is 5. 😸 Let me copy and paste the full list here:

  1. Machine Learning (Caltech) Self Paced
  2. The Science of the Solar System (Coursera) Mar 30th 2015
  3. Galaxies and Cosmology (Coursera) Jan 6th 2015
  4. Drugs and the Brain (Coursera) Jan 4th 2014
  5. Principles of Economics for Scientists (Early 2013) Jan 7th 2013

Looks like the Caltech Evil Empire abhors MOOCs…


In a small galaxy called Caltech far, far away from Stanford University, Darth Vader Preskill is informed by Emperor Palpatine that the two co-founders of Coursera, Daphne Koller (aka Princess Leia) and Andy Ng (aka Luke Skywalker), are his offsprings, and that they are threatening the Empire of traditional Universities that Lord Vader has sworn to defend.

The following is a quote from the movie “The Empire Strikes Back”, with some minor modifications. My omissions from the quote are crossed out. My additions to the quote are placed in parenthesis.

Darth Vader Preskill: [kneeling before Emperor Palpatine’s hologram] What is thy bidding, my master?

Emperor Palpatine: There is a great disturbance in the (Educational) Force.

Darth Vader Preskill: I have felt it.

Palpatine: We have a new enemy. The young (Coursera) Rebel(s) who destroyed the Death Star. I have no doubt this boy (and girl are) is the offspring of Anakin Skywalker.

Darth Vader Preskill: How is that possible?

Palpatine: Search your feelings, Lord Vader. You will know it to be true. He(They) could destroy us.

Darth Vader Preskill: He’s just a boy. (They are just children). Obi-Wan can no longer help him (them).

Palpatine: The (Educational) Force is strong with him(them). The son (and daughter) of Skywalker must not become a Jedi.(MOOC Jedis).

Darth Vader Preskill: If he (they) could be turned, he(they) would become a powerful ally.(powerful allies)

Palpatine: [intrigued] Yes… He (They) would be a great asset. Can it be done?

Darth Vader Preskill: He (They) will join us or die, master.

October 6, 2015

Teaching Quantum Mechanics to Children, the Caltech and Waterloo Univ. Method

Filed under: Uncategorized — rrtucci @ 7:29 pm

I’ve heard some uncouth people, naysayers and sour grapes most of them, voice the extreme, malicious opinion that video games are junk food for the mind.

And what about the recent video games qCraft (by Caltech) and QuantumCats (by University of Waterloo in Canada) which promise to teach quantum mechanics to children? To the naysayers, those video games are poison too, quantum junk food, a way of wasting, piddling away the precious, jam-packed, fleeting years of youth.

It occurred to me that such opinions could be put to the test scientifically. So I was very happy when, while poring over the Lancet, a journal which I read faithful every Sunday, I came across the following article about a study conducted by the Mayo Clinic on this very subject.

Clinical Study of the Effects on Children of Playing Video Games qCraft and QuantumCats
by Mayo Clinic, Oct 1, 2015



We conducted a 1 year study on a group of 20 school children, ages 10 to 18, who showed an early interest in math and science.

10 of the children were our control group A, and 10 were our video gamers group X.

The children from group A were given classroom courses by really good high school teachers in Algebra, Geometry, Calculus, Biology, Physics, Chemistry. They were encouraged to consult Kahn Academy, Wikipedia articles, take MOOC courses and read books on science and math. Those yearning for hands on experience were encouraged to join a Ham radio club or local Hackers club and build their own electrical devices or else join an open source programmer’s group and start writing computer code at an early age.

Group A children were also encouraged to do some physical activity by going to a court or gym and practicing a sport, and joining a youth sport team if possible. Bicycling, swimming, jogging, dancing, etc. were all encouraged

The children in Group X were told that before taking a math or science course, it would be better if they first learned the basics of Science by playing some video games. By practicing how to build a quantum computer out of imaginary Lego blocks or throwing quantum cats with a catapult, they could learn the basics of quantum mechanics first, and then, if after a year or two of that they showed any promise, they would be permitted to take courses in science and math, and consult Kahn Academy, Wikipedia and all those other old-fashioned, boring resources.

If the children from Group X wanted to do some physical activity by playing a particular sport, they were told that it would be better if they first learned the basics by playing a video game about that sport. If after a year or two of that they showed any promise, they would be taken to a court or gym to learn the physical part of the sport.

We found that 95% of the children from group A went to good colleges. 5% never made it to college because they had already started their own high-tech businesses in high school and saw no need to go to college.

We found that 95% of the children from Group X never went to college, because they were recruited by the Army right out of high school as drone plane operators. 5% did make it to college, mostly MIT, where they eventually became professors.

The figure above shows a typical child from Group X, after 0,1,2,3,4 months into the clinical study. His cranial capacity diminished by 15cc after 4 months but we were told by Caltech and Waterloo that if we had run the study for a longer period of time, we would have seen that cranial capacity reaches a minimum after 4 months and then begins to increase. They also pointed out that our study was flawed in methodology and too low in number of children to be statistically significant.

Next Page »

The Rubric Theme. Create a free website or blog at WordPress.com.


Get every new post delivered to your Inbox.

Join 650 other followers

%d bloggers like this: