Quantum Bayesian Networks

February 6, 2016

Quantum Open Source 2016

Filed under: Uncategorized — rrtucci @ 7:31 am

big-bro-anim

Big Brother is saying: Microsoft’s “Liqui|> is closed source for your own good. Quantum open source is EVIL. Using it will harm you!”


Quantum Carly Fiorina

Quantum Carly Fiorina

Big Brother is saying: Vote for Quantum Carly Fiorina, my handpicked leader

Big Brother is saying: Vote for Quantum Carly Fiorina, my handpicked leader


apple1984-anim


quantum-open-source

On 2016, the quantum open source community will finally prevail, and you will see why future quantum computer software won’t be like “1984”

Advertisements

May 28, 2016

Don’t tell Microsoft. Qubiter, an open source platform, can now do quantum chemistry too

Filed under: Uncategorized — rrtucci @ 8:10 pm

I’ve uploaded to

https://github.com/rrtucci/my-chemistry

a small collection of classes called `my-chemistry’ that allow one to construct a gate model quantum circuit for calculating the ground state energy of molecules using Kitaev’s Phase Estimation Algorithm.

Notice that I am releasing my-chemistry under my name only. Although this is an add-on to the artiste-qb.net Qubiter project, the Artiste company is free of any legal liability, as I have not merged it with the main code branch, and the company is not deriving any use of it at this time.

The docstrings of each class describe in detail how it works. All my classes have a main method at the end with examples and tests of the class. In addition, I have written a pdf document describing the more technical details of the quantum circuit involved. The pdf document is part of the my-chemistry distribution.

Here is an excerpt from the Introduction section of that pdf document:

This paper describes the particular circuit used by a Python software package called “my-chemistry”, written by R.R.Tucci, and available at GitHub, Ref.[1]. The software can be used in conjunction with “Qubiter”, another Python software package available at GitHub, Ref.[2].

A quantum circuit that is very similar to the one presented in this paper has previously been implemented FIRST in Ref.[4] and more recently and exhaustively in the closed source software package called Liqui|\rangle produced by Microsoft, with Dave Wecker as main author.

Here is a super brief, by no means exhaustive review of some of the highlights in the history of this quantum computing approach to chemistry.

The person deserving the lion share of the credit for this method is A. Kitaev, who in 1995, Ref.[3], was the first to propose the PEA. Also very deserving are Trotter for his expansion, and Jordan/Wigner for their transformation.

The first paper to present an actual computer program for calculating the ground state energy of an $H_2$ molecule using PEA appears to be Ref.[4], by Whitfield, Biamonte and Aspuru-Guzik.

Researchers working for Microsoft applied the method to more complicated molecules and found some very clever optimization methods, such as using the identity (CNOT)^2 = 1. Here is their epiphany paper Ref.[5], and here is their most recent paper Ref.[6]. The latter is recommended for what appears to be a very fair and exhaustive list of references of this approach.

Finally, one should mention that Microsoft has several patents on this method, so it is possible that Microsoft will claim in the future that the software described in this paper infringes on one of their patents. Going to the USPTO website and using the query IN/wecker AND AN/Microsoft, I located 4 patents Refs.[7][8][9][10] on Liqui|\rangle. There might be more pending.

Patents alluded to

Quantum gate optimizations
https://patents.google.com/patent/US9064067B2/en

Optimizing quantum simulations by intelligent permutation
https://patents.google.com/patent/US8972237B2/en

Language integration via function redirection
https://patents.google.com/patent/US9292304B2/en

Quantum annealing simulator
https://patents.google.com/patent/US9152746B2/en

August 15, 2017

Resistance is Futile: Jupyter Borg Threatens to Assimilate Quantum Computing Universe

Filed under: Uncategorized — rrtucci @ 5:00 pm

A week ago, IBM announced at its Quantum Experience usegroup that it had uploaded to github a large collection of jupyter notebooks exhibiting the use of their gate model quantum computer (previously 5 qubits, currently 16 qubits). I consider this an excellent addition to the quantum open source and free jupyter notebook universe and ecosystem. I’ve advocated for quantum open source and jupyter notebooks many times before in this blog, so it’s a pleasure for me to echo their announcement.

Pow! Right in the kisser of Microsoft’s Liqui|> software. Liqui|> is closed source software.

Google has announced that it will deliver by year’s end a 49 qubit gate model qc with accompanying open source software and cloud service. The jupyter ball is now in your court, Google.

Artiste-qb.net, the company that I work for, already provides a large and ever growing library of jupyter notebooks for both of its quantum open source offerings, Qubiter and Quantum Fog.

Rigetti’s PyQuil and ProjectQ are two other gate model qc simulators analogous to IBM quantum experience. So far these two have very few jupyter notebooks. Wimps! Laggards! Let them eat cake!

borg-cake

Borg Cake

jupyter-cake

Jupyter Cake

April 29, 2017

Miss Quantum Computing, may I introduce to you Miss Bayesian Hierarchical Models and Miss MCMC?

Filed under: Uncategorized — rrtucci @ 5:49 pm

Warning: Intense talk about computer software ahead. If you are a theoretical computer scientist, you better stop reading this now. Your weak constitution probably can’t take it.

When you enter the nerd paradise and secret garden that is Bayesforge.com (a free service on the Amazon cloud), you will see one folder named “Classical” and another named “Quantum”. Here is a screenshot of this taken from Henning Dekant’s excellent post in Linkedin

The “Quantum” folder contains some major open source quantum computing programs: Quantum Fog, Qubiter, IBM-QisKit (aka kiss-kit), QuTip, DWave, ProjectQ, Rigetti

The “Classical” folder contains some major Bayesian analysis open source programs: Marco Scutari’s bnlearn (R), Kevin Murphy’s BNT (Octave/matlab), OpenPNL (C++/matlab), PyMC, PyStan.

The idea is to promote cross fertilization between “Quantum” and “Classical” Bayesian statisticians.

Today I want to talk mostly about PyMC and PyStan. PyMC and PyStan deal with “Hierarchical Models” (Hmods). The other programs in the “Classical” folder deal with “Bayesian Networks”(Bnets).

Bnets and Hmods are almost the same thing. The community of people working on Bnets has Judea Pearl as one of its distinguished leaders. The community of people working on Hmods has Andrew Gelman as one of its distinguished leaders. You might know Gelman (Prof. at Columbia U.) from his great blog “Statistical Modeling, Causal Inference, and Social Science” or from one of his many books

Both PyStan and PyMC do MCMC (Markov Chain Monte Carlo) for Hmods. They are sort of competitors but also complementary.

PyStan (its GitHub repo here) is a Python wrapper of a computer program written in C++ called Stan. According to Wikipedia, “Stan is named in honour of Stanislaw Ulam, pioneer of the Monte Carlo method.” Prof. Gelman is one of the fathers of Stan (I mean the program, of course).

PyMC comes in 2 incompatible versions: 2.X and 3.X. Version 3 is more advanced and intends to replace Ver 2. PyMC2’s best sampler is a Metropolis-Hastings (MH) sampler. PyMC3 contains an MH sampler, but it also contains the “No U turns” or “NUTS” sampler that is supposed to be much faster than MH for large networks. Currently, Bayesforge contains only PyMC2, but the next version will contain both PyMC2 and PyMC3. As an added bonus, PyMC3 comes with Theano, one of the leading deep neural networks frameworks.

Check out this really cool course:

Sta-663 “Statistical Programming” , available at GitHub, taught at Duke U. by Prof. Chi Wei Cliburn Chan.

This wonderful course has some nice jupyter notebooks illustrating the use of PyMC2, PyMC3 and PyStan. Plus it contains discussions of many other statistical programmimg topics. I love it. It has a similar philosophy to BayesForge, namely to do statistical programming with jupyter notebooks because they are great for communicating your ideas to others and allow you to combine seamlessly various languages like Python, R, Octave, etc

February 22, 2017

Quantum Fog’s weight in bnlearn units

Filed under: Uncategorized — rrtucci @ 2:42 am

In a recent blog post entitled “R are Us. We are all R now”, I expressed my great admiration for the R statistical computer language, and I announced the addition to the Quantum Fog (QFog) GitHub repository of a Jupyter notebook called “Rmagic for dummies” which explains how something called Rmagic allows one to run both Python and R in the same Jupyter notebook.

In 2 other earlier blog posts, I also expressed great admiration for something else, bnlearn, an open source computer program written in R by Marco Scutari for learning classical Bayesian networks (cbnets) from data. I consider bnlearn the gold standard of bnet learning software.

The main purpose of this blog post is to announce that the QFog GitHub repo now has a folder of Jupyter notebooks comparing QFog to bnlearn. This is a perfect application of Rmagic to comparing two applications that can do some of the same things but one app is written in R while the other is written in Python. Pitting QFog against bnlearn is highly beneficial to us developers of QFog because it shows us what needs to be improved and suggests new features that would be worthwhile to add.

QFog can do certain things that bnlearn can’t (most notably, QFog can do both classical and quantum bnets, whereas bnlearn can only do classical bnets), and vice versa (for instance, bnlearn can do bnets with continuous (Gaussian) node probability distributions, whereas QFog can only handle discrete PDs), but there is much overlap between the 2 software packages in the area of structure and parameter learning of classical bnets from data.

A cool feature of the folder of Jupyter notebooks comparing bnlearn and QFog is that most notebooks in that folder can be spawned and run from a single “master” notebook. This amazing ability of the “master” notebook to create and direct a zombie horde of other notebooks is achieved thanks to an open source Python module called “nbrun” (notebook run).

qfog-bnlearn-scales

January 6, 2017

Pythonic Qubiter now has a quantum compiler. The Death Star doesn’t have one yet.

Filed under: Uncategorized — rrtucci @ 12:33 am

quantumsoftwaretwotypesAs I’ve mentioned many times before in this blog, Henning Dekant (in Toronto) and I (in Boston) founded about a year ago a quantum computer software company called Artiste-qb.netartiste-logo. Our two main products so far are the open source Python programs Quantum Fog and Qubiter. Both programs were originally written in C++, but now they have been rewritten in Python and improved in many ways. This blog post is to announce that the Pythonic version of Qubiter now has a quantum compiler. Hurray for the Rebel Alliance!

Quantum compilers have long been an interest of mine. As far as I know, I was the first one to use the term “quantum compiler” in a paper in the field of quantum computing. I did so in the following 1999 paper

https://arxiv.org/abs/quant-ph/9902062

By a quantum compiler I mean a software program that takes an ARBITRARY unitary matrix as input and decomposes it into a sequence of qubit rotations and CNOTs. In my opinion, quantum compilers that fit this description are useful, even necessary, if one wants to use quantum computers to do artificial intelligence. Because, whereas for most physics applications one can assume unitaries of the form U = e^{-itH} where H, the Hamiltonian, has a very special structure, in AI, the unitaries can have an ARBITRARY structure (not known a priori) that doesn’t come from a Hamiltonian with an a priori known structure.

That 1999 paper was the first one to propose using the CS decomposition of Linear Algebra to do quantum compiling. The C++ program Qubiter, which was first released open source simultaneously with the 1999 paper, was the first computer program that used the CS decomposition to do quantum compiling.

Since that 1999 paper, many papers have been written using the same CS decomposition algorithm as my paper, but not citing my work, because, truth be told, dishonesty is rampant among academic researchers.

Other papers have copied the term “quantum compiling” to refer to a task that is related but much less general than what I call quantum compiling, namely, the task of decomposing a single qubit rotation into a sequence of operations from a finite set of gates. The latter task is necessary for fault tolerant quantum computing and was first tackled by Solovay and Kitaev, but they did not refer to it as “quantum compiling”. Nobody did prior to my 1999 paper.

This blog, which is now 8-9 years old, has featured several posts about quantum compilers. These are my favorites:

And now, the latest gossip about quantum compilers.

Up to now, Microsoft’s main and only quantum computer program was Liqui|>, a heavily patented, closed source computer program written in an unpopular language called F#. For many years now, the main writers of Liqui|>, Dave Wecker and Krysta Svore, have been promising to be THE FIRST to provide a quantum compiler, and of course, never mentioning my work. In some papers, KS uses the term quantum compiler to mean the same thing as me, in other papers she uses it to mean software that does the Solovay-Kitaev decomposition. But I can state unequivocally that Liqui|> has no CS decomposition quantum compiler at the present time. So Qubiter is way ahead of them there.

But the plot thickens. Matthias Troyer is a professor at ETH Zurich that works part time, for big bucks, for Microsoft. He has written papers about Liqui|> with Dave Wecker and Krysta Snore. Hence many people were extremely surprised that on Dec. 23, 2016, Matthias and two of his students at ETH Zurich, Minion A and Minion B, released a computer program called ProjectQ that duplicates most of what Liqui|> does and is therefore in direct competition with it. However, ProjectQ is open source, written in Python, and under the Apache License. The suspense is unbearable. Does this mean that Liqui|> will be laid to rest, R.I.P., meaning Dave Wecker’s and Krysta Svore’s work for the last 5 years will be completely ditched? Stay tuned. Microsoft will be holding QIP 2017 on Jan 14-20. That QIP is going to be Microsoft’s 1936 Nazi Olympics, where they intended to dazzle the world with their superiority in quantum computing. Maybe then we will find out, by watching carefully the placement of the people on the main dais during the weapons parade, who is up and who is down.

Henning and I welcome all open source code, even if it is in competition with Qubiter.

IMHO, Qubiter is currently much better than ProjectQ. For one thing, they claim they have a quantum compiler, but they don’t, at least not currently. Not in the sense that I defined it at the beginning of this post. What they do seem to have and call a quantum compiler are some subroutines that expand a single qubit gate with multiple controls attached into a sequence of single qubit rotations and CNOTs. But Qubiter has that already too. Look at its CktExpander.py file.

Duel Between Microsoft’s ProjectQ and Liqui|> software

duello_allultimo_sangue

July 14, 2016

Quantum Fog, a quantum computer simulator based on quantum Bayesian networks, can now Think (at least better than a rock)

Filed under: Uncategorized — rrtucci @ 6:51 pm

Today, I added a folder called “learning” to Quantum Fog. QFog is a quantum computer simulator based on Bayesian Networks (bnets). Classical Bayesian networks are what earned Judea Pearl a Turing Prize. Quantum Fog implements seamlessly both classical Bayesian networks and their quantum generalization, quantum Bayesian networks.

The way I see it, the field of classical Bayesian networks has had 2 Springs.

The first Spring was about 20 years ago and it was motivated by the discovery of the join tree message passing algo which decreased significantly the complexity of doing inference with bnets. That complexity is exponential regardless, but the join tree algo makes it exponential in the size of the largest clique of the graph.

The second Spring is occurring right now, and it is motivated by the discovery of various algorithms for learning bnets by computer from the data. Immediately after the first Spring, bnet inference could be done fairly quickly, but the bnet had to be divined manually by the user, a formidable task for bnets with more than a handful of nodes. But nowadays, that situation has improved considerably, as you can see by looking at my 2 favorite open source libraries for learning bnets from data:

  1. bnlearn. Very polished, R language. Written by Marco Scutari
  2. neuroBN, a less polished but very pedagogically helpful to me. Python language. Written by Nicholas Cullen

Its new ”learning” folder gives Quantum Fog a rudimentary capability for learning both classical bnets and quantum bnets from data.(so far QFog can only do Naive Bayes and Chow Liu Tree algos. Soon will add Hill Climbing, Tabu, GrowShrink, IAMB and PC algos) Previous workers like Scutari and Cullen only consider cbnets. Quantum Fog aims to cover both cbnets and qbnets seamlessly. We hope QFog can eventually generate ”programs” (instruction sequences) that can be run on real quantum computer hardware.

Quantum Fog goes to pet school

Canine-Cognition-Center3

March 29, 2016

Qubiter on the brink of doing Quantum Chemistry

Filed under: Uncategorized — rrtucci @ 6:35 pm

On behalf of the artiste-qb.net company, I am pleased to announce that Qubiter now has a new class called PhaseEstSEO_writer.py that endows it with the superpower of being able to generate and simulate quantum circuits for quantum phase estimation.

The quantum Phase Estimation circuit (PEC) was invented in 1995 by Kitaev. Since then, it has found many applications in quantum computing. Microsoft Liqui|>, the main competitor of Qubiter, uses PEC to find the ground state of molecules.

In fact, Liqui|> can generate many other circuits besides PEC, but most of them are not very commercially viable. For example, Liqui|> can generate quantum error correction circuits, but error correction will most likely be done by the hardware manufacturers, so QC programmers won’t need to implement it via Liqui|> in their circuits. Liqui|> also does Shor’s algo, but I don’t think Shor’s algo will be a big money maker, because it requires thousands of qubits, so it will be one of the last applications to be implemented on a QC. By the time it is implemented, post quantum crypto, which is impervious to it, will have been in general use for many years.

So most of the circuits that Liqui|> can generate & simulate will never be commercially viable, but those for doing quantum chemistry probably will be.

The Liqui|> team claims to have used experimentation with Liqui|> to reduce by a factor of a thousand the size of their QC circuit for finding the ground state of molecules with PEC. They claim that their new, optimized circuit will give an answer in an hour of running on a gate model QC, once those beasts arrive. Such calculations would take billions of years for a classical computer to perform, they claim.

The Liqui|> authors like Krysta Svore seem well aware that Quantum Chemistry is one of the most commercially viable and potentially lucrative applications of Liqui|>. You can almost see the dollar signs in their beady eyes when they speak about quantum chemistry.

But Qubiter is throwing a spanner into their nefarious plans.

Microsoft Liqui|> is closed source and heavily patented. Also, it is written in a very unpopular language F#. Qubiter, on the other hand, is open source and written in the super popular language Python.

And now Qubiter allows everyone, not just Microsoft egg heads, to do QC quantum chemistry too.

March 22, 2016

First version of Qubiter (a quantum computer simulator) is out. It says: “Welcome, my navigator. Where do you want me to take you next?”

Filed under: Uncategorized — rrtucci @ 4:44 pm

artiste-qb.net has just uploaded at GitHub its first version of Qubiter, a quantum computer simulator, under the BSD license and written in Python.

It’s saying to you: “Welcome, Navigator. Where do you want me to take you next?”

dog-amazed

nerd-destiny

Over the last 20 years, dozens of quantum computer simulators (for gate model, aka quantum circuit QC) have been released. Here is a partial list. So what makes this one special?

Let me compare Qubiter with Microsoft’s very famous quantum simulator called Liqui|>.

  • Liqui|> is closed source (and heavily patented), Qubiter is open source under BSD license
  • Liqui|> is written in F#, Qubiter is written in Python. User base of Python is ~ 100 times bigger than that of F#. And believe me, programmers are very savvy consumers of programming languages.
  • Dave Wecker (rhymes with wrecker), chief architect of Liqui|> , has been quoted as saying that he estimates Liqui|> has about 30,000 lines of code. Qubiter currently has less than a 1000, and does all the basics and much more. So tell me, what would you prefer to have to grok, 30,000 or 1,000 lines of code?

I hope you enjoy it. Or as my grandma used to say,

Stai zitto e mangia!!

February 17, 2016

Our baby (Quantum Fog) can now read, write and draw

Filed under: Uncategorized — rrtucci @ 12:06 am

In a special Feb. 2 , 2016, ground hog day blog post, I announced the first large release by our company, artiste-logoartiste-qb.net, of Pythonic open-source Quantum Fog. As you may or may not know, Quantum Fog was originally (almost 20 years ago) a Mac-only application written in C++. The GitHub page for Quantum Fog contains the legacy C++ code of yee ole Quantum Fog, plus the shining new Python code which will attempt to reproduce all the functionality of the old application plus much more. So are we there yet, you ask. Not quite, but making steady progress.

These are some of the things the new QFog can do so far:

  • It can do inference with evidence using 3 methods: Join Tree, MCMC (Monte Carlo) and brute force (enumeration of all Feynman paths).
  • It can do inference by those 3 methods for BOTH, quantum Bayesian networks (QBnets) and classical Bayesian networks (CBnets).
  • It can read and write CBnets and QBnets in two formats .dot and .bif (bif stands for Bayesian Interchange Format).
  • It can draw the CBnet and QBnet graphs using only matplotlib and networkx, included in the usual Python installation. This is fine for most purposes, but if you want a super high quality plot of your graph, you should use the .dot file that Quantum Fog generates and fine-tune that with GraphViz.

QFog integrates CBnets and QBnets seamlessly. You can use all subroutines for either classical or quantum analysis simply by changing a Boolean input parameter called is_quantum

Footnotes

  • .dot and .bif files are both just .txt files. The .dot format is used by a really wonderful, free software called GraphViz. There are several very helpful Bayesian network repositories that store Bnets in the .bif format. The .dot and .bif formats are complementary. The .dot format is good for storing visual layout info, not good for storing the numerical tables associated with each node. The .bif format is good in the opposite way.

  • The “join tree” (or junction tree, or clique tree or belief) propagation method is an exact method (Other methods, like MCMC, are approximate). The Join Tree method caused a mini revolution in the Bayesian networks field starting from 1990 or so. Before then, people had been discouraged by a proof that calculating probabilities from a Bnet by brute force is NP hard. But the Join Tree method takes advantage of the graph structure. If I understand it correctly, its complexity is exp(k) whereas brute force is exp(n), where n is the total number of nodes and k is the number of nodes in the fattest clique (k is called the width of the join tree). The Join Tree algorithm still has exponential complexity, but is much better than the brute force algo. The join tree algo used by the new Qfog is the one described in the following very detailed and clear, cookbook paper:

    Inference in Belief Networks, A Procedural Guide, by C. Huang and A. Darwiche (1996)

I end by waxing poetic in a nerdy way. Here are 3 things that remind me of quantum fog:


fog-machine
Fog Machines, very cool. An essential prop in the shooting of moody films, in rock concerts, in serious Halloween home decorations and in nerd experiments. They work by either (1) pushing a mixture of water and (glycol or glycerin or mineral oil) over a heated surface, or (2) dropping dry ice, i.e., solid CO2, into water heated near boiling point. Some use solid N2 or O2 instead to get a different kind of fog.


Aerogel.
Aerogels, very cool too.

They are very good thermal insulators. You can put your hand on one side and a Bunsen Burner flame on the other side of a ½ inch thick layer of aerogel and not feel the heat.

The ones that are almost transparent and ethereal looking are the silica aerogels. They are kind of expensive though, like $50 for a 1 x 1 x 0.5 in. specimen.

Wikipedia quotes:
“Aerogel was first created by Samuel Stephens Kistler in 1931, as a result of a bet with Charles Learned over who could replace the liquid in “jellies” with gas without causing shrinkage.”

“The first aerogels were produced from silica gels. Kistler’s later work involved aerogels based on alumina, chromia and tin dioxide. Carbon aerogels were first developed in the late 1980s.”


GoldenGateFog
And of course, the Golden Gate Bridge shrouded in fog.

February 2, 2016

First Major Commit to Quantum Fog in Python

Filed under: Uncategorized — rrtucci @ 4:56 pm

Today is Groundhog day, Feb. 2, 2016. This holiday holds special significance in Quantum Computing History. Indeed, on another Groundhog day in the distant past, in the year 2015, a watershed historic quantum event occurred, namely Jimmy the Groundhog bit Seth Lloyd’s ear.

Other quantum watershed events have occurred since them, like, for example, Caltech held its “One Entangled Evening” and Alex Winter screened his NSF funded 10 minute mini-movie “Anyone can Quantum”. But those events are California/Hollywood glitzy, whereas groundhog day is a more folksy affair (old white men with funny top hats bugging a chubby rodent who just wants to go back to sleep).

artiste-logo Since we here at artiste-qb.net are more folksy than glitzy, we decided to celebrate this holiday by releasing today our first version of Quantum Fog in Python (QFog is open sourced under the BSD license, and it is available at GitHub)

Basically, what I did was to refurbish an old open-source program called PBNT by Elliot Cohen. PBNT does classical Bayesian Networks using 3 inference algorithms: Enumeration (brute force), MCMC and Join Tree (aka Junction Tree).

The join tree algo used by PBMT and QFog is the one described in the following very detailed and clear, cookbook paper: _Inference in Belief Networks, A Procedural Guide_, by C Huang and A. Darwiche (1996)
http://www.ar-tiste.com/Huang-Darwiche1996.pdf

Our new QFog release does BOTH, classical and quantum Bnets using the same 3 algos. One of our eventual goals is to write a quantum computer programming language based on quantum Bnets.

November 30, 2015

Quantum Fog Comes Out of the Closet

Filed under: Uncategorized — rrtucci @ 4:06 am

Quantum Fog, my old Mac application for doing calculations with quantum Bayesian networks, is now open-source (it’s finally out of the closet). Check it out at GitHub.

Quantum Fog was originally written in C++, but we plan to rewrite most of it in Python

By the way, I highly recommend the GitHub website if you need to collaborate with a group of people on the writing of a computer app or a website or an arXiv paper, or many other possibilities. I use GitHub in conjunction with a Windows app called TortoiseGit, which I also love and highly recommend.

November 11, 2015

Google Open-sources TensorFlow (A Fakesian Networks Software Library). Microsoft, Tear Down This Infer.NET Wall

Filed under: Uncategorized — rrtucci @ 5:21 pm

Check out

On Nov. 10, Google announced to much fanfare that it was open-sourcing TensorFlow, their software library for “Large-Scale Machine Learning on Heterogeneous Distributed Systems”. At this point in time, I know very little about TensorFlow, but I can already see that it is not very Bayesian Networky. Me being such an avid B Net fan, I can’t deny that I was a little disappointed by how little B net stuff it contains. To me, TensorFlow is Fakesian Networks instead of Bayesian Networks 🙂

In my opinion, TensorFlow has VERY LITTLE in common, except for the name, with what quantum information theorists call “quantum tensor networks”, although I’m sure that some sleazy, opportunistic physicists and science journalists will claim that the two are co-joined twins. Unlike the classical, mostly deterministic, highly distributed calculations that TensorFlow performs, quantum computers have to deal mostly with quantum probabilistic instead of deterministic calculations, and distributed computing for QCs would be very different than its classical counterpart. I think when dealing with probabilistic calculations, either on a classical or quantum computer, Classical Bnets and Quantum Bnets are the most natural and productive framework/strategy, as I’ve explained before.

Despite TensorFlow being Fakesian Networks, I welcome Google’s move to open TensorFlow, because it certainly raises the level of visibility, cooperation, competition and tension/suspense in the AI arena. For example, I’m sure that right about now Microsoft is facing a lot of pressure to respond in kind to the news about TensorFlow. So what does Microsoft use instead of TensorFlow to do its Bing AI? Is it Infer.net? Whatever it is, will MS have to open source part of its Bing AI, to keep up with the Joneses and the Kardashians?

I like Infer.net. It looks much more Bayesian Networky to me than TensorFlow does. Unfortunately, so far MS has only released to the public infer.net’s binary and API, and it has forbidden non MS people from using infer.net for commercial purposes.

Told you so UPDATE1: Ta-tan, Nostradamucci’s predictions have turned into reality again. Nov 12, just 2 days after Google released TensorFlow, Microsoft announces the open-sourcing of DMTK (Distributed Machine Learning ToolKit). And of course, Facebook was the first, with its open-sourcing of Torch enhancements on Jan 16 of this year.

UPDATE2: Related news. After UPDATE1, I learned that IBM has also been busy open-sourcing distributed AI software. It has recently open-sourced SystemML (ML=Machine Learning), a small part of its Watson software. The GitHub repository of SystemML was started on Aug 17, 2015. According to this press article, circa Nov 23, 2015, SystemML was accepted into the Apache incubator (a preliminary step to being declared a saint, where sainthood means you have performed 2 miracles and you are declared officially integrated and compatible with the Apache libraries, especially Spark, which SystemML will rely heavily upon.)

October 6, 2015

Teaching Quantum Mechanics to Children, the Caltech and Waterloo Univ. Method

Filed under: Uncategorized — rrtucci @ 7:29 pm

I’ve heard some uncouth people, naysayers and sour grapes most of them, voice the extreme, malicious opinion that video games are junk food for the mind.

And what about the recent video games qCraft (by Caltech) and QuantumCats (by University of Waterloo in Canada) which promise to teach quantum mechanics to children? To the naysayers, those video games are poison too, quantum junk food, a way of wasting, piddling away the precious, jam-packed, fleeting years of youth.

It occurred to me that such opinions could be put to the test scientifically. So I was very happy when, while poring over the Lancet, a journal which I read faithful every Sunday, I came across the following article about a study conducted by the Mayo Clinic on this very subject.

Clinical Study of the Effects on Children of Playing Video Games qCraft and QuantumCats
by Mayo Clinic, Oct 1, 2015

Synopsis

quantumcats-evol

We conducted a 1 year study on a group of 20 school children, ages 10 to 18, who showed an early interest in math and science.

10 of the children were our control group A, and 10 were our video gamers group X.

The children from group A were given classroom courses by really good high school teachers in Algebra, Geometry, Calculus, Biology, Physics, Chemistry. They were encouraged to consult Kahn Academy, Wikipedia articles, take MOOC courses and read books on science and math. Those yearning for hands on experience were encouraged to join a Ham radio club or local Hackers club and build their own electrical devices or else join an open source programmer’s group and start writing computer code at an early age.

Group A children were also encouraged to do some physical activity by going to a court or gym and practicing a sport, and joining a youth sport team if possible. Bicycling, swimming, jogging, dancing, etc. were all encouraged

The children in Group X were told that before taking a math or science course, it would be better if they first learned the basics of Science by playing some video games. By practicing how to build a quantum computer out of imaginary Lego blocks or throwing quantum cats with a catapult, they could learn the basics of quantum mechanics first, and then, if after a year or two of that they showed any promise, they would be permitted to take courses in science and math, and consult Kahn Academy, Wikipedia and all those other old-fashioned, boring resources.

If the children from Group X wanted to do some physical activity by playing a particular sport, they were told that it would be better if they first learned the basics by playing a video game about that sport. If after a year or two of that they showed any promise, they would be taken to a court or gym to learn the physical part of the sport.

We found that 95% of the children from group A went to good colleges. 5% never made it to college because they had already started their own high-tech businesses in high school and saw no need to go to college.

We found that 95% of the children from Group X never went to college, because they were recruited by the Army right out of high school as drone plane operators. 5% did make it to college, mostly MIT, where they eventually became professors.

The figure above shows a typical child from Group X, after 0,1,2,3,4 months into the clinical study. His cranial capacity diminished by 15cc after 4 months but we were told by Caltech and Waterloo that if we had run the study for a longer period of time, we would have seen that cranial capacity reaches a minimum after 4 months and then begins to increase. They also pointed out that our study was flawed in methodology and too low in number of children to be statistically significant.

March 30, 2014

The Inflationary Quantum Computer

Filed under: Uncategorized — rrtucci @ 9:53 am

Wow-wee! Check out the following article in The Guardian:

MIT Boffins Conjecture that the Inflationary Universe is a Quantum Computer (April 1, 2014)

an excerpt:

In the past, Seth Lloyd, MIT professor of Mechanical Engineering, has conjectured that the Universe is a quantum computer.

More recently, an experiment called BICEP2 that is being conducted in the South Pole, has detected evidence of gravitational waves in the cosmic microwave background. Boffins worldwide believe this to be a smoking gun for a universe with a Big Bang beginning, followed by a brief period of extremely rapid expansion called Inflation, followed by our current, slow as molasses, rate of expansion.

So now Seth Lloyd and his two collaborators, the identical twins A. Minion and B. Minion, are adding a new wrinkle to their theory. They now claim that the inflationary universe (or, alternatively, the inflationary Everett-Linde multiverse) is a quantum computer.

According to Seth Lloyd, his inflationary theory not only solves the 3 problems which the old, fuddy-duddy inflationary models solve (viz., the horizon problem, the flatness problem, and the magnetic-monopole scarcity problem), but it also solves the conundrum of why P!=NP in our present universe.

According to Lloyd, a field of Dark Pion particles exploded 10^(-36) seconds after the Big Bang. Pions are the quanta of the field of problems that are solvable in Polynomial time. Some scientists call them polynomions (you won’t find any pictures of them under Google Images because they are so difficult to visualize, neither waves nor particles). The Dark Pion explosion ended 10^(-32) seconds after the Big Bang. The duration of this explosion is called the Lloyd inflationary period. During the Lloyd inflationary period, many of the most sacred laws of physics were violated with impunity: information traveled faster than the speed of light, energy was not conserved, even P!=NP did not hold.

Let R to be the radius of the bubble of incompressible quantum information and quantum entanglement that is our universe. Lloyd believes that during the Lloyd inflationary period, R grew at a rate far higher than the speed of light. This was possible because during that period, P=NP so the universe was able to perform calculations at a prolific, promiscuous rate, called the “Twitter Rate Upper Bound” (named after Oxford U. Prof. Robin Twitter).

Yakov B. Zeldovich once said that the Universe is the poor man’s accelerator. Lloyd likes to add that the Universe is the poor man’s inflationary quantum computer.

Alan Guth often says that cosmic inflation is the ultimate free lunch. Lloyd reflects that cosmic inflation is the ultimate free, open-source, life-giving-app of the cosmic quantum computer.

Next Page »

Create a free website or blog at WordPress.com.

%d bloggers like this: