Quantum Bayesian Networks

May 17, 2018

Help us program the NYC Matrix

Filed under: Uncategorized — rrtucci @ 1:59 am

Our company Artiste-qb.net does bleeding edge research into quantum computing. We are so advanced that our researchers think of Palantir and the NSA as a bunch of dotards and little Linux rocketmen.

One of our main projects is to program The NYC Matrix. We are almost done, thank you. In this picture, one of our elite programmers, Simón Bermúdez, is experimenting with quantum cloning of himself. Just like in The Matrix movies, but the real thing. Simón gets quite a lot of work done for us by slipping into this multiverse mode.

Google, IBM, Rigetti, DWave and Alibaba, you have been checkmated. artiste-qb.net is the only quantum computing firm that has mastered quantum cloning of employees.

Simón was born and raised in Venezuela, but now he lives in Toronto, Canada. He used to work at the illustrious Creative Destruction Lab (part of the U of Toronto) (hence the shirt in the photo). Now he works for artiste-qb.net. Thanks to Simón for letting me use this TOP SECRET photo. I lowered the resolution of the photo, the original one is even better.


May 9, 2018

BBVI in quantum computing, classical vs quantum supervised learning (classical vs quantum ELBO)

Filed under: Uncategorized — rrtucci @ 2:18 am

Version 1 of the computer program “Quantum Edward” that I released a few days ago uses the BBVI (Black Box Variational Inference, see Ref. 1 below) to train a qc by maximizing with respect to a parameter lambda, a “classical ELBO” (an ELBO defined in terms of classical probability distributions). I call that “classical supervised learning” by a qc (quantum computer).

But one can easily come up with a BBVI that trains a qc by maximizing with respect to a parameter lambda, a “quantum ELBO” (one defined by replacing the classical probability distributions of the classical ELBO by density matrices and sums by traces). I call this second strategy “quantum supervised learning” by a qc.

One more distinction. In Version 1 of Quantum Edward, we do C. Supervised Learning by a simulated (on a classical computer, analytical) qc. More generally, one could do (C. or Q.) Supervised Learning by a (real or simulated) qc

C. or Q. Supervised Learning by a simulated qc is immune to the quantum noise that plagues current qc’s which have almost no quantum error correction. So we definitely should explore that type of learning today.

It will be interesting to compare classification performance for various models (for either layered or DAG models with varying amounts of entanglement) for

  1. C. supervised learning by a classical computer (e.g., for Classical Neural Net layered models or for Bayesian network DAG models)
  2. (C. or Q.) supervised learning by (simulated or real) qc (e.g., for Quantum Neural Network models or for Quantum Bayesian Network models)

Nerd Nirvana will only be achieved once we can do Q. Supervised Learning by an error corrected real qc. 🙂

1. R. Ranganath, S. Gerrish, D. M. Blei, “Black Box Variational
Inference”, https://arxiv.org/abs/1401.0118

May 5, 2018

Quantum Edward, First Commit

Filed under: Uncategorized — rrtucci @ 5:43 am

Today, I uploaded to GitHub the first commit of my “Quantum Edward” software. This blog is among other things a scrapbook of my quantum computing adventures. In this blog post, I want to save a copy of the first README of Quantum Edward. The software is exploratory and therefore will change a lot in the future and its README will change to mirror the changes in the software. So this first README will have a sentimental and comic value for me in years to come. Here it goes:

# Quantum Edward

Quantum Edward at this point is just a small library of Python tools for
doing classical supervised learning on Quantum Neural Networks (QNNs).

An analytical model of the QNN is entered as input into QEdward and the training
is done on a classical computer, using training data already available (e.g.,
MNIST), and using the famous BBVI (Black Box Variational Inference) method
described in Reference 1 below.

The input analytical model of the QNN is given as a sequence of gate
operations for a gate model quantum computer. The hidden variables are
angles by which the qubits are rotated. The observed variables are the input
and output of the quantum circuit. Since it is already expressed in the qc’s
native language, once the QNN has been trained using QEdward, it can be
run immediately on a physical gate model qc such as the ones that IBM and
Google have already built. By running the QNN on a qc and doing
classification with it, we can compare the performance in classification
tasks of QNNs and classical artificial neural nets (ANNs).

Other workers have proposed training a QNN on an actual physical qc. But
current qc’s are still fairly quantum noisy. Training an analytical QNN on a
classical computer might yield better results than training it on a qc
because in the first strategy, the qc’s quantum noise does not degrade the

The BBVI method is a mainstay of the “Edward” software library. Edward uses
Google’s TensorFlow lib to implement various inference methods (Monte Carlo
and Variational ones) for Classical Bayesian Networks and for Hierarchical
Models. H.M.s (pioneered by Andrew Gelman) are a subset of C.B. nets
(pioneered by Judea Pearl). Edward is now officially a part of TensorFlow,
and the original author of Edward, Dustin Tran, now works for Google. Before
Edward came along, TensorFlow could only do networks with deterministic
nodes. With the addition of Edward, TensorFlow now can do nets with both
deterministic and non-deterministic (probabilistic) nodes.

This first baby-step lib does not do distributed computing. The hope is that
it can be used as a kindergarten to learn about these techniques, and that
then the lessons learned can be used to write a library that does the same
thing, classical supervised learning on QNNs, but in a distributed fashion
using Edward/TensorFlow on the cloud.

The first version of Quantum Edward analyzes two QNN models called NbTrols
and NoNbTrols. These two models were chosen because they are interesting to
the author, but the author attempted to make the library general enough so
that it can accommodate other akin models in the future. The allowable
models are referred to as QNNs because they consist of ‘layers’,
as do classical ANNs (Artificial Neural Nets). TensorFlow can analyze
layered models (e.g., ANN) or more general DAG (directed acyclic graph)
models (e.g., Bayesian networks).


1. R. Ranganath, S. Gerrish, D. M. Blei, “Black Box Variational
Inference”, https://arxiv.org/abs/1401.0118

2. https://en.wikipedia.org/wiki/Stochastic_approximation
discusses Robbins-Monro conditions

3. https://github.com/keyonvafa/logistic-reg-bbvi-blog/blob/master/log_reg_bbvi.py

4. http://edwardlib.org/

5. https://discourse.edwardlib.org/

April 13, 2018

Toronto Quantum Computing Meetup, Next Event Featuring 3 Local Stars

Filed under: Uncategorized — rrtucci @ 4:39 pm

The Toronto Quantum Computing Meetup is the largest meetup in the world dedicated to quantum computing, so we claim the title of Quantum Meetup Supremacy, at least for now. (currently we have 1101 Supremos as members. The second biggest club is in London with 951 Brexiters members. The Quitters Brexiters have been growing fast lately. We see what you are doing: trying to sneak up on us and steal our crown. You guys are pathetic! It will never happen! Never! Grow up, you bunch of Peter Paners! )

We cordially invite you to our next meeting on Friday, April 20, 2018. The Event will feature 3 local stars, “Tres Amigos”, speaking about 3 different quantum computing related topics:

  1. Colin Lupton (from Black Brane Systems Inc.)
  2. Turner Silverthorne (from Zero Gravity Labs, part of Loyalty One)
  3. Hassan Bhatti (from CDL- Creative Destruction Lab, part of U of Toronto’s Rotman School of Management)

There will be FREE PIZZA, courtesy of ZGL

We are eternally grateful to ZGL (Zero Gravity Lab) for providing the venue for the event. ZGL is the super cool research lab of Loyalty One.

Loyalty One is one of the largest loyalty marketers in Canada.

April 7, 2018

TensorFlow Versus TensorLayers (for both classical and quantum cases)

Filed under: Uncategorized — rrtucci @ 4:39 am

Click to enlarge.

IBM announces partnership with 8 startups and Zapata announces $5.4M seed investment

Filed under: Uncategorized — rrtucci @ 3:22 am

Check out this IBM Press Release. It announces that the following 8 startups will be joining the “IBM Q Network”

  1. Zapata (U of Toronto, Aspuru-Guzik)
  2. Strangeworks (Austin TX, Whurley)
  3. QXbranch (Australia, M. Brett)
  4. Quantum Benchmark (Waterloo Canada, Lazaridis)
  5. QCWare (Ames-NASA)
  6. Q-CTRL (Univ. of Sidney, Biercuk)
  7. Cambridge Quantum Computing (London, Ilyas Khan)
  8. 1QBit (Vancouver)

It seems that the main thing these startups are getting is free access, which is not granted to everyone, to the 50 qubit IBM quantum computer. Not exactly like winning the lottery though. I suspect that in a few weeks, Google will grant everyone free access to their 72 qubit quantum computer.

The first company to be mentioned is Zapata, which starts with the letter Z. huh?? Inverse alphabetical order? You’ve got to be kidding me. Isn’t Zapata soon going to be one of IBM’s main competitors in the quantum chemistry arena? Isn’t IBM betting their qc farm on quantum chemistry? I hope, for IBM’s sake, that the master mind at IBM who conceived this program knows what he is doing.

Zapata has been much in the news lately.

Aspuru-Guzik, prof at Harvard for almost a decade and famous for his work using quantum computers to do chemistry, is moving to the U of Toronto in July. (I suspect that Matthias Troyer, another quantum chemistry eminence, will be offered and will accept the position at Harvud being vacated by Aspuru. If that happens, this will totally, completely deplete Microsoft’s quantum chemistry brain trust. He he).

Aspuru started Zapata a few months ago. Yesterday, Zapata announced that it obtained $5.4M in seed funding!!

All this Zapata news is very good news for us, artiste-qb.net. Our business plan has nothing to do with quantum chemistry, so there is very little overlap between Zapata and us. Zapata will attract qc talent to Toronto, artiste-qb.net’s home town. Such a high valuation for Zapata makes Artiste a real bargain.

3 of the 8 companies in the above list got their original funding less than 5 years ago by promising to investors that they would write software that would run on Dwave’s annealer quantum computer. But now they claim they were IBM’s best buddies and gate model experts all along. Gate model and annealer softwares look nothing alike. Judases!

All this IBM press coverage must have Microsoft stewing with envy. MS won’t have an anyon quantum computer for at least 5 years, if ever. But MS could easily compete with Google and IBM in the transmon quantum computer arena by buying Rigetti (or some similar, alternative startup, like Yale QC). The price of Rigetti is pocket change to MS. I’m sure such a sale is being actively discussed behind closed doors. After all, very few Silicon Valley startups ever reach IPO; instead, they either go bankrupt or are bought out by one of the giant Valley companies.

April 6, 2018

PyMC and Edward/TensorFlow Merging?

Filed under: Uncategorized — rrtucci @ 9:09 pm

News bulletin: Edward is now officially a part of TensorFlow and PyMC is probably going to merge with Edward.

The python software library Edward enhances TensorFlow so that it can harness both Artificial Neural Nets and Bayesian Networks. The main architect of Edward, Dustin Tran, wrote its initial versions as part of his PhD Thesis at Columbia Univ. (Columbia is the home of the illustrious Andrew Gelman, one of the fathers of hierarchical models, which are a special case of Bayesian networks). Dustin now works at Google as part of a team merging Edward with TensorFlow.

One of the near term goals of artiste-qb.net is to produce a quantum generalization of Edward. This would not run on a quantum computer but would simulate on a distributed classical computer possible experiments that could in the future be conducted on a qc.

I highly recommend the following two Discourses for Edward and PyMC:

It looks like the python software libs PyMC3 and Edward may soon merge:


This is very good news, in my opinion, because I am in love with both programs. It’s interesting to note that the current Numpy is also the result of the fortuitous marriage of two separate complementary software libs.

One can now call PyMC3 and Edward from Quantum Fog, although not very smoothly yet. See here.

April 3, 2018

The Wicked Witch of the West, Krysta Svore

Filed under: Uncategorized — rrtucci @ 10:29 pm

2 years ago, I pointed out that Krysta Svore often behaves unethically.



What devious misdeeds has she been up to lately? Well, as of today, April 3, 2018, she has:

  • a whopping 24 software patent applications (14 of those are on quantum computing). See USPTO search results here Check it out. Pretty soon, you won’t be able to do some pretty trivial operations in quantum AI without paying Microsoft.
  • a whopping 15 granted software patents (8 of those are on quantum computing). See USPTO search results here.

In a previous blog post, I compared in detail artiste-qb.net’s quantum computing language Qubiter to Microsoft’s quantum computing language Q#. Q# is a Rube Goldberg machine language that doesn’t even look like what everyone else calls quantum computing. It looks very different and is much more convoluted than its major competitors (IBM qasm, Rigetti PyQuil, Google’s ProjectQ, and the best of its class, Qubiter). Why would anyone want to learn another crappy Microsoft language, when, instead, they can use Python, like all the competitors just mentioned are doing?

So why does Q# suck so badly. Is this just a consequence of Microsoft’s penchant to monopolize markets and “bind for life” its customers to its products, or is there an aggravating factor. Yes there is an aggravating factor. It’s that one of the main architects and promoters of Q# is the Wicked Witch of the West, Krysta Svore.

In a scary nightmare that I keep having, Krysta yells the following curse to all qc fans:

“From the pits of Microsoft Hell, I curse thee, may you, and your children, and your children’s children, and their little dogs too, be forced to use Q# for the rest of their careers. ha, ha, ha, I made it extra convoluted to enjoy your pain.”

I predict that Q# will end up being as popular as the Zune. Will Microsoft rightly blame Krysta for the abysmal failure of this product, or will they blame Dave Wecker instead? Whoever is blamed will probably end up in Microsoft-Siberia, vanished from quantum computing land, bug testing new features for Microsoft-Office.


Krysta Svore, the Wicked Witch of the West


The Wicked Witch and her loyal servant

Dave Wecker

discuss Q#


Krysta’s Flying Monkey and coffee boy, little Nate Wieber,

February 19, 2018

Elon Musk Predicts Quantum TensorFlow

Filed under: Uncategorized — rrtucci @ 6:22 pm

Just kidding. But it could very well happen, now that I have suggested it on the Internet, whispering it into the ears of a thousand bots.

In a 1926 article in Colliers magazine, Nikola Tesla, inventor of AC current, predicted the smart phone.

“When wireless is perfectly applied the whole earth will be converted into a huge brain, which in fact it is, all things being particles of a real and rhythmic whole. We shall be able to communicate with one another instantly, irrespective of distance. Not only this, but through television and telephony we shall see and hear one another as perfectly as though we were face to face, despite intervening distances of thousands of miles; and the instruments through which we shall be able to do his will be amazingly simple compared with our present telephone. A man will be able to carry one in his vest pocket.”

I personally find this prediction and the conviction and clarity with which he uttered it quite impressive.

Elon Musk must be a big fan of Tesla, as evinced by the fact that he named his car brand Tesla. (by the way, when water melon rots in your fridge, it smells godawful. I call the smell Melon Musk. So the guy even has his own fragrance). Even though Musk is not very good at predicting (after all, he hasn’t predicted Quantum TensorFlow…yet), he has conquered many a science nerd’s heart, including my own, by building the Tesla automobile, in a “Gigafactory” that looks like a giant spaceship, and by putting, on February 6, 2018, his own Tesla roadster into orbit for billions of years using his SpaceX company’s Falcon Heavy rocket. That scene of the 2 returning side rockets from Falcon Heavy landing side by side, and nearly simultaneously, on space pad 39 in Cape Canaveral… is a scene that will be emblazoned forever in the hearts and minds of every science fiction and science fact lover that has seen it or ever will see it, for the rest of human history.

Let me channel for a moment the spirit of Tesla, who wants to communicate with Mr. Musk across the great divide, to tell him that quantum Tensorflow is an inevitability, and that it will need a logo. The ghost of Tesla left a picture of a suitable logo, in my computer, last night, while I slept. It’s the logo at the beginning of this blog post. The quote in the logo is of course due to Richard Feynman.

February 9, 2018

Today Enrolled our Baby (Quantum Fog) in gambling school taught by famous Monte Carlo gamblers (PyMC3, Edward, Zhusuan)

Filed under: Uncategorized — rrtucci @ 4:57 am

In the beginning, there was Matlab, which grew out of the Fortran lib Lapack (Linear Algebra Package, still one of the main software libs used to benchmark supercomputers). Matlab’s tensor stuff was copied and improved by the Python borgs to produce numpy, which handles tensors really nicely but doesn’t do it in a distributed “parallel” fashion. Then, starting about 10 years ago, some guys from the University of Montréal had the brilliant idea of writing the Theano Python Library (Theano was a Greek mathematician thought to have been the wife of Pythagoras). Theano replaces most numpy functions with Theano functions that are namesakes of the numpy ones and do the same thing in a distributed fashion. Then Google came out with the TensorFlow Python Lib, which copied Theano and improved on it. TensorFlow can do most numpy operations using multiple CPUs, GPUs and TPUs. But TensorFlow and Theano are much more than tools for doing tensor operations in a distributed fashion. They also do differentiation in a distributed fashion (such differentiation is often used to train neural nets). They are also designed to help you do fast prototyping and distributed running of artificial neural nets. In the last year, some new Python libraries built on top of TensorFlow and Theano have appeared that allow you to do fast prototyping and distributed running of Bayesian networks. B nets are dear and near to my heart and I consider them even more powerful than artificial neural networks. And I’m far from being alone in my love of b nets. Judea Pearl won the prestigious Turing prize for his pioneering work on them. Those new Python libs that I alluded to are PyMC3 (built on top of Theano), Edward (on top of TensorFlow) and Zhusuan (on top of TensorFlow). Added later: Forgot to mention that Facebook & Uber have their own Theano equivalent called PyTorch and also an Edward equivalent called Pyro. But I haven’t used them yet.

The main architect of Edward is Dustin Tran, who wrote Edward as part of his PhD thesis at Columbia Univ. Dustin now works at Google, and the TensorFlow team is working with Dustin to integrate Edward with TensorFlow.

Zhusuan is the art of using an abacus. The word means literally “bead counting” in Chinese. The Zhusuan lib is a fine open-source (under MIT license) product of the Tsinghua University in Beijing, China. It demonstrates that China is already very advanced in AI.

According to Google Trends, “TensorFlow” is at least 10 times more popular than “quantum computing” as a search term, even though TensorFlow has many competitors that started before it did and it was open sourced for the first time only 2 years ago.



One of the aims of artiste-qb.net is to participate in the revolution of extending Edward & Tensorflow so that it can do both classical and quantum Bayesian Networks. Today we took a small, initial step in that direction. We added a folder


which contains a file called ModelMaker.py and two jupyter notebooks. Both notebooks do MCMC for the classical Bayesian network WetGrass. One notebook does this by invoking the external software PyMC (a.k.a. PyMC2, the precursor of PyMC3), whereas the other does it via PyMC3. Both notebooks start by loading a .bif file for the WetGrass bnet. From that alone, they construct an X native model and analyze that model using X, where X = PyMC2, PyMC3. In the near future, we will also add a notebook that does the same thing for X=Edward, Zhusuan.
Addendum(Feb.16, 2018): Added support for Edward

climbing-mt-qc(Image by Henning Dekant)

February 6, 2018

The Toronto Quantum Meetup Supremos invite you to our next Meetup entitled “Blockchain & Quantum Computing”

Filed under: Uncategorized — rrtucci @ 6:56 pm

The Toronto Quantum computing Meetup is the largest meetup in the world dedicated to quantum computing, so we claim the title of Quantum Meetup Supremacy, at least for now. (currently we have 997 Supremos as members. The second biggest club is in London with a paltry 706 Brexiters members).

We cordially invite you to our next meeting on Monday, February 26, 2018 at the “Bergeron Centre for Engineering Excellence”, a beautiful building that belongs to York Univ.

This meeting is being held jointly with the “Blockchain Hub”, another Toronto Meetup (1604 members currently).

The speaker will be Henning Dekant, the CEO of artiste-qb.net (the company I work for) and 3 other distinguished panelists.

Depressed Bitcoin investors are especially welcomed. Let us cheer you up! We think we are funny. And we have good taste for music too! Supremos is Spanish for Supremes. Ah, The Supremes, the better angels of our (American) nature

Try singing that when the Bitcoin price is going down!

Our company artiste-qb.net is cultivating relationships with standard VC firms via the CDL (Creative Destruction Lab, part of the Rotman School of Business of the Univ. of Toronto), which is an incubator that we are currently participating in, and also with Chinese investors via networking done by our intrepid CTO and part owner, Dr. Tao Yin, who lives in ShenZhen, China.

However, as a side project, we are also contemplating a less traditional source of funding, an ICO. To quote Henning, “There are pitfalls with ICOs as well, the SEC has been clamping down on some, arguing that the tokens represent securities. China outlawed them completely etc. But this company aims at bringing them into regulatory compliance within all of North America, and they are the first to launch a regulator-approved ICO. I know the founders,..” The company Henning was referring to is TokenFunders, another proud Canadian company like us.

January 21, 2018

Life in the time of the Bayesian Wars: Clippy Strikes Back

Filed under: Uncategorized — rrtucci @ 7:42 pm

You can now add Clippy to any website, with his full panoply of adorable responses.


(Thanks to Gavin Dekant for pointing this website to us.) First lines of website:

Add Clippy or his friends to any website for instant nostalgia. Our research shows that people love two things: failed Microsoft technologies and obscure Javascript libraries. Naturally, we decided to combine the two.

So what does this have to do with Bayesian Networks?

In a previous blog post, I linked to a 1996 newspaper article that quotes Bill Gates revealing his strong interest in Bayesian Networks. But I didn’t mention in that blog post that this was not just a platonic love affair with Bnets which never went any further. By 1996, Gates was already actively channeling his love of B nets into backing real MS products, such as the Lumiere project, which brought to the world the first office assistant, in Office 97. Nowadays, Alexa, Siri, Google’s Office Assistant and Cortana are well known, commonplace office assistants, but MS was there first. Sadly, but characteristically, MS has fumbled the OA ball since then, and today, Cortana ranks last in usage among the big 4 OA’s. In fact, Cortana is almost dead today, now that MS has all but pulled out of the mobile phone OS and hardware businesses.

Next, I want to say more about the Lumiere project, a project remarkable for its great foresight, technical prowess and creativity, and for its instructive, amusing history with a dramatic, poignant ending.

Here is a nice article which backs up much of the history that I will recount next:

The Lumiere project: The origins and science behind Microsoft’s Office Assistant By Awesome-o | August 31, 2009

The office assistant created by the Lumiere project and first offered by MS in Office 97 was called Clippy. Clippy predicted its user’s next moves and needs so poorly that Office users soon started to hate and dread it, so much so that MS decided to turn it off starting with Office 2007. So poor Clippy was born about 20 years ago and he was killed when he was 10 years old.

Microsoft’s Lumiere project, headed by Eric Horovitz, produced the first in-house versions of Clippy, based on Bayesian Networks. This original Clippy learned from the user, it was trainable, so it was truly Bayesian. By all accounts, it worked really well. However, for the commercial version that appeared in Office 97 and thereafter, upper management insisted that the Bayesian heart of Clippy be replaced by a rule based system that could not learn from the user. The reason Clippy was crippled was not out of palace intrigue or corporate malice but simply that Office 97 already occupied too much space and the Office designers had to choose between including a full fledged Clippy or including some new, mundane word processing features, but not both, and they chose the latter. Hence, the original, by many accounts brilliant Clippy, was lobotomized before it was first released to the public.

But it gets better. Because of the Clippy fiasco, many people in 2007 considered the idea of an office assistant that could be trained by the user to be an impossible dream. How wrong they were! Look at the present. OA’s have not gone away. They keep getting better and more ubiquitous.

And, Clippy is back!

Bayesianism will never die. Author Sharon McGrayne has the same opinion and has written a wonderful popular science book explaining why:

“The Theory That Would Not Die: How Bayes’ Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy”, by Sharon Bertsch McGrayne

January 10, 2018

Trend Graphs for quantum computer programming related topics

Filed under: Uncategorized — rrtucci @ 3:45 am

Programmers like me often fret over whether they are learning the programming language du jour, because employers usually only want to hire those who already know a specific p. language. And the more popular a p language is, the more jobs are available using it. It’s an annoying fact of life for programmers that there are numerous p languages available, and that they tend to peak in popularity and then become less and less popular over time. In the old days, I used to go to a Barnes & Noble book store and there gauge the popularity of each p language by the relative number of books available for it on their shelves. Nowadays, to gauge such popularity, I go instead to the webpages at Google and StackOverflow that generate trend graphs.

The purpose of this blog post is to gather in one spot links to trend graphs comparing topics that I personally think are important and related to quantum computer programming.

StackOverflow trend graph for various computer languages used in quantum computing, big data, ai, statistics (javascript > python > java > C#… Haskell used by Quipper is insignificant. C# getting less popular over time)

Google trend graph comparing: Cortana, Alexa, Siri, Google Assistant (Alexa peaking, Microsoft Cortana lowest)

Google trend graph comparing: TensorFlow, PyTorch, quantum computing, quantum computer, bayesian (TensorFlow superpopular, especially in China)

January 1, 2018

New tools for dealing with limited couplings of a quantum computer chip

Filed under: Uncategorized — rrtucci @ 11:37 pm

The main purpose of this blog post is to announce that Qubiter now has two simple, but hopefully useful, new classes that facilitate writing qc programs for the new crop of qc’s with 20-50 qubits but such that those qubits are not fully connected (coupled):

  1. Class ForbiddenCNotExpander reads an English file and writes a new English file. Any CNot in the old file that is not physically allowed by a pre-specified qc chip is replaced in the new file by an expansion (sequence) of allowed CNots.

  2. Class ChipCouplingsFitter reads an English file and writes a new English file. The new file permutes the qubits of the old file so that the old CNots are mapped into new ones that are allowed for a pre-specified qc chip. Of course, this is not always possible, so the class only promises to find such a mapping and to tell you what it is if it finds one.

I hope these 2 new classes will make it easier for Qubiter users to interact with real qc hardware of any brand.

We are striving to make Qubiter interact with as many qc hardware brands as possible. So far, we have targeted the IBM chips with a class called Qubiter_to_IBMqasm2 that translates a Qubiter English file (i.e., Qubiter’s version of qasm) to IBM qasm2. As soon as Google comes out with its 49 qubit qc and accompanying cloud service, which Martinis has promised for Jan 2018, we will target Google devices too, by writing a translator from Qubiter English files to Google’s version of qasm. This IBM or Google qasm can be sent by a Jupyter notebook via the internet to the IBM or Google quantum computers sitting in the cloud, which will then run the program and send back the results to the original Jupyter notebook.

We are trying to design Qubiter so that it facilitates communication not only between a user and multiple hardware devices, but also among multiple users. Indeed, Qubiter already makes it easy for users of different qc devices, for example Alice who uses an IBM chip and Bob who uses a Google chip, to be on friendly terms with each other, maybe even go out on a date. Qubiter facilitates such dalliances because it saves quantum circuits as simple yet clear text files. This makes it as easy as pie to modify those text files, save them to improve them at a later date, and share them with other qc engineers. I you want to collect quantum circuits like you would collect baseball cards, comic books, stamps, butterflies or whatever, Qubiter’s English and Picture files found in Qubiter’s io_folder are a great way of doing it.

December 17, 2017

I Left My Heart in Nanjing, at the Quantum AI Conference, AIQP-2017

Filed under: Uncategorized — rrtucci @ 8:31 pm

Nanjing University will be holding a Quantum AI conference AIQP-2017 on Dec 20-22, 2017.

Our company artiste-qb.net has its headquarters in Toronto-Canada and is currently a participant in the CDL (Creative Destruction Lab of the Univ. of Toronto) quantum AI incubator.

Dr. Tao Yin, currently living in ShenZhen-China, is the CTO of our company artiste-qb.net. Our man Tao will not be speaking at the conference, but he will be attending and meeting people. Also attending, and speaking!, will be 2 other persons associated with CDL, Dr. Jacob Biamonte and Dr. Peter Wittek. Not speaking, (Whew!) are any IBM, Google or Microsoft representatives.

Quote from conference website:

The workshop is sponsored by the Nanjing University AIQ fund. The AIQ fund is donated by the Founder and President of ECOVACS Robotics, Mr. Qian Dongqi, who is also an alumni of NJU (Bachelor in Physics 1981, and Master in Philosophy 1987).

The Ecovacs company sells excellent robotics vacuum cleaners all over the world.

I end with a “Make China Great Again” Tweet that I have been using recently to advertise our company:

Next Page »

Create a free website or blog at WordPress.com.

%d bloggers like this: