Quantum Bayesian Networks

May 29, 2019

UFO’s and quantum computers

Filed under: Uncategorized — rrtucci @ 5:50 pm

Apple computer is building a quantum computer. Yes-siree. As usual, they are extremely secretive about their new products, but our carefully planted mole was able to get a picture of the iQC before its official unveiling:

Recently, the U.S. Air Force divulged that it believes in UFO’s. It stands to reason that if UFO’s exist, they must run the most advanced quantum computer software and hardware

May 28, 2019

Xanadu.ai, a quantum computing Theranos, will never make a profit

Filed under: Uncategorized — rrtucci @ 8:44 pm

Last night, Xanadu.ai, a quantum computing company, published the following paper in arXiv

“Quantum-inspired algorithms in practice”, by Juan Miguel Arrazola, Alain Delgado, Bhaskar Roy Bardhan, Seth Lloyd,
https://arxiv.org/abs/1905.10415

All 4 authors give Xanadu as their affiliation. The last one, Seth Lloyd, also gives MIT as an affiliation.

The main author, Arrazola, has also published a popular description of the work in his Medium blog:

https://medium.com/xanaduai/everything-you-always-wanted-to-know-about-quantum-inspired-algorithms-38ee1a0e30ef

This Theranos-syndrome-showing company claims that its main goal is to build hardware, a CV (Continuous Values or Variables, squeezed light) quantum computer. I am told that they just closed their second round of funding, and yet…they have never presented any hardware that can calculate anything more efficiently than a cell phone does. Even if they had, it is doubtful that such hardware could be error corrected in a fault tolerant way.

And today we learn that they spent so much time and manpower on a very abstract subject that has almost zero overlap with their CV hardware. Is this company ever going to make a profit? I am extremely doubtful that it will.

In my opinion, last night’s paper is a vanity paper for Seth Lloyd, whose obvious narcissism has been hurt by Ewing Tang’s brilliant work, which I described in a previous blog post:

https://qbnets.wordpress.com/2018/11/17/18-year-old-girl-shows-that-seth-lloyds-quantum-machine-learning-algorithm-is-junk/

Actually, the paper backfired to some extent, because Tang’s algorithm seems to work better than her complexity theory upper bounds predicted. I bet that Tang, or others, will improve her algorithm so that it works better for high rank matrices. Xanadu bozos, I hate to tell you this, but mathematical algorithms are a moving target, they almost always can be improved, especially algorithms that are so new and novel.

Despite this silly paper, the fact remains that Seth Lloyd’s Machine Learning algorithm is as dead as a door nail. It always has been, even before Ewin Tang’s brilliant work. I say this because Lloyd’s ML algorithm uses something called QRAM. QRAM is a very distant pipe dream, even more distant than far easier Fault Tolerant quantum computing, which is itself a distant dream.

To summarize, for Xanadu to be spending so much time and manpower studying a distant pipe-dream algorithm that is irrelevant to their business, instead of working on their under-performing or non-existent hardware, does not bode well for the company. I am extremely doubtful that Xanadu will ever make a profit, or make a device that calculates anything better than a cell phone does. Xanadu investors: you are being fleeced by your VCs and by Très Void.

May 26, 2019

Tucci’s Work Going “Almost” Viral (LOL)

Filed under: Uncategorized — rrtucci @ 3:36 am

Sometimes you find yourself losing faith in yourself and in your hard work of 20 years. But then some very kind people rise out of damn Twitter!, of all God forsaken places, to reassure you. At such times, you thank God for Twitter (LOL, I never thought I would say that. 99% of the time, I hate Twitter with a passion. I used to belong to Twitter, but no longer do. On those painful occasions when I peek into it, I access it via a prophylactic incognito tab of my web browser. )

My recent blog post entitled “Quantum simulator Qubiter now has a native TensorFlow Backend” has been shared on Twitter by a small, select group of super kind people. I want to store in this blog post the ID of those tweets before Twitter archives them and the Twitter search engine stops listing them. As of today, upon querying the Twitter search engine with the keywords (“quantum” or “quantumcomputing”) and “tensorflow”, I count 11 Tweets. In my book, even one retweet is like going “almost” viral!🙃

May 14, 2019

Quantum simulator Qubiter now has a native TensorFlow Backend

Filed under: Uncategorized — rrtucci @ 2:05 am


I am pleased to announce that my quantum simulator Qubiter (available at GitHub, BSD license) now has a native TensorFlow Backend-Simulator (see its class `SEO_simulator_tf`, the `tf` stands for TensorFlow). This complements Qubiter’s original numpy simulator (contained in its class `SEO_simulator`). A small step for Mankind, a giant leap for me! Hip Hip Hurray!

This means that Qubiter can now calculate the evolution of a state vector using CPU, GPU or TPU. Plus it can do back-propagation on a quantum circuit. Here is a jupyter notebook that I wrote that uses Qubiter’s TF backend to do VQE (Variational Quantum Eigensolving). (I like to call VQE, mean Hamiltonian minimization)

https://github.com/artiste-qb-net/qubiter/blob/master/qubiter/jupyter_notebooks/MeanHamilMinimizer_native_with_tf.ipynb

Numpy is a tensor library in Python and TensorFlow (produced by Google) is another tensor library in Python. TensorFlow matches by name and functionality, almost 1 to 1, every function in numpy. But the TF namesake functions are much more powerful than their numpy counterparts. Besides accessing the usual CPU, they can access distributed computing resources like GPU and TPU. Furthermore, they can be asked to “tape” a history of their use, and then to replay that history in reverse, back-propagation mode so as to calculate derivatives of a list of user-designated variables. These derivatives can then be used to minimize a cost (aka loss or objective) function. Such minimizations are the bread and butter of classical and quantum neural nets.

These are exciting times for TF:

  • Just last March, Google released officially TF version 2.0.

  • In TF 2.0, the Eager mode (which is the mode that Qubiter’s TF backend uses) has been elevated to default. TF Eager uses dynamical graphs like the TF competitor, PyTorch, produced by Facebook, uses.

  • TF 2.0 also incorporates “Edward”, a software lib by Dustin Tran for doing calculations with Bayesian Networks. Edward in its TF incarnation is called TF Probability. Plus TF 2.0 is also incorporating Keras (a library for doing layered neural nets) and PyMC (a lib for doing Markov Chain Monte Carlo calculations)

  • Anyone can run TF 2.0 on the cloud via Google’s Colab. That is, in fact, how I developed and tested Qubiter’s TF backend.

In theory, all you have to do to convert a numpy simulator to a tensorflow one is to change all functions that start with the prefix `np.` by their namesakes that start with the prefix `tf.`. However, the devil is in the details. I, like most good programmers, hate repeating code. I try to avoid doing so at all costs, because I don’t want to have to correct the same bug in multiple places. So I built class `SEO_simulator_tf` by sub-classing the original numpy simulator class `SEO_simulator`. That is one of the beauties of “sub-classing” in OOP (Object Oriented Programming), of which I am a great fan: sub-classing helps you avoid repeating a lot of code. So `SEO_simulator_tf` ended up being a deceivingly short and simple class: all it does is basically turn on a bunch of flags that it passes to its parent class `SEO_simulator`. Most of the new code is in the parent class `SEO_simulator` and other classes like `OneBitGates` which have been modified to respond to all those flags that `SEO_simulator_tf` sets.

May 5, 2019

Quantum Matrix Rain

Filed under: Uncategorized — rrtucci @ 9:11 pm

In a previous blog post, I compared the evaluation of gradients in quantum computing via multi-threading to the “digital rain” of The Matrix movie series. This Sunday, I was bored, so I decided to make my own artistic rendition of quantum multi-threading. It turns out to be a trivial task, requiring almost zero javascript knowledge, to write a rudimentary animation, if one uses the power of the browser and the HTML tag marquee.

http://www.ar-tiste.com/quantum_matrix_rain.html

May 1, 2019

Popular Talk on Multi-Threading, Gradients, AI, Quantum Computing

Filed under: Uncategorized — rrtucci @ 3:33 am

An old friend asked me to prepare a short talk at the popular science level about my latest work on multi-threading, gradients, AI, quantum computing. Voilà
http://www.ar-tiste.com/threaded_grads_popular_talk.pdf

Blog at WordPress.com.

%d bloggers like this: