Quantum Bayesian Networks

April 14, 2014

Can Quantum Computers Do Machine Learning in General in Polynomial Time?!!!

Filed under: Uncategorized — rrtucci @ 2:50 pm

In 2 recent papers (here and here), Seth Lloyd, Masoud Mohseni and Patrick Rebentrost have proposed a method for using a quantum computer to do “supervised machine learning”. Their papers have been trumpeted by Lloyd and the press as a significant breakthrough. On Jan 29, 2014, Lloyd gave a lecture about his algorithm at Google.

A nagging question that you might have upon browsing the 2 Lloyd papers is, how can this be? His algorithm has a polynomial time complexity. Don’t the meatier machine learning problems have exponential time complexity, even on a quantum computer? After all, as Scott Aaronson states in the masthead to his blog, “quantum computers are not known to be able to solve NP-complete problems in polynomial time”.

An answer to this nagging question about Lloyd’s 2 papers was given in the following paper by a group of Microsoft researchers:

“Quantum Nearest-Neighbor Algorithms for Machine Learning”, by Nathan Wiebe, Ashish Kapoor, Krysta Svore (abstract)

The Microsoft paper points out in its introduction that the machine learning algorithm that Lloyd et al implement on a QC is known to perform poorly in many cases on a classical computer, so its quantum computerized version will perform just as poorly, except that it will produce the same crummy answer more quickly than a classical computer.

Let me explain briefly what the Microsoft paper says about the Lloyd et al algorithm.

Suppose you are given a point \vec{x}\in \mathbb{R}^N and 2 disjoint sets A and B of points in \mathbb{R}^N. The task solved by Lloyd et al is to assign \vec{x} to either A or B, the one which is “closest” to \vec{x}.
Two simple methods of doing this are:

  • Nearest centroid (NC) method: Calculate the centroid (i.e. mean) of A and of B. Then assign \vec{x} to the set whose centroid is closest to \vec{x}.
  • Nearest neighbor (NN) method: Find the point (called the nearest neighbor of \vec{x}) in A \cup  B which is closest to \vec{x}. Assign \vec{x} to the set which contains that nearest neighbor.

Lloyd et al use the NC method, which often performs poorly, whereas the Microsoft group uses the NN method, which performs much better. The catch is that whereas the Lloyd NC method has a polynomial time complexity on a QC, the Microsoft NN method has an exponential time complexity on a QC. (In fact, the Microsoft NN method uses Grover’s algorithm. My Lisbeth software for doing AI on a QC also uses Grover’s algorithm.)

Assume that red data points are uniformly distributed over red areas and blue data points are uniformly distributed over blue areas. In these 3 examples, the Lloyd method would classify 50% of the red data points as blue whereas the Microsoft method would classify all red data points as red.

April 3, 2014

Life in the Time of the Bayesian Wars, Operation Lisbeth Unveiled

Filed under: Uncategorized — rrtucci @ 1:11 am

In previous blog posts, I described the race for Bayesian supremacy, the Bayesian Wars, that is currently rocking the computer world, especially in the areas that go by the buzz words of artificial intelligence, machine learning, big data, analytics, deep learning, etc. Since quantum computers can perform certain special but important tasks with a better time complexity than classical computers, it is inevitable that quantum computers will enter this fray as soon as they become available. Any empire that fails to acquire such quantum weaponry will surely succumb to its rivals. This blog post is a bird’s eye-view of Operation Lisbeth, my entrant into the quantum Bayesian weapons race.

Lisbeth is a software package, written in Java, for generating code that can be followed by a GATE MODEL quantum computer. The code generated by Lisbeth enables a QC to do certain AI related tasks, particularly those associated with classical BAYESIAN NETWORK calculations. The algorithms used by Lisbeth are all based on AFGA, a patented improvement of GROVER’S ALGORITHM. As with the original Grover’s algorithm, AFGA based algorithms are quadratically more efficient (measured by time complexity) than the best available counterpart classical algorithms.

SOFTWARE
Lisbeth comprises 5 Java applets. These applets are limited in the types of cases they can handle, but this is because they are meant for demonstration purposes only. The 5 applets are based on a Java class library that is much more general and malleable. The applets are:


  1. qSym-logo
    qSym calculates a symmetrized version of a given function.


  2. qMobius-logo
    qMobius calculates the Mobius Transform of a given function.


  3. qMargi-logo
    qMargi calculates a marginal probability distribution of a given probability distribution.


  4. qMean-logo
    qMean calculates the mean value of a given function.


  5. qJennings-logo
    qJennings allows one to discover from data the structure of a classical Bayesian network.


Source code can be found at my website www.ar-tiste.com

Check out my previous post entitled: “qJennings Thinks qWatson is a Moron“.

DOCUMENTATION
The algorithms used by the 5 Lisbeth applets and the interface of the applets are documented in the following 5 arXiv papers.

  1. “Quantum Circuit for Calculating Symmetrized Functions Via Grover-like Algorithm”, by R.R. Tucci (abstract)
  2. “Quantum Circuit for Calculating Mobius-like Transforms Via Grover-like Algorithm”, by R.R. Tucci (abstract)
  3. “Quantum Circuit for Calculating Mean Values Via Grover-like Algorithm”, by R.R. Tucci (abstract)
  4. “Quantum Circuit For Discovering from Data the Structure of Classical Bayesian Networks”, by R.R. Tucci (abstract)
  5. “qSym, qMobius, qMargi, qMean and qJennings, 5 Code Generators for Generating Quantum Circuits that Perform Some Artificial Intelligence Related Tasks”, by R.R. Tucci (Not in arXiv yet. You can download pdf here)


The Lisbeth software is protected by 4 pending patents, namely the last 4 entries in this list of patents owned by the Artiste company. Sorry for the patents, but my competitors are very dishonest and I have to protect myself.

Emblem for Operation Lisbeth, the goldfish with the dragon tattoo

Emblem for Operation Lisbeth, the goldfish with the dragon tattoo

March 30, 2014

The Inflationary Quantum Computer

Filed under: Uncategorized — rrtucci @ 9:53 am

Wow-wee! Check out the following article in The Guardian:

MIT Boffins Conjecture that the Inflationary Universe is a Quantum Computer (April 1, 2014)

an excerpt:

In the past, Seth Lloyd, MIT professor of Mechanical Engineering, has conjectured that the Universe is a quantum computer.

More recently, an experiment called BICEP2 that is being conducted in the South Pole, has detected evidence of gravitational waves in the cosmic microwave background. Boffins worldwide believe this to be a smoking gun for a universe with a Big Bang beginning, followed by a brief period of extremely rapid expansion called Inflation, followed by our current, slow as molasses, rate of expansion.

So now Seth Lloyd and his two collaborators, the identical twins A. Minion and B. Minion, are adding a new wrinkle to their theory. They now claim that the inflationary universe (or, alternatively, the inflationary Everett-Linde multiverse) is a quantum computer.

According to Seth Lloyd, his inflationary theory not only solves the 3 problems which the old, fuddy-duddy inflationary models solve (viz., the horizon problem, the flatness problem, and the magnetic-monopole scarcity problem), but it also solves the conundrum of why P!=NP in our present universe.

According to Lloyd, a field of Dark Pion particles exploded 10^(-36) seconds after the Big Bang. Pions are the quanta of the field of problems that are solvable in Polynomial time. The Dark Pion explosion ended 10^(-32) seconds after the Big Bang. The duration of this explosion is called the Lloyd inflationary period. During the Lloyd inflationary period, many of the most sacred laws of physics were violated with impunity: information traveled faster than the speed of light, energy was not conserved, even P!=NP did not hold.

Let R to be the radius of the bubble of incompressible quantum information and quantum entanglement that is our universe. Lloyd believes that during the Lloyd inflationary period, R grew at a rate far higher than the speed of light. This was possible because during that period, P=NP so the universe was able to perform calculations at a prolific, promiscuous rate, called the “Twitter Rate Upper Bound” (named after Oxford U. Prof. Robin Twitter).

Yakov B. Zeldovich once said that the Universe is the poor man’s accelerator. Lloyd likes to add that the Universe is the poor man’s inflationary quantum computer.

Alan Guth often says that cosmic inflation is the ultimate free lunch. Lloyd reflects that cosmic inflation is the ultimate free, open-source, life-giving-app of the cosmic quantum computer.

March 29, 2014

“You have to think like a high energy physicist”

Filed under: Uncategorized — rrtucci @ 4:51 am

The title of this blog post is a quote of John Martinis, taken from the following talk he gave at Google LA on October of last year. The talk has only recently (Feb 28) been put on the web. Check it out. It’s about 1 hr long. I highly recommend it. It’s crystal clear and covers quite a lot of territory. If after you finish seeing it, you want even more detail, I recommend Martinis’ website at UCSB, where he has links to all the Doctoral theses of his past students. Doctoral theses are a treasure trove of information.

I predict that John Martinis’ QC designs will totally dominate quantum computer hardware R&D for the next 5-10 years.

D-Wave’s QC is fine and I wish them well. However, I specialize in writing QC software, and all the QC software I’ve ever written is for gate model QCs, not adiabatic QCs, so my heart is invested in the gate model. Perhaps history will prove that this is a good analogy:
D-Wave QC ~ analog computer,
Martinis’ QC ~ first vacuum tube computer like the Eniac.

There are other gate model QC designs out there besides Martinis’, but his design is already FAR more advanced than that of his competitors. Furthermore, his qubits are bigger (100 microns) and therefore easier to connect to with leads than those of competing QC designs (eg., ion trap). I never liked the fact that with the ion trap design, you have to physically move the qubits from location to location. Smells too mechanical to me. Mechanical means are always slower than electrical means. And they also break down more quickly.

Here are some brief notes I took from Martinis’ Google talk. I just wrote down some numerical values. I didn’t write down anything about all his wonderful explanations and diagrams of the physics and math that describes his QC.

What would M need to factor 2048 bit number?

2E8 physical qubits,
10 MWatt,
1 day runtime,
cost = $100-500 million,
length of machine = 1-10 meter

Fault Tolerant Error Correction With Surface Code

How efficient is surface code?
Let
r_n = (number of physical qubits)/(number of logical qubits)

r_p = p/p_max = (probability of error per physical gate)/ (maximum allowed probability of error). p_max = 1E-2 for surface code. Other, less forgiving codes have p_max = 1E-3.

r_e = number of errors per unit of time

Then
r_p= 1E-1, r_e= 1/age of universe ===> r_n=3000

Type of technology?
standard integrated circuit technology,
Al/Al-O/Al Josephson junctions on sapphire substrate
frequency = 5 GHz (= 240 mK, 2E-5 eV, 6E-2 meters)
temperature = 20 mK

His CNot quality?
40ns, 99.45% efficiency

Size of his current research group?

about 50 researchers, 1 Pomeranian dog

Why surface code?
“Surface code really looks quite ideal for building integrated circuits” because
(1) only 2D nearest neighbor interactions are required
(2) Has highest p_max, most forgiving, of all known error correction codes

Martinis’ 5 year plan?
Next 5 years, demonstrate 1000 physical qubit machine with control electronics at room temperature, outside of chip and cryostat.

Must bring control lines to qubits. Control lines separated by 100 microns. Have to bring 100-1000 control lines to edges of wafer.

Near 56:10 min mark, M says:”You have to think like a high energy physicist” (LHC detectors have a huge number of wires)

Martinis’ 10 year plan?
After achieve 1000 physical qubit machine, M plans to put control electronics right on the chip. Superconducting IC technology for doing this already exits, has existed for many decades. Recent advances in it made by D-Wave

March 15, 2014

Craig Venter Dreams About Quantum Computers

Filed under: Uncategorized — rrtucci @ 8:47 pm

Recently (Oct. 21, 2013), Craig Venter was interviewed by Charlie Rose on Rose’s TV show. They discussed the contents of Venter’s new book, “Life at the Speed of Light”. When asked by Rose which of his many scientific achievements he was proudest of, Venter said (I’m saying it in my own words, not his) that he was proudest of having pioneered a new, faster way of doing scientific research than the conventional way normally used by Academia and public research.

Venter certainly has a very good track record of doing just that: setting up large, highly effective, quick paced, privately and publicly funded, mini-Manhattan projects that bring together a vast array of very talented scientists and engineers.

Let me review some history in case you have forgotten it or never learned it. The privately funded company Celera, founded by Craig Venter when he couldn’t get funding from the NIH, put fire under the feet of the publicly funded Human Genome Project (HGP), causing HGP to finish mapping the human genome years ahead of schedule. Furthermore, HGP did so using the shotgun approach to mapping genomes, an approach which is today the de facto standard for doing this, but which the HGP had been deprecating before Venter’s competition caused HGP to adopt it. Celera and HGP published simultaneously (one day apart) the first ever human genome maps on Feb 2001.

To me, this is a great example of how private industry can accelerate scientific progress by providing competition and/or additional funding to publicly funded research.

Check out the following article in which Venter, now 68 years old, admits to having wet dreams about using quantum computers for genetic research.

Science: Can we extend healthy life?
(by Clive Cookson, FT magazine, March 14, 2014)

excerpts:

Craig Venter, who became a scientific celebrity by sequencing the human genome in the 1990s and then moved on to microbial synthesis, is returning to human genomics on a grand scale.

He has set up a company in San Diego called Human Longevity Inc or HLI, which “will build the largest human [DNA] sequencing operation in the world”. As the name suggests, HLI aims to discover how we age in order to improve health as the process takes hold.

The DNA reading technology comes from Illumina, the genetic instrumentation maker, whose latest HiSeq X Ten machines can sequence tens of thousands of human genomes a year, each containing three billion letters of genetic code. Illumina has joined a group of wealthy private investors in putting up $70m to fund HLI for its first 18 months.

To make sense of these various components – human genomes, microbiomes, metabolomes, stem cell science and, of course, participants’ health records – will require data analysis on an epic scale. Venter believes his computers will be up to the task but he is not overconfident. “That’s not clear yet,” he admits. “A quantum computer would be ideal for this but we can’t wait for quantum computing to solve these problems.”

Other organisations in the public and private sectors in the US, Europe and China are embarking on similar projects to sequence 100,000 or more genomes and relate these to participants’ health records. But none has the depth and breadth of HLI, Venter maintains.

The most mysterious venture is Calico, which Google launched last September with an apparently similar mission to HLI, to extend healthy lifespan. Google has released few details about Calico and Venter still knows little about its activities.

I think it’s a sure thing that QCs will eventually be indispensable to genomics. I believe this can occur in the next 10 years if Venter tactics are used to accelerate QC development. I advise Craig Venter, his coworkers, and Illumina workers to watch this YouTube video. It’s a QC talk given by John Martinis at Google LA on Oct 2013. I sure hope Martinis and Venter become friends if they aren’t already.

I’ve mentioned Martinis many times before in this blog. I will say more about this superb Martinis Video in a future post.

March 14, 2014

P, NP, Sad Tale

Filed under: Uncategorized — rrtucci @ 3:47 am

Recently, Scott Aaronson posted in his blog 2 nice, teaser posts about Computational Complexity Theory.

In his first post, Scott discusses 2 papers. One paper by Lenny Susskind proposes that the number of elementary operations in a quantum circuit be used as a clock, to measure time intervals. The second paper by Terry Tao uses complexity theory to study the solutions of the Navier Stokes equation.

In his second post, Scott tries to explain to the non-specialist why he believes that P\neq NP.

I know next to nothing about complexity theory, so you better take anything I say next with a grain of salt.

NP= problems that can be verified in poly time.

P= problems that can be solved in poly time. P\subset NP. P problems are the easiest to solve problems in NP.

P and NPcomplete are both subsets of NP. They are believed to be disjoint.

NPcomplete=NPhard \cap NP. NPcomplete problems are the hardest to solve problems in NP but their solution can be verified in poly time. 3-SAT problem is an element of NPcomplete.

If P and NPcomplete intersect, then they become equal to each other and to the whole NP set. This doomsday scenario is called P=NP.

In one of his posts, Scott compares the boundary between sets P and NPcomplete to an “electrified fence” for electrocuting frogs. Raoul Ohio observes in the comments that the P, NPcomplete separation reminds him of the phenomenon of “eigenvalue avoidance”, wherein a random matrix rarely has two degenerate eigenvalues. Raoul’s comment moved me to tears and inspired me to write the following very sad tale.

The Sad Tale Of The Two Sets That Once Kissed and Then Parted

Matrix M(t) had two eigenvalues x_1(t) and x_2(t) where t\in[0,1]. Let

Set_1 = \{ (t, x_1(t)) : t\in[0,1]\}
Set_2 = \{ (t, x_2(t)) : t\in[0,1]\}

Originally, everyone thought that there was no symmetry in the system that M(t) described, which meant that Set_1 and Set_2 did not intersect.

Then someone discovered an “accidental” symmetry that led to an “accidental degeneracy”, which led the two sets to kiss at a single point.

Then someone realized that the original model was too naive and that in real life there is a perturbation that breaks the accidental symmetry, so an eigenvalue “gap” develops, which led the two sets to stop kissing each other and part forever.

THE END

princess-np

March 2, 2014

US Patent Office and Rip Van Winkle

Filed under: Uncategorized — rrtucci @ 7:49 pm

RipVanWinkle
For my non-American readers, Rip Van Winkle, written by Washington Irving, is an American folk tale in which a man wakes up after sleeping for 20 years.

Last Thursday (Feb. 27, 2014), I submitted 4 patent applications to the US Patent and Trademarks Office (USPTO). I will soon (in the next two weeks) post all 4 of them (plus supporting software and its documentation) at my website. They cover what I have referred to in previous blog posts as Operation Lisbeth, or the goldfish with the dragon tattoo. They deal with the use of quantum computers to do artificial intelligence and big data. I won’t say any more about them here, in this blog post. I’ll do that in future blog posts over the next few weeks. Instead, I’d like to use this blog post to praise effusively the Patent Office for the enormous strides it has made in modernizing its online submission systems.

According to this article, the USPTO first launched its EFS (electronic filing system) in March 2006. Last Thursday, I had the pleasure of using it for the first time, and it worked flawlessly and painlessly for me. I love it.

I was able to do everything I needed to do for my 4 patent applications, file all necessary documents and pay all fees, completely electronically and online, without ever using any paper copies or snail-mail.

Basically, as long as you can turn a document into pdf or txt format, you can submit it (with some minor exceptions). I typed all my documents in LaTex and turned that into pdf using the windows application WinEdt. I drew all my figures using the application InkScape, which allows you to save your drawings as pdf.

In the past, to submit an appendix containing computer source code, you had to mail the Patent Office a CD (Compact Disc) with the stuff. Now you can create a single txt file containing all your source code and send them that electronically. Vast improvement. (I used a free application called TXTcollector to create the required single text file from all my separate .java files)

In the past, for what is called the Information Disclosure Statement, you had to mail to the Patent Office a paper copy of each of your references. Now you can just send them electronically a pdf copy of your references. Much, much easier.

It’s easy nowadays to convince oneself that the US government is declining dangerously. So I find some solace in the fact that the Patent Office appears to have bucked that trend and improved significantly in the past 7 years or so. An institution like Rip Van Winkle, that is waking up after being deeply asleep and behind the times for many years.

I have only one minor quibble. They still don’t allow LaTex submissions and generate the pdf themselves from that, the way arXiv does. This means that they still retype the patent from its pdf version. If they allowed LaTex submissions, they could do what most physics and engineering journals have done for the last 15 years: add a few reformatting commands to the LaTex and publish that, without any need to ask a human to retype things, which is boring for the re-typist and introduces a lot of typos. Of course adding this LaTex capability to their EFS is still possible, and would be a natural next step in their path towards improvement.

February 22, 2014

YouTube Video From The Future, BQP

Filed under: Uncategorized — rrtucci @ 1:25 pm

I was browsing through some videos from the year 2020 and I found this one which might be of interest to my readers:

scott-bqp

Transcript of the video:

Hello, I am Scott Aaronson, the director of BQP (British Quantum Petroleum, a subsidiary of British Petroleum). Two years ago, BP made a commitment to quantum computing and it plans to keep that commitment. BP has already invested 1 billion dollars in my quantum computing company called BQP, which has 50 employees. Once upon a time, Facebook spent 19 times as much money for the same number of employees and we all know how that ended up.

BPQ offices are located on the Gulf coast. So BP is continuing to keep its commitment to the Gulf by bringing there high-tech, high paying American jobs. Today, the beaches of the Gulf are restored, and some areas are reporting the best tourism season in years.

Is this another Silicon Valley cockamamie idea?

Absolutely not.

Silicon Valley is not the quickest learner in the class, but it has learned a lot from the AOL/Time-Warner, Facebook/WhatsApp, and Google/D-Wave mergers. Furthermore, BP is not a Silicon Valley insider. BP wants to add sanity to Silicon Valley. BP has a track record of prudent, successful investments outside of Silicon Valley. BP is, after all, the 5th largest company in the world. In 2013, BP had 396 billion dollars in revenue, and 85,700 employees. BP plans to succeed where the Google/D-Wave merger failed, by harnessing the superpowers of Complexity Theory, the BQP complexity class and BosonSampling.

February 13, 2014

Particle Physics and Quantum Computing, Two Parallel Worlds

Filed under: Uncategorized — rrtucci @ 10:19 pm

Welcome, aficionados of Sociology and Anthropology.

Quoth Merriam-Webster:

  • Sociology- the science of society, social institutions, and social relationships; specifically : the systematic study of the development, structure, interaction, and collective behavior of organized groups of human beings.

  • Anthropology- the science of human beings; especially : the study of human beings and their ancestors through time and space and in relation to physical character, environmental and social relations, and culture.

It seems to me that lots of parallels can be drawn between how Quantum Computing society and culture are developing and how the older, more mature (some would say senile) Particle Physics society and culture developed. It’s like two Shakespearean plays whose characters and plots bear a strong resemblance to each other, as if the bard of Avon were cribbing from his younger self because he had run out of ideas. Like a Dan Brown (Yuck!).

Let me explain with a picture:

parallel-worlds

January 30, 2014

Our Expanding Fan Base

Filed under: Uncategorized — rrtucci @ 4:00 pm

We are proud to announce that this blog has received a ringing endorsement from the eminent, platinum frequent flyer, MIT Professor Scott Aaronson in his blog Shtetl Optimized:

Check out comment #180 here. Quote:

And Michael #179, I became better able to appreciate what humor there occasionally is in rrtucci’s snarks once I stopped looking for intellectual coherence in them!

Wow, we’ve heard before Scott laying on thick the accolades on Sean Carroll, Max Tegmark and Christopher Fuchs (“I read the samizdat as a beginning graduate student, and it changed my career.”) , but nothing like this. Those people must now be mighty envious because Scott didn’t go the extra mile to praise them as he has done for Quantum Bayesian Networks, the first quantumly, bayesianly incoherent blog, the diary of a potted plant (with an IQ to match that of a potted plant).

Next Page »

The Rubric Theme Blog at WordPress.com.

Follow

Get every new post delivered to your Inbox.

Join 30 other followers