Apollo 11 mission:
Duration- 8 days round trip.
Lunar Module (lunar lander)- the Eagle.
Touchdown site- Sea of Tranquility.
Neil Armstrong: “Houston, Tranquility Base here. The Eagle has landed.”
Neil Armstrong: “That’s one small step for [a] man, one giant leap for mankind”
Neil Armstrong has been quoted as saying that at the time he thought his chances of surviving the Apollo 11 mission were 50/50. Yet he did his job without flinching. Truly a man made of the right stuff!
2009 is the 40th anniversary of the Apollo 11 mission. It’s also the zeroth anniversary of my new paper on Quantum Gibbs Sampling. One small step for mankind, one giant leap for Bob (me).
My new paper (Ref. Tuc3 below) extends 2 previous papers of mine (Tuc1, Tuc2). Papers by other workers that particularly inspired me in this work are Refs. Som1 and Woc1 on quantum simulated annealing. All these papers fall under the general category of quantum computerizations of MCMC (Markov Chain Monte Carlo), a subject that I find extremely interesting. Why?, you ask (Okay, maybe you didn’t ask, but let me tell you anyway).
MCMC allows you to sample probability distributions. If you’ve ever needed to sample a simple probability distribution defined for in a real interval, you are probably familiar with the inverse transform method, Unfortunately, this method requires finding the cumulative probability distribution and inverting it, a task which is very difficult for many probability distributions. Suppose, for example, that your probability distribution is defined for , where each can assume values in a discrete set . (This can be described as a classical Bayesian network). Unfortunately, the inverse sampling method does not work for sampling most B. nets. One method that does work and is very popular is called MCMC (this includes Gibbs sampling, Metropolis-Hastings sampling and Simulated Annealing). If you are new to this subject, a wonderful introduction can be found in Chapter 4 (the section entitled “Monte Carlo Methods”) of David MacKay’s magnificent, highly intuitive book entitled “Information Theory, Inference, and Learning Algorithms” (available for free in electronic pdf form, or for money in paper form).
Why is sampling a probability distribution (i.e., a classical B. net), where lives in a multidimensional domain, so difficult? A domain with a large number of dimensions is huge. The majority of the samples come from those regions of the domain where the probability is high. But finding those regions requires that we explore the entire domain!
Why is sampling a probability distribution useful? If you have a sample of points for , distributed according to , then you can evaluate approximately the expected value of any function using
You can also approximate probability distributions , where is the subset of which constitutes your Hypothesis variables, and is the subset of which constitutes your Evidence variables. Such Bayesian probability distributions are very useful for making inferences (in AI, bioinformatics and data mining, for example).
Why is quantum computerization of MCMC useful? Speed! Typically, for any , a quantum computer can sample a B. net with precision in steps, whereas it takes a classical computer steps to achieve the same precision. Here is the distance between the two largest eigenvalue magnitudes of the Markov chain being used.
References:
- Tuc1– R.R Tucci, “Use of a Quantum Computer to do Importance and Metropolis-Hastings Sampling of a Classical Bayesian Network” arXiv:0811.1792
- Tuc2– R.R Tucci, “Code Generator for Quantum Simulated Annealing ” arXiv:0908.1633
- Tuc3– R.R Tucci, “Quantum Gibbs Sampling Using Szegedy Operators” arXiv:0910.1647
- Som1– R. Somma, S. Boixo, H. Barnum, “Quantum Simulated Annealing” arXiv:0712.1008
- Woc1– Pawel Wocjan, Anura Abeyesinghe, “Speed-up via Quantum Sampling” arXiv:0804.4259
[…] Mathematician well, let us imagine this. The shore algorithm could make all that work in applied Quantum computerization and operation of […]
Pingback by Computers Basic Physics or Advanced Engineering - Technology everyday — March 15, 2016 @ 3:58 pm