A question I am often asked is what is the difference between tensor networks and quantum Bayesian networks, and is there any advantage to using one over the other.

When dealing with probabilities, I prefer quantum Bayesian networks because b nets are a more natural way of expressing probabilities (and probability amplitudes) whereas tensor nets can be used to denote many physical quantities other than probabilities so they are not tailor made for the job as b nets are. Let me explain in more detail for the technically inclined.

In Fig. 1, I show the common way of denoting tensors graphically. In particular, I show a tensor T with two incoming arrows and two outgoing arrows. The incoming arrows are denoted by upper indices and outgoing arrows by lower indices. The indices are enumerated counter-clockwise starting at the “stub” of the node. When one takes the complex conjugate of the tensor, the lower indices are raised and the upper ones lowered, which corresponds to changing the direction of all the arrows.

In Fig. 2, I show the same tensor as in Fig.1 but expressed as a Bayesian network of 3 nodes called 1,2,3.** In general, a single tensor node can always be viewed as B net composed of several B net nodes.** Nodes 2 and 3 in Fig.2 are what I like to call “marginalizer nodes”. (The deltas denote Kronecker delta functions). I first used marginalizer nodes in my first quantum Bayesian networks paper in 1997. I also used them in my quantum B net program Quantum Fog. I’ve been using marginalizer nodes for more than 15 years to solve exactly this problem: how to express a tensor node as a B net composed of several B net nodes.

In Fig.3, I show the chain rule for a probability distribution with 4 variables. Then I show the B net that corresponds to that chain rule. As you can see, B nets are tailor made for denoting the chain rule for probabilities. That’s what I mean when I say that b nets are a more natural way of expressing probabilities graphically that tensor networks are. If you try to express the same chain rule using tensors, it looks quite unnatural (some might say, ugly).

Another advantage of quantum b nets over tensor networks is that one can express classical and quantum processes with the same diagram, except that the nodes in a classical b net stand for conditional probabilities and in the quantum b net case, they stand for probability amplitudes. I’ve been able to generalize many of the results of classical b nets to quantum b nets. (for example, d-separation and Pearl’s causality do-calculus).

I predict that in the future, at least for some tasks, quantum computers will be programmed first using quantum Bayesian networks, and those nets will later be translated by a computer program called a compiler into quantum circuits. In fact, one of my first quantum Bayesian net papers was entitled: “How to compile a quantum Bayesian Network”.

Hello again,

and thanks! this is clarifying to me, and now I understand your marginalizer nodes. What I find interesting in the tensor network picture of bayesian nets is that the tensor product is the numerical and concrete and tangible example of the corresponding abstract product giving the monoidal structure of the category-theoretic string diagram expressing the tensor net. In the quantum case the nLab points to how dagger-compact categories are the setting for finite quantum systems [*], and classically, the star in star-autonomous categories is the duality in finite dimensional real vector spaces (but here only row-stochastic matrices must be allowed for 1,1-rank tensors, and there must be an analogue restriction for higher rank tensors). So both cases have their string diagrams.

[*] http://ncatlab.org/nlab/show/finite+quantum+mechanics+in+terms+of+dagger-compact+categories

Comment by Jesús López — March 30, 2015 @ 8:53 pm

Gracias Jesús. ¿De dónde eres? Yo me crié en Puerto Rico .

Comment by rrtucci — March 31, 2015 @ 12:40 am

Anda, no sabía que hablaras castellano, soy de la madre España y un categorista vocacional.

Comment by Jesús López — March 31, 2015 @ 10:35 am