I was Googling last night to see if I’m crazy. Well maybe I am, but specifically about my perception of the frequentist tendencies of complexity theorists. And I found this funny (in an understated way) blog post by Larry Wasserman in his blog “Normal Deviate”. Check it out:

“Aaronson, COLT, Bayesians and Frequentists”(May 5, 2013)

Excerpts:

I am reading Scott Aaronson’s book “Quantum Computing Since Democritus”

…

Much of the material on complexity classes is tough going but you can skim over some of the details and still enjoy the book. (That’s what I am doing.) There at least 495 different complexity classes: see the complexity zoo. I don’t know how anyone can keep track of this.

…

So my claim is that computational learning theory is just the application of frequentist confidence intervals to classification.

There is nothing bad about that. The people who first developed learning theory were probably not aware of existing statistical theory so they re-developed it themselves and they did it right.

### Like this:

Like Loading...

*Related*

Interesting thought Robert. I think you may be on to something. One of the wonders of contemporary physics is how the advances in statistics and machine learning of the last fifty years completely passed them by. Physics has the sophistication of statistics in 1930 and has not budged since. Perhaps the same is true of complexity theorists? One of the more interesting works post 1930 was R.T. Cox “The Algebra of Probable Inference” in which he showed why the Bayesian perspective makes sense as a “generalized logic”. In that work, Cox demonstrated that the laws of probability were uniquely determined once you made a few simple desiderata of simplicity and consistency regarding the use of “indeterminate” truth values. It was recognized by Jaynes later that the “fuzzy logic” of Zeh and co in the 1960’s was simply an attempt to do the same without using the word probability. It seems that scientists must constantly re-invent the wheel, only they generally add edges and flat portions so it turns less smoothly. In many quarters, wobbly wheels are regarded as “progress” over smooth ones. So far as the present topic is concerned, I think you are on to something rather deep. Complexity theorists clearly do not understand the Bayesian revolution. Beyond that, they have failed to comprehend that the so-called “quantum computing revolution” is just another example of men struggling to free themselves from “fixed truth values”. In attempting to go beyond the one and zero of truth and falsehood they invented for themselves the “qubit” which could be “both at once”. However, the same folks cannot free themselves of the idea that the qubit is somehow still a 1 and a 0 in magical superposition. This is a really old problem in Western thought with roots that run very deep. Western man is obsessed with the notion of logical certainty. Hence, when you have a calculus of the imperfectly known, such as Bayes provided, Western man cannot absorb it. He runs to the safety of logical truth values occurring in frequentist fashion. This mental problem leads me to the view that Western Science will soon be eclipsed. The time of Western Science is past now. The East has no difficulty whatsoever with this philosophy of reasoning accurately with imperfectly known truth values. It is, after all, at the heart of all Eastern philosophical instruction for thousands of years now. To the extent that reasoning with probabilistic truth values is now the practical beating heart of most every practical machine learning algorithm means that Eastern societies are now at a distinct cultural advantage. They do not have this persistent mental hangup which causes Western scientists to (in general) lack comprehension of the Bayesian revolution at work. It is as if a ne language of discourse came to all science, and only a few grasped it. No wonder the rest are confused and so prone to zealotry.

Comment by Kingsley Jones — June 10, 2013 @ 3:48 am

This reminds me… I once wrote a short draft for an article: “On Zealots and Zenbots” It is lurking somewhere on an old hard drive🙂 I had some very interesting and stimulating correspondence with Karvel Thornber (one of Feynman’s students) on analogical reasoning (he had developed an analogical calculus). This is the logic of “Like a…” relations rather than “Is a…” relations. If you want to do a good job on practical A.I. then analogical reasoning is vital since it provides the means to abstract fresh concepts from known examples. e.g. A rubber duck “is not” a duck, but it “is a” duck in respect of certain properties. Such splitting of categories to make new categories involves the conjunction of opposites followed be the refinement of that conjunction with newly recognized properties (alive vs dead, made of rubber vs made of flesh and feathers). Western thinkers have so much trouble with this thought process that they do not yet have a name for it. It is all unconscious and unrecognized. It is very sad really. This void is what has made western philosophy so sterile and useless.

Comment by Kingsley Jones — June 10, 2013 @ 4:07 am

Thanks Kingsley. It’s ironic that Turing, the patron saint of complexity theory, was a gay bayesian who used Bayesian techniques to break the German uboat codes.

Comment by rrtucci — June 10, 2013 @ 7:22 am

Love it! Can Gay Bayesians get married in the State of NY??? Or, do you need to go to Copenhagen and risk Complementarity?

Comment by Kingsley Jones — June 10, 2013 @ 7:31 am

Hi all, I would add a comment regarding the differences between Bayesian and Fuzzy probabilities. Fuzzy probabilities has axioms which if pushed violate the axiom of choise. Bayesians do not attempt this sort of generalization.

Zelah

Comment by Zelah — June 28, 2013 @ 11:49 am