Skip to content

Complexity Classes Meet Particle Physics

July 30, 2009


A possible way to look at complexity classes as fundamental particles

images

Juris Hartmanis is one of the founders of computational complexity, for which he was honored with the prestigious Turing Award. His theorems helped start the field, his PhD students are some of the best ever, and his conjectures have had huge influence on the development of the field. I have already posted about one of his wonderful conjectures on real-time computations.

Today I want to talk about a recent breakthrough result that proves that {\mathsf{QIP} = \mathsf{PSPACE}}. I think that this terrific result, due to Rahul Jain, Zhengfeng Ji, Sarvagya Upadhyay, and John Watrous, raises some interesting questions about the future of the field, and more generally of all of theory.

I still remember a talk that Juris gave at Yale University back in the mid 1970’s. At one point he put up a slide—no powerpoint then—with the following formula on it:

\displaystyle  \mathsf{L} \subseteq \mathsf{P} \subseteq \mathsf{NP} \subseteq \mathsf{PSPACE}.

He pointed out that the ends of this chain were easily known to be different. He then smiled and said, therefore, at least one of the subset relations had to be proper. But which one? Was there just one that was proper? Or two? Or all three? This simple observation made a big impression on me. It’s too bad that over three decades later we have made no further insight into which one(s) is proper.

Back then, Yale computer science used the post-talk-food normal form. That is after the talk donuts were served to the audience and the speaker. Most places then and now use pre-talk-food normal form, but Yale was different. I always wondered why we were different, but it was Yale.

While we were eating our donuts one of my system colleagues said that he had an intuition about P=NP. A few of us perked up and asked what it was. He said he thought that they should be equal. We asked why? He said look: nondeterministic does not matter for finite state automata, does not matter for pushdown automata, and therefore why should it matter for Turing Machines? I liked this general analogy, but I pointed out that it does help for pushdown automata. Not all context free languages are accepted by a deterministic pushdown machine. He said never mind. Oh well.

The Complexity Classes

I have often thought that complexity classes could be viewed as our analog of particles in Physics. Really, I am not kidding. It is not April first.

Today there are over 200 known “fundamental” particles. See this for a short introduction. Not all particles are indivisible, some are made up of others. I am not a particle physics expert, so please forgive me ahead of time—no doubt I will say something wrong about particles.

The central thesis, however, that strikes me as an interesting one is:

\displaystyle  \mathbf{ particles} \leftrightarrow \mathbf{ complexity \,classes }.

Here are some of my reasons for this analogy:

{\bullet } Complexity classes are the basic objects that we study.

{\bullet } Complexity classes have properties. Instead of mass, spin, charge, and so on; we have our properties. Is it closed under complement? Does it have a complete set? Does the complete set, if it exists, have special properties such as random self-reduction? Is it low? Recall, that a complexity class {B} is low for a complexity class {A}, if {A} with an oracle for {B} is equal to {A}. And so on.

{\bullet } Complexity classes sometimes are known to be different—unconditionally. For example, {\mathsf{L}} is not equal to {\mathsf{PSPACE}}.

{\bullet } Complexity classes sometimes are known to be the same. The recent result about {\mathsf{QIP}=\mathsf{PSPACE}} is a perfect example of this behavior. Here two “particles” were known, but eventually were shown to be actually the same.

{\bullet } Complexity classes have a natural global mapping between them. For each class {\cal C} there is another class {\mathsf{co-}\cal C}. This is a mapping from the classes to the classes. Of course there are some classes that are fixed points: {\mathsf{P}} is trivial, and {\mathsf{NL}} was a big result.

So What?

One question you might ask is why is this a useful analogy? I like it because I think that the huge zoo of complexity classes is unlikely to be the fundamental ones. I like the analogy to particle physics more that the analogy to a zoo—with all due respect to the keepers of this valuable web resource: Scott Aaronson, Greg Kuperberg, and Christopher Granade.

I think that we should model our behavior closer to physics that to zoo’s. For example, the physics world is constantly searching for a reduction in the number of “fundamental particles”. We are doing the same. Every time two classes are proved the same, we have reduced the number of fundamental particles.

This analogy raises some interesting ideas. Forget for a moment the conventional wisdom on the complexity classes. What is the fewest distinct classes that we could have? For example, what possible worlds could exist if huge collapses happen? Then, I would like it because the world could become very simple, since there are only a few distinct classes.

On the other hand, what is the most complex world? I guess the obvious answer is that all known classes that are not yet proved to be the same are different. To me that seems like a pretty complex and unappealing situation. I would hope that this does not happen, but it could.

Open Problems

The co-mapping is a kind of symmetry between classes. Could this be like the matter and anti-matter symmetry of particle physics? Note it is known that for physics the symmetry is not perfect. The same seems to be true for our classes. For example, {\mathsf{NP}} may behave differently, then {\mathsf{co-NP}}

Where are the accelerators? Unfortunately, I cannot imagine how to build a multi-billion dollar machine that we could use to discover the structure of our fundamental particles. If we could that would be a great way to get more dollars flowing into theory. Any suggestions?

I think, in a sense, we are the accelerators. As we smash ideas together sometimes we discover the further structure of our fundamental particles—our complexity classes.

9 Comments leave one →
  1. rjlipton permalink*
    July 30, 2009 11:17 pm

    Glad you enjoyed the post. Definitely was fun to write…rjlipton

  2. rjlipton permalink*
    August 1, 2009 2:44 pm

    Check this out.

    http://blogs.discovermagazine.com/cosmicvariance/2009/08/01/when-do-we-get-donuts/

    …writes an interesting post suggesting that complexity classes play a similar role in complexity theory as elementary particles in high-energy physics. All very fascinating stuff, no doubt. But along the way a much more important issue is raised: when there is a seminar, should we have donuts before, or after? Back then, Yale computer science used the post-talk-food normal form. That is after the talk donuts were served to the audience and the speaker. Most places then and now use pre-talk-food normal form, but Yale was differen

  3. August 6, 2009 1:27 pm

    The result can be interpreted as saying that “quantum” doesn’t add as much power to computation as we thought—and likely hoped. Certainly not at the level of PSPACE.

    The next step seems to be, what about at the level of PH? At COCOON’09 last month (in Niagara Falls, NY) I was told that “conventional wisdom” is BQP not being contained in PH. I believe instead that BQP is in Sigma_3^p intersect Pi_3^p, which would put it at most one level higher than the containment known for BPP.

    One of my Princeton physics courses had a research-survey essay assignment. I chose the hunt for magnetic monopoles, and referred to the “zoo” of particles—which closes the circle on the analogies :-). Professor P.J. Peebles actually took exception to my writing that experimenters wanted to “shoot the monopole and exhibit it in their trophy case”, preferring a less-violent analogy! The analogy to physics that I use in theory courses, however, is how remarkable it is that so many thousands of decision problems “quantize” into so few completeness levels…

Trackbacks

  1. Who says computer scientists don’t know how to have fun? | mathemagicio.us
  2. Random bits « Equilibrium Networks
  3. When Do We Get Donuts? | Cosmic Variance | Discover Magazine
  4. When Do We Get Donuts? | Marriage Blog
  5. An Approach to the P=NP Question? « Gödel’s Lost Letter and P=NP
  6. Thanksgiving Day, Pardons, and Turkeys « Gödel’s Lost Letter and P=NP

Leave a Reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading