Skip to content

Neil Jones, 1941–2023

April 6, 2023

Neil Jones, sad to relate, just passed away. He was Professor Emeritus of Computer Science at the University of Copenhagen, which he joined on a permanent basis in 1982 after gaining tenure at Penn State and a full professorship at the University of Kansas.



Eric Allender wrote a post on Neil for Lance Fortnow’s and Bill Gasarch’s famous blog. We point to it, hopefully with their full approval, and add some supplementary remarks. Eric’s tribute leads with Jones’s work with Alan Selman characterizing logical spectra via languages in nondeterministic {2^{O(n)}} time. He quotes remarks by D. Sivakumar that echo what Siva wrote for our memorial to Alan.

The Space Problem

Steve Cook and Dick Karp started the quest to understand {\mathsf{P = NP}}? What many including Neil have always considered the “second big problem” involves the power of space rather than nondeterminism, specifically: is {\mathsf{LOGSPACE = PTIME}}? Neil, like the rest of us, had his thoughts on the conjecture that they are different, but like the rest of us, did not know for sure. His thoughts on the subject ranged from two early papers to the last paper on his DBLP page.

This last paper is joint with Siddharth Bhaskar, Cynthia Kop, and Jakob Simonsen, and is titled, “Cons-free Programs and Complexity Classes between LOGSPACE and PTIME.” It appeared at the 2020 joint meeting of the Horn Clauses for Verification and Synthesis (HCVS) and Verification and Program Transformation (VPT) workshops. They call a program “cons-free” if it does not allow agglutination of data, not by cons with lists, nor successor, +, or * with integers, nor any allocator of storage. They further consider constraining recursion so as not to mushroom by mandating certain forms of tail recursion. Problems decided by cons-free programs form the class “{\mathsf{CF}}” and those further having only tail recursion, “{\mathsf{CFTR}}.”

This use of functional languages represents both a more modern viewpoint—than how complexity was founded on Turing machines in the 1960s—and an older one, insofar as Lisp and other ideas of functional languages were fertile before the 1960s. They characterize {\mathsf{P = CF}} and {\mathsf{LOGSPACE = CFTR}} as their form also constraining recursion. The former holds some surprise as the cons-free programs are allowed to run for exponential time. If they are constrained to run in polynomial time, a class intermediate between {\mathsf{LOGSPACE}} and {\mathsf{P}} emerges. Is it equivalent to a known class? They leave that open.

Non-Turing Time

Jones’s work on other models besides Turing machines caught my attention 30 years ago—this is Ken writing this section. I was interested in models that have a constant-factor time overhead for universal simulation {U} of any other machine {M}, in contrast to the standard multitape Turing machine model where {U} incurs an {O(\log t)} time overhead for simulating {t} steps of machines {M} with more tapes than {U} has. This {\log t} factor also shows up in the deterministic time hierarchy theorem, though for natural instances it can be shaved down to {O(\log t)^\epsilon} for any fixed {\epsilon} by techniques of padding and translation.

Martin Fürer had shown that when {k} is fixed, the log factor goes away. I was interested in leveraging random-access models that have constant-factor overhead while imposing locality restrictions on the random access. The resulting time hierarchy theorems are only “tight” in the sense of needing {t_1(n) = o(t_2(n))}—in order to construct a language decidable in time {t_2(n)} but not in time {O(t_1(n))}.

Jones went this one better by building a natural programming-based model that has a time hierarchy for a fixed constant factor. His STOC 1993 paper was titled, “Constant Time Factors Do Matter” with italics on the Do. This grew into a paper in Acta Informatica 2000 with Amir Ben-Amram. It also became the basis for his 1997 textbook, Computability and Complexity: From a Programming Perspective.

The prominence of “{\mathsf{P = NP}}?” masks that our lack of knowledge of lower bounds takes effect at linear time. For circuit models the status is even worse: we still have not refuted the possibility that every language in deterministic {2^{O(n)}} time has (possibly nonuniform) {O(n)}-sized circuits. I thought that a new kind of super-linear lower bound could get a grip on peeling the end of a coiled tape that might then freely unroll. As with {\mathsf{P}} versus {\mathsf{LOGSPACE}}, however, getting such a grip remains open.

Open Problems

I—Dick again—worked on related stuff in a 1979 paper that had a cast of famous brilliant scholars: “Random Walks, Universal Traversal Sequences, and the Complexity of Maze Problems”—by Romas Aleliunas, Dick Karp, me, Laci Lovasz, and Charlie Rackoff. There were ways to build off this to greater results, particularly Omer Reingold’s blending-in of Irit Dinur’s PCP proof technique to show that undirected maze problems belong to deterministic logspace. Is there a way to build more off Neil’s results—the newer and the older ones?

No comments yet

Leave a Reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading