…And To Make A Long Story Short—
All: Too late! —from the 1985 movie “Clue”
Lane Hemaspaandra was among the youngest present at the creation of the Computational Complexity Conference (CCC). We should thank him and all those who promoted this great conference in the green days of our field—indeed we will. He is older now, as are we all, but continues to make important contributions to complexity theory.
Today Dick and I want to talk about an older idea from the kitchen of Lane and others, and how it adds mustard to ideas we have already discussed on this blog.
Initially CCC was called Structure in Complexity Theory to emphasize the scientific position that the real-world behavior of many thousands of disparate computational problems is channeled by a small number of information-theoretic properties. Although the online conservatory called the Complexity Zoo is about to go over 500 classes, most problems are tied with rope into the completeness levels of barely more than a dozen classes. Many complexity classes have complete problems; many are closed under various operations and types of reductions; others involve languages with various densities and properties such as self-reducibility that affect worst-case/average-case behavior. Other structural notions have been proposed, studied, furthered, or put aside according to their utility.
The concept that Dick and I have just found reason to study further originated with Andrew Goldberg and Michael Sipser in their 1985 STOC paper “Compression and Ranking.” The ranking task for a given language is to compute, given any string
, the number
of strings
such that
in the standard ordering of strings. If this function
is computable in polynomial time, then
is
-rankable. In his single-author paper at the second Structure in Complexity Theory conference at Cornell in 1987, Lane expanded one of their implications into equivalences that hold for several weaker notions of ranking as well, and gave many related results. Stephen Rudich had obtained several of these results plus some others, and became a co-author on the journal version.
Lane is also an expert on voting systems. Indeed Lane and his wife Edith Hemaspaandra and his postdoc Jörg Rothe began in 1997 by analyzing the work of Lewis Carroll on voting systems that we mentioned in-passing here, and this led to a large ongoing project with its own library of three dozen papers in computational social choice theory. Lane né Hemachandra and Edith née Spaan amalgamated their names when they married. I (Ken) have been happy to have them in nearby Rochester, though this year they are on sabbatical in Düsseldorf, Germany, visiting Rothe—whose name is akin to scarlet.
Succinctness
Large objects are succinct if they have small descriptions that are easy to operate with. Psst—to take you out in the hall to explain the title of this post:
People who say, ‘…and to make a long story short’ usually have already missed being succinct.
Consider a large graph with vertices. The graph may be fairly sparse, with each vertex having at most
or
or so neighbors, but in toto it is too big to write down in
time. However, to navigate this graph, it may suffice to have a small program that given any vertex
outputs the polynomial-sized list
of neighbors of
. Or weaker, given any
and
we would at least like to know whether they are connected by an edge.
For the simpler case of a binary string, the condition which we have discussed before goes as follows:
Definition 1 With regard to some fixed polynomial
, a binary string
of length
is (weakly) succinct if there is a Boolean circuit
of size at most
with
input gates such that for all
,
,
outputs bit
of
.
By size of we mean its number
of wires; if we require instead that
then
is also an upper bound on the polynomial-time Kolmogorov complexity of
. Compared to standard Kolmogorov complexity notions which allow programs arbitrary time to lounge around generating
serially from a short description
, the circuit description
provides fast parallel computation of the bits of
.
New Definition
The above sounds good, but is it good enough? When trying to extend the notion to graphs and matrices in general, we encountered situations where we know there is a single in a large interval, and would like to find where it is. We tried making special-case definitions with wording like “strongly succinct” for some of the cases that we worked on, yet this issue threw a monkey wrench or spanner into what we wanted to do next. Finally we hit upon extending the base definition instead. Again with regard to a fixed polynomial
, the new property reads:
Definition 2 A string
,
, is succinct if there is a circuit
of size at most
with
inputs and
outputs such that for all indices
,
.
That is, computes the census of places previous to
where
has a
. Now we can recover bit
itself as
. To be strictly-ballroom of course we put
.
More to the point, we can quickly find a place with a between any places
and
by binary search. We call a family of strings
succinct if there is a polynomial
with regard to which each
is succinct. Although doing binary search bumps up the runtime to
, in asymptotic usage this is still “polynomial.”
Relation to Ranking
Given any language and string
of length
, take
to be the binary string that encodes
on strings of length up to
. Then
gives the census of
on indices up to that of
. Alternatively we may consider the family of strings
for each
. Then the family is succinct if and only if
is rankable by
-sized circuits.
This is actually called strong rankability in the Hemachandra-Rudich paper. The base definition puts whenever
, while their “weak” definition allows
to be arbitrary when
. Neither weakening seems to allow binary search—though both allow equivalence theorems for
and
. The angle here is that the task of ranking the members of
can be restricted to the members of
.
We also want to extend the notion to vectors and matrices over arbitrary sets
of entries. Here the small circuit
computes
for all
and indices
. We can still do binary search to find some entry
where
. Indeed if there are only
such entries, then we can find them all. This is the property needed to navigate various kinds of sparse graphs. It is a plum that a single idea allows us to treat various combinatorial structures as partially-white boxes.
Matrix Computations
Now let stand for a big
matrix. We say
is succinct if the string or vector of length
obtained by unrolling
in row-major order is succinct. It is weakly succinct if the string satisfies the original succinctness notion, i.e., if a small circuit can compute individual entries
.
For a matrix, succinctness means we are computing
the census of
‘s to the left of
in row
plus all those in rows above. We can do binary search in row
, and thus in a sparse graph, find all neighbors of vertex
in polynomial time.
We do not know whether succinct matrices, even Boolean matrices, are closed under matrix product. This involves the problem of computing or estimating the inner product of two succinct exponentially-long vectors, which is a knife-edge challenge all its own. We can, however, prove some special cases.
Lemma 3
- (a) The product of two succinct permutation matrices is succinct.
- (b) The product of a succinct sparse matrix and a weakly succinct matrix is weakly succinct.
Proof: For (a), let for permutation matrices
and let any
be given. We know the census of
‘s in rows above
will be
, so we need only tell whether the
in row
is to the right or left of (or in) column
. We can first find the unique
such that
. Then in row
of
we find the unique column
with a
and compare
.
For (b), we have where
is the
-sized set of columns in which row
of
has nonzero entries. Since
is succinct this can be found by binary search, and then the availability of a small circuit computing
for those values completes one for
.
Is the Notion Strong Enough? Too Strong?
To multiply by a succinct sparse matrix on the right, for , we need to find the sparse set of rows in which column
of
has a nonzero entry. This may require symmetrizing the notion of succinctness for matrices by stipulating that the column-major string also is succinct. This ensures it is closed under transpose
—note this is A^\dagger in
.
Stronger still, we could stipulate that a small circuit computes the census of the quadrant northwest of
. Then
gives the row-major census, and the column-major census is similar. Is this equivalent to both, or to either?
It is also possible that counting is stronger than we need—note that part (b) of the theorem falls short of getting census counts for . Perhaps we can make our desired applications work by crafting an intermediate notion that allows binary search but stops short of full counting? Ryan Williams has pointed out to us that for applications with for-contradiction assumptions that entail
, the assumption makes the counting-based definitions equivalent to the original one.
In any event, the strongest definitions do seem to be met by important examples of succinct objects in complexity theory. Some of them come from circuits that simulate Turing machines . These circuits have a regular grid structure that replicates a fixed-size gadget for the finite control of
. Owing to the regular structure, global counting—whether north, west, or northwest in the grid—does not impose any greater difficulty. Another source is quantum computation. Deterministic gates such as controlled-NOT and Toffoli are represented by succinct permutation matrices, and the lemma above suffices to compose them at will while preserving succinctness.
Why fret these details? Proving another nice lemma could make the choice a lead-pipe cinch. In non-uniformity and some other ways the quest for lower bounds suffers from the lack of a ‘standard candle’ for complexity, but first one needs to make a standard candlestick. We feel there should be a unique solution to this puzzle of logic and esthetics, but we are still picking up clues.
Open Problems
Admittedly we’ve bounced around like balls in a billiard room, but to make a long story short, what is the best notion of succinctness?
Are the row-major and/or column-major and “northwest” notions of succinctness for matrices equivalent?
For fans of the game “Clue” or Cluedo: who did it, where, and how?
[changed P to P/poly in last main section]




Interesting post ! I suppose that the Hemaspaandra pair interest in voting system is not independent of the fact that they shared university with Riker, the heresthetician.
They certainly cite Riker. Wow, I learned a new word! This page on “Heresthetics” says that “Riker coined this term from a Greek root meaning ‘choosing and electing.'” I would have preferred to see it used to describe the iconography of iconoclasm.
SPOILER ALERT! on the Clue(do) puzzle below,
hence
this
comment
is
being
elongated
vertically
by
as
many
words
as
I
have
the
patience
to
type
right
now.
Looks like you don’t have many fans of Clue reading your blog. Sigh…
It was Miss Peacock, in the dining room, with the revolver.
I completely missed that. I thought those phrases like “tied with rope” and “it is a plum” were just references to Clue, not pieces of an actual puzzle.