Skip to content

Logicians are Everywhere

January 4, 2023


So where were they between 1720 and 1820?

Helena Rasiowa was a famous logician from Poland. She visited Case Western Reserve University when I was an undergraduate a million years ago—in the 1960s.

I have always loved mathematical logic. I took undergraduate courses with two famous logicians. Richard Vesley taught me my first logic course at Case. I later took an advanced course there, also as an undergraduate, from Rasiowa. Vesley became a Professor in the University at Buffalo mathematics department, where Ken also knew him before he passed away in 2016.

One of my memories from her class is about a statement. One day in class we were stuck on a tricky insight and we as a class were asking lots of questions. Perhaps too many. She finally said:

“You will understand.”

I still recall this like it happened yesterday. She was eventually right. But at the time we were scared that we might have trouble getting it.

Logicians Named Lewis Are Everywhere

Rasiowa’s dissertation was titled Algebraic Treatment of the Functional Calculus of Lewis and Heyting. The names are Clarence Lewis and Arend Heyting. Although Lewis was American, he adopted the British habit of going by his initials as C.I. Lewis. This made him confusable with C.S. Lewis, the writer Clive Lewis in this blog’s style.

An unrelated Lewis is Harry Lewis, who is an American computer scientist known for his research in logic—and for books on theoretical computer science, computing, higher education, and technology. He is the Gordon McKay Professor of Computer Science at Harvard University, and was Dean of Harvard College from 1995 to 2003.

Another logician named Lewis whom I could have known at Princeton was David Lewis. He is best known in logic for rigorizing counterfactual conditionals. An example he gave is, “if kangaroos had no tails, they would topple over.” In complexity theory, many results are like, “if pigs could whistle then horses could fly.” Scott Aaronson wrote about one such result here. Maybe we could have used Lewis to organize the logic of these results.

Lewis is also famous for actuating the condition of the implication “if a cat could get a published paper then …” His pet named Bruce Le Catt was credited for this article until the journal recently corrected it. The Cheshire Cat—

Wikipedia src

—brings up another logical Lewis: Lewis Carroll. Well, in this blog’s style he is Charles Dodgson. By whichever name, his work in mathematical logic was substantial.

Our point is, there have been so many logicians in the past century-plus that we can point to several with the same name. But that was not always the case. There is a previous century-plus when we can find hardly any logicians at all. To explain why this surprises us, we need to go back further, to Gottfried Leibniz.

Leibniz

Leibniz was of course one of the great mathematicians of all time. He published nothing on formal logic in his lifetime—he wrote only working drafts. Bertrand Russell claimed that Leibniz had developed logic in his drafts to a level which was reached only two centuries later.

Harry Lewis wrote a book with Lloyd Strickland titled Leibniz on Binary: The Invention of Computer Arithmetic.

Harry’s website says about it:

The definitive edition and translation of 32 of Leibniz’s works on binary arithmetic. He works out all the arithmetic operations, and realizes that base-16 would be a more usable notation, so invents several different notations for what we now call the hexadecimal digits.

Leibniz may have been the first computer scientist and information theorist. Early in life, he documented the binary numeral system (base 2), then revisited that system throughout his career.

Among testimonials on the book’s MIT Press page is this by Donald Knuth:

“This book is a model of how the history of computer science and mathematics should be written. Leibniz pointed out the importance of putting ourselves into the place of others, and here we get to put ourselves into the shoes of Leibniz himself, as we’re treated to dozens of his private notes, carefully translated into idiomatic English and thoroughly explained.”

The publisher’s description, echoed on the book’s Amazon page, chimes in about readability:

The [translated] texts are prefaced by a lengthy and detailed introductory essay, in which Strickland and Lewis trace Leibniz’s development of binary, place it in its historical context, and chart its posthumous influence, most notably on shaping our own computer age.

A Historical Logic Gap?

The “shaping of the computer age” seems to have started no earlier than the work of Charles Babbage on mechanical computation beginning in the 1820s. Even so, Babbage’s Difference Engine dealt only with numerical calculations. It took his later Analytical Engine to involve programming logic as we conceive it.

Ken has had several thoughts along these lines, going back to his graduate student days at Merton College, Oxford University:

  • The Merton College Library had one half-height stack of mathematics books. Shelved right alongside modern texts—this was the early 1980s—was an 1854 first edition of George Boole’s book The Laws of Thought. We refer to Boolean logic and Boolean algebra because of this book. These terms came from a book placed with the moderns, not from the centuries-older books growing out of Aristotle and other Classical-era works that Ken could find in the grand Upper Library. This struck Ken as a warp of time.

  • Ken says that the watershed for doing computational logic is realizing that NAND and NOR are universal gates. The older name for NAND is the Sheffer stroke, after the American logician Henry Sheffer. But that wasn’t until 1913, when Russell and Alfred Whitehead picked it up. The polymath Charles Peirce had discovered this about NAND and NOR in the 1880s, so NOR is also called the Peirce Arrow. He also conceived electrical implemenation of AND and OR:

Advent of Computers source

The century-plus between Leibniz and Babbage had Leonhard Euler. It had Carl Gauss. It had all the Bernoullis. It had Joseph-Louis Lagrange, Augustin-Louis Cauchy, Adrien-Marie Legendre, Jean-Baptiste Fourier, and Marie-Sophie Germain. But where are the logicians? As we quoted Russell above, Leibniz’s preliminary work connects only to two centuries later.

So why the gap? That is the puzzle. One further question is how close Leibniz came to perceiving the universality of NAND and its significance. Harry, who also drew the Peirce drawing to our attention, tells us that Leibniz invented XOR and also wrote bitwise AND for binary strings. Another is how far back the ideas of Polish notation go. Gottlob Frege anticipated it, but that was still in the late 1800s. Polish notation and its reverse form have had enduring value in programming and compilation since Jan Łukasiewicz invented the notation in 1924. Being taught by Rasiowa in the 1960s brought me closer to the origins of these logical fundamentals than I might expect, on historical reflection.

Open Problems

Are you puzzled by the gap? Can you explain it?

How might history have changed if the French greats had developed Boolean logic? For some jokes, spelling Louis in their names as Lewis might have helped. There are no US Senators named Lewis, though the introducer at Monday’s Rose Bowl tried to create one. Not jokes: both Leibniz and C.I. Lewis figure in this section of Wikipedia’s bio of David Lewis. Maybe now the rapper Louis Logic will help people named Louis catch up—to the logician Louise Hay, whom Ken knew when she visited Peter Neumann and others at Oxford.

9 Comments leave one →
  1. January 4, 2023 1:15 am

    This is interesting and well written, thank you for sharing!

  2. January 4, 2023 8:30 am

    Drawing a blank on logical Louies, will dig up some logic histories later, but for now I’ll just post a few loose links pertaining to Leibniz and Peirce. (NB. When I was coming up one of my profs taught us to use NNOR (read “neither nor”) instead of NOR because he said the latter left people confused which OR was meant, so watch out for that in my notes.)

    Here’s one from Peirce on what he dubbed the Ampheck.

  3. John Cherniavsky permalink
    January 4, 2023 10:22 am

    Helena’s son – Zbigniew Ras – is a computer scientist at University of North Carolina, Charlotte. He has been quite active in Knowledge Discovery and Data Mining.

  4. January 4, 2023 1:16 pm

    A few thoughts.
    1) What accounts for the long gap between Leibniz and Boole? I don’t know (maybe Lloyd, who is an intellectual historian, will weigh in with a better account). I think it was probably a combination of factors. Leibniz’s reputation suffered during the 18th century, especially in England, where the dispute with Newton about the invention of the infinitesimal calculus was unhelpful. Leibniz also had himself to blame — his philosophical optimism was considered silly by Enlightenment thinkers such as Voltaire (who lampooned him as Dr. Pangloss, that is, “Dr. Say-Anything,” in Candide). And he got quite carried away, in ways that look strange to modern eyes, trying to connect binary notation to Christian and Chinese traditions. Most of his writings on binary remained unpublished — most of what is in our book has never appeared in print before.
    2) We take the connection between binary and logic for granted, but these were not the same thing. Leibniz’s work on binary was mostly related to arithmetic. As Ken notes, Leibniz did invent a notation (a + sign with a tilde over it) for XOR, but only so he could more easily describe the algorithm for adding binary numbers. At the same time he had a much less developed program to develop a full-fledged calculus of ideas. One essay on this topic is in my other recent collection, Ideas That Created the Future: “… if we could find characters or signs appropriate for expressing all our thoughts as clearly and exactly as arithmetic expresses numbers or geometric analysis expresses lines, we could accomplish in all matters, insofar as they are amenable to reasoning, everything that can be done in arithmetic and geometry.” He made an uncertain start at that project (see Part II, Chapter 13, of Struik’s Source Book in Mathematics), but it didn’t go anywhere, and I don’t think it was strongly related to his work on binary arithmetic.
    3) Finally, I’d note that the metamathematical development of logic as the tool of foundational studies of mathematics itself has a complicated legacy for computer science. Richard Hamming has a thought-provoking paper “Mathematics on a distant planet” (American Mathematical Monthly 105:7, Aug-Sep 1998, 640-650), arguing, among other things, that hitting on Boolean logic sooner or later might have been inevitable in any advanced civilization, but logic qua foundation of mathematics looks awfully contingent and arbitrary. So while there would have been no Church-Turing theorem without Whitehead and Russell or something like that, I wonder how our field would have developed if some prescient pre-Boole had written The Laws of Thought in the mid-18th century rather than the mid-19th.

  5. January 6, 2023 11:24 am

    To expand a little on point 1 of Harry’s post: while several collections of Leibniz’s work were published in the decades after his death, the editors (naturally) selected writings they judged to be of public interest at the time, and neither his work on binary nor his logical writings qualified. On top of that, both sets of writings are far from polished – they are mostly explorations-as-he-is-writing. There’s gold in all of that, to be sure, but his eighteenth-century editors weren’t particularly concerned with mining to find it.

    With regard to Leibniz’s reputation: well, in England, sure, Leibniz’s name was mud for a long time after he died, though there was a grudging respect for him (coupled with an almost complete ignorance of his published work, which was better known through the summaries and discussions of others than it was first hand). As for Voltaire, I’d have to disagree about that. The idea that Voltaire somehow sullied Leibniz’s reputation is a 19th-century fiction. There’s no evidence for it at all in the eighteenth century. Voltaire actually got a lot of blowback for Candide. And Leibniz’s philosophical optimism was actually seen as a respectable position for a good part of the eighteenth century, and it was eventually undermined by a series of philosophical and theological arguments (usually based on misunderstandings, it has to be said) rather than by satire, even if today we think the satire is really rather good. I’ve a long paper on all this here if interested: https://jmphil.org/articles/10.32881/jomp.3/

  6. Dave Lewis permalink
    January 8, 2023 1:51 am

    I resemble this remark.

  7. January 9, 2023 11:22 am

    A favorite passage from Leibniz —

    The Present Is Big With The Future

    Leibniz’s annunciation of the present as pregnant with all futurity, as seen from the standpoint of a fully informed observer, not only describes determinacy in the ideal limit but reveals the secret springs of his differential calculus and foreshadows a picture of the holographic universe.

Trackbacks

  1. Rabin-Scott Time | Gödel's Lost Letter and P=NP
  2. Novel Proofs of the Infinitude of Primes | Gödel's Lost Letter and P=NP

Leave a Reply

Discover more from Gödel's Lost Letter and P=NP

Subscribe now to keep reading and get access to the full archive.

Continue reading