Robert P. Crease discusses the results of an unusual popularity contest: the "twenty greatest equations ever".
I feel some affinity with this strange quest, since a few years ago I tried to persuade my colleagues that "The Ten Greatest Equations of All Time" would be a good organizing principle for a general-education course in mathematics and its applications.
However, three of my personal top-ten list are missing from Crease's top twenty: Shannon entropy, Bayes' Theorem and Euler's generalization of Fermat's little theorem. Entropy is arguably the most important new concept of the twentieth century*; Bayes' theorem is fundamental to most statistical pattern recognition, whether by computers or by animals, and Euler's totient theorem is the basis of most current public-key cryptography. Also, all three of these are easy to understand, both intrinsically and in their applications.
Crease's poll had a pretty small N -- he got 120 proposals, with the top of his list getting 20 votes and the bottom just 2. That makes it all the stranger that my three were missing. I guess it's because he was surveying mainly physicists rather than psychologists or engineers, but still...
There are a lot of other great equations that are not on that list. But it's just wrong to leave those three out. I guess Fermat's little theorem isn't as fundamental as the other two, except that it allows you to bring up the question of whether P = NP and related stuff.
Are there any equations that come out of linguistics that should be included in my hypothetical course? Well, Shannon entropy is all about the information content of messages, and so it belongs to linguistics as the field should properly be defined. One other candidate would be Zipf's Law (~ Pareto's Law, Benford's Law, etc.). Of course, many of the other Great Equations have obvious linguistic applications, though I haven't been able to come with any plausible ways to bring E=mc2 to bear.
*OK, Bolzmann wrote in 1877, and the Bolzmann equation S = k ln W is one of Crease's top twenty, but the implications of Boltzmann's work were not worked out until the 20th century, and Shannon's form of the entropy equation is more general and more important...
[via Ray Girvan at the Apothecary's Drawer weblog.]
Posted by Mark Liberman at October 13, 2004 12:00 PM