Email address autocompletion is an underappreciated peril of modern life. Once, the danger was failing to distinguish between "reply" and "reply all", or failing to notice that even a simple "reply" would go to a mailing list rather than to an individual. There are many entertaining stories, some of them true, about the consequences of this sort of carelessness.
But if you use an email agent that helpfully provides autocompletion suggestions from your address book, and harvests addresses from every passing message, then a whole new set of mistakes become possible.
Years ago, I once sent... well, never mind. More recently, I composed an elaborate explanation of issues in porting some speech-technology software from Linux to OS-X, and sent it to the CEO of a health-care start-up, whose only connection to the software in question was sharing the same first name as the person I meant to correspond with.
The July 10 Achewood offers an example of autocompletion embarrassment that also includes a dig at the lolcats phenomenon. Chris Onstad seems to agree with Geoff Pullum about this, despite being the author of a comic strip that "portrays the lives of a group of anthropomorphic stuffed toys, robots, and pets":
(I feel Ray's pain -- though in my defense, I never actually transferred the slogan from the "cookie" picture, much less sent it to a journalist.)
Anyhow, there's an application for simple-minded AI here: an email agent that could associate sets of "topics" (represented as regions in n-gram space, or something like that) with individuals, evaluate how well a given message fits with different people's profiles, factor in your own past communication patterns, and act accordingly. The program could re-order suggested addresses; red-flag addresses that seem unexpected; or whatever.
True, you could make things worse, by introducing an obtrusive "helper" like Clippy or a damaging meddlesome misfeature like the "December 1 DWIM effect". But you wouldn't do that, would you? And an unobtrusive and lossless intervention might actually be helpful.
Posted by Mark Liberman at July 11, 2007 09:31 AM