Edwin Henry Landseer’s first show at the Royal Academy came when he was thirteen. He was a full member before he was thirty; he was knighted before he was fifty. A booming commercial trade in engravings prepared by his brother made him wealth. He was a favorite of Queen Victoria’s, and his art is still familiar today, if perhaps more for its sensibility (and commercial appeal) than its artistic merits. Before being driven to drink, madness, and the creation allegorical depictions of polar bears rending explorers, he had accrued all the successes that a Victorian artist could expect, and one unexpected one. Thanks to his portrait, A Distinguished Member of the Humane Society, Landseer has a tall, powerful, well-built breed of dog named after him. The dog was the distinguished member of the Humane Society, of course, because the Humane Society was not concerned with the prevention of cruelty to animals but with saving human lives, and Bob (of the Newfoundland breed that some today call “Landseer”) was not an honorary member.
A Belarusian Jew named "Israel" might not seem the most likely person to write the great American Christmas carol. But Israel Isidore Baline, the greatest songwriter of the dying days of Tin Pan Alley, had a brainwave one winter day at a spa in Phoenix, and wrote the best-selling song of the first hundred years of recorded music. Baline, working under his adopted name of Irving Berlin, excised the somewhat sardonic framing narrative of the song, about a California millionaire wishing for the snowy Christmases of his youth, and things took off from there. Fred Astaire, to whom he presented the song, liked it, but it ended up sung by Bing Crosby in the Crosby/Astaire film Holiday Inn. Although the hotel chain took its name from the movie, "White Christmas" was Holiday Inn‘s true legacy.
The small town of Kale in Turkey has a few claims to fame. There are some lovely Roman ruins around town; one of Turkey’s major industrial conglomerates, the Kale Group, is based there; and the city surrounds the ancient city of Myra, jewel of the Lycian Alliance. Myra was part of early Christendom, and its bishophric produced a man of renown in the fourth century: Nicholas of Bari. Little is known about Nicholas’ life; veneration began relatively soon after his death, and to this day, his remains are said to miraculously generate water ("manna of Saint Nicholas") held to have curative properties. Nicholas is one of the most important saints in several Orthodox traditions; he is the patron saint of sailors, pawnbrokers, children, and thieves. And yet in America, he is, by and large, remembered thanks to two lines from one poem, written by a farmer named Henry Livingstone and for a hundred years published under the name of Clement Moore:
Some holidays are regional. Sweetest Day is celebrated almost entirely in the Great Lakes region. It’s Valentine’s Day in October, a day invented out of whole cloth by Cleveland candy manufacturers to spur October chocolate sales in an era before Halloween was a festival of cavity-inducing gluttony. Huge amounts of newspaper advertising, candy gifts to newsboys, and a promotional push by the great Theda Bara, Hollywood’s original vamp, failed to break it out of its regional shell; Buffalo and Detroit are among the few areas other than Cleveland where Sweetest Day continues to move heart-shaped boxes off the shelves. Some holidays are pointless; not even the popcorn people admit to knowing who came up with National Popcorn Day. And some are wholly spurious; the purportedly Inuit Festival for the Souls of Dead Whales probably goes over well among String Cheese Incident fans, but bears no relationship to any historically celebrated Native Alaskan holiday. But some, like World Hello Day (link via the similarly bewildered LGM), manage to combine all three qualities. Who are these McCormacks who came up with the idea? Why November 21? Why on earth do they want us to say hello to people, when holidays would ideally celebrate people leaving one another alone? And perhaps they do things differently in Arizona, but even if Esquivel, Seamus Heany, and Olivia de Havilland all say it’s a fine idea, a holiday dedicated to introducing oneself to strangers seems likely to get one killed in, say, New York. Far better than World Hello Day would be Hello World Day, celebrating 32 years of every programming manual’s stock first example. Global diversity could be honored by recognizing our rainbow of programming languages, from Pascal to Brainfuck, and when we were done we could all sing a song, examine some art, and return to our homes without bothering anyone.
Carthage’s most famous son is best known today for his tactic of marching elephants through the Alps (or possibly for having a name that rhymed with "cannibal"). But his sharp mind for tactical advantage, including the cavalry techniques that enabled him maintain a fifteen-year campaign against the Romans in Italy, weren’t the first instance of his military innovation. Hannibal was the first man in recorded history to engage in biowarfare, flinging urns full of snakes at enemy ships in the Mediterranean. Use of cavalry predates the Battle of Cannae, of course; the chariot was invented around the year 2000 B.C., and the invention of the stirrup — a profoundly simple invention with drastic world-historical effects in both Europe and Asia — meant that cavalry could ride horses directly. (Other animals, such as the Carthaginian elephant or the Arizona camel, were also used.) But Hannibal’s snakes were another technology entirely: animals not as a tool of soldiers, but as weapons themselves.
It’s about that time of year again; while the rest of the world worries about the beautiful (if bloody and tragic) game, nebbishy number crunchers here in the last remaining superpower fret about Melky Carbera’s VORP. Fantasy baseball may be widely derided as having turned a manly sport of drunken brawlers into something that only sabermetricians, lonely shut-ins, and boys named Theo care about. But this point of the season, when weaknesses start to become really glaring, offers a form of ritualized combat that most people in today’s effete society can only dream of: the chance to rook one’s friends and coworkers and then mock them mercilessly for having made the dumbest trade of the year.
Kevin Kelly is a technophile, and has been since his Whole Earth Review days; like many people who really like tools, he often finds that specialized and old-fashioned implements are the cheapest, most efficient, or most aesthetically pleasing way to get jobs done. And if his theory that "species of technology do not go extinct" is correct, then he’s in luck — those lovely speed levers and screw punches will be around for generations to come. Kelly acknowledges the counterexample of Greek fire, the terrifying napalm-like weapon used by the Byzantines to ensure their naval superiority. Various glazes, perfumes, and dyes are gone, but it’s doubtful that Kelly would consider the failure to replicate particular shade of blue stained glass a "species extinction". He’s right that in an amazingly wealthy world, it’s possible to find almost anything. Even if we can’t duplicate a Stradivarius (ignoring the debate about whether Stradivarius himself had any secret techniques), it’s possible to buy a violin in any well-equipped music shop; given enough time and money, you could acquire a brand new pianoforte or glass harmonica. The problem isn’t with Kelly’s insight; it’s with his metaphor.
When Gen. Edward Bragg nominated Grover Cleveland to be Democratic candidate for president in 1888, he said that the American people loved him most for the enemies he had made. The DC Comics superhero the Flash must therefore set a record for the least-loved long-running character in the four-color world. It’s not just that he travels through time with a Cosmic Treadmill; that the original character’s death was the capstone on the most baroque storyline in the history of American comics; or that he was played by Dawson’s dad on a dreadful t.v. show. It’s the bad guys.
Isaac Asimov was one of the most successful science fiction writers of the Twentieth Century. His Foundation series was Gibbon’s Decline and Fall of the Roman Empire, projected into the far future and with one man attempting to turn back the waves; his "Three Laws of Robotics" led to both an OED entry and a Will Smith movie. Asimov spent the vast majority of his life within the Northeast Corridor, venturing out from the Boston-to-New-York axis only occasionally. He stopped teaching as an associate professor at Boston University in 1958, with no major research to his name (although he wrote a college textbook; given his consumate skill as a popular science writer, it was probably a better read than most). He was afraid of flying (he flew only twice in his life, both in the course his military service during World War II). Isaac Asimov spent the vast majority of his life staying in one place and writing. It showed, both in his often prolix novels and hs staggeringly lengthy bibliography. He is, most likely, the boringest man ever to inspire a deranged Japanese death cult.
Any American child who has had to perform in a Christmas conert or gone caroling has suffered through the interminable verses of "Good King Wenceslas". Caroling is basically a medieval English tradition adopted (like so many others, with its edges filed off) by the Victorians; Sir Arthur Sullivan, of Gilbert-and fame, tried his hand at a few, but "Good King Wenceslas" is by an earlier writer, the Anglican scholar and hymnist John Mason Neale. The carol itself contains no references to the Nativity or to Christ; instead it talks about King Wenceslas as a saint and reminds "Christian men" of the virtues of charity. Wenceslas, also known as Vaclavor, was a tenth century Czech ruler, the Duke of Bohemia. He was raised by his grandmother, Ludmilla, a royal convert to Christianity following the proselytizing mission of Saints Cyril (or Cyrillic fame) and Methodius; his mother, however, was a pagan. The conflict between the pagan nobility and the Christian Vaclavor was not just over religion but also about Vaclavor’s relationship with Henry the Fowler, king of the Germans, whose son Otto would become the first Holy Roman Emperor, and the relationship between the Czechs and the Germans. Tenth century primary sources being scarce, it’s difficult for modern sources to say what stirred emotions — religious conflict, a move by Henry to make Bohemia into a client state — but rebellion broke out during Vaclavor’s reign. Ludmilla’s martyrdom proceeded Vaclavor’s own (at the hands of his brother) by only fourteen years. Today he’s the patron saint of the Czech Republic, and Wenceslas Square is at the heart of downtown Prague. He was, as the carol tells us, known for his good works and charity towards the poor; miracles associated with the saint include healing the lame and the blind. But it’s not a Christmas carol at all; the only reason most non-Czechs have ever heard of Saint Wenceslas is that Neale decided to set his song during the Feast of Saint Stephen — Boxing Day. Celebrity has nothing to do with talent, even for thousand-year-old saints.
At one point in R. A. Lafferty‘s story "Nor Limestone Islands", a Miss Phosphor McCabe requests zoning permission to build a structure on her plot of land: a thirty-acre pagoda, four hundred feet high and built of three hundred thousand tons of pink limestone. It will be, she asserts, real pretty, and a tourist attraction to boot. This being a Lafferty story, Miss Phosphor has an entirely sensible plan for obtaining a three hundred thousand ton limestone pagoda: she will ask her friends on the Grecian flying island to touch down and cut off a chunk. Baldasare Forestiere didn’t have any such floating friends, which is why he carved his underground gardens the old-fashioned way. There are a surprising number of these fairy castles and visionary landscapes dotting the globe. The reasons for crafting them vary. Howard Finster’s Paradise Gardens and Benedictine monk Joseph Zoettl’s Ave Maria Grotto were created as expressions of deeply-felt religious belief; Simon Rodia’s Watts Towers (1, 2) or eccentric millionaire Edward James’ surrealist masterpiece Las Pozas seem to have been created as consciously artistic endeavors. Some, Alex Jordan’s House on the Rock, were commercial ventures; others, like the Winchester Mystery House, are sad legacies of madness. But to me the most wonderful are those that seem to stem from the British tradition of follies, the ones that suggest someone just thought to him- or herself that it would be swell to live in a bizarro junk castle or a bottle house. Why shouldn’t one live in a home made entirely from paper products? As Miss Phosphor wrote:
In 1942, a madman sent a note to the Manhattan police. In the cut-and-paste letters of bad kidnapping comedies across the world, it read: "I will make no more bomb units for the duration of the war — my patriotic feelings made me decide this — later I will bring the Con Edison to justice". He’d delivered a dud pipe bomb to Con Ed in 1940, left an bomb with an unwound alarm clock as a defanged trigger a few blocks from Con Ed’s offices in 1941. No one payed much attention, and George Metesky, the Mad Bomber of New York, was as good as his word; he didn’t plant another bomb until 1950. Then he began planting bombs that detonated, targeting Con Ed for their dastardly deeds, and other targets for no apparent reason at all: the New York Public Library, Grand Central Station, movie theaters throughout the city. Dr. James Brussel, the state’s director of mental hygiene, eventually helped crack the case with what may have been America’s first psychological profile of an unknown criminal. Among his conclusions about the Mad Bomber’s grievances, background, and education, he noted that F.P., as the Bomber signed himself, was probably a Slav. He was right. Everyone knew that Eastern Europeans used bombs as their weapon of choice.
Hardy’s admonishment that "if you blind yourself to the deeper issues of a tragedy you see a farce" is harsh medicine in the face of true devastation. One might need to wait for the aftermath. It won’t just be the spectacle of the Federal Emergency Management Agency, a crucial cog in the intricate machinery of the New World Order, reduced to putting on a showto defend the indefensible. (Although revelations that the head of FEMA was apparently let go due to incompetence from his previous job as head of a Arabian horse association may be the punchline in the dark comedy that we’ll see about this ten or fifteen years from now, when the dead are buried and college students from across the nation again have a place to drink themselves senseless on the public streets every February.) It’s not just the carrion birds, big and small, that follow any disaster (those of a certain mindset might suggest that they represent a continuum). Desperate people and reconstruction money draw con artists like moths to a flame; even though the people who most lose out are going to be the dispossessed of Louisiana and Mississippi (with the American taxpayer in a distant, distant third place, if Congress has its way), if everything is kept in soft focus, maybe it will serve as a comedic backdrop.
When Marcus Rediker of the University of Pittsburgh decided to write a book about the golden age of Atlantic piracy, he chose the title with care. Villains of All Nations is the story of a brawling, violent culture, where life was cheap but the maimed received pensions, where infractions could be punished by summary execution but the captains were chosen by the crew (a radical doctrine that inspires swooning among anarchist visionaries to this day). They were of all nations; Rediker makes a convincing case that the astounding levels of institutionalized (and frequently rewarded) sadism of the average eighteenth-century naval vessel, no matter what flag it flew under, made piracy an attractive option for the average sailor. Pirates didn’t need press gangs (although they used them; "Black Bart" Roberts, the most successful pirate of his day, had been third mate on a slave vessel before it was captured by Howell Davis and he was offered the Articles or the plank). The wickedest city in the West, Port Royal, swelled with Dutch and English sailors, Frenchmen, Spaniards, an occasional escaped African slave. Bartholomew Roberts, William Lewis, and Henry Morgan (who led a gigantic pirate fleet to burn Panama City to the ground before entering into a fitfully successful political career that would lead to his name one day gracing America’s most popular rum) were all Welshmen.
In his magnificent study of the seedy underside of nineteenth-century New York, Low Life, Luc Sante notes the long history of the Bowery stage. The neighborhood was once second only to Broadway as a site for legitimate theater in New York. Well before the Bowery had turned into a haven for ethnic theaters (particularly Yiddish theater) and burlesque joints, dime novelist Ned Buntline struck gold with his "Mose cycle". The plays starred Mose, the Bowery Paul Bunyan, a brawling boozing Irish b’hoy who fought whole fire engine companies and drank vats of beer at a go. A certain amount of accuracy to the role was assured casting a neighborhood toguh, Frank Chanfrau, the younger brother of the man who had beaten the real Mose Humphreys in a fight for the first and last time in 1838. Humphreys afterwards quietly decamped for parts unknown, leaving the path clear for Chanfrau to ascend to stardom simply by having the right accent and a certain jaunty way with his hat and cigar. Neighborhood loyalty was assured, and loyalty in the Bowery was no small thing; a few years later, native son actor Edwin Forrest incited the Astor Place riot (22 were killed) after feeling insulted by rival (and, more importantly, British) Shakespearean William Macready, appearing uptown in a rival performance of Macbeth.
They called it the "goodgod" or the "Lord God bird"; folk etymology has it that when someone saw the ivory-billed woodpecker, they’d say, "Lord God, what a woodpecker." A wildlife artist named Don Eckelberry, painting one along the Tinsaw River in Louisiana in 1944, painted one and wrote he was "impressed by the majestic and wild personality of this bird." That bird, with "its vigor, its almost frantic aliveness," was the last ivory-billed woodpecker confirmed to be seen alive until this year. For sixty years, the Lord God bird was missing and presumed dead, despite occasional reports floating in of one seen here and there at a distance. They were the so-called Elvis sightings of ornithology. But years of intensive searching (combined with some good guesses on the part of the Nature Conservancy about what lands to buy in the hopes of preserving woodpecker habitat) paid off when a kayaker saw one in the Big Woods of Arkansas in 2004. Intensive video surveillance followed, confirming the bird’s presence, and this year it was announced: Elvis had not left the building.
"There are no second acts in American life," wrote Scott Fitzgerald, before drinking himself to death in Hollywood, convinced that he had squandered his talent. But there are. "Old soldiers never die," said Douglas MacArthur, "they just fade away." MacArthur, the FDR antagonist, the man who ordered the Bonus Army attacks and eventually resigned from the Army had gotten his second act in World War II. But even a war hero could find himself constrained; Harry Truman removed him from his command after MacArthur repeatedly argued in favor of a nuclear strike on China. He tried for a third invention, positioning himself for the Republican Presidential nomination 1952 (he would eventually land on the ticket of Charles Lindbergh’s ultra-right America First Party ticket, receiving only a few hundred votes). The last ten years of his life were quiet ones. But old soldiers don’t always go out quietly; Col. David Hackworth, laid to rest today in Arlington National Cemetary, went with a lot of noise.
In two thousand years, even the most mundane and plodding entity could acquire a great many secrets. A political power with its reach spread across a continent would acquire rather more; a political power that was also the Church Militant headed by the Vicar of Christ, rife with genuine secret societies and prone to receiving the occasional (and often officially sanctioned) divine revelation might well be the world center of esoteric knowledge. For that reason, it was somewhat disappointing when the Roman Catholic Church revealed the third secret of Fatima in 2000.
In the sixth century, Procopius, the Byzantine historian, published a book. The good bits — scandalous, even pornographic in parts — of his latest missive contained information that the average reader might have found a bit more interesting than Procopius’ previous "Justinian’s Buildings", a straightforward account of of all the buildings in the eastern Roman empire for which the emperor was responsible. While the latter has proved invaluable for historians and archaelogists, the former contained such information as the emperor (and his wife, Theodora’s) apparent supernatural origins:
"Alas," said the Roman emperor Vespasian in 79 A.D., "I feel myself becoming a god." Vespasian, a one-time courtier of Nero’s, is known mainly for his reign’s relative tranquility; his death is mostly remarkable for its placidity (he died in his home province) and his deathbed witticism. And by Roman theological standards, he was right: becoming gods was what dying emperors did. And of these emperors, the first of the Caesars were undoubtedly the greatest. Gaius Julius, reshaped Rome and marched his armies down its streets, ending the Republic. His nephew, Gaius Octavius, later Gaius Julius Caesar Octavianus, later still Caesar Augustus (Caesar the Exalted), was the first true emperor of Rome. Special emperors deserve special recognition, which is why we have months called "July" and "August" but none called "Claud" or "Flav."
When Columbus landed on San Salvador on October 13, he had discovered what was to Europeans a brave new world. The Vikings may have been first, but they didn’t introduce the Norweigan rat, smallpox, and gunpowder to the Americas, nor did their adventures bring back quantities of silver sufficient to eventually destroy the economy of their home country, so they don’t count. Columbus’ discovery, such as it was, was epoch-making and its results so intertwined in Europe’s subsequent history that science fiction writers and bar trivia fiends could argue about counterfactuals for years; it was, however, no scientific revolution.
Will Eisner died this month, at 87 years old. His old protege Jules Feiffer ("Now he’s as bald as me!" Eisner once chortled) writes that "alone among comic book men, Eisner was a cartoonist other cartoonists swiped from." And even a cursory look at his bibliography suggests that even disregarding the first forty years of his career, Eisner was still a giant of the form. A Contract with God was, if not actually the first graphic novel to use the name, certainly the most influential of the early graphic novels, and Eisner’s Comics and Sequential Art is something like a Data Structures and Algorithms for the comic book artist; with Scott McCloud’s Understanding Comics it remains one of the definitive works in critical theory for the field. But looking at his career from the Sixties on slights his early work. Eisner did war comics (most notably Blackhawks) and and Army comics (Jeep maintainance guides for the literate and semi-literate during World War II). Like most artists of the day, drew the occasional superhero ("Wonder Man" was sued out of existence as a blatant copy of DC’s Superman; the incident is adapted, as was much of Eisner’s career in the ‘40s, in Michael Chabon’s The Amazing Adventures of Kavalier and Clay), but Eisner’s most lasting creation wore nothing more memorable than a blue serge suit.
In the early fourteenth century, the settlers on Greenland were a flourishing colony, the farthest-flung outpust of Viking culture during the medieval warm period. They were a propserous, cathedral-erecting, people, and then they vanished. The colony’s extinction — and tantalizing pieces of physical evidence the colonists left behind, some more controversial than others — has led to widespread speculation about whether any refugees made their way to North America, but everyone agrees that Greenland was populated only by Inuit by the sixteenth century. Had the colony been destroyed by Inuit, English, or cod-seeking Basques? Did some unknown plague finish them off? Jared Diamond‘s theory is the one currently accepted by most archaelogists: the Greenlanders starved to death.
Pedantry aside, it’s now five years into the twenty-first century; we are living in science fictional days, the bright cold days of omnipotent totalitarianismm and the shiny white cleanliness of Pan Am flights to the moon. But instead of dealing with cold fusion cars, let alone jetpacks and space babies, the world is digging the dead out of the mud of Aceh. The tsunami that occured was an order of magnitude more deadly than Krakatoa. It will be one of the deadliest natural disasters in recorded history; depending on how many people die in from the diseases and privation that follow disasters outside the developed world, it may be the worst ever to happen outside China. The shiny technolopoli of our imaginations never resembled this. War and horror and even unsought, random catastrophe — and the aftermath — can be compelling, but there’s very little story in a natural disaster that strikes a part of the world mired in grinding poverty that nobody much cares about any way. There are sales in the tale of surviving the apocalypse; how many books will be published about the efforts of the Red Cross and Doctors without Borders and the U.S. Marines to make sure that sugar packets and clean water can get to communities of starving Indonesians so they don’t die of cholera. Appropriate technology, like providing an electricity source or cheap water pump to a rural South Asian community, is still of vast importance in leapfrogging Asian tigers. At a science fiction convention in Texas a few months ago, people asked whether there was room for cyberpunk in a world after 9/11. Their answers were insightful, but even a writer who has spent some time thinking about the problem of poor, drowning communities missed one possible answer. Bruce Sterling participated via self-parodic letter, but didn’t give an answer that surely has occured to him: for most of history, for most of the world, war, pestilence, and famine were simply the natural order of things. For people who don’t read People, let alone appear in it, the future looks a lot like the past.
Christmas is under siege again. Every year, the story of how Christmas is being ruined plays out in the newspapers. Much ink is usually spilled over the question of whether Christmas is too commercial, although some people hold that it’s not commercial enough; this latter group presumably believes that elbowing aside fellow shoppers to grab the last Lego set on the shelf at Target represents the height of holiday fun, beating out making snow angels and caroling. This year, however, an organized attempt seems to be underway to convince America that the religious aspects of Christmas are about to be stripped away by forces of secular oppression. The stories are somewhat thin gruel, of course, but Christmas present has been compared unfavorably to that of Christmas past for centuries.
The First World War ended amidst the sounds of church bells on the eleventh hour of the eleventh day of the eleventh month of 1918. On that final day of the Great War, British and Commonwealth forces suffered 863 deaths. In America, we celebrate the 11th as Veteran’s Day, a day to honor the sacrifices of all the men and women who have served in our country’s military, people like my grandfather, who lost much of the use of his arm after being hit by shrapnel in World War II. In England, however, it’s not called "Veteran’s Day"; it’s Armistice Day, soon to be followed by Remembrance Day. America lost 116,000 men in the First World War, but the casualty rates for Western European nations were unimaginable. England, a small nation, suffered 700,000 dead; Italy, a smaller one, lost 600,000. Mocking the French for their cowardice was briefly fashionable; over the four years between 1914 and 1918, over 1.4 million French soldiers, roughly 3% of the entire population of the country, died, largely in the blood and muck of the Western Front. Ninety thousand Bulgarian troops were killed, almost two Vietnam Wars’ worth of dead from a country the size of Tennessee. Afterwards, throughout the twentieth century, wars grew less bloody, because the science of soldiers killing soldiers and being killed in turn had reached its apex. Today, there are still a few survivors; a 108-year-old Scot, a 111-year-old American, 103-year-old Australian. Eric Bogle (and later Shane MacGowan) called them "tired old heroes from a forgotten war". Today is Veteran’s Day, and I thank every American wearing a uniform. But Sunday is Remembrance Day, a time to remember Europe’s spasm of murderous self-destruction, the most gruesome war in the history of mankind, and General Sherman’s dictum about war, true in the twentieth century as it was in the nineteenth and today: "Boys, it is all hell."
The Red Sox got their championship; the Yankees got to feel what it’s like to be on the losing side of a historic collapse; the Cardinals got to watch Albert Pujols, probably the best position player in the National League, strut his stuff in games that counted (and he’ll be around; St. Louis, which is apparently packed with baseball nuts, has a legitimate excuse to say "Wait until next year"); and the rest of the world can finally, finally stop hearing about Beantown and its curse. I am not a Sox fan, although I work with Sox fans, I have lived within Red Sox Nation, and my mother is from Belmont, Mass. I know of the pain of the Bostonian baseball lover. But as sports pages spend the next few days running articles with titles like "Curse Reversed", it’s worth noting that other cities have their own problems. The 1954 Indians ran up the best record in the modern history of baseball, but Cleveland hasn’t won a championship since 1948. The city of Philadelphia hasn’t had a championship, despite four frequently excellent teams, in twenty years. And I imagine that we’re all going to have to hear about the billy goat if the Cubs ever make a run at the pennant again. Talking about curses is a fun way to distract from real issues, like the Sox’s refusal to hire black players in the 1950s (while other teams, including the Indians, transformed themselves with the new talent), or an ownership’s habit of running off players, or even the bad luck that Bill Buckner must curse every morning. But thanks to whiz-kid GM Theo Epstein (whose grandfather wrote Casablanca; I’m sure Boston thinks this is the beginning of a beautiful friendship), the wild card (which I still think was a bad idea), and a run of luck, the Sox are champions.
If the Red Sox are able to overcome Tim Wakefield’s shaky command of the plate and their frequently inept defense to score more runs than the Cardinals, they win and the Cardinals lose. If Black manages to force checkmate, White loses the game. If a salesman sells a car objectively worth $4,000 for $5,000, the buyer has lost $1,000. These are known as zero-sum games; there is only so much winnable goodness to be spread around (entries in the win column; dollars in the economic system), so for one participant to gain, another must lose. On the other hand, many games are non-zero-sum; proponents of free trade, for instance, feel that net gains resulting from a shift in production to places with a comparative advantage, and they’ve got board games to back them up. A simple example of a non-zero-sum game is the prisoner’s dilemma. Two criminals have been arrested and are being interrogated in separate rooms. If both refuse to talk, each will be out of prisoner in a year; if one plea bargains and confesses, he’ll be out in six months while the other spends five years in jail; if both confess, each will spend three years in jail. The best solution for the criminals is for neither to talk, but can they really trust one another not to fink?
London in the early years of the twentieth century feasted upon the fruits of an empire upon which the sun never set; from Burma to Rhodesia to British Honduras, the Union Jack flew proudly. Today the countries are known as Myanmar, Zimbabwe, and Belize, and the British come, if at all, as tourists. As the nineteenth century ended, the Urabi revolt in Egypt and the death of Gen. Charles "Chinese" Gordon defending Khartoum against a Sudanese uprising at the end of the nineteenth century were a foreshadowing of the problems of empire. And the new century brought with it two surprises out China. In 1900, a mystical secret society known as the Fists of Righteous Harmony led the Boxer Rebellion against the European devils; in 1901, nine European nations, Americans, and the Japanese invaded, seized Beijing, and enforced the Boxer protocols. Four years later, after Russia violated the terms of the protocols by refusing to withdraw its troops from Manchuria, the Japanese declared war on Russia. The world watched, shocked, as the Japanese devastated the Russian navy in the Russo-Japanese War. The war seems today like a distant memory, but it was the first time since the Industrial Revolution that an Asian nation had so soundly defeated a Western power. The English called the nineteenth century struggle for control of central Asia the Great Game; the Russians called it a "Tournament of Shadows". England’s refusal to supply coal to the Russians was a major factor in their defeat, but to Englishmen concerned with the fate of the new imperialism, the defeat of Russia only added new concerns. There was trouble stirring in the East, and that trouble was soon to gain a face.
A few months ago, one of the writers for Physics World asked his readers to send in their nominations for the most beautiful equation of all time. Being a journal devoted to physics, and not economics or pure mathematics or accounting, the respondants offered a number of candidates in that field: the equation defining Hubble’s constant, the Balmer formula, Maxwell’s equations. (Strangely, Newton’s Law of Universal Gravitation seems to have been relatively unpopular, perhaps because of lingering suspicions that Newton stole the idea from hunchbacked microscope enthusiast Robert Hooke. Given Newton’s smearing of Leibniz over the discovery of calculus, perhaps the Physics World readership felt that the Royal Society, alchemy, and gambling studies were enough historical credit for the nonce.) I probably would have chosen a pure math equation, possibly the prime number theorem or one of the leading votegetters, Euler’s identity, which elegantly ties together every important number in undergraduate-level math (e, π, i, 1, and 0). Of the nominees selected by the magazine’s readership, I have a sneaking fondness for the oldest of them (not counting 1 + 1 = 2), however. Some 2500 years ago, Pythagoras of Samos was leading a cult of philosopher-mathematicians. Dedicated to studying the mathematics, purifying their souls, and avoiding beans, the Pythagoreans viewed mathematics as a window into the true nature of reality. They showed how to construct the Platonic solids, discussed the personalities of the integers, and proved an Egyptian theorem that today bears Pythagoras’ name: "for a right angled triangle the square on the hypotenuse is equal to the sum of the squares on the other two sides". The fact that a2 + b2 = c2 is hardly novel today (although there are some notable implications), and there are a staggering number of ways to demonstrate it, including some not based on geometry at all. Chinese mathematician Liu Hui proved it in the third century B.C.(had he been born someplace a bit more central to Western civilization, it might be known the Hui theorem); President James Garfield proved it; in Gödel, Escher, Bach, Douglas Hofstadter uses the fact that a computer has proved it as the starting point for discussing artificial intelligence and the meaning of creativity and intent. A sabremetrical formula for estimating a baseball team’s expected wins shares its name. And once upon a time, a New York Times editorial urged humanity to contact Mars by means of carving a gigantic geometric proof on the Siberian steppes. Any equation can be simple or well-known or meaningful, but what could define beauty better than vast triangles carved into the permafrost in order to signal our intelligence to an alien race?
There is a plaque that stands in an alley in San Francisco: "On approximately this spot, Miles Archer, partner of Sam Spade, was done in by Brigid O’Shaughnessy." And it’s true — if you go to Burrit Street, you can see the spot where Brigid, that woman who told Spade that she was "bad, worse than you could know", knocked off a man with ten thousand insurance, no children, and a wife that didn’t like him. The Maltese Falcon was filmed by John Huston, a screenwriter for Warner Brothers, who had been offered the chance to direct a movie of his choice. Dashiell Hammett’s The Maltese Falcon had been adapted twice before, once as The Maltese Falcon and once as Satan Met a Lady (featuring a reluctant Bette Davis in the Sydney Greenstreet role). Huston, adapting the novel himself, thought he could do better. The signs weren’t promising: Peter Lorre was a B-movie actor in America, best known as the Japanese detective in the Mr. Moto series; Greenstreet had never made a movie; Mary Astor was the second choice to play Brigid O’Shaughnessy. Perhaps with that in mind, George Raft turned down the lead role, and a character actor named Humphrey Bogart — who had burst into prominence in another role Raft rejected — was selected to play private investigator Sam Spade. Bogart mostly played heavies, the hood gunned down in the second reel. His friendship with Huston got him the lead (alongside Falcon costars Lorre and Greenstreet) in Huston’s Casablanca, the best movie of the twentieth century according to the AFI — "Two cliches make us laugh. A hundred cliches move us," wrote Umberto Eco. Bogie became an icon because The Maltese Falcon showed he could play a hero. But how heroic was Sam Spade?
Washington is no Cleveland. ESPN has dubbed Cleveland "America’s Most Tortured Sports City", with its seemingly unending string of agonizing sports losses. Cleveland, 1974, edges out Washington, 1964 on the list of horrific years for cities in sports. But after a brief flirtation with excellence and a run at the Stanley Cup, the Washington Capitals have descended into consistent mediocrity; the Washington Wizards are jokes, among the worst franchises in professional sports; and surely zillionaire adman never imagined the trials he’d endure after buying his favorite boyhood team. What a city with three mediocre teams needs is a fourth to root for, and so DC stole the Expos, giving the good people of Montreal an excuse to shout "Allez les Expos!" one last bitter time. Not to dismiss the thrilling 1925 World Series, featuring the great Walter Johnson, but was robbing Montreal of its miniscule baseball tradition worth the inevitable pundit blatherings about baseball in the nation’s capital, the shameless pillaging of the public trough that stadium deals represent? (And 70% of the city doesn’t even want the team.) Washington is a football town without a decent football team, but the Senators — or Nationals, or Grays, or Monuments — seem unlikely to capitalize on their lack of success; under the control of the 30 other major league owners, the Expos were bled down to nothing. As recently as 1994, they were a powerhouse; last year, they lost nearly 100 games. And so they will rumble into Washington. The last two teams to play in Washington struggled and ended up in Minnesota and Texas. Who knows if the third time will prove lucky? Perhaps DC should name its team not after senators or monuments but after an American hero and a team that, unlike so many others, almost never let fans down.
Were a Victorian Englishman of good breeding seeking a book for his children, he would have many choices; the explosion of publishing had trickled down into the children’s market, producing a wealth of offerings. Many were didactic and moralistic; many were repetitive and wooden; many, then as now, were both. But in 1865, there would have been only one choice: the phenomenally best-selling and critically acclaimed book by an obscure Oxford lecturer in mathematics named Charles Dodgson. Dodgson’s had entertained his boss’s daughters on a rowing trip in the summer of 1862, and the story he told the Liddell sisters — Edith, Lorina, and most especially Alice — were going to make him famous beyond his wildest dreams. Child mortality was a constant concern in nineteenth century London, so children needed to learn right from wrong lest they go to Hell. But despite his position as a deacon — at one time in line to become a rector — at Christ Church, Dodgson wasn’t concerned with children’s spiritual salvation; he had embraced heterodox ideas, rejecting the idea of eternal damnation. Boys of Dodgson’s class were expected to learn the manly virtues of honesty, fair play, and hard-charging competition at sports. But Dodgson, unlike Tom Brown, had loathed his time at Rugby School. He was terminally shy, uncomfortable around other boys, unathletic, a stammerer. What Dodgson liked was little girls, logic, the theater, nonsense, and the light verse he tirelessly penned under the name "Lewis Carroll".
Once, there was a company with a monopoly so entrenched, a technological advantage so large, a place in the economy so vital, that the government felt that it needed to step in to protect consumers. Actually, it was more than once; ever since Henry Lloyd started raking muck and talking about exactly how the Vanderbilts operated, making moves against the largest and most competition-free companies in the country had been a pursuit of the federal government depending on the political climate; the Sherman Antitrust Act wasn’t only for hassling unions. And so it was that, in addition to railroads and oil companies, the Federal Trade Commision has occasionaly seen fit to investigate information technology. U.S. vs. Microsoft gave Larry Lessig a platform, David Boies a new career, and the world quality entertainment, but it produced little else. The antitrust case launched against IBM in 1972 was such a notorious debacle that many claimed that antitrust was dead. But against the technological juggernaut of the American Telegraph & Telephone company, the FTC drew blood, spurred innovation, saved money for tens of millions of Americans, and crippled one of America’s most storied companies.
Before they were comic book legends, Jerry Siegel and Joe Shuster were a couple of Cleveland high school students; The "Man of Tomorrow", a New Deal superman fighting gangsterism and leaping tall buildings in a single bound, became a huge commercial success, giving the world an icon, the Golden Age of Comics its selling point, and the industry a character to clone; when Shuster and Siegel thought they weren’t getting their fair share, they sued. Beginning in 1947, began a series of lawsuits over ownership of the character. Amazingly, 50 years later, the legal issues of who owned Superman were still murky. The two struggled through years of poverty before DC Comics offered them a modest annual stipend. Before he was a comic book legend, Gardner Fox was a lawyer, and he could see the writing on the wall. When editors stopped giving him work in the late Sixties, Gardner resumed his one-time career of writing pulp fiction, making a successful living churning out spy and romance novels, westerns, mysteries, and science fiction under a variety of pen names until his retirement. Gardner, who wrote everything from romance comics to funny animal books, but his lasting stamp was in the superhero genre. Frank Gorshin’s Riddler is based on a Gardner script from the 1960s; he created Hawkman and the Flash; with Julius Schwartz, he almost single-handedly started the Silver Age of comics through his revivals of The Flash and The Green Lantern; and he created the superhero team-up.
For most of his lifetime, only a privileged few knew about the work of the late, great Ed Wood. Wood’s fobiles were obvious: the dreadful actors he dug up, jaw-droppingly appalling production values (Wood’s best known movie, the breathtaking Plan 9 from Outer Space has a continuity and common senes errors list that runs into the dozens of entries, and that’s not even counting the famous double for the late Bela Lugosi, who died just after filming had begun), a seeming lack of familiarity with the basics of establishing a scene or a character or even a shot. But they brought him some notoriety late in his life; an alcoholic ruin, he had drifted away from the little slice of Hollywood he had carved out for himself and was making softcore movies and writing hardcore books. The Golden Turkeys rescued him from obscurity. Randy Dreyfuss and Harold and Michael Medved cranked out a book giving their views of the worst films in the history of Hollywood, and Ed Wood, that conflicted visionary who had brought the world Bride of the Monster and The Sinister Urge, was their selection for the worst director in Hollywood’s history and Plan 9 the worst movie. Wood’s reputation grew; film festivals and midnight screenings began to occur; people like Tim Burton, who would go on to make his best and least likely movie about Wood’s life, discovered his work and became fans. And why not? People love exploitation films. Whether they’re made by the delightful Russ Meyer (who describes his work variously as "crowded on one side by hardcore films and on the other side by major product that is very explicit" and a worthwhile opportunity to get into starlets’ pants) or the repugnant Andy Milligan, cheap movies can be dreadful (the climax of a current favorite of mine, Rock ‘n’ Roll Nightmare, features a hair metal guitarist wrestling Satan WWF-style), but they are rarely boring.
People can misplace many things: hundreds of poems (link via Bookninja), silent film masterpieces, the occasional ship’s crew. But losing an entire city still stands out as an impressive feat. Plato’s Atlantis — and his apparent original source material, the Egpytian "pillar of the sky" Keftiu — has been variously identified as the Cretan island of Thera, destroyed by a volcanic explosion, the Iberian peninsula, Antarctica, and, most recently, Ireland. (It has also been identified as the product of space aliens and a home for prehistoric life, the latter being a particularly nice lost world touch; the Lovecraftean city beneath the waves off the coast of Cuba seems to have fallen out of fashion, but presumably somewhere in the world someone is working on a book proving the relationship between Havana, Atlantis, and Innsmouth.) People who are convinced that they’ve stumbled onto the true meaning in Plato’s words seldom seem to consider the possibility that Plato was just borrowing a myth, using an invented example to prove his points. Any number of counterexamples showing that Plato knew of this obscure technique or that their theories defy Occam’s razor go unheeded. Fortunately, Johann Ludwig Heinrich Julius Schliemann was just such a true believer.
Believe it or not, Leroy Ripley was a ballplayer; he played semi-pro ball in Santa Rosa and tried out for the New York Giants while still a cartoonist. But he never made it to the bigs, so he had to fall back on every sport’s fan’s backup plan: a headful of trivia. Ripley was working as a sportswriter; his great contribution to the vernacular of his trade was to coin the phrase "Murderer’s Row", commemorating the potent bat of hard-luck Wally Pipp, the man who missed a game for a headache and saw an unknown rookie named Lou Gehrig take his space (believe it or not!). His cartoon, Champs and Chumps, arrived 1918, a few years after an injury had ended Ripley’s dream of being a professional baseball player. It featured sports trivia: the first strip, written and drawn as a spacefiller, contained items about backwards walkers and broad jumps on ice. Ripley soon abandoned the sports conceit and sloughed off the name, just as he had abandoned "Leroy" in favor of the more athletic-sounding "Bob". His new strip, Believe It or Not (originally drawn and researched by Ripley; eventually researched by a crack staff aided by the New York Public Library and drawn by assistants and freelancers, including an adolescent Sparky Schultz), was a modest success. And then Ripley took a jab at Charles Lindbergh. He claimed that the celebrated aviator, then at the pre-isolationist height of his popularity, was not the first or the second man to fly across the Atlantic, but the sixty-seventh. Outraged letters poured in by the tens of thousands, but Ripley was right — two airships had made the journey before — and Ripley’s next book sold hugely. From there, Ripley moved on to radio, television, and, of course, his Odditoriums. The Odditoriums are descendents of the dime museum and the carny sideshow; as any visitor will tell you, the wax figures and portraits in toast have a certain charm, but what I think really sold tickets was the shrunken heads. Jivaro’s shrunken heads may not be the Körperwelten, that Germanic corpseworld, industrial charnel house (link via Long story, short pier), but shrunken heads are plenty outlandish enough to put fannies in the chairs. The Mütter Museum has the Ripley-esque thrills of giant tumors, human horns, and monster babies, the Philippi Mummies are both creepy and unthreatening (with the appeal of the mysterious embalming fluid created to rival that of ancient Egypt), but the Jivaro heads (most originally from Ripley’s own collection) appeal to impulses that would have made Barnum proud. They allow tourists to think about hearts of darkness (no matter that they’re from South America) and the superiority of Orlando, Chicago, and other outposts of Ripleyana. They allow people to gape at freakish spectacle while masking it as mere anthropological interest. They invite the viewer to ask whether they’re real or fake, and, in either case, how they were made. And they are, in their own horrible way, iconic and even cute. Racing backwards and broad jumping couldn’t hold a candle (not even that of the Lighthouse Man) to a genuine artifact of oddity like a shrunken head. And with a showman’s insight that might have done Barnum proud, Ripley prospered, becoming the first cartoonist to make a million dollars, marrying a beauty queen, and devoting less and less time to his strip and more to his travels before dying young during the filming of the first season of his television show. It’s since been revived twice, and today the name, the still-running comic strip, and all forty-odd Odditoriums are owned by Jim Pattison, a Canadian billionaire. And if you read the story in the newspaper, you just might not believe it.
Mathematician and engineer Claude Shannon single-handedly invented the field of "information theory" with his seminal paper "A Mathematical Theory of Communication". Shannon spent the Second World War working on anti-aircraft weaponry and codemaking (his trans-Atlantic colleague Alan Turing had worked more as a codebreaker); after the war, he worked as a researcher for Bell Labs, and it was while engaged in this more prosaic pursuit that he made the conceptual leap that will mean that scientists two hundred years from now will know his name even if his wonderful toys, scientific juggling, and computer chess are forgotten. While working on the problem of noisy phone lines, Shannon proved that it was possible to reduce the possibility of errors in the transmission to arbitrarily small numbers, simply by using well-designed error checking algorithms. (Designing these algorithms then occupied very smart people in the telecommunications industry the next fifty years.) Showing this required Shannon to invent the idea of a communication’s "entropy", thus providing a measurement of the padding built in. Language is fragile where it is not redundant; a missed digit in a binary number could be disastrous, but every English reader can figure out who "Wm Shksper" is meant to be. From Shannon forward, information theorists looked at language as hovering somewhere between redundancy and noise. Information was surprise.
Fay Wray died last week at 96. Her career started in the days of the silents; she starred in Erich von Stroheim‘s great (if not universally acknowledged by her contemporaries) The Wedding March. But it was for three decades worth of B-movies that made her famous. Wray starred in Doctor X (not the Humphrey Bogart-starring sequel or some other Frankensteinian mad scientist flick), Dragstrip Riot, The Most Dangerous Game, Hell on Frisco Bay. And it was one B-movie, perhaps the greatest movie at all, that made eulogies for a 92-year-old actress who hadn’t been in the movies in over twenty years and whose period of greatest fame was seventy years behind her. But what a B-movie it was: King Kong. Kong, the first and greatest of screen giants, was inspired by naturalist Douglas Burden’s 1926 discovery of the Komodo dragon, which proved the existance of the gigantic lizards, long presumed extinct. Producer Merian Cooper had done work on nature films — King Kong is basically a nature film writ large — and he reimagined Burden’s giant lizard as an even more giant ape. Skull Island is the lost world of the picture; the first time one sees the movie, it’s almost shocking how much of the film is spent watching Kong battle the denizens of Skull Island rather than travelling to New York. Animation pioneer Willis O’Brien created a masterpiece, a stop-motion ape who seemed to be genuinely acting, genuinely interacting with with the other characters (thanks in no small part to Cooper’s novel improvements to the rear projection technique used to superimpose actors on previously filmed footage). And then, of course, there were the Manhattan sequences, setting the standard for rampaging urban monsters for all time. All it took for cinema immortality for Vina Fay Wray, a ranch girl from Alberta who took drama classes at Hollywood High, was one great ape’s fist, one iconic Art Deco skyscraper, and one mighty scream. They are forever linked. For fifteen minutes last week, the Empire State Building dimmed its lights, paying tribute to the most famous woman ever to make it to the top floor the hard way.
Almost from the very first, there were monsters stalking the silver screen. In 1895, the Lumiere Brothers invented the experience of seeing a movie with an audience, and a man at that first performance, Georges Méliès, decided to buy a camera of his own. But the Lumieres were documentarians; Méliès was a magician, and he was astounded by the tricks he could commit to film. His Le Voyage Dans La Lune, with its Wellsian Selenites (acrobats Méliès recruited from a Parisian theater), is a classic; its famous shot of the man in the moon remains instantly recognizable. But Méliès went beyond whimsy and charm; he wanted to shock, to thrill, to put the audience in his theater’s seats, even if they’d only need the edge. He played the devil twenty-four times, made movies about walking skeletons and mad scientists, and crept up towards the first vampire film. Goerges Méliès was interested in making magic on the screen; the horror movie was simply an intersection between commerce and his love of special effects.
It was wrapped in a poplar’s roots when Olof Ohman found it, grubbing a stump outside Kensington, Minnesota, in 1899. He couldn’t read it, so he gave it to a banker in town; the banker couldn’t read it, so he sent it on a professor at the university, and there it was translated: "After we came home found 10 men red with blood and dead." That professor, O. J. Breda of the University of Minnesota, dismissed it as a fake. The runes were wrong, the dates seemed wrong, and there weren’t any Goths and Norweigans in Minnesota in 1362 to chisel a record of their mysterious fate into stone. A few years later, Newton Winchell of the Minnesota Historical Society and a geologist, took a closer look at the stone and Ohman’s account of finding it and pronounced the Kensington Runestone genuine. And experts have gone back and forth on the matter ever since. For instance, there were discrepancies between the runes scholars in the early twentieth century would have expected from an expedition in the fourteenth century expedition. Later scholars noted that a number of the oddities can be found in historical sources. Still later, a nineteenth century letter using the same slightly weird runic alphabet was discovered; what was thought to have been proof of a fourteenth century origin might simply have been a Scandanavian trade code. But if the runestone’s supporters were not discouraged when expert opinion universally held that it was fake, they won’t be satisfied by the mere appearance of a suspicious similar set of letters five hundred years too late.
Some people cruise to Vancouver to watch whales; some people cruise to the Bahamas to get tan; some people, equipped with $20,000 and a complete disregard for scientific opinion, will cruise to 84.4 N, 141 E abord a chartered Russian icebreaker in search of the entrance to the hollow earth (link via Les Orchard). The idea of mysterious underground civilizations reached via a hole at the North Pole dates to John Cleve Symmes 1818 declaration that "the earth is hollow and habitable within, containing a number of solid, concentric spheres, one within the other, and that it is open at the poles twelve or sixteen degrees"; it wouldn’t be much fun to just take a cruise to a hole, however, so passengers and crew of Voyage Hollow Earth are probably hoping to discover something more exciting — perhaps a "lush green hole", a hidden piece of jungle within the earth in the Arctic Circle. With any luck, this savage land will not contain bare-chested Tarzan clones. But the idea of a lost world filled with prehistoric creatures wasn’t original to Stan Lee; he cribbed it quite neatly from Arthur Conan Doyle’s well-titled book about a mysterious plateau in South America. Part of a series of scientific romances by Doyle, the book was a reasonable depiction of state-of-the-art Victorian paleontology. If the Yanomami people of the Amazon could live isolated, without the wheel, steel tools, or a counting system that went beyond "more than two", why couldn’t the lost valley of Shangri-La exist? (It did, almost; it was part mountain kingdom of Tibet, not opened to the outside world until the Great Game of struggle for Central Asia ran straight into it, part Hunza, that secluded valley of green-eyed Pakistanis in the Hindu Kush.) If Australia could be home to oddities like the Tasmanian tiger — which some insist did not die out sixy years ago — why should it be so odd for the coelacanth to turn up off the coast of Africa in 1938, calmly swimming as though it were 400 million years ago? In 1933 and 1934, the quiet tens along the Great Glen Fault were overrun by newspapermen looking for a dinosaur-like creature in Loch Ness. The creature might have existed, once, but the surgeon’s photo that fascinated cryptozoologists for decades was, alas, a fake. It would be a clever wee creature to survive thousands of years in a populous area without anyone knowing, but the idea sold newspapers and when reporters squinted just right, it seemed possible. And so when Arthur Conan Doyle showed what seemed to be newsreel footage of dinosaurs at play, the New York Times admitted that it had no idea what to think. Revolutionary discovery? Spiritualist message from beyond? Clever hoax? It was none of them; it was the work of a genius named Willis O’Brien, and they called it The Lost World.
"I believe the moon is rich in gold!" That’s how the great Fritz Lang’s script to his tension-packed 1929 silent film Frau im Mond (Woman in the Moon). As is traditional when crackpot visionaries make bold pronouncements in science fiction movies, the philistines in the staid scientific community jeered, jeered, but Professor Manfeldt proved them wrong by constructing an atomic rocket ship and voyaging to the moon (with his chief engineer, a rugged German pilot, a cunning American financier, his daughter, and a young stowaway, neatly rounding out the standard spaceship travelling pack). Lang’s movie is surprisingly technically plausible for 1929 (including use of a recognizable countdown to liftoff), thanks to his technical consultant, rocket scientist Hermann Oberth. Oberth was, in fact, a crackpot visionary jeered at by philistines; his doctoral thesis had been rejected, the idea of rockets in space being too ludicrous to satisfy his committee. But Oberth’s young assistant, Werner von Braun (who would create both Hitler’s storied V-2 and NASA’s Saturn V booster rocket) would live to see the professor’s critics proved wrong when man landed on the moon. There was no breathable atmosphere, unlike Lang’s script, but there were no Selenites, either. The Selenites were the beings that lived on the moon in H.G. Wells’ The First Men in the Moon (in which the moon is reached using the motive power of an anti-gravity sphere); the term is more generally applicable to any race of moondwellers, such as the terrifying Kalkars, the savage race of moon people warring with the civilized U-ga and promoting disquieting racial stereotypes in Edgar Rice Burroughs’ The Moon Maid; the earthlings in this one reached the moon via a rocket ship, the Barsoom, but the aliens all live in the moon’s interior. Cyrano de Bergerac travelled to the moon in a much more civilized fashion, using a sort of homemade glass balloon; Baron Munchausen‘s was simpler yet, involving simply being blown off course by a monstrous gale. Munchausen’s moon was a remarkable place, with fleas the size of sheep and deadly radishes. When the Eagle landed in the wonderfully named (thanks to selenographer Giovanni Battista Riccioli) Tranquility Base, fulfilling Kennedy’s promise to land a man on the moon before the decade was out and sparing Nixon the need to announce the astronomer’s deaths, they, of course, found no such thing. They found a lot of rocks and dust. Is it any wonder that conspiracy theorists, Hollywood screenwriters, and the Fox network are so eager to assert that the Apollo landings were faked? Unless Buzz Aldrin bounced around in a sound studio somewhere (a touchy subject for him at this point), Charles Fort was wrong, and astronaut’s footprints will be the only transitory lunar phenomenon and the only man in the moon is the one we tell ourselves we see.
We may never know who killed union boss Jimmy Hoffa and where his body ended up (Giants Stadium is a more romantic option than a gravel pit in Michigan). The Monster of Florence may remain forever anonymous, despite tantalizing hints of wealthy libertine occultists getting their kicks through ritual murder. And despite the role it played in fanning the "Popish Plot" hysteria‘s flames in Carolingian England, leading to the martyrdom of Saint John Wall, and the attention of no less a scholar of mysteries than John Dickson Carr, the people who killed Sir Edmund Berry Godfrey probably got clean away. On the other hand, history has no doubt who killed Captain James Cook. He died at the hands of angry Hawaiians, bludgeoned to death while his boat was Kealakekua Bay in February, 1779. He was one of the most celebrated and travelled men in the world, and his death was recorded by his crew; the question they couldn’t answer is why he died.
Laura Ingalls Wilder‘s Little House series has delighted generations of readers; Wilder, the daughter of homesteaders, a prairie girl and a schoolteacher of fifteen, was a heck of a storyteller, even if (as many suspect), the books were largely ghostwritten by her daughter, journalist and novelist Rose Wilder Lane. What the television show never quite communicated was that Ingalls Wilder’s story was largely one of privation; during the winter of 1880, shortly after the term "blizzard" was first applied to massive snowstorms, a breathtaking 132 inches of snow fell in the Dakotas (and Minnesota didn’t have it much better, although perhaps some of them took it as a point of pride). It was, without a doubt, damn cold in Laura Ingalls’ little town of Desmet during that long winter. Life on the prairie was rough. Easterners travelling to California didn’t call the plains states the Great American Desert for nothing. Food was scarce. With wood foreign to the Great Plains, the pioneers built sod houses (and other buildings), but the wind howled constantly, sometimes literally driving people insane. While some pioneers sang "Home on the Range", others sang "Dakota Land":
Looking at early maps of North America can be disconcerting; in the sixteenth century, only the bare outlines of the shape of the continent were known. But the explorers of the New World knew what they would find: dragons. Not actual dragons, necessarily; only one map that ever read Hic sunt dracones, and dragons were more more properly situated in Asia. But they would find strange and unusual things that nonetheless were strange and unusual things they had a context for. Maps of the world at the time of Columbus’ journey were far more detailed than those of the days of Herodotus, the Greek "father of history"; European exploration had mapped out much of the Atlantic coast of Africa, for instance. Francis Bacon‘s dictum that empirical evidence should be believed before received wisdom had yet to take hold, however, so the leading minds of Europe strenuously tried to fit the new continents into classical knowledge. As one book on the conflict between the power of tradition and the shock of discovery. John of Plano Carpini, the Papal ambassador to the Mongol court, had written of his voyages and the wonders he had seen, but when Marco Polo, the Italian merchant, had described fanciful nonsense like furry chickens and nuts in Sumatra as big as a man’s head that were filled with a clear liquor, few believed him. Herodotus had described the flying snakes of Arabia and gold-digging ants of India. Ptolemy had told of the Mountains of the Moon. But those were ancient times.
Ovid didn’t even have a guess; when his Metamorphosis told the story of story of Phaeton, the son of Helios, and his fateful joyride in his father’s blazing chariot, he gives details about the sun’s descent and the effect it had on the rivers of the ancient world:
Samuel Slater disguised himself when he came to Rhode Island because leaving the country meant breaking the law, becoming an intellectual property pirate; the crown, seeking to protect English industrial supremacy, restricted the ability of skilled craftsmen like Slater to leave the country. The patent system was originally designed to help people make money from an idea without forcing them to keep it secret like the formula for Coke or the blueprints for the eighteenth century spinning mill. But patents were designed for actual mechanical devices and chemical formulae; simple ideas were never supposed to have the same level of protection. Ideas are a dime a dozen; if the great Theda Bara had been allowed to sue Myrna Loy for stealing her trailblazing "vamp" persona, Loy’s career might never have gotten off the ground (and today almost all of Bara’s hugely popular films, including her Cleopatra, are lost). If Moxie had been able to tie up the idea of sweet, fizzy drinks, the soft drink ecologoy would never have flourished in the New World. And if one person — even if that person was a litigation-prone mercenary looking for trouble in Afghanistan — could have exclusive rights to stories about nuclear terrorists being thwarted by cocky Green Berets who never played by the rules, late-night cable would be a lot more barren. But millions of dollars can be made from ideas, so it’s not surprising that people try to cash in. Small software developers who feel (rightly or wrongly) that Apple Computer, which arguably stole the rudiments of modern mouse-and-windows computers from Xerox, has bullied into their territory, often wonder what can be done about it; can their breathtakingly clever idea be patented? It shouldn’t be, but ever since the Lotus "look and feel" case, which established that copyright alone couldn’t protect the way a piece of software operated, companies have been trying. The gates were unlocked when the Supreme Court ruled that business methods could be patented, it’s been open season. Crustless peanut butter and jelly sandwiches and using a laser pointer to entertain your cat are now patented with the full weight and majesty of the United States Patent and Trademark Office. Is there a better way? A small band of entertainers in England came up with one in the 1940s; to ensure that they didn’t violate anyone else’s intellectual property, they created a voluntary central repository that newcomers to the industry could search and which could settle disputes over precedent. The Clown Repository paints portraits on goose eggs to ensure that no two clowns look alike. Details of the Repository’s enforcement mechanisms are sketchy, but surely restraining orders and million-dollar lawsuits are as nothing compared to the threat of a pie in the face and the tears of a clown.
Today, with great fanfare, Alan Greenspan and the Federal Reserve Board announced that interest rates would be raised. Greenspan’s role in the world would have seemed remarkable to any number of past generations. Thirty years ago, Greenspan’s visibility might have been surprising, but Paul Volcker’s importance in breaking the back of inflation in the Seventies (and Greenspan’s near-deification during the Nineties boom) vastly improved the Fed Chairman’s visibility. Eighty years ago, it might have shocked government officials that interest rates would be used as a tool of economic policy to manage growth, but we’re all Keynesians now. Two hundred years ago, the very idea of a national bank — a series of federal reserve banks, in fact — was a controversial one; Andrew Jackson fulfilled his election promise to decharter the Second National Bank (the first’s charter having expired in 1811), and the reserve bank system was not established until the twentieth century. And a thousand years, the mere act of charging interest could have gotten Alan Greenspan excommunicated or worse; following the various dictums of Aristotle, Leviticus, and the Koran, usury was prohibited.
In its prime, US Route 40 was America’s Victory Highway. It ran from New Jersey to California, its dusty signs and roadside attractions — from the muffler men to metaphors rising in the desert — providing an American iconography. Along with its sibling, Route 66, Route 40 provided an echo of the American frontier; the nation’s families could pile into their tailfinned chariots and stick a thumb in Frederic Jackson Turner‘s eye, if only for the space of a vacation. But at the end of June, 1956, Dwight Eisenhower created the interstate highway system with the stroke of a pen, and Route 40 began its slide into irrelevance. In 1964, California decommisioned Route 40; it was a national highway no longer. The diners and flyspecked tourist traps remained to be examined by students of Americana, but the towns that relied on the commercial traffic of Route 40 were doomed. Brownsville, Pennsylvania is a ghost town in a way that not even the distressed steel towns like Alliquippa can quite match up to; its attractive downtown has simply been boarded up and abandoned, perhaps in hopes that its stretch of US 40 will one day hum with commerce again. But when cities lose their commercial reason for being, when jobs go away, they tend not to come back. Mining towns out west and in the Far East don’t have much reason to exist once the mines aren’t being worked. Diesel engines killed the need for pump towns along the railroad lines of rural America. Engines have been the lifeblood of Motor City, USA since Buick started making them there in the nineteenth century, but as industry has moved away from Detroit and Flint, little has sprung up to replace it. Ruins can coexist with a thriving city, but the slow depopulation of postwar Detroit (dramatically accelerated by white flight after the riots of the ‘60s and ‘70s) has left a hollow city, and neither the population decline nor the disappearance of the city center has stopped yet. A thousand years ago, the Fremont people vanished, leaving their ruins — whole towns, in some cases — behind them. Five hundred years ago, the Khmer empire began to disintegrate, with the amazing ruins of Angkor Wat remaining as a remnant of their power. Living cities don’t have abandoned lots taken over by farmland, because living cities have a reason to exist. When the reason goes, the people go, and the weeds and the rats move in. Without their purpose, cities revert to jungle and prairie. If Detroit’s fate will be that of Angkor Wat and Pumpville, Texas, what stories will we tell ourselves to explain where it went?
Many people can lay claim to the title "father of the computer": John Mauchly, Atanasoff and Berry, even Charles Babbage, who died in 1871. But British mathematician Alan Turing‘s claim is as good as any. Turing’s university studies under Max Newman led him to study the "halting problem", the question of whether one could tell in advance whether any formally defined problem in mathematics could be solved by a given algorithm in finite time. For instance, an algorithm could be written to test every possible factorization of a number to determine whether it was prime, which would take a large but finite number of steps. An algorithm attempting to find an odd perfect number, however, might run forever or it might not; while mathematicians haven’t been able to find one and can demonstrate that none smaller than 10300 (a number many orders of magnitude higher than the number of atoms in the universe) exist, they haven’t been able to prove that there are none to be found. Solving the halting problem for the odd-perfect-number-finding algorithm (which can be written in just 43 characters of C code) would prove the existance or non-existance of such a number, enabling mathematicians to solve an enormous number of currently intractable problems. But through a similar jujitsu involved used by Gödel in his incompleteness theorem, Turing demonstrated that it couldn’t be done. There is no general solution to the halting problem; we can wind up a mathematical clock without knowing whether it will wind down. Turing’s proof led him to develop the concept of the Turing machine, a sort of idealized computer (back when the term was used to describe people with adding machines), the building block of theoretical computer scientist. But a hugely important proof, even one linked to Gödel’s (beloved and often misunderstood) theorem, doesn’t always result in one getting namedropped in Neuromancer and used as a character by Neal Stephenson. Alan Turing’s fame in popular culture comes not from his mathematical work but from a bit of philosophy.
Guy Ballard went up Mount Shasta a seeker after wisdom; he came down possessing the knowledge imparted to him by those who had manifested "Luminous Essence of Divine Love." Ballard, who had long studied the Theosophical writings of Madame Blavatsky and her followers, adopted the pen name Godfre Ray King and wrote bestselling books about what the Ascended Masters atop Mount Shasta had told him; first in Los Angeles and then in Chicago, New Jersey, Santa Fe, Ballard and his wife Edna began spreading word of the three-fold truth:
The Detroit Pistons just beat the Los Angeles Lakers in the biggest upset in the NBA finals in the last thirty years. Sharp-eyed observers may note that while there are plenty of pistons in Motor City, there are no lakes in Los Angeles. (There’s is, however, a river; the description of it as "treated sewage and oily street runoff" can be used as a metaphor depending on how one feels about L.A.) How did the most successful profession sports team in the City of Angels get such an unrepresentative name? They brought it with them; like so many Angelenos, the Lakers are immigrants. The Minneapolis Lakers played in the land of a thousand lakes between 1947 and 1960, before leaving for the lights of Hollywood. The Lakers kept the name upon arrival in California. The Utah Jazz, similarly, get their name from the fact that they originally played in New Orleans, not from a secret fondness for Max Roach out on the Great Salt Lake. Team names used to simply be nicknames. The Pittsburgh Pirates got their name thanks to their success in snatching successful players from other clubs; the dreadful Cleveland Spiders became the Naps, after their Napoleon Lajoie, then the Indians in reference to Louis Sockalexis, a Penobscot Indian from Maine (or, quite possibly, as a cheap marketing gimmick); the storied Brooklyn Dodgers were first the Trolley Dodgers, then briefly the Brooklyn Robins. But names are more stable now; basketball’s Washington Wizards were unveiled in 1996, the owners having sensibly decided that the murder capital was a poor choice to host a team named the bullets (and having resisted the urge to go back to calling the Zephyrs). The Houston Colt .45s became the infinitely less bad-ass Astros in 1966. As a general rule, teams keep the same names; major league sports are a big business, and the brief uptick in sales when a name switches might not make up to long-term damage to the brand. (How many more Colt .45 jerseys would sell throughout America?) So even generic names that bear no real relationship to a team or its environs — the New Orleans Hornets, the Tampa Bay Lightning — have a long and happy life ahead of them. Or they will until the teams they are attatched to move elsewhere: wherever Macon’s former minor-league hockey team is now, I wish them well, but when you’re not in Macon, you’re just not a Whoopie.
Beaver stole fire from the pine trees. Man stole fire from Kondole the whale. And Prometheus stole fire from the gods. But every advance in illumination since has resulted more from hard work and a little bit of luck. Michel-Eugène Chevreul invented the clean-burning candle after years of studying the chemical composition of dyes. William Procter, candlemaker, and James Gamble, soapmaker, were two young craftsmen in the bustling river town of Cincinnati when they met; their wives were sisters, and on the back of that connection, a vast commercial enterprise was built; as the more interesting historical bits of Richard Powers’ Gain lay out, the waste products of soap could be used to make candles and vice versa. Proctor & Gamble and their cousins (and the nineteenth century invention of the candle-molding machine) helped drive the practice of chandlery out of the home and into the factory. The discovery of spermaceti, the substance that gives sperm whales their name, provided gainfully employment to Melville‘s whalers, as they set forth from Nantucket and the rest of New England looking for sperm whales. Whale oil was used not just in lamps but also to make hard, sweet-smelling, long-burning candles. And when the Civil War disrupted the whaling industry, the men drilling for oil in Pennsylvania created a market beyond patent rheumatism cures; the invention of kerosene and kerosene lanterns vaulted fossil fuels ahead of whale oil as personal lighting technology. Then came Edison. Although the technology behind the lightbulb predated Edison’s team, they created the first long-lasting carbon fillament, capable of lasting for hundreds of hours (and more; one bulb has lasted for a hundred years, meaning that Phoebus should be around to turn the poor thing off any day now). But the incandescent bulb generates light as a side effect of its electrical resistance; it might be better thought of as a heat source we can read by. Newer technologies, such as compact fluourescent bulbs, which are some four to six times more energy efficient, have become more popular; the energy savings that would be realized by a nationwide conversion are immense enough that Lawrence Berkeley Laboratories, more usually associated with nuclear physics and supercomputer engineering, has a design lab for better lamps. But there’s life in the incandescent bulb yet; bulbs using longer-lasting, more efficient "nanotube" filaments may hit the market within five years. That’s five more years, at least, before the world faces a crisis even LBL’s scientists — even Gyro Gearloose — might not be able to solve: when the incandescent lightbulb goes, what icon will the world use to indicate a bright idea?
In the fourth century, the former bishop of Avila, a man named Priscillian, was executed. He was a heretic; his beliefs were largely Manichaean, he asserted the accuracy and doctrinal relevance of apocryphal scriptures, and his claim that lying in service of a greater good was no sin roused Augustine to issue a rebuttal. These were not the crimes he was executed for. The crime that earned Priscillian of Avila the sword in the year 385 was sorcery. Priscillian was killed for witchcraft, and he would not be the last. In 1324, for instance, Alice Kyteler, the Witch of Kilkenny, escaped a death sentence with the aid of her family, but her maid, Petronella of Meath, became the accused witch burned at the stake in Ireland. Pope Alexander IV had ordered investigations limited to those witches who were also heretics; in 1398, theologicians had decided that all witchcraft was innately heretical. In 1486, two experienced witch hunters published a manual, the Malleus Maleficarum (The Hammer of Witches), and a new day dawned for witch hunters across Europe.
A handful of people who got out of bed at dawn this morning and strapped on welders’ goggles got to witness something no living person had seen: the Transit of Venus, a sort of miniature eclipse (as Venus appears to be much smaller than the moon from an earthbound perspective, the vast majority of the sun is not blotted out). Last observed in 1882, the Venus transits were once major events. Seventeenth century observers became astronomical icons; eighteenth century observers travelled the world in attempted to see what could be seen; in the nineteenth century, Congress put aside hundreds of thousands of dollars to finance the American Transit of Venus Expeditions and John Philip Sousa commemorated it with a march. But as the near-neighbor who writes Slate’s "Chatterbox" notes, today’s transit was largely a non-event. The nineteenth century was the high point of America’s fascination with the sky; it was the dawn of American universities on the European model, with observatories springing up at colleges and even high schools. Drawings of the sun had been crucial pieces of data ever since Galileo first recorded a sunspot, and the 1882 transit was to be the moment when the art of photography could be wed to the science of astronomy. And it was; reams of data were recovered, and the accurate measurements allowed astronomers to determine the parallax of the sun, and thus its distance from the earth. But that was a hundred and twenty years ago; we now know the distance of the sun from the earth. Men have walked on the moon and robots have discovered evidence of water on Mars since then. In most cities, it’s too bright at night to see the stars. The last time a large group of people eagerly awaited a rare yet recurring astronomical event, it ended poorly. So the event was largely ignored: a few brief mentions in the newspaper, a flurry of announcements and information from astronomy fans, and the Chatterbox family gathered out on a parking garage in suburban Washington to see a twice-in-a-lifetime event. But whenever there’s a scientific anniversary or event, at least one highly influencial group can be counted on to successfully bring it to the attention of millions of members of the general public, along with references to expert opinion about just why it’s important — as long as it can be represented using the letter "O". John Philips Sousa would be proud.
Rudyard Kipling was perhaps the most popular poet in England at the dawn of the twentieth century. His Barracks-Room Ballads had been a tremendous success upon its publication in 1892, and it propelled him to the top of the list of possible successors to the post of Poet Laureate upon Tennyson’s death; Kipling declined, and in 1896, the post was filled by minor pastoralist Alfred Austin. In 1907, he won the Nobel Prize in literature, the first English writer to do so. His reputation as a major writer has been in decline ever since. T.S. Eliot referred to him as a writer not of great poetry but of "great verse", and George Orwell’s typically thoughtful 1942 essay (link via Stuttercut) on Kipling’s place in history responds by calling Kipling’s power that of a "good bad poet".
Rugby players, as we know, are a tough bunch. As the bumper sticker says, they eat their dead (even if the film version used tofu legs); the modern sport was born when William Webb Ellis, a student at the Rugby School in Rugby, England, violated all the rules of football and civilized conduct by picking up the ball and running with it. The play soon became accepted, but one imagines that Webb Ellis was promptly and soundly thrashed. Rugby students didn’t have to eat their dead, but one imagines they were a touch bunch as well. Under the tutelage of Thomas Arnold, an ordained deacon in the Church of England, who preached that manliness was Christian and Christianity manly. Arnold, the headmaster at Rugby during the early nineteenth century, was the chief proponent of what came to be known as "muscular Christianity" (the phrase comes from a review of Tom Brown’s School Days, a schoolboy novel about Arnold’s Rugby). Arnold felt that the church of his days was effete, emasculated, aesthetic. His Jesus, the one he taught his charges about, was no rose of Sharon or lily of the valleys. Arnold’s Jesus was representative of schoolboy virtues: athleticism, honesty, giving things the good college try. His was the Jesus of the moneychangers in the temple and the Harrowing of Hell.
The idea that suddenly, out of nowhere, a long-lost uncle would pass on a fortune has inspired artists from Frank Capra to Jamaica Kincaid to those interested in less weighty fare. Millions of people want to believe it, which has been the basis for a hundred con games, many notably successful. How much better is it if the rich uncle left not only money, but a crown? So if fashion photographer Phillip von Hessen was a savvy New Yorker, he’d have been hanging on to his wallet when he was contacted by the Helsingin Sanomat, a Finnish newspaper, advising him that he should come to visit the country that might have been his. Finland only gained independence in 1917, and its constitutional planners briefly considered appointing a monarch; their neighbors Sweden and Norway were kingdoms and seemed to get along fine, and naming Kaiser Wilhelm’s son-in-law, Friedrich Karl of Hesse, to the throne could counterbalance the looming threat of Russia. Phillip von Hesse, Friedrich Karl’s grandson, is the man who would be king of Finland, had the German defeat in World War I not forced Finnish politicians to forego their plan and create a republic. For every Don Juan — exiled, stripped of his title, and eventually returned to Spain to watch his son ascend to the throne — or Prince Ra’ad of Iraq, who still has a faint hope of another Hashemite restoration (and has gathered a few supporters, to boot), there’s a dozen members of the Greek royal family who are never going to have a hope of returning to the throne. (Rumor has it that the student directory at Brown University listed Prince Nikolaos‘ last name as "Ofgreece".) And for every Greek prince in exile, there are probably a dozen or more people like Peter Pininski, who discovered that he could trace his ancestry not only back to Polish nobility (Pininski is a count), but to Bonnie Prince Charlie, whose Stuart line was believed to have died out in 1807. Pininski, unlike his celebrated ancestor, shows no sign of plotting the overthrow of the British government. And if computer scientist Mark Humphrys’ theory is right, for every Pininski, there are a thousand of the rest of us, each distantly descended from a Hapsburg or a Plantagenet; Humphrys and colleagues assert that just as almost every actor can be traced to Kevin Bacon, almost every person of European descent could trace their ancestry back to a member of the nobility. The real-life man who would be king, a Pennsylvania adventurer named Josiah Harlan, was driven from Afghanistan (where he had become a tribal leader) when the British arrived; he went home, vented his spleen in a memoir which was never published, then left for California where he began practicing medicine without a license. History does not record attempts to regain his throne. The last prince of Korea’s Joneson dynasty, Lee Gu (link via Stavros), lives in Japan and cannot speak Korean, but he has spent his life pining for a homeland he never knew. What need do we have for fantastic royalty, the Royal and Serene House of Alabona-Ostrogojsk and all the rest? If Mark Humphrys is right, a thousand secret kings walk down every city street, untroubled by the weight of their crowns.
Chung Ling Soo was a New Yorker by birth, a Scot by ancestry, and "the Original Chinese Conjurer" by choice. The magician, born William Robinson, gained prominence thanks to a feud with an actual Chinese stage magician, Ching Ling Foo (né Chee Ling Qua; the good people of New York in 1898 were not sticklers for authenticity in their Chinese names). Robinson basically stole Foo’s act; Foo called him out (a practice magicians such as Harry Houdini engaged in frequently in the nineteenth century; it was often good for business for both parties, and beyond that, magicians were high-paid entertainers who often poached tricks from one another and carried grudges), but then backed down, leaving Robinson to perform as Soo. The newspapers of the time either believed him or played along, and so he was the undisputed Chinese stage magic champion of New York. But Robinson would be as forgotten today as the great Tampa were it not for the manner in which he died; Bill Robinson, Chung Ling Soo, the Original Chinese Conjurer, botched the bullet catch.
The sleepy town of Newport, Rhode Island, is home of the Naval War College and dozens of spectacular robber barons’ mansions dating from its days as a summer resort. It doesn’t seem like it would have been the site of rioting Allman Brothers fans, but in 1971, at the Newport Folk Festival, it was. George Wein invented the contemporary music festival with his Newport Jazz Festivals, dating back to 1954. By 1971, he had branched out to folk festivals and some rock music, and was looking to try something new, so he booked Frank Zappa, Led Zepplin, and a little-known Southern blues-rock band featuring Gregg and Duane Allman. (Jimi Hendrix was turned away, as there was no room on the bill.) By the summer of ‘71, the Allmans were the hottest band in the country, and all hell broke loose. Wein decided never to have anything to do with rock music again. But the Newport Folk Festival’s place in rock history had been cemented six years earlier. In 1965, the folk fans of America booed and the folk musicians of America went apoplectic. The Newport Folk Festival, the most visible stage for commercially successful folk music, was where Bob Dylan went electric.
Were it not for two accidents of history, Ibn Saud and his family might have fallen into the deepest recesses of the history books. But in 1915, Ibn Saud entered into an alliance with the British, against his clan rivals, the Rahidi, who were supporting the Turks. The well-timed backing of the nation that would become the dominant power in the Middle East after the First World War lead to increasing influence and eventually a direct confrontation with the Hashemite dynasty that lead to Saudi control over the holiest city of Islam, Mecca. The second accident was less political and more geo: in 1938, Aramco, the Arabian American Oil Company, struck a gusher. In the nineteenth century, John Rockefeller, son of a patent medicine salesman, got his start in the oil boomtowns of Pennsylvania; Quaker State and Pennzoil got their names for the original center of the world petroleum industry, where a former railroad conductor named Edwin Drake drilled the first oil well in Titusville. Rockefeller would come dominate Pennsylvania’s oil industry, and as later strikes occured, in Oklahoma, Texas, California, his power and influence would grow. He became America’s first billionaire, the richest man in the world, somebody people create conspiracies about. But none of Rockefeller’s Standard Oil strikes matched the great mother of all gushers lying beneath the sands of the country called Saudi Arabia.
P.T. Barnum was a politician, a circus promoter, a cheerful fraud. If his conscience ever bothered him, history does not record it, but perhaps he salved his soul by attending church. Barnum was a Universalist; Tufts University was the first Universalist college in America. And so, as a wealthy promoter of his sect (Barnum’s tract, Why I Am a Universalist, sold 100,000 copies in the first three years after its publication in 1890), Barnum provided Tufts University with Barnum Hall. Being P.T. Barnum, however, he filled the hall with an elephant: Jumbo, whose very name was synonymous with size. Jumbo had been hit by a train in 1885; never one to turn down an attraction, Barnum had Jumbo stuffed and took his body on a world tour. When he returned home, he donated Jumbo and a number of other effects to Tufts, where they quietly sat for decades. (Students placed pennies in Jumbo’s trunk to ensure luck on tests.) In 1975, Barnum Hall burnt down; a quick-thinking university employee managed to salvage some of Jumbo’s mortal remains, however, and today Jumbo retains a place of honor in a school safe, where his ashes are occasionally taken out (in their ceremonial peanut butter jar) and given to coaches for luck before the Tufts Jumbos take on rival teams. It’s a hard life, being a dead celebrity.
"Voting is for old people" hasn’t reached the hallowed heights of banal t-shirt slogans; it’s no "I ♥ NY" or "Virginia is for lovers." It’s not even "Pennsylvania has intercourse". But when the commodifiers of dubiously hip at Urban Outfitters offered the shirt for sale, hoping to make a few bucks off camp, it triggered a very brief media frenzy. The fact that the founder of Urban Outfilters is a semi-closeted conservative, a financial backer of Sen. Rick Santorum, gave the story a little bit of bite, but mostly the concern seems to have been that someone was attempting to make money from teenagers willing to drop thirty bucks on a cheaply made shirt to advertise their apathy. But advertising has always been the t-shirts job; although the plain white t-shirt spread into America thanks to the U.S. Navy (with assists from Marlon Brando and James Dean), it really came into its own when someone realized it could turn the wearer into a walking billboard. People have been proclaiming their political identity through their t-shirts since at least 1948.
Since 1992, a small band of scholars has dedicated itself to the study of an obscure language. They’ve translated Hamlet and the Bible into tlhIngan Hol, their field of study. And although the language is extinct — there are no native speakers of tlhIngan Hol — their efforts and the efforts of a few sympathizers have ensured that millions of people throughout the world have heard at least a few words of it. Documentarians have even put together a movie about their efforts. Many mainstream linguists would kill for the visibility of the brave few at the Klingon Linguistics Institute. Unless Michael Dorn knows something he’s not telling, Klingon is, of course, an artificial language, the project of <a href=”http://www.canoe.ca/JamBooksFeatures/okrand_marc.html
It must be an odd and amusing moment for members of the American Dialect Society when they select their annual word of the year. It’s a brief moment of fame, a chance to crop up on the front page of the Style section and in filler segments on "Morning Edition", but the words and phrases the select are so often ephemeral: "Mother of All", millenium bug, chad, information superhighway. The coining of words and phrases that last are scattershot; when Sheridan wrote his punningly-named Mrs. Malaprop into The Rivals, did he htink that he would be introducing an entire neatly packaged concept into the English language? Lewis Carroll’s portmanteau words were an amusing game (from a man who loved games), but "chortled" (part "snorted", part "chuckled") well and thoroughly stuck, providing translators with something to chew on. When Gen. Ambrose Burnside gave his barber some most unusual instructions, did he think that an entire category of facial hair would be named after him? And then someone had to deny Burnside his chance at lasting fame from people other than Civil War buffs who recall his failures at Fredericksburg and Spotsylvania and Brown University students mooning around Swan Point Cemetary when someone misunderstood "burnsides" as "sideburns." San Francisco Chronicle columnist Jon Carroll is a tireless popularizer of mondegreens (the term is from Sylvia Wright’s mishearing of "The Bonnie Earl of Murray"), misunderstood and frequently hysterical song lyrics of which Jimi Hendrix’s "Excuse me while I kiss this guy" may be the best example. But the phenomenon is an old one, and responsible for words much longer lasting than any sniglet. "Adder", "apron", and "umpire" were originally "nadder", "napron", and "noumpere", just as "nickname" was "ekename" before that pesky indefinite article got in the way. Hoppin’ john was quite pois pigeon, stories about John hopping around the table notwithstanding (unless, of course, it was just a misunderstanding of an obsolete term, "hopping"). In Arkansas alone, you can visit Smackover, Pair o’ Geese, and Buggywitch, the Anglicized replacements for "Chemin Couvert", "Pirogue", and "Bourgawich." So when a dashing young pilot bailing out over the English Channel in a war movie will call in a "Mayday" even in the middle of winter; the term is an international radio convention settled upon by virtue of its similarity to the French for "Help me!", "M’aidez!" It has absolutely nothing to do with the calendar year, pagan fertility rituals, labor demonstrations, or Soviet propaganda, except when headline writers get very, very lucky.
William Shakespeare (who, as we all know, was a prosperous grain merchant from Stratford-upon-Avon and certainly not a cultured and brilliant playwright) was born around April 23, 1564. No one knows exactly when, however; all we have to go on is the day that Shakespeare was christened, and any number of factors could have affected the date. It could have been three days before St. George’s Day, or it could have been more than a week before. We’ll never know; for most Elizabethans, there wasn’t any point in recording the day a baby was born. Instead, they recorded the day he or she was christened. Baptism is a Catholic sacrament, but christening serves a purpose other than baptism, other than allowing a child’s godparents to reject Satan and all his works; it’s traditionally where children were given their names. "Christian" names, as distinct from family names (given at birth) or confirmation names (given on reaching adulthood in the eyes of the church). There’s a reason Clark Kent became "Superman": names are powerful things. From Nigeria to Lithuania, naming ceremonies mark beginnings. The tradition is at least as old as Abraham (born "Abram" and taking his new name before becoming a father of nations) and is at least as current as Russel Jones becoming Ol’ Dirty Bastard (and then Osiris, then Big Baby Jesus, and now Dirt McGirt). Names are a way of marking the dawn of something new. And what happens when we build a ship? We christen it: have a ceremony, give it a name, hit it with a bottle of champagne. That’s what I call a birthday party.
Charles Finney was a newspaperman in Tucson, Arizona for thirty years. On the side, he wrote fiction, but either it wasn’t much of a concern or he worked slowly; after two efforts in the 1930s, he stopped for twenty years. In the late 1950s and early 1960s, he published a bare handful of short fiction in such magazines as Harper’s and the Paris Review. In 1968, he wrote a novella. And, assuming he was a good reporter, that was it for his career as a writer of fictions. His reputation rests entirely on the slim volume that was his first published work, a odd little tale about a circus. The circus is The Circus of Dr. Lao, and it brings some curious characters along to a dusty little town in the desert:
In New York, the musician’s union is gearing up for a battle against a new technology, the Sinfonia (links via Girlhacker). The issue is whether the Sinfonia, a souped-up keyboard that has enabled the producers of a new Off-Broadway musical, The Joy of Sex, to to run its orchestra pit with only three musicians is an instrument or a "virtual orchestra machine". If it’s simply a new and powerful electronic instrument, the musician’s union will be forced to stand down, but union representatives say that the Sinfonia is no ordinary synthesizer. Ever since Elisha Gray hooked keyboards to oscillated metal reeds and Thaddeus Cahill realized that electric signals could generate musical tones, the sawtooth sounds of electronic music have been improving. (The improvements didn’t come in time for Cahill; he and his family managed to build his seven-ton masterpiece, the Telharmonium, but a lack of electronic amplifiers was crippling; after a brief flurry of interest, the project began losing money hand over fist, and today not a scrap of the several models of Teleharmonium — one weighing two hundred tons and occupying two floors — remains.) There were various oddities along the way, such as theremins and the bowed Choralcello, but generally electronic instruments stuck to keyboards and progressed in a rigorously linear fashion, with electronic music improving all the while. The Pianorad, developed by science fiction and radio pioneer Hugo Gernback, gave way to machines with well-known names made by people like clockmaker Laurens Hammond and engineer Robert Moog. Even if the Russian models looked like they could withstand a direct artillery strike, they were still clearly a single musical instrument. Not so the Sinfonia.
Brood X is about to invade the nation’s capital. Brood X is not, alas, something dreamed up by Grant Morrison (or a cheap Alien knockoff by Chris Claremont). Brood X is the largest population of periodical cicadas to hit the eastern seaboard; the cicadas are entirely harmless and vulnerable to predators, so they’ve developed a unique strategy for ensuring that they survive to make the next generation of baby periodical cicadas. They spend the vast majority of their lives underground, and when they surface en masse they do so in such quantity, often in densities of over a million per acre, that predators simply get overwhelmed. There are twelve seventeen-year broods, each locked to a different cycle, and Brood X, the tenth, is the largest, which means people will be sweeping vast quantities of discarded cicada exoskeletons off their stoops and dogs will be eating themselves sick. (Some people may eat themselves sick, too, although I doubt the comparison to soft-shelled crabs will convince the general public.) And they’ll be loud, too; Bob Dylan may have enjoyed their song when he experienced it two cycles ago in 1970, but he didn’t have to sleep with ten thousand cicadas outside his window. If the 1987 brood is anything to judge by, it will be maddening. But it’s hard to begrudge them their moment; they hide underground for years, drinking from plant roots, only to emerge upon a hidden signal when the time is right. If they were called Barbarossa bugs and were released when the nation was in danger, maybe they and their brethren would get a little respect. But then again, their name alone may do that; not dead which eternal lie, and fear for your sanity when Brood X wakes.
Every now and again, a survey comes out designed to show how stupid Americans (and particularly American schoolchildren) are compared to the rest of the world. Americans don’t know where Mexico and Canada are! They can’t identify what language the Dutch speak! They think The Flintstones is solid anthropological fact! This tradition seems to have moved on to England, where a recent poll seems designed to make the British feel as bad as Americans. It seems that Britons are confused about whether the Battle of Hastings or the Battle of Helm’s Deep was real, and so forth. Some of it seems calculated to confuse; the Battle of Endor, featuring those irritating (yet beloved among a certain segment of the population) Ewoks sounds real. If the Witch of Endor could serve as the basis for
The world heard that today is the tenth anniversary of <a href=”http://www.rockrap.com/archive/arch124a.html
A hundred years ago, a craze swept through Europe and America. In the offices of America, it was banned as a distraction that kept workers from their desks; in the cafés of Paris, it was handed from person to person; statesmen and politicans sat down with it in the halls of the Reichstag. It was a puzzle, the "Fifteen Puzzle", the masterwork of the American puzzle king, Sam Loyd. The Fifteen Puzzle was unquestionable Loyd’s most popular work; it sold tens of thousands of copies, spurred on by Loyd’s offer of $1000 to the first person who was able to provide the solution to the puzzle. This sort of offer isn’t terribly unusual; the lure of prizes drives armchair treasure hunts (as opposed to the more active kind, which seems to be mainly driven by prestige). Celebrity magician David Blaine commissioned an armchair treasure hunt from game and puzzle designer Cliff Johnson; Blaine’s puzzle was recently solved, bringing the winner a hundred thousand dollar prize. Unlike David Blaine, Sam Loyd didn’t have to worry about paying up; the Fifteen Puzzle has no solution.
In 1911, Ivor Gurney won a scholarship to the Royal Academy of Music. In 1914, he volunteered to join the British army. In 1915, he was accepted. In 1917, he was gassed at Passchendale. In 1922, he was certified insane and institutionalized, and he never got well. The story that "Batty Gurney" was eternally reliving the horrors of Passchendale are apparently a myth; he continued to write poetry about the war, but Gurney was not suffering from "deferred shell shock" and he knew it. Gurney freely admitted that he had invented the symptoms of shell-shock in order to get improved medical benefits. Although there is some debate about what his condition actually was, he was genuinely ill; one doesn’t spend a decade and a half in a mental institution on a lark. Still, he could have had it worse. The idea of Gurney, the lost (and last) war poet, reliving and responding to the events that shattered him is a romantic one, but twenty years of life during imagined wartime would have served only (as one of Gurney’s poems put it) God’s purpose of pain.
It started, of course, with a bet. Before the Edison Trust and movie stars, there was Occident. Occident was a prize horse owned by railroad baron, former California governor, and wealthy horselover Leland Stanford, and Stanford wanted to settle an argument. When Stanford argued with some of his colleagues about whether a horse in full gallop took all four hooves off the ground, he didn’t mess around. Stories of a $25,000 bet are probably apocryphal, but Stanford was still willing to spend thousands of dollars — a positively enormous sum in the 1870s — to prove himself right. He turned to a well-known photographer Eadward Muybridge, probably the most respected name in the still-new field then working west of the Mississippi. Muybridge was born Edward Muggeridge; he had worked, earlier in his career, under the name "Helios", taking landscape photographs and scenes of California life as well as doing survey work for the railroads. Several years later, the great experiment was complete. Muybridge’s photographs were taken using wet slides, which are terribly slow compared to modern film but were the fastest then available; the pictures of the horse were little more than faint blurs, but they proved the point. A galloping horse did, in fact, leave the ground entirely.
Atlantic City owns a small and curious space in the American imagination. Despite Louis Malle’s best efforts, it’s not location of gangster fantasies; the city has a storied and criminal past, including a 1929 conference of East Coast bosses that lead to the carving up of regional territories among mob bosses in 1931. But for gritty gangster stories, we look to Vegas, to New York, to Jersey and Miami Beach. Atlantic City is the home of Miss America, but increasingly no one cares. Donald Trump reshaped the skyline and destroyed his company. And, alas, nobody has heard of Lucy the Elephant. But as America turns to other places for lost weekends and reminiscing about the good old days of organized crime, Atlantic City will always have one thing that no other city does: Rich Uncle Moneybags.
"Thousands are sailing across the western ocean," sang the Pogues, and they were right. The Irish diaspora dispersed the starving Irish throughout the world: in the Pacific, in South America, to Canada, and, of course, to America. Other poor sections of the British Isles, notably Scotland and Cornwall, had mass migrations in the nineteen century, but the Scottish and the Cornish didn’t plan foreign unrest and stoke fearful nativism. That was the role of the Irish; a conquered people whose way of life had been taken from them. The Irish were a cow-owning people: beef eaters, cattle thieves, and (according to 17th century visitor John Stevens), "the greatest lovers of milk I ever saw, which they eat and drink about twenty several sorts of ways and what is strangest, for the most part love it best when sourest." But the English had come and taken their land and, for the most part, taken their cattle; what they had given the Irish in return was the potato (and possibly not even that; although most credit Walter Raleigh for introducing the potato to Ireland, it may have arrived via trade with Catholic Spain; the English didn’t really know what to do with the newfangled American root). The potato became the centerpiece of Irish cuisine from sheer necessity:
There were giants in the earth in those days. That’s what the book of Genesis says about the post-Fall world of the sons of Adam. These were the nephilim; the origin of the word is somewhat unclear, and although many Bible scholars advance the perfectly sensible interpretation that the nephilim were great in the sense that they were the descendants of antediluvian kings and heroes, a whole mythology of the nephilim, complete with a family tree, sprung up. If nephilim were giants and not "sons of Adam", where did they come from? Were they demons? Extraterrestrials? British goths? One thing was sure to true believers, though: there were giants in the earth in those days. And if there weren’t, George Hull would put them there.
Wet your finger and run it around the edge of a wine glass just right, and you can make a strange keening sound. Adjust the amount of wine in the glass, and you can change the pitch of the sound. That’s the principle behind an instrument developed by Benjamin Franklin, the glass harmonica; Franklin’s version eschewed the water, using a series of glass tubes of different diameters, but the principle is the same. The glass harmonica (alternately the "glass armonica") and its descendents — "the melodion,the eumelia, the clavicylindre, the transpornierharmonica, the sticcardo pastorate, the spirafina, the parnasse instrument, the glassharfe…, the uranion, the hydrodaktulopsychicharmonica" — enjoyed quite a vogue at the end of the eighteenth century, until the work of Franz Mesmer, who used the glass harmonica as background music in his fashionable mesmerism sessions, led people to associate the instrument with madness. But hundreds of works had been written for the glass harmonica. The composer Donizaetti scored an opera’s mad scene for it. What could music purists do when faced with demands for a keyboard glass harmonica for an authentic Mozart piece? People usually arranged the piece for a different instrument, but that often wouldn’t do. The dance of the sugarplum fairies wouldn’t be the same without the tinkling celesta. Early music enthusiasts giving historically informed performances can do all the research in the world, but if results from more modern artists are any guide, guesswork is always going to play a factor. The new gospel compilation Goodbye, Babylon asserts that Texas gospel singer Washington Philips is heard accompanying himself on the Dolceola; painstaking research by Dolceola enthusiasts (it’s a cousin of the Autoharp, apparently, and only obscure instrument fans have ever heard of it) shows that he was playing no such thing, although blues giant Huddie "Leadbelly" Ledbetter apparently used one in a few sessions. And that’s for a commercially produced instrument used after the dawn of recorded music. Trying to reproduce the rattling sounds of Tom Waits’ Bone Machine or a Skeleton Key album would be nigh impossible, as the percussion is largely generated by a motley collection of random junk. There’s a thriving theremin community that’s been keeping that electronic instrument’s sound (the backbone of "Good Vibrations" and a thousand B-movie soundtracks) alive since Leon Theremin was kidnapped and forced to build listening devices for Lenin, but the electronic music pioneers <a href=”http://www.terrascope.org/silver.html
Who can tell when they’re living through history? On April 12, 1994, people who logged on to their Internet providers and checked Usenet (which, in the days before widespread adoption of the Web and bulletin board websites, was the preeminent means of mass communication on the Internet). In addition to the usual — the cranks on sci.physics, the discussion of future Squiddy winners on rec.arts.comics, the bizarre talk on talk.bizarre — people had an exciting message awaiting them: "THE DEADLINE HAS BEEN ANNOUNCED". An Arizona law firm, Canter & Siegel, had targeted an ad at anyone on the Internet trying to enter the American green card lottery. Whether this service was useful was beside the point; Canter & Siegel had figured out how to abuse the system to contact hundreds of thousands of potential customers. As one somewhat bewildered user reported, "Everywhere you went, it was Green Card, Green Card, Green Card." The green card lawyers had, for all intents and purposes, invented spam.
In early 1964, the Beatles came to America, and they hit the ground running. Three nights of appearances on Ed Sullivan, following a dazzlingly successful American publicity campaign (that led the Herald Tribune to call them "75% publicity, 20% haircut"), stirred up a genuine frenzy of teenage excitement. Seventy-three million people watched the first night of Sullivan, then a record; when George Harrison celebrated his twenty-first birthday at the end of February, he got 30,000 birthday cards from fans. Liverpool had been a center for teenage pop music since the skiffle craze of the late ‘50s, and the Merseybeat sound — three guitarists harmonizing and a drummer in back — had made it to America. The larger "British invasion" of the Kinks, the Who, the Troggs, and the Rolling Stones, would soon follow. The names said it all: this was not American music.
They sought him here, they sought him there, those Frenchmen sought him everywhere, but they never quite figured out that the damned annoying Scarlet Pimpernel was neither in Heaven nor in Hell but right under their noses. Baroness Orczy wrote other books, but none as popular as The Scarlet Pimpernel; perhaps it was the scintillating prose or perhaps everyone really likes rooting for a swashbuckling Englishman who fights for the French aristocracy, but I think the answer is more simple:
Today 12-year-old Brian Ash of Zeeland, Michigan, is celebrating his third birthday. Brian Ash is a person of a certain distinction; leap day babies are few and far between, a list of celebrities born on leap day demonstrates. Dennis Farina is a character fine actor, and Patricia McKillip’s The Forgotten Beasts of Eld a fine book, but those are pretty slim pickings. All other things being equal, children have roughly a 1 in 1461 chance of being born on Leap Day; I was always faintly disappointed that my father missed by a day (leaving alone that he wasn’t born during a leap year). February has been stretched every four years since Roman times, other pieces of calendar reform notwithstanding, but throughout history very little seems to have happened on Leap Day. The <a href=”http://scienceworld.wolfram.com/astronomy/FrenchRevolutionaryCalendar.html
Juan Carnaval is about to die. The kings and queens of the San Pedro Carnival have written his will; tomorrow San Pedro will hold a wake, and his brides will weep behind their fishnet veils before removing their widow’s weeds and revealing themselves to be men. His will will be read, and he will be burnt, and Carnival will be over until next year. Four hundred years ago, a Flemish painter named Pieter Bruegel, in one of his most Boschian works, celebrated a different sort of celebration of the same holiday and called it The Fight Between Carnival and Lent (larger view). Over a hundred peasants gather around two figures: the hatchet-faced Lent, drawn by a nun and priest and wielding a scourge, and Carnival, fat as a lord and brandishing a roast pig as his weapon. The duel is about to commence.
Upstate New York in the nineteenth century was an unsettled place. The opening of the Erie Canal had brought an economic boom, but also unsettling changes, as the Canal unlocked the west and brought about the first major population migration in the United States towards the wilds of the Northwest Territories. There were other influences keeping things roiling as well; the Erie area was known as "the burnt district" for the number of revivals that took place there during the Second Great Awakening. John Humphrey Noyes founded Oneida, a Christian utopian commune in which the men and women were all considered married to one another. The hints of sexual license around Matthias the Prophet‘s Mount Zion community had become a major scandal. And just outside Palmyra, New York, Joseph Smith received his first angelic visitations that would eventually lead him to form the Mormon church. Against this backdrop, the religious proclomations of a farmer named William Miller might not have seemed all that noteworthy, were it not for one thing: William Miller had been told by God that the world was in its end days, and he was willing to say exactly when we would all be summoned home.
In 1477, a Norfolk woman named Margery Brews wrote her fiancé, John Paston, a letter. As the BBC explains it it:
I put on about ten pounds my freshman year of college, and a lot of that was thanks to my regular lunch at Louis’ Restaurant (and, in particular, their eggplant parmesian sub with a small fries), a greasy spoon favorite for generations of students. Louis’ was then still run by the gnomic perpetually irritated Louis Gianfrancesco, who started it with his brother in the late ‘40s. That was the heyday of the American diner, before a Mixmaster salesman named Ray Kroc came calling at a hamburger joint in San Bernandino, California, to find out why they were buying so many milkshake machines. The McDonald brothers, Dick and Maurice, told him it was because they sold so many milkshakes and simply wore out the Mixmasters. Kroc listened and got an idea, one which would eventually lead to a grease-powered empire of vast global reach.
In the dawn of Internet time, a spelunker and computer programmer named William Crowther was looking for something to do to take his mind off his pending divorce. An avid Dungeons and Dragons fan, Crowther decided to put together a computer game he thought his daughters might enjoy, a fantastic crawl through an underground world loosely based on Mammoth Cave in Kentucky. He wrote a program that accepted commands in broken English — "GO NORTH", "GET LANTERN" — and called it Adventure. Adventure, also known as Colossal Cave, was a success with his children; it was also a success with the (then quite small) Internet community. Passed from user to user and computer to computer, it eventually reached Stanford’s not-yet-storied SAIL system. In 1976, a SAIL user and game fanatic named Don Woods fell in love, got in touch with Crowther (through the simple and then tolerable expedient of mailing the "crowther" account on every machine on the Internet), and rewrote the game in C. A year later, a handful of MIT Adventure fans wrote a substantially upgraded homage called Zork; when they founded a company named Infocom a few years later and were looking for a product, they reached back to Zork, and the text adventure game was born.
Once upon a time, there was a plucky boy reporter for WHIZ radio named Billy Batson. Led by a mysterious dark figure down the abandoned subway tunnel at Slumm Street and Ninth, past grotesques of the Seven Deadly Enemies of Man, to a mysterious cave where the ancient Egyption wizard Shazam awaited him. Sitting on his throne, beneath a stone block hung by a thread, the wizard tells Billy to speak his name and be transformed into Captain Marvel, the world’s mightiest mortal! From 1940 until 1953, hundreds of thousands of children enjoyed the wonderfully ridiculous adventures of the "Big Red Cheese", Captain Marvel. A pulp writer of no particular distinction named Otto Binder hit his stride. As Binder took over the writing, things got weirder and weirder. Captain Marvel already had an arch-nemesis, the cackling, bald-pated mad scientist Dr. Sivana (whose appearance was based on a pharmacist in chief artist C.C. Beck‘s neighborhood), a romantic interest (Sivana’s daughter, Beautia, a Betty Grable lookalike) and a similarly superpowered sister, Mary Marvel (who Beck apparently decided to transform into Judy Garland). Binder took things further. Tawky Tawny, the celebrated talking tiger, appeared as a major character; the sports-coated feline was a good friend of Billy’s. Black Adam, Captain Marvel’s evil counterpart, was the first person given the gift of Shazam, but went evil, as if his name didn’t suggest it. (Billy’s invocation of Shazam’s name gave him the wisdom of Solomon, strength of Hercules, stamina of Atlas, power of Zeus, courage of Achilles, and speed of Mercury; Black Adam relied on distinctly dodgier notables like Shu and Zehoti.) An epic, two-year storyline (in 25 parts!) led Captain Marvel against the sinister Mister Mind, who was in the end revealed to be a superintelligent talking worm. (I can only assume that Stanley Kiesel remembered this when he wrote one of my favorite childhood books.) Beck modelled him on My Three Sons star Fred MacMurray. Although Beck came to have mixed feelings about his most recognized character and denigrated his contributions to the character, the clean, cartoonish feel he mastered made the book instantly recognizable. Captain Marvel’s wholesome yet bizarre world was immensely popular, and that led to the downfall that Sivana could never cause.
In 1516, Sir Thomas More introduced a new land to his English readership, an obscure corner of the New World: Utopia. More, then an up-and-coming lawyer and politician, put forth Utopia as the tale of his encounter with Raphael Hythloday, a widely-traveled sailor, while on a diplomatic mission. Hythloday recounts his encounter with More’s real-life mentor, Cardinal John Morton, lays out his view of Utopia as an ideal state, and attacks England’s excesses:
Within the jungle of Central America, chicleros still practice their art: tapping the chicle tree every other day, then boiling the sticky white sap to make chewing gum for Yuppies. When an exiled Santa Ana introduced chicle to the United States, hoping to sell it for use as artificial rubber and thus fund an army to lead him triumphantly back to Mexico City, one of his contacts, Thomas Adams, saw him chewing the stuff and decided to see if it would go over with his neighbors. The chewing gum available at the time was made of flavored paraffin; unsurprisingly, people preferred Adams’ version. He soon added sassafras and licorice flavorings, creating Black Jack, and Adams, later the American Chicle Company, was on its way. Competitors sprang up, many of which (including the maker of Chiclets) were purchased by Adams; William Wrigley founded his eponymous company in 1891; Walter Diemer of the Fleer Chewing Gum company invented Dubble Bubble, the first bubble gum, in 1928. Other innovations came later; Topps’ Bazooka bubble gum, with their desperately unfunny comics, didn’t come out until after World War II, and Jim Bouton, the Yankees pitcher and Ball Four author, didn’t get the idea for Big League Chew until 1977. Today, a dozen or more varities of gum, many with classic “bubblegum flavor are available at any supermarket or gas station, but in the waning years of the nineteenth century and the first few of the twentieth, gum was an advertising gimmick, a huge moneymaker, a cultural phenomenon. People patented holders for already-been-chewed gum. Dr. Edwin Beeman rode his gum fortune to the Cleveland city council; William White, a gum salesman whose Yucatan brand was the first peppermint gum, rode his to a term in Congress. But it wasn’t until 1924 that "Does Your Chewing Gum Lose Its Flavor on the Bedpost Overnight?" was written and became a huge hit. Other novelty foods — breakfast cereal comes to mind — have leapt from being fads to being staples. (Veggie burgers may be walking that path today.) Bananas were a huge fad food, partially because they were available in the winter, and today they represent the most popular fruit in America. Bananas even inspired a competing music-hall hit. But only bubble gum returned to the charts, surfacing as a top-ten as a novelty "skiffle" hit by Lonnie Donegan in 1961, bubblegum pop indeed.
The camera obscura is based on a simple principle. If you go into a dark room (thus the name, the Latin camera, "room", and obscura, "dark") and punch a small hole in the wall, the image outside will be projected inside. Francis Bacon understood the apparatus; Da Vinci described them in his notebooks; Frisius and Kepler used camera obscura projections to help perform their observations of the sun. The camera obscura became a staple of Victorian seaside resorts; one built in the 1940s at Cliff House in San Francisco provides a pleasant view of the elephant seal rocks. Recently, one controversial book suggested that Dutch master Johannes Vermeer may have used the camera obscura in his art. Certain hints of the perspective Vermeer used, the physical evidence suggesting that Vermeer was able to very accurately render objects’ proportionally without measuring them, the apparent finding that many of Vermeer’s paintings were made in the same room, and a tantalizing question of whether one object in a painting represents Vermeer’s darkened booth all piqued architecht Philip Steadman‘s interest. The question isn’t settled, and it may never be. For one thing, why wasn’t Vermeer’s lens, which would have been a rare and quite valuable item, recorded in his effects when he died? Vermeer, though a master painter and member of Delft’s painters’ guild, was primarily an art dealer, and could quite likely have afforded it, but would it have vanished out of history? But Steadman’s thesis is nothing compared to that of painter and photographer David Hockney.
When technology pundit Clay Shirky wanted to attack the idea of the Semantic Web, he didn’t do it the way Cory Doctorow did. Doctorow pointed out that "metadata", information about information, is created by fallable people, people who make mistakes, misjudge what’s important about their work, and have incentives to lie. Shirky went after the underlying technology instead; the Semantic Web, he said, was nothing but a syllogism processing engine. The designer Paul Ford, whose Harper’s Magazine design is one of the first commercial endeavors to be built around Semantic Web principles, strongly disagreed. Ford didn’t mention, however, that Shirky’s examples were very bad syllogisms. Were Shirky playing Lewis Carroll’s "Game of Logic" (its rules proposed in an 1887 pamphlet), he’d be in trouble. It requires proving or disproving statements such as:
If there’s a real analog to the History of the Land Called Uqbar, it’s the Codex Seraphinus, that remarkable traveller’s sketchbook of an alien world. However, it was created by Italian designer Luigi Serafini in 1981; for a hallucinatory encyclopedia of more uncertain province, cast an eye on the Voynich manuscript. It is full of strange drawings: of bathing women, zodiac figures, astrological diagrams, and curiously half-recognizable plants. (The fact that human figures are naked makes something as simple as dating the document difficult, as no costumes are depicted.) The alphabet it is written in only vaguely resembles anything ever known in Europe. Is it nonsense carefully designed to look halfway legible, a sort of sixteenth-century book from the sky? Investigators "lack decisive tests for distinguishing between nonsense babble, crafty cipher, and language," but the manuscript seems not to be simply random characters. Statistical analysis, the domain of computational linguists and information theorists, shows it to more closely resemble language, albeit a strangely curtailed one. The "words" of the Voynich manuscript are highly repetitive, and its vocabulary is much smaller than that of a typical English or Latin text of the same size. Is it perhaps an encoded religious text of the Cathars, the heretical Catholic sect, written in a synthetic alphabet? Manipulated Latin? The shorthand notes of the great philosopher of science Roger Bacon, recordings of his experiments with telescopes? Or perhaps something even less likely?
Once upon a time in the West, spring marked the New Year. The original Roman calendar ended the year with Februarius, a month of repentance and restoration before the Ides of March marked the New Year. But the Republic’s civil calendar began in January when newly elected consuls assumed office, so in 153 BC, officials simply rewrote things so that popular life matched with the bureaucracy. There were still problems, however. The lunar cycle seemed to suggest periods of roughly twenty-nine or thirty days, as in the Egyptian calendar, probably the world’s first. The period of earth’s rotation, however, was ever so slightly off, and lunar calendars slowly crept out of true with the actual cycle of the seasons. The Romans dealt with it by occasionally intercalating an extra month, but the process was complicated. The pontiffs who set the calendar were persuadable; a well-rewarded nudge here or there could affect the results of an election or keep an incumbent in office a while longer. When Julius Caesar seized control, he decided that enough was enough and converted to the Julian calendar. Caesar added 90 days to the year to get the seasons back in true; his system then relied on set sequence of 31- and 30-day months, with a short February expanded by a day every four years to adjust for the the fact that the year is a little more than 365 days long. Caesar did not renumber the years in his honor; the month of July was presumably enough reward.
It’s hard to make a Christmas movie. No film adaption of A Christmas Carol has quite lived up to Lionel Barrymore’s magnificently hammy renditions for Orson Welles’ Mercury Theater. The Bishop’s Wife has a stellar cast — Loretta Young as a neglected woman, Cary Grant as an angel just this side of seductive — but it’s been largely forgotten. The film for the season should be Miracle on 34th Street, the movie that cemented a New York Christmas in the popular imagination. (If Barbara Stanwyk’s boss in the wonderful comedy Christmas in Connecticut had appreciated Manhattan in the snow, there wouldn’t have been any movie.) It features a prepubescent Natalie Wood, far less annoying than she should be, as she learns the true meaning of Christmas, and veteran character actor Edmund Gwynn just nails Saint Nick, doing his part to cement the image. But really, it’s 90 minutes spent demonstrating that yes, Virginia, there is a Santa Claus (and also that New Yorkers are magnificently tolerant of crazy people who are clean and don’t bother anyone). The movie that seems to have run away with the season is a rather strange choice, all things considered. People love to hate It’s a Wonderful Life.
It is a common complaint — although not a universal one — that Christmas is too commercial. For anyone who’s waded through crowds in the weekend before Christmas, looking for the perfect gift, the idea that the modern image of Santa Claus was invented by Coca-Cola has a certain satisfactory ring to it. After all, Rudolph the Red-Nosed Reindeer was created as a holiday promotion for Montgomery Ward; in a Christmas miracle, the department store allowed the creator, Robert L. May, to have the copyright back several years later, and it made him and his brother-in-law, songwriter Johnny Marks, quite wealthy. But Santa-with-Coke advertising, memorable as it was, had very little to do with the modern conception of Santa Claus. St. Nicholas (the patron saint of thieves, pawnbrokers, and children) was a traditional German Christmas figure, of course; he gave good boys and girls presents, and was accompanied by a figure who punished bad children. There was much regional variation; in northern, Protestant Germany, the Santa figure was "Kriss Kringle" (that is, "Christ Child"), and whether the naughty were given whatfor by Black Peter or Krampus (link via Drew McDermott) seems to have depended on where in Europe you lived. Three men were responsible for the modern figure. Thomas Nast, the political cartoonist who created the donkey and elephant symbols, drew him as a gift-giving elven figure and placed his workshop at the North Pole. Louis Prang, a Boston engraver and inventor of the Christmas card, nailed down his costume. And Clement Moore, the Bible professor and son of the former president of Columbia University, invented the eight tiny reindeer and all the rest when he wrote "A Visit from St. Nicholas" on Christmas Eve, 1823. The problem is that Moore apparently did no such thing; the poem was published anonymously in 1824, and like many things anonymous, its authorship was in some doubt. Moore included it in a book of his collected verse in the 1840s, and that seemed to settle the matter for everyone except the descendants of Henry Livingston, a New York farmer, politician, and amateur poet. A few years later, two of them, Stephen Livingston Thomas and Mary Van Deusen managed to put samples of Livingston’s later work in front of Vassar’s Don Foster, America’s most famous textual investigator. Foster concluded that there was no doubt that the poem was Livingstons, and comparing the two authors’ poems on similar themes (and a Christmas poem that’s unquestionably Moore’s) certainly suggests that he was right. Livingston’s name is slowly supplanting Moore’s as the true author of the piece. Amazingly enough, 150 years later, an Air Force Lieutenant Colonel named Bruce Lovely apparently took credit for a soldider’s retelling of "’Twas the Night Before Christmas". The poem, "Merry Christmas, My Friend", was by a Marine corporal, James Schmidt, who was probably less offended at the plaigarism (his authorship was easy to establish, as he’d published it in Leatherneck magazine two years before Lovely claimed to have written it) as by the fact that Lovely had stripped out Schmidt’s references to the Marine Corps:
Leopold and Loeb, the Chicago thrillkillers, killed because they thought of themselves as Nietzschean supermen; Clarence Darrow, whose bravura courtroom performance saved the pair from execution, took the case specifically because of the publicity it would attract — he wanted a national soapbox from which to air his opinion about the death penalty. But for all of the twentieth century’s famous trials — O.J. Simpson’s murder case launched a cable network — only one was the "crime of the century": the Lindbergh kindapping. When a suspect, a German carpenter, petty criminal, and illegal alien named Bruno Hauptmann, was caught, sixty thousand people crowded into Flemington, New Jersey (current population: 4,200), to be closer to the case. And newspapers jumped on the story with both feet:
In Kill Devil Hills, North Carolina, on a clear December morning in 1903, the Wright Brothers flew. The world collectively yawned. The Wrights’ secrecy — they were engaged in negotiations with the American military and fiercely protective of their technology, which had not yet received its patents — and their complete lack of interest in planning a public test didn’t help. Just a week before, physicist and Smithsonian Institution director Samuel Langley‘s catapult-launched had, to widespread publicity, crashed on takeoff; the Wrights didn’t ask anyone to witness their flight in 1903, and even two years later never formally invited reporters to attend their test flights. Initially, many thought they were hoaxers; the airship mania of the late 1890s was then a recent phenomenon. By the time the Wrights were willing to go public, the world had largely caught up with them. The two, still tinkering with their manufacturing processes and working rejected Earnest Archdeacon’s direct invitation to participate in a 1907 air competition (with a fifty thousand franc prize). In 1908, they finally silenced critics with the longest sustained flight to date (and the greatest control over piloting; the Wrights were the first to understand roll, pitch, and yaw), then shrugged off the potential publicity of being the first to fly across the English Channel. The Wrights were not interested in showing off. Glenn Curtiss was an innovator — his planes’ ailerons may have been used because they avoided the Wrights’ patents, but they were an improvement over the Wrights’ system that are still used today — but, more importantly, he understood the advantages being in the public eye. The Wrights had their "Flyer"; Curtiss had his "June Bug". His planes cost a fifth what the Wrights’ did, and Curtiss used them to win speed trials throughout the country. By 1920, the Wrights were gone from the airplane business, although their named lived on in that of the Curtiss-Wright Corporation. Their claim to fame is not that they were the best, but that they were first. If it weren’t for them, one of the other students of Octave Chanute might be celebrated today, France’s Voisin brothers or the Franco-Brazillian Alberto Santos-Dumont (who killed himself when he saw that airplanes would be used as a weapon). But being first is no small thing, and the Wrights made important discoveries. More importantly, they flew. Even with all the advances since, that’s nothing to take for granted.
The lost dauphin — or part of him, at least — has finally been laid to rest and so, perhaps, has the myth of the escaped Louis XVII. The boy prince was jailed during the Revolution by "the Shoemaker", Antoine Simon, a Communard who was eventually killed in the 9 Thermidor coup. Simon and his wife brutalized the young Louis, and his death from tuberculosis in 1795 must have been something of a surcease of pain. However, in the years after his death, more than two dozen claimants to the prince’s title appeared. Most of them were obvious frauds (although American missionary Eleazer Williams influenced Huckleberry Finn and inspired an opera), but one, Karl Wilhelm Naundorff, convinced several people who knew the real Prince Louis and managed to get the Netherlands to acknowledge him as the rightful heir to the throne of France. Royalty who die under mysterious circumstances have a knack for showing back up. Henry VII had to deal with two pretenders: Perkin Warbeck, who claimed to be Richard, Duke of York, and Lambert Simnel, who claimed to be Edward, Earl of Warwick. The two "false Dmitris" each briefly managed to grab the throne in early 17th century Russia, the time of Boris Godunov. Anna Anderson convinced some German relatives of the House of Romanov that she was the miraculously survived Grand Duchess Anastasia Nicholaevna, daughter of the last Czar of imperial Russia; Anderson failed DNA tests, but did eventually marry an amateur historian.
Last week, a 26-year-old engineering grad student at the University of Michigan got back from a meeting, sat down at his computer, and discovered that he had earned his own footnote in the history of mathematics. Michael Shafer did a victory dance, then called his wife to let her know that he had found the largest known prime number Prime numbers are those, like 7 and 101, that have no divisors other than 1 and themselves. Although there are an infinite number of primes, it’s impossible to predict precisely where they will show up. The prime number theorem gives a rough estimate of how many primes exist below a given number — the degree of error in this estimate is intimately tied to the Riemann Hypothesis, the most important unsolved problem in number theory (and one worth a million bucks to the mathematician who solves it) — but there’s generally no easy way to tell if a given number is prime. The type of prime that Shafer discovered, mersenne primes, are an exception. Mersenne primes are primes of the form 2p - 1, where p is a prime (although not all primes will form Mersenne primes in this way), and these numbers have special properties that make them easier to find. The five largest known primes are all Mersenne primes, and Shafer’s discovery is a whopper: over six million digits long.
In 1908, Henry Ford started selling the Model T, a workingman’s car that cost $825. In In 1958, the Ford Edsel — "Impressive, yes! Expensive, no!", the disatrous marketing campaign put it — cost less than $4,000. Today, a snazzy Ford Thunderbird would set you back almost ten times as much; a nice, sensible Focus would still cost you thirteen thousand dollars. Reading a nice nineteenth-century comedy of economics can be slightly perplexing — how rich was Mr. Darcy, anyhow? (Economist Brad DeLong came up with two different answers.) Science fiction writers, accustomed to setting storylines, have a related problem. In 1984, when William Gibson wrote Neuromancer, he gave his protagonist (on the run with few allies and less cash) one asset; Case just might be able to get a ticket out of town if he can find a buyer for his "three megabytes of hot RAM in the Hitachi." Today’s price for 32 megabytes of perfectly legitimate RAM is somewhere around $5. Dated technology is bad; dated prices are worse. Few things are as jarring in a story where passenger ships fly to the moon as a cup of coffee that costs a dime. Some writers create new currencies to avoid this problem. Who’s to say how much a credit, work unit, New Dollar, or piece of gold-pressed latinum is worth? Robert Heinlein occasionally took this route, but more often, he kept his terms vague (a tourist’s visit to the interstellar gates costs a few coins in Tunnel in the Sky; Johnnie simply spends all his money while on leave at Sanctuary in Starship Troopers). When he was specific, it can be hard to judge his meaning — the half-million-dollar settlement in "The Man Who Sold the Moon" would be a lot for me, but for a billionaire of today, let alone the future, wouldn’t it just be the price of doing business? Neal Stephenson’s Snow Crash had a nice solution to the problem; his characters tossed around trillion dollar bills like they were twenties, acknowledging a universal truth: inflation always wins.
December is here. The first real cold snap has arrived in Washington, and the first snow is just around the corner. New York has the Rockettes and skating at Rockefeller Center, but the District’s major holiday tradition is the Pageant of Peace. Every President since Eisenhower has used the Pageant of Peace as an opportunity to light the National Christmas Tree. There’s no room for a forty-foot spruce in my apartment, so I might just get a snowdome. Snowdomes are older than the National Christmas tree, dating to the late nineteenth century. They’re wonderful pieces of chintzy Americana; I can see the appeal to collectors (1, 2). When I get sick of snow, I could switch to a more Western motif. Prisoners escape fromAlcatraz; hippies do their thing in San Francisco; hula dancers enjoy the Hawaiian beaches; people die in Texas. If the London Riot Reenactment Society sets up an American branch in Pennsylvania, they can enjoy a Whiskey Rebellion snowdome; if, on the other hand, a dimensional vortex is breached and angry slaad or shirokinukatsukami storm the mid-Atlantic region, perhaps a Frank Frazetta snowglobe will appease them. L.A. may suffer from poor air quality, but given how badly D.C. drivers handle the snow, I might prefer the smog — I might even prefer a swarm of locusts to the inevitable SUV wipeouts and eight-car pileups that winter brings. And when winter has receded, I can spirit Marion Davies away on my yacht, lie back in bed, and let a snowglobe fall from my hand. Nothing says "spring" like a rosebud.
Forty years ago this month, President John F. Kennedy was shot dead on the streets of Dallas. Forty years ago this December, the President’s Commission on the Assassination of President Kennedy — the Warren Commission — began their investigation of the assassination. Forty years ago next September, they made their report. The report ran to 26 volumes and over 5000 footnotes; the National Archives stored 360 cubic feet of the commission’s materials. And still, people feel that the matter is not settled. The ongoing and pervasive belief that Lee Harvey Oswald was not a lone gunman — a "silly little Communist" — might just be a need for meaning for most of its adherents, an outcome of a culture of conspiracy that might tell us something about the conflicted essence of the national story of America. But it seems rather unkind to say that the true believers, those who brush conventional logic aside, are simply searching for meaning. They’re not trying to make Kennedy into a martyr for something larger than himself; they think that they’ve found the traces of something monstrous. Going through 363 cubic feet of documents, how could they not?
In 1964, an essay by Frank Littlefield called "The Wizard of Oz: A Parable of Populism" appeared in American Quarterly. It initially caused little stir, but over the years it’s been repeated and has created a widespread belief that the book is a political allegory. Littlefield noted that Dorothy’s magic slippers were silver in the novel, and he began to map the political meaning to the various characters. The Tin Woodsman is the Eastern industrial laborer; the scarecrow the Midwestern farmer; the companions travelling on the Yellow Brick Road are Coxey’s Army. The Cowardly Lion, Littlefield wrote, represented William Jennings Bryan, the "Great Commoner", and The Wizard of Oz is an allegory about the magical benefits of monetary reform.
Dwight Eisenhower created Veterans Day in 1954 to honor all of America’s veterans. My grandfather won a Purple Heart in the Army Air Corps in World War II; I owe a great debt to him and to the rest of the armed forces, past and present. It’s entirely right that there be a day dedicated to rememberance and thanks. There’s an intentional vagueness to Veterans Day, however; it’s intended to honor all military veterans, every soldier and sailor from World War II through today. November 11 was originally a celebration of Armistice Day, and the legislators who drafted 44 Stat. 1982 in 1926 could be specific:
Peg Entwistle was a failure. In New York, she had been a player (though not a successful one) with Lawrence Langner‘s Theater Guild. When the Depression came, she had trouble getting stage work, so she headed out to California; a play with Billie Burke closed quickly, and her first film role came in a disastrous Irene Dunne comedy that RKO held back to re-edit. Her contract, RKO informed her, would not be renewed. One drunken September evening, she went up to the Hollywood Hills and killed herself. Today she is one of Los Angeles’ most famous ghosts, not because of who she was but because of where she died; Peg Entwistle is the woman who threw herself off the Hollywood sign.
"When hinges creak in doorless chambers and strange and frightening sounds echo through the halls, whenever candlelights flicker where the air is deathly still, that is the time when ghosts are present, practicing their terror with ghoulish delight." So begins the soundtrack of the Haunted Mansion, Disney’s take on the "dark ride" genre of amusement park attraction. It’s not surprising that Disney’s version has attracted a number of fan sites (1, 2, 3); it’s a bit more surprising that rides like the Haunted Mansion at Rehoboth Beach, Delaware’s Funland or Elysburg, Pennsylvania’s
Sixty five years ago this Halloween, CBS’s Mercury Theater scared the living daylights out of America. Long-standing belief in life on Mars, some talented performances, a bit of razzle-dazzle with a fake program of dance music, and a forty minute gap between announcements that they were performing H.G. Wells’ The War of the Worlds combined to produce a classic example of mass hysteria. The Mercury Theater’s young founder, Orson Welles, must have loved it. There have been pranks in the American mass media for as long as there’s been an American mass media, from Edgar Allen Poe’s airships to George Plimpton’s Sidd Fitch. But Welles love of untruth. Richard Hell (that Richard Hell) notes that Welles’ father owned a hotel filled with retired vaudevillans and suggests that Welles developed his fascination for performance there. Welles was a regular at the Magic Castle in Hollywood and, according to his friend Jim Steinmeyer, quite an accomplished amateur magician. His breakthrough role as the Shadow was a part particularly suited to someone interested in stage magic, puzzles, false identities, illusions. Welles’ lost documentary on Brazil was to have been called It’s All True, the sort of objection that only a man who would later go on to create something like F for Fake would need to raise. The somewhat boggling claim that Stalin targetted John Wayne for assassination makes a bit more sense with Welles as a major source. Welles said that he wanted descriptions of him to be flattering, not accurate; why shouldn’t his story of the cowboy and the dictator be the same? And so we arrive at Welles’ Batman. Why shouldn’t Lamont Cranston have taken a crack at playing Bruce Wayne? Basil Rathbone would have made a brilliant Joker! Welles could have wrung a magnificent performance out of Marlene Dietrich as Catwoman! (For that matter, Welles discovered Eartha Kitt.) Alas, the whole thing seems to have been a product of Mark Millar‘s fecund imagination, but it was believable for a minute. Orson Welles would have been proud.
There’s a certain class of con games that involves selling something that is not what it seems. The fiddle game involves selling a cheap pawn shop violin as a Stradivarius. Pigeon drops can involve forged lottery tickets or cashier’s checks. The gold coin game is a slight update on selling gold bricks, a con that dates to the 1880s and was tired in O. Henry’s day:
Silent film stars had reason to be nervous at the arrival of "movitones", the New Yorker‘s 1928 term for the "singies" or "talkies." Film sound was a remarkable technical and entrepreneurial achievement. It was also going to put a lot of people out of work. Japan’s benishi, the silent film interpreters who had honed their performances to an art, hung on until the mid-thirties; the most famous of the benishi, Tokugawa Musei, was narrating films as late as the early 1960s. Argentina’s tango orchestras relied on income from performing film soundtracks. But the most visible targets of the "cruel and relentless myrmidons of science," as Robert Benchley puts it, were those actors "whose speaking voices could hardly be counted on to put across the sale of a pack of Fatimas in a night club."
The Chicago Cubs had a chance to reach the World Series for the first time since 1945. Last night, it ended in tears, of course, but just like the ‘86 Series, everyone’s going to remember Game 6, when the Cubs suffered a complete (though not unprecedented) meltdown, giving up a staggering eight runs in the seventh inning. The curse of the Bambino may be a myth, but Cubs fans across the nation are wondering if the curse of the billy goat. During the Cubs’ collapse, 26-year-old fan Steve Bartman prevented Cubs left fielder Moises Alou from catching a foul ball that drifted into the first row of the stands. Unlike the wretched Jeffrey Maier, the Bartman wasn’t actually reaching over the wall, but his instinctive reaction to make the catch was unfortunate. In a statement released yesterday, he apologized, saying that he just didn’t realize Alou was going to try to make a play. Cubs fans were not immeditately forgiving; Barman was pelted with beer and escorted out by security for his own protection. He skipped work the next day as online fans started baying for his blood. Remixes of his picture are cropping up on the Internet. His parents received death threats. Last night, as the Cubs added another chapter to their history of failure, a police helicopter hovered over his home. Bartman’s life isn’t going to be very pleasant for the next few months, but he should be consoled by the fact that America isn’t a soccer-worshipping nation.
Friendster, the hugely popular "social networking"/hookup engine, has attracted attention from venture capitalists, lonely twenty-somethings, and at least one friend of mine looking to add a little extra oomph to a breakup by defriendsterizing her ex. It’s also attracted fakesters, people who have decided to enliven the site by adopting identities like Giant Squid, Bat Boy, or Patty Hearst. People like the fakesters; Giant Squid had 335 friends tied to his account. The Friendster corporate team doesn’t care for them nearly as much, and they’ve been angering some users by shutting down fake accounts left and right. Fakesters like "Ecstacy" and "Tootsie Roll" have entered the graveyardster; innocent non-fakes have been caught in the purge. One particularly vulnerable group, apparently, has been DJs; they’re sociable, well-liked, and oddly named, three symptoms of fakester behavior. It’s not easy being popular.
The Progressives of the late nineteenth and early twentieth centuries fought for the recall initiatives that brought Gray Davis down. They also fought against the "smoky room" nominating process that allowed party elites to control nominees to elected office. Over the last hundred years, campaigns have changed in ways the Progressives could never have imagined; one of the equalizers for unseating an incumbent is name recognition. Celebrity has power. Turning fame into a political career is hardly original to Arnold Schwarzenegger. California turned out not just a well-known actor turned Governor, but Senator Sonny Bono, dancer turned right-wing Senator George Murphy and Congresswoman Helen Gahagan Douglas. Douglas, a singer and Broadway actress, wasn’t really a film star; her only Hollywood role was as the title character in the film adaption of H. Rider Haggard’s She. But her husband, Melvyn Douglas, provided enough Tinseltown credibility for two. Later, two-term Congresswoman Douglas was smeared as a Communist sympathizer by Richard Nixon during their 1950 battle for a California Senate seat; she’ll go down in history as the first person to label him "Tricky Dick".
In 1950, a man named George Lerner invented a toy. His brainstorm was a set of comical pin-backed noses, mouths, and hairpieces. Given a set of the pushpins, Lerner reasoned, children could amuse themselves for hours with only a starchy vegetable into which to jam the things. Two years later, Lerner and a small Rhode Island toy company bought back the rights to the toy from the cereal company then manufacturing it as a premium. Later, the toy company realized that actual potatoes got kind of gross after you let kids play with them for a couple days and switched to plastic; four decades later, Mr. Potatohead is still going strong. Mr. Potatohead was the first major success for the Hassenfeld Brothers (his grin is part of the company’s logo); today Hasbro is America’s second largest toy company, the corporate parent of Milton Bradley, TSR, and Tonka. The Pawtucket-based company is also one of the largest in Rhode Island, and when the state decided that it would commission some themed public art, Mr. Potatohead was a natural choice. Rhode Island was following in the path of cities large (Chicago’s cows) and small (Bloomington’s corn) and everywhere in-between in trying to attract some tourist attention with a series of identically shaped, differently decorated statues placed throughout the city. Statues of Rich Uncle Moneybags painted a festive gold or Serpentor wearing a rain slicker probably seemed like a bad idea, so the project’s planners chose the unthreatening and iconinc Mr. Potatohead. Except for some complaints that the statues were ugly (and one accusation about a supposedly racist, darkskinned Potatohead), the program went smoothly enough. The program ran throughout 2000; eventually, most of the six-foot-high statues were donated to a charity auction. Last week, one of the statues was stolen. People do all kinds of terrible things to public statues; three beloved Muscovite ducks were stolen in 1999, and last month someone dynamited Copenhagen’s Little Mermaid (link via <a href=”http://www.metafilter.com/mefi/28359
There’s a crisp autumnal breeze in the air, baseball’s regular season is over, and we may well awake sometime soon to see the skies opening up with a rain of blood: the possibility exists that we could see a Cubs vs. Red Sox World Series. The Cubs won their last World Series in 1908; the Red Sox in 1918 (beating, of course, the Chicago Cubs). The two teams have compiled a remarkable record of futility ever since. The Cubs were crippled by decades of managerial incompetence, penny-pinching owners, and fans who are often content to go see a game at baseball’s prettiest ballpark regardless of whether the team wins or loses, a fact that drives more competitive fans absolutely bonkers. Some Red Sox fans — my grandfather was one of them — aren’t willing to accept mundane explanations like owner apathy or the fact that the Red Sox were the last team in the majors to begin signing Negro League players, a full twelve years after Jackie Robinson broke the color barrier. They want a more interesting answer. And so: the curse of the Bambino.
Hurricane Isabel shut down DC, of course; any weather heavier than a gentle breeze can do that. But even in the Carolinas, where the storm was stronger and the usual band of brave and foolhardy residents refused to evacuate, it didn’t do much real damage. Isabel caused less than two dozen deaths (largely due to traffic accidents, although several people died from carbon monoxide poisoning while running generators during blackouts). Hurricanes are still devestating storms, but they’re not nearly as deadly as they once were. If another Long Island Express occurs (the practice of giving hurricanes people’s names didn’t start until the Fifities), the damage will be incredible. The 1938 hurricane devestated a rural Long Island largely made up of farming communities and was still the sixth costliest storm of the twentieth century. But, thanks to meteorologists and hydrologists, people will have warning; that wasn’t always the case.
This summer’s outage that hit New York City and much of the Great Lakes region was much more widespread, but the five and a half days I just spent without power in the wake of Hurricane Isabel was plenty of time in the dark, thank you. The earlier blackouts highlighted problems with the power grid, the amazingly complex system of interconnected utilities. England has a single grid operator (not that this helped stop London’s summer blackout) compared to the multiplicity of grid operators in America (each composed of many utilities). It’s no wonder that people are starting to talk about "smart grid" programs, although those won’t do any good against the sort of power outage that I suffered, the sort that results from a very large storm knocking over a great many trees. The hundreds of imported power crews are doing the best they can, but by day four of the outage, I was dredging up old thoughts about green houses and going off the grid (I used to read Backwoods Home the way I now read shelter magazines, and for much the same reason). Thank goodness for battery-operated lanterns. The blackouts in New York get all the attention, but the tedious cleanup required to get power restored after vast numbers of transmission lines have been downed is no fun, even if it doesn’t come with widespread rioting.
Those who fell ill in medieval Europe were in trouble. Medieval physicians didn’t know about germs, infections, or, in many cases, the rudiments of human physiology. Medieval surgeons were barbers (who were busy men; in 1450, Parliament felt obliged to restrict English barbers to bloodletting, toothdrawing, cauterization, and "the tonsorial operations"). Doctors couldn’t prevent or cure many of the day-to-day maladies that could befall people in the best of times; when an epidemic hit, especially one as devastating and world-shaking as the fourteenth century black death that killed a third of Europe’s population, there was little left to do but pray. Several saints, most notably Saint Roch and Saint Sebastian, were thought to have particular power to intercede against plagues. When the going got really tough, they could turn to the Fourteen Holy Helpers, saints thought to be particularly effective at interceding on behalf of sufferers of different ills. The Holy Helper who dealt with plague was Saint Christopher.
Three years ago last August, I moved back to the East Coast from California. I went about the usual routine, getting phone service taken care of (difficult to do in the middle of the Verizon strike), transferring my auto insurance, getting a driver’s license. It took me a while to do that last one. Every day when I look in my wallet, I see a reminder of exactly how long it took: "Issued Date: 09-11-00" I spent the morning of September 11, 2001 in a server room, frantically trying to keep things running. I’d seen the first plume on CNN in the office lobby and assumed that a horrible accident had occured; only on my emergence hours later did I find out what had happened. My boss told me that my face turned gray. I work in northern Virginia; with very little rejiggering, I can imagine myself, my coworkers, my loved ones in the E-ring of the Pentagon that morning.
When children think of a very large person, they think of Shaq. Shaquille O’Neale stands 7’ 1" tall. Thanks to a nagging toe injury that hampered his training, last season he weighed more than 360 pounds. If a reported weight of 382 pounds before the NBA playoffs was correct, Shaq was the heaviest athlete in professional sports last year (in American major sports, at least; Musashimaru could spot Shaq an Olympic gymnast). And life has been good to Shaq, making him not just wealthy and famous but a film star and recording artist, a man whose exhuberant tastes and daffy immaturities are seized on as somehow meaningful. It’s a better fate than he might have had; a half century ago, Robert Wadlow, the world’s tallest man, died at the age of 22, slightly under nine feet tall and weighing over 400 pounds. Given that Shaq’s majors crimes seem mostly aesthetic and attitudinal (the latter hardly surprising given that tall people are winners in life), one hopes he’d have met a happier fate.
Edgar Allen Poe fancied himself a cryptographer. Like many of his contemporaries, Poe played with the acrostic (1 2). He did a great deal to popularize the cryptogram, he published (and may have authored) the "William Tyler" cryptograms, and in his famous story "The Gold Bug", Poe introduces a theme that has been hovering at the back of the public imagination ever since: solve the puzzles and find the buried treasure.
Forty years ago this week, a Southern preacher gave the defining speech of the civil rights era. And forty years ago this week, at the same event, a Southern preacher of a different sort handed out pamphlets telling blacks to get back to Africa, perhaps hoping to provoke a race riot. Martin Luther King’s "I Have a Dream" speech is one one of the truly great speeches of the twentieth century, a masterpiece of oratory, steeped in the language of the church from which King sprang. A hundred years from now, students will still study his words in school, just as a hundred-odd years later, I know the words of the Gettysburg address. George Lincoln Rockwell, the Air Force veteran, Brown alumnus (one of his college girlfriends would later become the only white columnist for Harlem’s Amsterdam News), and American Nazi Party founder who led the march forty years ago last week, is a historical footnote. George Lincoln Rockwell gave speeches designed to stir up racial violence and carried placards that replaced King’s name with a racial slur. And yet, recalls Taylor Branch, when "George Lincoln Rockwell, the Nazi commander, showed up in Selma and accosted King during a march and said that he was going to prove that King’s philosophy was the work of the devil… King turns to Rockwell and says, ‘Well, Mr. Rockwell, I would really like to engage with you and talk about that, and we’re having a mass meeting tonight, and I will give you 15 minutes in my pulpit to discuss that, and now I would like to talk with you about it, either then or afterwards.’ And the silence after he said that — not just by Rockwell but by other people — was startling silence." King occasionally failed as a politician and as a man, but remained devoted to his philosophy of passive resistance, racial justice, and the betterment of the American soul. He remains by any measure one of the great Americans in the history of our nation. Lincoln said in Gettysburg that "[t]he world will little note, nor long remember, what we say here, but can never forget what they did here"; the miracle of the March for Jobs and Freedom is that we will remember both.
Summer is winding down into fall, and soon the harvest moon will be appearing on the horizon. The harvest moon and hunter’s moon are the two most stable names that appear on the list of moon names, those fanciful — "Worm moon"! "Sturgeon moon"! — descriptions of the calendar that farmers’ almanacs love to print. The "blue moon" is missing from this list, as it never defined a specific period on the calendar. Blue moons are the second full moon in a month. The term originally meant an impossibility, Elizabethans not having seen the aftermath of the Krakatoa eruption or other events that introduce huge amounts of particulates into the air. Also missing from the list of moon names is the honeymoon. There are many folk etymologies explaining why a honeymoon is called a honeymoon, but the apparent true origin of the term is rather more cynical than one might have expected from a nice sixteenth-century lexicographer like Richard Huloet:
Already, in the confused improvisation of the first encounter, the possible future of a cohabitation is read. Today each of you is the object of the other’s reading, each reads in the other the unwritten story. Tomorrow, Reader and Other Reader, if you are together, if you lie down in the same bed like a settled couple, each will turn on the lamp at the side of the bed and sink into his or her book; two parallel readings will accompany the approach of sleep; first you, then you will turn out the light; returning from separate universes, you will find each other fleetingly in the darkness, where all separations are erased, before divergent dreams draw you again, one to one side, and one to the other. But do not wax ironic on this process of conjugal harmony; what happier image of a couple could you set against it? (Italo Calvino, If on a winter’s night a traveler)
Today, Andy Warhol would have been 75 years old. His work has aged well. By work, I don’t mean his art. Most of it is clever and some of it is good, but thirty years of progress have removed any novelty that Warhol’s insights once possessed. Warhol made art in the age of television and tabloids, but that’s been the state of art for at least as long as I’ve been alive. By work, I don’t even mean his prescient decision to surround himself with a gallery of Greenwich village artists, beautiful people, and http://www.entertainted.com/scott/Divinity/arphil.html” title=”Divine”weirdos. I don’t mean the wonderfully apt idea that we’ll all be famous for fifteen minutes, or Interview magazine, or the charming museum he left behind, or his generally interminable films, or the re-introduction of wigs for men. No, I mean the Velvet Underground, and specifically the Velvet Underground’s role in the fall of Communism. The VU were responsible for their own genius, of course, but Drella helped make the world safe for democracy when he presented them to a larger world through his movies and Factory gigs. The line about the Velvet Underground, who never broke into the Billboard Top 100, is that they only sold a thousand records but every single person who bought one went out and started a band. Usually people are thinking of the New York club scene of the ‘70s and explicit followers like Jonathan Richman. But off in Czechoslovakia, a band of hippies called the Plastic People of the Universe (the name taken from a Frank Zappa song) heard the VU and felt the healing power of rock and roll. The Russian invasion of 1968, however, brought with it bureaucrats who rejected the decadent bourgeoise music, driving the Plastic People of the Universe — and their performances, multimedia extravaganzas inspired by Warhol’s Greenwich Village bashes — underground for a decade. In 1974, hundreds of their fans waiting to hear an illegal performance were herded away and beaten by police. In 1976 they were arrested. On the first day 1977, having attended their trial and been appalled by what he felt it said about Czechoslovokian society, Vaclav Havel and a group of dissident intellectuals formed Charter 77. (The line between Czech artists, intellectuals, and politicians remains blurry today.) Havel spent years in prison for organizing against the Communists; in November of 1989, student protests led to a general strike. Havel’s Civic Forum organization demanded the resignation of the Communist government and the release of political prisoners. Amazingly enough, in the face of hundreds of thousands of demonstrating Czechoslovakians (and a Soviet Union not prepared for another Prague Spring), they got it. Vaclav Havel became the first post-Soviet president of the Czech Republic. Andy Warhol’s mother, nice Slovak woman that she was, should have been proud. And they called the transition to democracty and freedom the Velvet Revolution.
Eugene Podkletnov is a Russian materials scientist who claims to have discovered a gravity shield. An article he published in a peer-reviewed physics journal claimed that items suspended over his device — a series of rotating superconducting discs — lost a measurable amount of weight. Serious scientists, people who work for places like NASA and Boeing (which denies everything, a clear sign that something’s afoot) investigated his claims. Alas, they were unable to reproduce them. But if you’re not convinced that this amazing scientific breakthrough will <a href=”http://paranormal.about.com/library/weekly/aa081902a.htm
The English prisons of the eighteenth century were devoted to one thing: holding prisoners. Crime, even violent crime, continued unabated inside the prison walls. At the time, justice was somewhat arbitrary to begin with; the fact that wealthy prisoners could buy better accomodations while poor ones were given worse food and lodging (and died more often) makes it even more so to modern eyes. Utilitiarian philosopher Jeremy Bentham thought he could do better. He wanted to rationalize punishment.
Last night at Fort Reno, the Dismemberment Plan played what is supposedly their final show ever in the United States. The Plan were probably the still-extant band I’ve seen the most often; now they’re just another entry on my "favorite local band" chart. DC seems to have a knack for producing bands that ninety-nine percent of America has never heard of but that record shop-haunting types the nation over adore; I blame Dischord.
For over a thousand years, doctors in Europe depended on Galen. Galen represented the pinacle of Hellenic medicine. He experimented, for one thing: he invented the process of taking patients’ pulses, demonstrated that the kidneys produce urine, and explored the role of nerves in keeping animals alive. But he didn’t understand the role of the heart in blood circulation and didn’t, in fact, understand blood circulation at all. Numerous inaccuracies crept into his work, some possibly due to his philosophical beliefs and some due to his practice of dissecting not human corpses but those of pigs, sheep, and apes.
If Gutenberg had been born just a few centuries earlier, there might have been no need for the university system. A medieval culture of the book, relying on written authority in both secular and religious matters, was developing, but there simply weren’t many books to go around. Production of books at scriptoria was slow. At a time when an extensive private library might contain two or three dozen books, Oxford’s Oriel College had 52 books in 1375; Cambridge, then as now vying with Oxford for the title of England’s center of learning, had 122 volumes on its library shelves fifty years later. (On the Continent, where academic matters were somewhat more advanced, some monastaries of the period had hundreds, even thousands of volumes; these were the books copied in the scriptoria and circulated throughout Europe.) A gathering of scholars could share books. And there were other advantages.
One of the joys of the Internet is that people with unusual interests can achieve critical mass and find one another: if you are, say, a recumbent bike enthusiast in North Texas or someone who likes listening to auto races via scanner, you can find compatriots online. So too for coffee nerds. Still, as wonderful as resources like Coffeegeek are for true obsessives, all the websites in the world can’t match the impact of one ubiquitous chain. Starbucks is, of course, a frequent target of anti-globalization activists (as well as ornery individuals) because its amazingly successful business model has made it the most prominent coffee company in America. It’s often chided for its habit of setting up shop near smaller coffeehouses and drawing foot traffic away from them; sometimes the business lost to Starbucks is enough to make the difference between staying open and closing down. In the towns and suburbs where there weren’t coffeehouses of any sort before Starbucks arrived, however, their coffee can be a godsend. For all the twentieth century coffee world’s technical developments (most especially including the modern espresso machine, a recent development in coffee’s century old tradition), none had as profound an impact on American coffee consumption as the rise of instant coffee. At one time, every major city in America had several coffee roasters; there were once 45 roasters in Ohio alone. Alfred Peet, whose chain of stores remains a Bay Area institution, started his first coffeehouse in 1966 as a side business to coffee roasting. But the slow move to bland yet highly-caffeinated robusto beans — used as cheap filler in instant coffee — helped kill American coffee culture almost everywhere in the country until Starbucks brought it back. The company’s efforts at social activism might not matter nearly as much as the growth in coffeehouses that accompanied (and perhaps was even triggered) by the chain’s inexorable spread. Starbucks is everywhere, and coffee smells like money. Even fast-food chains now looking to make a buck off of gourmet coffee. And if Starbucks falters, another chain — or even an alliance of independent coffee shops, doing it Booksense-style — will rise up to replace them. America eats outmoded retailing concepts alive, and it doesn’t need an overroasted $4 mocha to wash them down.
Herschell Gordon Lewis is a marketer. He is the author of the wildly successful how-to book, Writing Direct Mail Copy, as well as such other guides to selling as The Businessman’s Guide to Advertising and Sales Promotion, Writing Letters that Sizzle, How to Handle your own Public Relations. He was the author of Omaha Steaks’ direct mail campaign. He’s a senior fellow at the International Society for Strategic Marketing, and has worked as a marketing consultant for Lenscrafters and Barnes & Noble. His current career is, in fact, only a few shades removed from his previous incarnation as "the Godfather of Gore". Herschell Gordon Lewis made exploitation films.
Why did capitalism in its modern form take root in Europe, particularly Western Europe, and not elsewhere in the world? Max Weber tried to explain this in his famous The Protestant Ethic and the Spirit of Capitalism. Weber was examining the sociological implications of Protestants’ — particularly Calvinists — attitude towards work and investment. Drawing a more direct correlation between doctrinal belief can be difficult. Viewing a church — particularly one with centuries of history behind it — through the reductionist lens of its effects on economic thought and development is essentially meaningless. Should Catholicism be primarily identified with Dorothy Day‘s socialism, the development of international banking under the Templars, the Church’s teaching on usury, or the invention of double-entry accounting by Brother Luca Pacioli? The New Testament’s attitude ranges from the parable of the talents to the admonition that " it is easier for a camel to go through the eye of a needle than for a rich man to enter the kingdom of God." One can find interesting historical trends, like the prevalence of Jewish professionals throughout history, and there are any number of instances of religious movements who seem inseparable from their attitudes toward money. Saying that God wants you to be rich isn’t going to lose authors any fans, but Western religions — as opposed to Western religious figures — generally avoid a fixation on secular wealth. Strictures about tithing and the occasional more spectacular contribution to God’s works are usually as far as it goes among mainline religions. Which makes the Darqawiyya sect of Islam, run by Scottish-born Abdalqadir as-Sufi al-Murabit so interesting; one of the tenets of the faith is the importance of returning the Muslim world to the gold standard (in this case the dinar coin, as mentioned in the Koran). The sect is small, though fairly widely distributed, and where the Sufi tradition from which it arose is centuries old, the Darqawiyya goal of a coup de bank took a uniquely modern form. In order to crush the heretical banking system that al-Murabit identified as one of the great enemies of Islam and return the Muslim world to its rightful place, the sect invested in a revolutionary dot-com.
When Jeanne Calment died in 1997, she had outlived every single one of her contemporaries. At 122, she was the oldest living person in the world; married in 1896, she had outlived her husband 55 years and her only grandchild by 37. At 90, she had sold her apartment to a French lawyer on a contingency basis; she would receive 2,500 francs a month, and when she died, he would inherit it. She collected the money for 32 years, receiving twice what it was worth, and outlived the lawyer by a year. It will take Henrietta Lacks a while to catch up with Jeanne Calment, but she will some day. Ms. Lacks is at something of an advantage, however, having died in 1951.
The influenza pandemic of 1918 killed at least twenty million people. Some mutation had transformed the disease that induces a week of misery into something deadly, that killed the young and healthy even more easily than it killed children and the elderly. La Grippe ripped through American and European hospitals, already strained by the casualties of World War I. More than forty thousand American soldiers died from influenza; in comparison, fifty thousand American soldiers died in action or from wounds throughout the Great War. In Europe and in America, the dead overwhelmed funeral homes and mortuaries; in some parts of the U.S., public funerals were outlawed. When people seem to be overreacting to SARS, this sort of global, widely transmissible pandemic is what they have in mind. Compared to the Spanish flu, the common cold is as comforting as a warm bath, but it sure doesn’t feel that way. As Ogden Nash wrote:
The census, conducted every ten years, is so important that it’s in the Constitution. The proportion of the House of Representatives allotted to each state is determined by the decennial census, which helps keep the Representatives representative. The nifty genealogical reseach that historic census data makes possible is merely an afterthought, but even so the American census is more pleasant than most censuses, which have historically been used to figure out the tax rolls. The census was known in Biblical times and makes its way into European history thanks to Servius Tullius, one of the last kings of Rome. The Roman practice continued for four hundred years, and the magistrates in charge of the census, the censors, were among the most important officials in pre-imperial Rome. The census enables taxation, taxation enables the state, and the state enables the census (whole bureacuracies spring up to support census-taking), so it’s unsurprising that there has long been some opposition to census participation. George Washington noted, during the first American census, that
By definition, I’ve never heard of any truly successful forgers. The best forgery is one which is never detected. A close second, however, would have to be one which is detected and then celebrated it in its own right. Things don’t work out nearly so well for most skilled forgers. Tom Keating, who forged over a thousand paintings, including more than a dozen spurious Samuel Palmer watercolors, was reduced to making videos. John Drew, who forged Giacomettis, is in jail. But the great forger of Vermeers, Han van Meegeren, found himself hailed as a national hero when, on trial for collaborating with the Nazis, he revealed that he had sold Göring a fake (and rather wooden) Christ and the Adultress. After being found out and serving a jail sentence, Elmyr de Hory, one of the twentieth century’s most prolific great forgers (Modiglianis, Picassos, and Jackson Pollocks were among almost a thousand fakes he created over his lifetime), became famous in his own right. He was the central character of a biography by Clifford Irving and Orson Welles’ F for Fake; today his fakes can command five figure prices. In the age of mechanical reproduction, when it can be impossible for a layman to distinguish a van Gogh from a Wacker or a Schuffenecker, why not just collect the forgery?
The pop anthropologist Jared Diamond likes to use Easter Island as an example of how a human society can self-destruct. Over a few hundred years, a people that had once been prosperous enough to build the famous stone monoliths, the moai, crashed into extinction. The island was deforested, the societal structures fell apart, and by the time the Europeans arrived in the early eighteenth century, the once fecund island could barely support the Rapanui people; the Rapanui may even have turned to cannabalism. Diamond, in a lecture entitled "Why Do Some Societies Make Disastrous Decisions?", speculates that the Rapanui’s religious beliefs — particularly their continuing desire to build the moai — prevented them from altering their behavior until it was too late. But no one knows for sure, because we can’t read the Easter Islanders’ written records.
June Carter died last Thursday, and it’s a great loss. I don’t expect that Johnny will last the rest of a year without her. June’s death also severs the last direct link to the original Carter Family, the group that basically invented recorded country music. A.P. Carter was a fiddler and traveling salesman with a fondness for the songs of the Appalachian hills; he met and married Sara Dougherty when she was 17, supposedly as she was sitting outside her family’s house singing "Engine 143". Sara’s cousin Maybelle (who married A.P.’s brother Ezra) joined them as a guitarist, and the Carter Family — after proving its commercial viability with a few singles recorded in Bristol, Tennessee — was born. The family was wildly successful before the Depression hit and sent A.P. wandering (collecting songs and looking for work). They recorded for the radio, including for Mexican pirate station XET, but the rest of the family largely stayed in Maces Spring, Virginia. Sara and A.P.’s marriage fell apart, but Maybelle — whose method of picking on her extravagently expensive $125 Gibson L-5 guitar came to be known as "Carter style" and was the standard bluegrass sound for twenty years — had decided that she liked show business, and with her daughters, June, Helen and Anita, began recording as Mother Maybelle and the Carter Sisters.
Last week, the Texas legislature was brought to a halt by the flight of many of its Democratic members. Texan Charles Kuffner has been following the action. The fifty-odd Democrats holed up in Oklahoma, hanging out at a Denny’s just over the state border. This is great, farcical political theater, from Oklahoma state officials refusing to assist the Texas Rangers and Texas Department of Public Safety officials drag the "Killer D’s" back to Austin (the Texas Constitution gives the Legislature the power to compel attendance, following the lead of Article 5 of the U.S. Constitution) to use of federal law enforcement by House Majority Leader Tom DeLay to find the Dems. The walkout — scheduled to end tonight, after Texas’ current legislative session ends — has the feel of an old-fashioned political donnybrook.
The oral tradition may not be dead, but it’s been buried in an academic journal. Nonetheless, as all-around smart guy Ray Davis noted in an email to me, if you squint right, the answer song looks like the latest — possibly the last — in a long line of forms taken by the oral tradition ancestors. Certainly Kool Herc, when he brought breakbeats and toasting to New York’s black music scene, wasn’t thinking about the praise songs of the West African griot. But I’m hardly the first to think that rap battles bear a significant resemblance to signifying or playing the dozens. But it’s not surprising; what else is communication for, if not for talking trash about your enemies, bragging about your sexual prowess, and giving shoutouts to your friends? It probably dates back to Neanderthal days.
The answer song is not a parody. Weird Al Yankovic writes those for a living. Some people see parody songs as the best way to spread the Word (that last is via Waxy and Defective Yeti), but mostly they just are or are not amusing the first time you hear them. For a Canadian 7", Billy Childish did a brilliant sendup of the unmistakable Kingsmen recording of "Louie Louie" with the tune he wrote for the Headcoats, "Louis Riel" ("The Metis and the Cree did agree to live on the plains peacefully / At the battle of Batoche, the dream was lost / And with their lives they paid the cost / Louis Riel, oh man, you’re gonna hang / Hey-ah, hey-ah"), which would, in a better world, win him some kind of award, but the parody song is, for the most part, fodder for Dr. Demento.
The idea had been stirring in the brains of inventors and mad scientists throughout the end of the nineteenth century: use radio to transmit voices through the ether. Marconi’s sparks showed that radio waves could reach from England to the Americas. Reginald Fessenden (link via MeFi) harnessed the power of wireless for speech and music as early as 1906. Everyone knew that radio was revolutionary, a gold mine. But at the beginning of the twentieth century, people were beginning to ask exactly what people could do with it.
On my birthday two years ago, I launched my Literary Year project. A year ago, I wrapped it up. Other people have been doing booklogs, in many cases better than mine. On the other hand, Diana of Field Notes, no slouch of a reader herself, said she was fond of Literary Year. I won’t flatter myself that I’m any great shakes as a critic — A. of Waggish is one of many far better sources for thought-provoking reaction to books — so I will ascribe Diana’s kind words to my catholic tastes. I’m starting a reviews log back up; unlike Literary Year, I will not even attempt to record everything I’ve read. Even aside from my inability to review everything, I read in bulk; this year, I reread almost the entirety of Steven Brust‘s Vlad Taltos books, a handful of of books by Raymond Chandler, three late Horatio Hornblower novels, and every book I own by Edward Eager. I love Eager, but it’s more difficult to write something interesting about the sixth treasured children’s book in a row than about, say, The Third Policeman upon first discovery. Nonetheless, I’ll give it a try every now and then, lest Diana think ill of me. Seven Things Lately will also contain more than just book reviews; it’ll be a consumption journal. In Providence and in Berkeley, I watched a hell of a lot of movies; now that the Silver Theatre has opened, I hope to resume the practice. Food and drink and comics and music and suchlike will all be fair game, as will websites I don’t think I can wedge into a proper Snarkout entry. (I have my own elaborate rules about what constitutes such a thing.) I’ve installed Moveable Type and will be engaging in some shuffling over the next week or two; please forgive any broken links or hideous stylesheet foulups you encounter. In the interests of keeping this sort of tedious housecleaning update away from essays about con artists, decadent Frenchmen, the origin of nachos, and anti-Stratfordians, I’ve also added an announcement page that will be of little interest to anyone. Thank you, readers, for indulging me; your continuing interest is always something of a surprise to me, and remains a wonderful gift.
Katie Roiphe is worried that people are reading What I Loved all wrong. She cites a New York Observer writer who carefully explicated all the similarities between the life of the author, Siri Hustvedt, and her protagonist. Roiphe seems to think that, mere coincidences like dates and biographies aside, it is an act of "sordid literary sleuthing" to feel that a novel about an academic (married to an artist) whose stepson is involved in a drug-related murder bears any relationship to the real life of a poet and novelist (married to a novelist) whose stepson was involved in a drug-related murder. Roiphe goes on to complain about the reception of her own family’s novels and notes that "in the mid-‘50s, when Mary McCarthy wrote her novel A Charmed Life, reviewers took her to task for myriad weaknesses, but not for the mere fact of writing about her ex-husband, Edmund Wilson." This is rather amazing. I know from The Morning After, Roiphe’s feminist-slamming book about date rape, that she earned a graduate degree in English at Princeton; it is a mystery why Roiphe, possessed of a Ph.D. in literature, refuses to mention the roman à clef.
Ben Franklin, the "first American", once declared that nothing in life was certain but death and taxes. Brother, if he only knew. In colonial and post-revolutionary America, the government funded itself using taxes, levied against certain goods made in America, and tarriffs, levied against imports. The occasional armed revolt broke out. Shay’s Rebellion in 1787 was at least partially about taxes although largely about more general economic hardship faced by Massachusetts. The Whiskey Rebellion of 1794 was explicitly a response to a tax on whiskey dreamed up by Alexander Hamilton; the hard-drinking frontiersmen of western Pennsylvania didn’t take kindly to the idea, and rose up to fight until the rebellion was quashed by a militia under the personal command of President George Washington. In wartime, however, sin taxes and tarriffs didn’t suffice. Income taxes were first levied during the war of 1812, and again during the Civil War. In 1894, a tarriff-reduction measure brought with it the first peacetime income tax; the Supreme Court promptly struck it down as unconstitutional in 1895’s Pollock v. Farmer’s Loan. In 1909, the Sixteenth Amendment, legalizing the federal income tax, was proposed. To the joy of editorial cartoonists nationwide, it was ratified; decades later, cartoonists were still making a living off mocking the taxman (link via Kieran Healy). If one feels that income tax is bad or that progressive taxes are bad, one will find a great deal of company. The argument that income taxes are unconstitutional, however, would seem to be on shakier ground — there are all those laws and editorial cartoons as evidence! But people say it, and a whole microindustry has sprung up around providing escape mechanisms for tax protestors (link via Making Light). One might think that the idea that Nebraska was not a state or that the flag in a courtroom annulled judicial authority or that a hidden Thirteenth Amendment made Congress obsolete would be harder to swallow than the idea that not paying one’s taxes is a crime, but apparently not. Just last week, a federal judge contemplated banning Irwin Schiff’s The Federal Mafia: How It Illegally Imposes and Unlawfully Collects Income Taxes in front of "a courtroom filled with vociferous tax opponents." I’m pulling for Mr. Schiff; banning books, even if they’re tax fraud manuals designed to sucker the unwary, creeps me out. I wonder, however, if Schiff is charging sales tax on those things.
The aftermath of the war in Iraq has begun; the images of Iraqis looting hospitals and museums have begun to filter back to the United States where they’ve been inevitably spun. It’s been an amazingly bloodless campaign for the Americans, slightly less so for the Iraqis (but still much better than I had feared). The American military isn’t a police force, and thankfully it knows it’s not a police force; even if it were, it’s not at all clear that they had enough people to control a city of five million people. Like Teresa, I’m appalled by the almost overnight dismantling of Mesopotamia’s history.
The baseball season is in it’s second week, and my fantasy team is already struggling to stay out of the cellar. My inability to draft decent starting pitchers is part of the problem, but the shoulder injury Derek Jeter suffered in the season opener didn’t help. I’m no fan of today’s Mr. Yankees, who I think is overpraised, but the hit he took looked ugly, and I’m happy that he won’t require surgery. Baseball is a non-contact sport. The death from heat stroke of Steve Bechler was shocking in part because serious injury on the diamond is so thankfully rare. There are only a handful of people (mostly players, a few umpires) who have been killed as a direct result of the national pastime, most famously Ray Chapman, a standout catcher for the Cleveland Indians who was struck in the head by a Carl Mays pitch in 1920. Today, beanballs are much less common and when people get hurt, but it’s usually relatively minor — Ken Griffey’s hamstring, Juan Gonzalez’s thumb — or a result of the fantastic stress of heaving a ball at 95 miles per hour. Pitching is strenuous enough that it’s somewhat surpring how long-lived some pitchers can be. Nolan Ryan‘s career lasted 27 years after its inauspicious start; Nolan was 6-10 over his first two seasons in the majors before going on to win 324 games. Jesse Orosco isn’t nearly the pitcher that Ryan was, but if he continues to be used as sparingly as he is (a recent save was his third in the past five seasons), there’s no reason he can’t hold up for another few years. But it seems likely that no one will ever replicate the longevity of Hall of Famer Satchel Paige.
Today is the birthday of a very special little boy. In 1951, Osamu Tezuka, the "father of manga", created a character known "Mighty Atom" in a Japanese comics magazine. "Mighty Atom", better known in America as Astro Boy, was a robot with the futuristic birthdate of April 7, 2003. Today celebrations are taking place in Japan and at scattered locations in the U.S. (link via Boing Boing). Astro Boy, along with Tezuka’s Kimba, the White Lion, became the basis for the Japanese animation industry, today one of Japan’s major export industries. But April 7 has arrived, and despite the arrival of breakdancing robots, the market for robot superheroes with a heart of gold is, as yet, not fully developed. That’s the danger of putting an expiration date on the future; the University of Illinois computer science department had to have HAL’s birthday party without HAL. As Bruce Sterling has noted, even when science fiction got things largely right, it missed on the particulars. And Sterling’s examples of getting things largely right weren’t so much the product of visionary futurism as a reading of history; Orwell’s 1984 was two parts Stalist Russia and one part BBC bureaucracy, and Heinlein’s theocracy in Revolt in 2300 was Huey Long crossed with Father Divine. But Heinlein’s heroes sound suspiciously like Heinlein characters rather than residents of the 24th century, and Orwell didn’t predict the control devices of today. With a few exceptions — "Deadline", a short story that described the atom bomb, is one famous example — predictions generally fall flat. Perhaps that’s for the best; a writer in 1951 might have been able to guess that the Soviet Union would fall or that we’d have access to battery-powered toothbrushes by April 7, 2003, but would he or she have predicted improvements on dental hygine or loosening of political repression under Kruschev? Could the facts of Braun’s product development cycle or the Velvet Revolution have the same narrative appeal of something that a talented writer just made up? Truth is stranger than fiction, but truth can’t provide a robot boy with seven different powers and a strong sense of justice. Thank you, Osamu Tezuka, and thank you, Dr. Elephun. Happy birthday, Astro Boy!
According to the New York Times, Michael Drosnin has been briefing intelligence analysts at the Pentagon (link via Charles Kuffner). Drosnin is the author of The Bible Code, the 1997 bestseller which asserts that the Torah contains specific predictions about the future hidden in equidistant letter sequences. Gematria, reducing the words and letters of the Torah to numbers, is a kaballic exercise with milennia of history behind it, but I sure hope that nobody’s putting too much weight into Drosnin’s methodology. Drosnin arranges the letters into grids of arbitrary width; unfortunately, the method is flexible enough that one can produce almost any result one chooses. Responding to a throwaway comment by Drosnin (who asserts that the Torah predicted the assassination of Yitzhak Rabin), skeptics have produced assassination prediction in Moby Dick. I’m sure if they wanted, they could produce textual evidence predicting Saddam Hussein’s death or the appearance of SARS or Marquette upsetting Kentucky. Given enough lattitude, anything that a researcher recognizes as a pattern can be considered a valid result. The whole thing resembles nothing so much as the attempt to find anagrams proving Francis Bacon’s authorship of Shakespeare.
I don’t write about politics very often. There are two reasons, neither of which is a lack of interest. First, there are enough dabblers in the subject on the Internet; the world doesn’t need to hear from another person who mistakes smarts for political insight. Second, I generally feel that one’s political beliefs are best left out of conversations with strangers, a rule that tends to protect both participants. (If you, Dear Reader, are not a stranger, I apologize heartily. Email me and I’ll willingly rant away.) My misgivings about the long-term effects of America’s disregard for our allies, my disgust with what I see as the Bush Administration’s deliberate attempt to blur the issues surrounding the conflict in order to gain domestic political capital, and my discomfort with what I see as reflexive and ill-thought-out anti-war stances on the left are political issues. I’m not going to talk about them. There are fine writers (and decent people) who seem to share a great many of my beliefs and concerns, and if you’re looking for that sort of thing, I recommend you visit Kevin Drum’s CalPundit and find your way from there. Discussing the war — be it my concerns about a looming Turkish-Kurdish-Iranian conflict, the breakdown of Donald Rumsfeld’s "Afghan model" (which worked so very well in Afghanistan), or simply mourning the dead — is not a political issue. On the other hand, whereas I’m as knowledgable about politics as anyone else who reads The Economist and the Post, that level of familiarity with matters of war is rather useless. I can’t tell how the war is going, no matter how much I read the excellent coverage at the BBC or the rather astounding news roundup provided by the Agonist, it’s not going to really make me better informed. Misinformation and disinformation are the bread and butter of wartime journalism. I can read the comments over at the left-leaning Daily Kos or the right-leaning Tacitus. Kos or Tacitus are veterans and know viscerally what Army life is like, but I’m not sure anyone who’s talking really has any idea what’s going on. This is the live television coverage war, but if there’s a 21st Century Ernie Pyle out there, I don’t think he or she has been revealed yet. So it’s a diet of worry and obsessive reading. I feel useless and stupid and disconnected from the people runnng the country, but there’s nothing to do but hope and pray that this ends as well as it possibly can (and from my stance on day seven, it’s pretty clear that "as well as it possibly can" is not going to be the same as "as well as it possibly could"). We’re going to win the war one way or another, but I want this to be as bloodless as possible, for our men and women to get home safely, for it not to stretch into years of occupation and guerilla war or turn into a multinational bloodbath. Someone has made a giant mistake, and I want it not to have been America. I want the long-term outcomes of this war to be a free and prosperous Iraq, not fascistic nations getting ideas about how to discourage war (or, for that matter, a shattered U.N. and a series of new American client states). I don’t want to turn on the news tomorrow and find out that three hundred Marines have died. I don’t want for it to be accepted wisdom some day that they died in vain.
Even were it not one of the busiest public transit systems in the world, transporting 19 million passengers annually, the London Underground would surely be among the most famous. It has appeared in dozens of British films from the ‘20s onward. It’s inspired artwork in the Tate, fan weblogs, and its own museum. The history of the Tube, which began as several independent lines, is often fascinatingly tawdry. The Bakerloo Line, for instance, was begun by British industrialist Whitaker Wright in the 1890s; when Wright’s commercial empire collapsed he was put on trial for fraud and committed suicide by swallowing a cyanide pill, leaving American subway magnate Charles Yerkes, himself having been convicted and jailed for misappropriation of funds in the 1870s, to finish the line. But the fact that a subway system an ocean away from me is instantly recognizable has nothing to do with skullduggery in its past; it’s the result of the creative works of two men, Edward Johnston and Harry Beck.
Japanese journalist Shun Akiba thinks he’s discoverd a secret city beneath Tokyo. There are discrepancies in old subway maps; huge swaths of land seem to have disappeared off the maps. Could the Japanese have built some sort of civil defense structure beneath the city? The United States had a number bunkers from which the government could operate in case of nuclear war, Mount Weather and the Greenbriars being the best known. Those were kept secret for some time. And I can’t say that I’m an expert in the post-war Japanese psyche, but if films such as Godzilla and I Live in Fear are any guide, announcing that huge, expensive shelters had been built to protect the Emperor and the government when bombs started falling on Tokyo might not have been a wise decision. So it’s certainly possible that these shelters were built for perfectly understandable reasons but kept classified for decades. On the other hand, perhaps Akiba is making too much of some circumstantial evidence because he really wants to believe in the existance of a secret underground city. Who wouldn’t?
A number of websites I read (1, 2, 3, 4) have taken it upon themselves to create a guerilla celebration: Oulipo Day (or possibly Month or Year). It’s no zanier than "For Pete’s Sake" Day or Polar Bear Day or any of the other manufactured holidays from last month, so why not? Mind you, I haven’t ever read any Raymond Queneau or Georges Perec. I have, however, read Gilbert Adair. Adair is something of a Renaissance man; he is the author of Love and Death in Long Island (which was made into a movie starring Jason Priestly, so it must have been good) and a noted film critic, as well as an essayist and theoretician (about such topics as the origins of Death in Venice and the poetry of Bruce Andrews). More to the point, he’s translated a lipogrammatic novel by Perec; Adair’s translation (called A Void in English) takes the plot of Perec’s La disparition, a whodunit about the mysterious death of Anton Vowl and, more amazingly, also carries off Perec’s trick of leaving out the letter "e". It’s been done before in English, but the sections and reviews ("Arranging for many such omissions in this book is our lurking author, a lipogrammatic artist and assassin who both plots Vowl’s doom and plucks his customary signatorial pictograph.") I’ve read make Adair’s version sound like a good deal of fun. Poe’s "The Raven" is the jumping-off point for "Poe, E.: Near a Raven", the most impressive exercise in the Oulipian vein I’ve seen to date (no slight intended to the authors of the Theory of Relativity as rendered in Made Apex Dean). An instantly recognizable poem with easily mimicked cadences must lend itself to Oulipo zaniness, because where Perec wrote lipogrammatic transpositions of Hugo and Verlaine, Adair does his own take on "The Raven":
The Mauch Chunk Switchback, a forty-mile stretch of rail near Jim Thorpe, Pennsylvania, was meant for transporting coal. Its place in history was assured by the unknown genius who saved himself a walk down to the station by hitching a ride and discovering just how fun it was to hurtle down the mountain. When the rail was rendered obsolete by a tunnel in 1872, its owners switched it entirely over to thrillseekers. George Ferris‘ celebrated creation for the Columbian Exposition had been a huge financial success, and engineers began to refine the concept. Soon more intentional coasters began to spring up, including the Switchback Railway, built in 1884, and the Leap the Dips coaster, built in 1902 and today the oldest standing coaster in the world. The Switchback Railway was followed by a host of imitators thanks to its immense financial success and its location: the tony seaside resort and racing destination of Coney Island.
Are you looking for a hobby that screams "extreme"? Is civil war reenactment too much about thread counting, drilling, and maggoty hardtack, and not enough about the glorious violence? Gator wrestling too traumatic (for the alligator)? Training as a blockhead for the Jim Rose Circus Sideshow too much effort? Why not join the London Riot Reenactment Society (link via Need To Know)? This wonderful group attempts to bring London’s vibrant history to life! Major American riots of the last hundred and fifty years have unfortunate political overtones: New York and my beloved Baltimore broke out in waves of anti-Union and anti-black rioting during the Civil War; Tulsa’s black neighborhood of Greenwood was burnt to the ground and dozens were killed as a white mob ran amok; the Watts riots ended a more innocent age, highlighted the schism between white and black America (the LA Times had no black reporters and relied on an advertising salesman for coverage), and brought the reflexive convulsions of the urban underclass to living rooms nationwide in living color. In contrast, reenacting Wat Tyler’s Peasant Revolt of 1381 ("In a dramatic climax which will take place at Smithfield a re-enactor dressed as the 14 year old King Richard II will meet a re-enactor dressed as Wat Tyler, who will then be murdered by a re-enactor dressed as the mayor"), the Spa Fields Riot of 1816 ("Eighty re-enactors dressed as police will attempt to disperse the crowd of re-enactors, and one re-enactor dressed as Joseph Rhodes will be stabbed") and the Fourth Hunger March of 1932 ("Many re-enactors and police will be injured") are for the most part race-issue free, so potential reenactors can take part in them with a clear conscience and enjoy the sort of hearty, educational experience that the LRRS surely intends. The best to reenact, of course, would be the gin riots: expressing one’s contempt for Robert Walpole and love for Mother Gin, putting the fear of God into informers, and making two great political philosophers very proud.
Before it became a symbol of fin de siècle decadence, absinthe was simply an herbal cordial with an attractive green hue. Like sarsaparilla, absinthe was considered to be medicinal; French soldiers in Algeria were given the stuff as tonic against fever. Unlike sarsaparilla, absinthe was roughly 140 proof, and the soldiers apparently developed quite a taste for it. Although absinthe had been invented in 1792, and Pernod Fils, owner of the original recipe, opened its first French distillery in 1805, it wasn’t until the solders came marching home in the 1840s that the drink achieved popularity. It soon became a recognizable symbol of the demimonde: Van Gogh painted it, Rimbaud wrote about it, and Oscar Wilde quipped about it ("After the first glass, you see things as you wish they were. After the second glass, you see things as they are not. Finally you see things as they really are, and that is the most horrible thing in the world."). But it was also popular among France’s upper and middle classes; cocktail hour was l’heure verte. In 1905, when Swiss peasant Jean Lanfray went on a murderous rampage after drinking a minibar’s worth of liquor, newspapers seized upon it as an "Absinthe Murder." As if to confirm the prognosis, an absinthe drinker in Geneva killed his wife with a hatchet. Tens of thousands of signatures were collected on petitions to ban the drink. In France, La Fée Verte would soon be a memory.
As a twelve-year-old, Bobbi Trout decided she wanted to fly. Outside of girls’ adventure novels, there weren’t many models for a schoolgirl who wanted to be a pilot. Journalist and photographer Harriet Quimby received her pilot’s license in 1911, becoming the first American woman to do so. Betty Scott had received lessons from Glenn Curtiss, become a stunt pilot, and worked as a test pilot for Glenn Martin. But when the first Women’s Air Derby, memorably dubbed the Powder Puff Derby by Will Rogers, put out the call for contestants, only forty women qualified. One of them was Bobbi Trout, who had earned her license in 1928. Nineteen women took part in the race, which began in Los Angeles, looped up through Washington State, and ended in Cleveland. Louise Thaden won the race, with Gladys O’Donnell and Amelia Earhart finishing second and third. Mechanical troubles had knocked Trout out of the running, but she soldiered on and was one of the fourteen pilots to finish. Half a million people paid to see the Derby’s finish in Cleveland.
And speaking of pirates, I have to confess that I, like Cory Doctorow of Boing Boing, am looking forward to Pirates of the Carribean. It’s vaguely disconcerting that movies are now being based on circa-1967 Disneyworld rides (although I suppose it can’t be worse than The Country Bears. It’s more disturbing that the movie will be a product of Jerry Bruckenheimer, a man who seems to have legally changed his first name to "schlockmeister". But I can’t help it; I’m a sucker for pirate movies, and I doubt there will ever be that many new ones. But why? What could be more cinematic than a tale of good, evil, and swashbuckling on the high seas?
Some people are calling today’s Super Bowl the "Gruden Bowl" after Jon Gruden, who left Oakland for Tampa Bay last summer. (His current team one.) But since it was the Oakland Raiders vs. the Tampa Bay Buccaneers, I think it should go down in the books as the Pirate Bowl. Pirates have obvious merits as sports team mascots. They’re instantly identifiable. They’re graphically interesting. They’re mean. Both teams depict their pirates as pasty white guys, which slights the proud tradition of African, Arab, and Chinese piracy but forestalls any of the complaints that spring up when you name your team something like "the Redskins". (At least Louis Sockalexis played for the Cleveland baseball team, even if they weren’t actually named after him; the great Jim Thorpe played for a number of football teams, but not Washington.) The Pittsburgh Pirates got their name after "pirating" Louis Bierbauer away from the Philadelphia Athletics, but both Oakland and Tampa are port cities. However, where Tampa celebrates regional (and presumably nonexistant) pirate José Gaspar with its annual festival of Gasparilla, Oakland celebrates pirates by dressing up like extras from The Road Warrior. Raider Nation may be full of crazed, aggressive fans, but they’re not really crazed, aggressive, pirate fans. Better luck next year, Oakland!
The American West is a little smaller today. Douglas Herricks is dead (link via Boing Boing). Herricks, whose name I did not know until today, was a young man in Douglas, Wyoming, when he and his brother returned from a hunting trip with an oversized jackrabbit:
Public Enemy’s last great song is a historical artifact now that even Arizona honors Martin Luther King’s birthday with a holiday. Unfortunately, Dr. King now inhabits a similar role in the American iconography as Washington or Lincoln. It’s an odd term to use for a man who was "the son, the grandson and the great-grandson of preachers", as his masterful "Letter from Birmingham Jail" put it, but King has become a secular saint. It’s not a fair characterization of a complicated man. Reading about his epic struggles with Richard Daley gave me more of an impression of how he operated as a politician. J. Edgar Hoover’s attempts to ruin him as a symbol of the civil rights movement (which transformed from an investigation of Communist influence at the Southern Christian Leadership Center to a pure smear campaign focussed on King’s sexual proclivities) brought to light not just a government agency that had gone out of control but also a very human, very fallible man. In his last speech, given in support of striking sanitation workers in Memphis the day before he was assassinated, Dr. King lectured on the parable of the Good Samaritan:
Occasionally I see someone online make a particular insulting comment: "What a maroon!" When I am particularly lucky, I will see another person put forth an indignant defense (thanks to Dan of Tinyblog for the illustrative example) of the term "Maroon". Maroons, we are reminded, were free blacks who formed communities throughout the Carribean and Americas, most notably in the Great Dismal Swamp and Florida. These people were bravely attempting to set up a parallel existance to antebellum American society! They fought in the Seminole War, the only Indian war that ended without a peace treaty! How can we use the term "Maroon" as an insult? The problem is that while all these things are true, they have very little relevance to someone quoting Bugs Bunny mangling the English language. It’s a case study in how associations are not always apparent to one’s audience. Language drifts.
Eighty-three years ago today, Bostonians were enjoying unseasonably warm weather. The temperature had shot up forty degrees in three days, allowing lunching workers near North End Park to doff their jackets. Little did they know that they were about to be swept up in American history’s most laughable urban disaster. An huge industrial tank was about to explode, sending a "roaring wall of death" down Commonwealth Avenue with a "a horrible, hissing, sucking sound." January 15, 1919, was the day of a "wet, brown hell" was unleashed, the day of the Great Boston Molasses Flood. It sounds like an urban legend, but contemporary reports make it clear that it was no joke. The tank held almost two and a half million gallons of molasses, roughly fourteen thousand tons. When it ruptured, molasses sprayed out with a pressure of several thousand pounds per square inch at an estimated speed of 35 miles per hour and leveled several city blocks.
Last year, George Gelestino, the owner of the late, great Vinyl Ink record store in Silver Spring, Maryland, passed away. When I was a teenage indie kid in suburban Maryland, I only knew of a few choices for where to go to get that Harriet 7” all the zines were raving about: Soundgarden in Baltimore, Vinyl Ink, and later Go! in Arlington. The local chain, Kemp Mill, wasn’t going to cut it. Musicland was a joke. Tower was a distant urban dream; I mostly went when I was in visiting relatives in Cambridge, Mass. Until I got a car, I did most of my record purchasing through the mail, buying from distributors like Parasol and a million and one tiny labels with ads in the back of Option and Punk Planet and Caught in Flux. But it was still good to go to Vinyl Ink, even I perceived it as the sort of collector scum-dominated store that knew people existed who would pay four hundred bucks for a rare Stereolab single on clear vinyl. As a deeply insecure 15-year-old, I always found it slightly dizzying to shop there; Vinyl Ink seemed cool, cool in a way that I was not.
Readers may have noticed that my posting frequency has plummeted over the last few months. I apologize; my life has come to feel something like a plate-spinner’s act. While I may not be up there with Leonardo or Jaq Schmidt (today’s plate spinners probably aren’t up there with Leonardo or Jaq Schmidt themselves; while everyone, myself included, loves to use plate spinning as a strained metaphor, and jugglers seem to be keeping the faith, when was the last time you saw one on network TV?), things have been really busy lately. Hopefully life will be somewhat calmer in the first few months of the new year and I can make something of the various notes and scribblings I’ve strewn around the office. (Some structural change will also be coming to Snarkout as soon as I can find the gumption to sit down and make it happen.) In the meantime, have you considered visiting one of the sites on my bookmarks page? Pure gold, the lot of them, even the ones that aren’t there any more. I read Boing Boing, Leuschke, and Girlhacker on a daily basis. Those creepily obsessed with my everyday life can get an idea what I’m eating by reading V.’s Hungry Tiger food journal; for those more those interested in the sorts of things that rattle around my head, while Juliet is still (sadly) on hiatus, her archives remain dandy reading, and Portage has returned; WorldNewYork is back too. And it’s not remotely like what I write (primarily by virtue of being so damn funny), but Izzle Pfaff! should be read now, now, now.
The International Documentary Association has recently announced that Michael Moore’s Bowling for Columbine is the number one entry on its top twenty list of "all-time favorite non-fiction films." Now, I was a big fan of TV Nation. I thought Roger & Me, Moore’s apoplectic response to General Motors’ treatment of his decaying hometown of Flint, Michigan, was a fine piece of work. I’ll state up front that I haven’t seen Bowling for Columbine. Even ignoring the curious coincidence that the best "non-fiction film" ever made was made just this very year (despite the presence of non-fiction films date back to the very beginning of commercial cinema; a paying customer could have seen Burial of the "Maine" Victims during the Spanish-American War), however, it seems highly unlikely to me that Bowling for Columbine is the best non-fiction film ever made; it’s almost certainly not the best documentary. Moore isn’t really a documentarian. He’s a polemicist.
There are a number of things that are, even the most dedicated Francophobe would admit, best experienced au francais: New Wave cinema, cave-ripened cheeses, Jerry Lewis appreciation. To this list, one might add "anti-religious rhetoric". Seven years after the most famous American atheist was murdered, Madalyn Murray O’Hair still has people worried that she’s going to boot God off T.V., but "America’s most hated woman" was in many ways a rather bland figure, living quietly in Austin and scratching out "In God We Trust" from coinage. The Sisters of Perpetual Indulgence are all in good fun, and even Mark Hoffman, the "Mormon bomber", was at least partially motivated by covering up his many criminal enterprises while he attempted to destroy the Latter-Day Saints through the power of explosives and embarassing forgeries. Hoffman was murderous, but somehow he just doesn’t seem as meanspirited as someone like Michel Mourre’s, who infliltrated the 1950 Easter Mass at Notre Dame in Paris. Dressed as a monk, Mourre stood in front of the altar and read a pamphlet proclaiming that God was dead. If it had been available, he might have played a recording of his fellow Frenchman Antonin Artaud.
Teresa Nielsen Hayden has produced a marvelous taxonomy of the con; like her, I’m not quite sure about the fine lines of the distinctions she draws (where would something like the fiddle game fall? Is it a bait-and-retraction or simple misrepresentation?), but it’s a pretty comprehensive list, covering everything from phony bonds to the tax avoidance methods sold by con artists to a ready market in people who really, truly, honestly believe that the I.R.S. is illegal.. In some cases, like that of Frank Agnable, the categories just blur; in his teens and early twenties, Agnabale, who was apparently an unflappable actor, passed himself off for two years as (among other things) a doctor and a Pan Am pilot, but his money mostly came from kiting checks, using both his impressive abilities at convincing people he was who he wasn’t and some early exploitations of the national check processing system. Agnabale now makes his living as a wildly self-promoting lecturer and security consultant; I’m not sure quite where that falls in Teresa’s taxonomy. A movie based on his life and directed by Steven Spielberg will be out this winter; I’m relatively certain I know where Hollywood falls. But Teresa has covered every major variety of con game that I know of; there’s only so many basic ways to convince people to hand over their money, although there’s a vast number of variations. One of these, the Nigerian 419 scam, is (as Teresa notes) simply an updated version of the Spanish Prisoner.
The introduction to the Ace edition of Fritz Leiber‘s Swords Against Deviltry notes that
There are fewer truly great criminal masterminds than there might be. The average bank robbery nets less than $5000, and the would-be Willie Sutton should remember that most bank robbers are caught. Kidnapping tends to turn into murder all too frequently, and, like blackmail, depends on the victim playing along; when dealing with a hard nut like John Paul Getty, who negotiated his grandson’s ransom down $15 million, this is a tenuous proposition at best. In a world where corporate malfesance can lead to billion-dollar frauds, what can the everyday street criminal do to measure up? Even the crimes of Adam Worth, the criminal mastermind upon whom Professor Moriarty was reportedly based, seem picayune in comparison. Where are the criminals who think big? Oscar Hartzell (thanks to Defective Yeti for bringing ths book to my attention), the swindler who bilked Depression-era Americans out of millions by claiming that he was going to win an immense proportion of England’s national wealth in a lawsuit stemming from the disputed estate of Sir Francis Drake. When Hartzell was dying in prison, he apparently believed that his lies were true, that the Drake fortune was real, and that he was the rightful ruler of much of the free world. What if some Napoleon of Crime tried to take over the world or, if conquering the world was too difficult, a nation?
The company once known as British Petroleum now wishes to be called BP; in a rebranding effort that’s been a long time coming (at a temp job in 2000, Redfox helped put together an identity guideline kit for their helios logo). They’ve been running some commercials touting themselves as "Beyond Petroleum", and Slate’s Daniel Gross is unimpressed. He feels their move "inspires no small amount of cognitive dissonance"; the subhead of his critique claims their ad campaign "makes no sense." Gross is almost certainly correct that cynical, possibly hypocritical, self-promotion is behind the rebranding effort; that is admittedly quite often the basis of ad campaigns. John Browne, BP’s chairman, comes off as one smart cookie; if a veneer of social responsibility provides benefits to BP, Browne seems like he’ll pick up on that fact. Further, while I have some issues with it as a design movement, I find something very reassuring in science fiction writer’s Bruce Sterling’s Viridian capitalist thought experiments. America hasn’t shown any desire for a Manhattan Project-style undertaking to ween ourselves off fossil fuels, so I’m all in favor of Sterling’s notion of market-driven environmentalism: driving environmental policy through the creation of gadgetry that consumers want, thanks to pricing, novel features, or desperate hipness (or potentially all three — link via Boing Boing) which also advances the state of the art in environmentally efficient disposable consumer culture. Sterling loves BP; a major player in the world petroleum market actively trying to make a buck on renewable resources is a good thing, assuming they’re better at staying in business than some folks. Gross is right that solar power isn’t going to be more than a blip on BP’s earnings for years, if not decades, but I don’t think they’re in it for their health. I think BP thinks it can grow to be a leader in the solar and hydrogen (via its immense natural gas extraction capabilities) markets of the future, but if they can’t, they can at least get their money’s worth in good publicity.
What were the most beautiful experiments in physics (link via Making Light, among others)? The concept of beauty (or "elegance", as mathematicians tend to put it) in science is not any better defined than it is in art or literature; it’s a subjective quality that’s been the subject of arguments for milennia and probably since humanity first put burnt stick to cave wall or arranged some shells. Physicist Chad Orzel mused about the list and wondered why the Michelson-Morley experiment (which disproved the existance of luminous aether which had long been assumed to exist) didn’t make the cut. His guess is that experimental complexity (or its close analog, the degree to which a physics professor can demonstrate the experiment during a lecture) had something to do with it. But the Michelson-Morley experiment seems elegant to me; its results were surprising and important, it was conceptually (if not in practice) a simple experiment, and it involved proving that something didn’t exist.
The poor people of Grover’s Mill never knew what hit them. On the evening of Halloween, 1938, as America watched war engulf Europe, the small town in New Jersey was visited by warfare of a different sort. On the Columbia Broadcasting System, an evening of dance music featuring Ramón Raquello was interrupted by one shocking announcement after another:
One hundred and twenty-three years after Luigi Galvani electrocuted a dead frog, making its muscles contract and its legs twitch, a crowd gathered at Coney Island waiting to see an execution. Topsy the Elephant had killed three people, and even if one of them had fed her a lit cigarette, no one was much interested in granting clemency. Luna Park on Coney Island featured a number of elephants who gave rides to the amusement park’s guests, and there was no way that they were going to let a rogue elephant remain at the park. The concept of animal cruelty being a more flexible one then; the ASPCA had objected to the original plan to hang Topsy, so unlike Mary the elephant, hanged in 1916, Topsy was going to die in the most modern and humane way possible: she was going to be electrocuted.
In 1453, Constantinople fell to the Turks; for two centuries more, Europe faced an Ottoman Empire that threatened to conquer Vienna and spill into central and western Europe. But as Byzantine refugees fled into Europe, they brought with them their scholarship, helping to spur the Italian Renaissance forward, particularly with their translations of (and abiding interest in) Plato. But they brought something else with them: the Corpus Hermeticum. These books were the works of Hermes Trismegistus, contemporary of (or perhaps even precursor to!) Moses and greatest magician of the age, Hermes the Triple Master, Hermes Thrice-Great.
Despite the paucity of my updates lately, I’m still out here. Last weekend, I moved into a new apartment a few blocks away from my old one, which was a great deal of sound and fury for very little physical translation. The new apartment has more light and more room than the old one, but a smaller and more poorly designed kitchen, which I hope will not impact V.‘s principal hobby too badly. I hope to be back posting regularly as soon as my DSL is up and running, but given that I’m dealing with the most Kafkaesque of the Baby Bells, I don’t expect that before next week. Some of my favorite compatriots seem to have succumbed to the beginning of the semester, but I have entirely different problems. Additionally, with the anniversary of Sept. 11th tomorrow, I haven’t felt hugely enthusiastic about writing something cheerfully trivial. Chad Orzel’s advice for commemorating that horrible day seems as good as any: "Buy a cop lunch. Buy a fireman a beer. Go to church, light a candle and pray." When the news was breaking, I immediately thought of giving blood; a year ago, donations surged beyond America’s needs (and even America’s storage capacity), but blood banks are still around and would still appreciate help. Thanks for bearing with me in my quiet time; I’ll be jabbering again real soon.
I’ve just found the solution to a minor mystery; every now and again when I’m reading some old science fiction, I run across a story or a well-conceived book review by H.H. Holmes. This in itself is not terribly surprising; I read a fair amount of older science fiction, and one of the local junk stores sells old magazines, including some back issues of Fantasy and Science Fiction. There are any number of writers whose work I’m vaguely familiar with and whose names hover right below a level at which I’m consciously aware of knowing who they are. I’d have assumed that H.H. Holmes was one of these, a John Wyndham or a Ward Moore, were it not for the fact that Dr. Henry H. Holmes, a Chicago druggist, was also America’s first serial killer, whose house of horrors was exposed in the 1890s after an insurance scam went awry.
You wouldn’t know it to look at the Western art he’s been painting since his retirement, but Al Feldstein played at least two small but crucial roles in American popular culture. Both stemmed from his position at Entertaining Comics. EC had been founded by Max Gaines, whose earlier company, Dell, may have been the first real comic book company. Gaines went on to found All-American Comics (later merged into DC; a number of All-American’s titles, penned by comic book pioneer Gardner Fox, are still available today in slightly different forms). The name originally stood for "Educational Comics", then stood for "Entertaining Comics", but seems to have simply been a nod to Gaines’ history with DC. When Max Gaines died young, his son William inherited EC, then having largely abandoned the educational comics game to become a publisher of funny animal books. Gaines doesn’t seem to have been terribly interested in the company until Feldstein came along, but the two seem to have gotten along like a house on fire. They quickly launched into crime comics, following in the footsteps of the lurid bestseller Crime Does Not Pay, and then (writing the stories themselves) began publishing Tales from the Crypt, The Vault of Horror, and the rest of the EC horror comics. The two men had created a genre.
From summer 2001 to spring 2002, phootgrapher Simon Høgsberg (link possibly via the excellent photography weblog Consumptive) camped out at Marble Arch in London and took pictures of pedestrians as they walked by. The photos themselves vary in quality, but there’s something wonderful about the project of making art based on everyday people. It’s vaguely reminiscient of Daniel Meadows’ "Photobus" work, a series of free portraits he took in England in 1973 and again (with many of the same participants) in 1998. It’s even more reminiscent of some of the work at the Hirschhorn’s current "Open Cities" street photography exhibit, particularly the work of Beat Streuli and (especially) Philip-Lorca diCorcia‘s wonderfully cinematic street photography.
Patrick Farley, the man behind "The Guy I Almost Was" and "Apocamon", has been working on a comic about Afghanistan. It’s called Spiders, and part three is now available. Scott McCloud is a big fan; Spiders gets up to some of the nifty tricks McCloud talked about in Reinventing Comics. Like When I Am King, my personal standard for showing how comics can take advantage of not being on paper, Spiders (and much of Farley’s other work) eschews rectangular layouts in favor of long horizontal and vertical strips; in part three, he uses vertically offset strips of panels to indicate simultaneous actions. And he foregrounds the medium of the Web, something I don’t think I’ve seen before in online comics. The "Voxpop" splash screen serves as a sendup of Salon (of course the alternate universe’s Salon would focus on Zawahiri’s danger fetish) and as a way of introducing the story; since we know that Farley’s war is largely being fought remotely, over the net, it also serves as a rather elegant infodump. Farley isn’t just writing an excellent online comic; he’s writing very good science fiction. Spiders reminds me of Bruce Sterling for more than just the (quite plausible, what with the heat rays, stink bombs, and other non-lethal weapons the Army is working on) future of war speculation. The conceit of the spiders reminds me of the "Chinese lottery" (or "Chinese radio") codebreaking thought experiment, in which mass production of consumer goods is harnessed for distributed computing. Farley imagines something a bit like distributed DES attacks or SETI@home crossed with reality TV, which is a brilliant conceit. And like all good science fiction, Spiders isn’t just taking a stab at the future; it’s saying something about today. In the real world Karzai’s government is shaky, bombs sometimes hit the wrong people, and we haven’t found Bin Laden. Farley is imagining instead a world of liberated, gun-toting Afghan women, MDMA bombs, and terrorist hunting as a non-lethal, interactive online game. It’s a classic piece of liberal wish fulfillment (down to the reference to President Gore, which reads as irony, wish fulfillment, and a marker of just how difference this universe is, all at once). There’s nothing wrong with wish fulfillment in science fiction or in comics; in the real world, heavy pre-natal doses of radiation rarely make one into, say, a roller-skating disco superheroine. But Farley presents a worm in the apple; the righteous, bloodless war of vengence isn’t bloodless, and it barely even comes off as righteous. I can’t wait for part four.
Poor yay-saying wannabe pop star Amanda Latona recently was given a less-than-glowing profile in the New York Times Magazine. I have zero interest in ever seeing Latona perform or hearing her music, but I don’t think that she’ll cheat her audience, and I don’t think that audience includes a lot of Times readers. When was it decided that singers had to be smart? Is Ben E. King an Einstein? Was Tammy Wynette the second coming of Madame Curie? I have no idea, but it seems largely irrelevant. What’s important to the music is the music; the idea the musicians should also be songwriters is a recent anomaly, spurred by Bob Dylan and the British Invasion. It would have confused people in Elvis’ day, much less the Carter Family‘s or Jenny Lind‘s. But Amanda Latona isn’t rock and roll; you don’t have to have given any thought to the question of whether punk rock equates to capitalism to be a little perturbed at watching the hit machine operate.
On April 18, 1956, the world rather spectacularly changed. The change itself was a miniscule one, but the spectacle was something to behold. Grace Patricia Kelly was a sort of homegrown aristocrat. Her father was an wealthy contractor and a winner of two gold medals in rowing at the 1920 Olympics (although stories that portray him as a brash upstart are overstated; he was perhaps the best-known oarsman in the world at the time). He later became first oarsman inducted to the Rowing Hall of Fame. Her mother was a gifted swimmer and the first women’s athletics coach at the University of Pennsylvania. Her uncle George was a popular playwright, winner of the Pulitzer Prize for Craig’s Wife (though politics within the committee may have skewed the vote). But there’s American aristocracy, and then there’s aristocracy; Kelly’s marriage to the handsome (if somewhat shifty-eyed) Prince Rainier of Monaco put her in touch with the real thing.
Everyone has a part of the newspaper that they habitually turn to first. Some read the business section; some the sports page; some the comics. Others turn to the obituaries. Reading the obituaries over one’s coffee would seem a rather morbid way to start the morning, but obituaries have fans. Recently, New York Times obituaries have been collected in The Last Word: The New York Times Book of Obituaries and Farewells and 52 McGs, as well as a study of who gets commemorated. Goodbye Magazine tracks particularly noteable celebrity obits, and London’s Daily Telegraph has spawned a whole series of obituary collections, each volume dedicated to, say, eccentrics or heros and adventurers. British obituaries are prized by connoisseurs for their bluntness, superior sense of gallows humor, and occasional genuine spitefulness; American obituaries tend to be respectful, even when the subject doesn’t deserve it.
Walking around Berkeley at dusk last week, we saw a hydrangea that almost seemed to glow. It was an example of the Purkinje shift. The effect is named after Johannes Purkinje, a nineteenth-century Bohemian physiologist who discovered the Purkinje cell and the Purkinje fiber; Purkinje also gave blood plasma its name and was the first person to classify fingerprints. Purkinje noted the shift when looking at an Oriental rug one evening; as dusk settled, some colors appeared to grow relatively brighter. In low-light conditions, the rod receptors in your eye (scotopic sensitivity) take over from the cone receptors (photopic sensitivity). Rods and cones are most sensitive to different wavelengths of light, so as it gets darker, we perceive colors as changing in brighteness as reds and oranges grow relatively dimmer and greens and blues grow relatively brighter; this Purkinje shift demonstration can be done with your computer in a dark room. Unlike many optical illusions — the waterfall effect, for instance, or relative length and angle tricks or this horrid thing — the Purkinje shift is not based upon fooling the brain. It’s a result of the mechanics of the eye. The eye doesn’t work the same way as mental models of the eye, as telescopes or cameras. It’s a slightly eerie notion; upon his discovery of the blind spot in 1668, Edme Mariotte was disturbed by the conflict between what he had just observed and Kepler’s model of the eye as a natural lens. It wasn’t until 1819 that scientific exploration of the blind spot really took off, both because nerves were poorly understood and because no one had a model of the eye good enough to displace Kepler’s that also accounted for the blind spot and the weird way it seemed to flow into the background. Nineteenth century philosophy, of all things, becan to provide this model. Schopenhauer sums it up at the beginning of On Seeing and Colors: We see nothing, save through reason.
In 1937, in the midst of the Depression, a young Nebraskan named Joycolon Knapp decided to hit the road with her family; her journal held her photos, notes, and an expense and mileage log she kept while vistiing places like San Francisco, Las Vegas, and the Grand Canyon (link via Portage). In 1927, after graduating from Cornell, Japanese native Kiyooka Eiichi decided that he would take a forty-day car trip from Ithaca to San Francisco (where he would catch a ship to Tokyo); in 1989, Jeffrey Rouff came across a reference to Kiyooka’s trip (link via Dan at MeFi), tracked him down at Keiõ University, and obtained a copy of the home movie he took, and interviewed him. In an interview with Rouff, Kiyooka said, "The usual way would have been to take a train in Ithaca to San Francisco, but going across the country by train looked like a very stupid thing to do"; rail-jumpers, both modern (link via BoingBoing and classic, would disagree, but I’m not sure I would. America is perhaps best experienced by road trip.
Las Vegas remains a remarkable city: a monument to gaudiness and lack of restraint, a Disneyland for adults that’s even more upfront about wanting your money in its pocket. I rather like the effect, as long as I’m not forced to stay there more than a few days. The Bellagio buffet was not as good as I remembered it (and the rooms suspiciously resemble those of a glorified Hilton), but Olives was excellent, and Paris Las Vegas is simply fun. Despite the availability of strategy charts for blackjack, I could not reliably remember when to surrender, and I was too embarassed to look at a pocket flash card; of such frailities are billion-dollar casino companies made. A few days of modest winning pushed me close to break-even, and I called myself lucky (the more so for getting tickets, after a last-minute screwup, to the fabulous Cirque du Soleil "O").
Proposition bets? Con men? Randomness? Post-modernism? Air conditioning? If you’ve detected a theme over the last few weeks, you’re right (though it was only half intentional, I swear!). Red and I are taking a vacaton starting tomorrow, and on our way out to California I’ll be spending two nights at the Bellagio, home of the world’s nicest buffet. It has its own museum, a restaurant studded with Picassos owned by the man who may be the best chef in the Southwest. A circus troupe you may have heard of performs there, in, on, and with a 1.5 million gallon pool; the hotel also features a 9-acre manmade lake (aping the Italian village along the shores of Lake Como that gave the hotel its name), and some giant fountains, all sensibly placed in the Nevada desert. Casino mogul Steve Wynn, the man who created modern Las Vegas when he built the Mirage, built it at a cost $1.9 billion (including a nine-figure sum for modern art; Wynn, now nearly blind, is a major art collector) and basically destroyed his company. The Bellagio didn’t exactly sink Mirage Resorts, but it was so lavishly expensive that it wasn’t a profit center in the best of times, and whe the Asian high rollers it was meant to attract didn’t arrive, Wynn and Mirage were crippled, eventually selling out to rival company MGM Grand. The Bellagio remains the most ridiculously opulent hotel in a city that prides itself on ridiculous opulence. Posts will be sparse for a week or so; I’ll try to check in once or twice, here or at Notlost. If you’re bored, why not visit one of the fine sites I’ve bookmarked? If you’re really bored, close your eyes and think of me down to my last chip and trying to remember if I’m supposed to split sixes when the dealer shows a three, and wish me well, dear reader.
The seventeenth century mathematician and physicist Blaise Pascal invented the world’s first digital calculator, experimented with creating an artificial vacuum, and was an early student of projective geometry (which is concerned with such projections as a globe onto a flat plane), but he is mostly remembered today for his triangle (an immensely pattern-laiden representation of binomial coefficients) pioneering work on gambling. Among other things, Pascal is often credited with inventing the game of roulette. The origins of roulette are murky, but it’s at least possible; Pascal’s final work as a mathematician was on the cycloid, the curve formed by a point on the rim of a circle as it rolls, and he was one of the first modern mathematicians to seriously study probability.
Last Friday, it was 90 degrees in Washington, and I gave my thanks once more for Willis Carrier’s good idea. The air conditioner celebrated it’s hundredth anniversary last week; Carrier designed the first air conditioner in 1902 to improve the consistancy of a the Sackett-Wilhelms Lithographing and Publishing Company’s work during the humid Brooklyn summer; the temperature and humidity fluctuations had made quality control at the printing plant nigh impossible. Carrier’s invention has had a huge impact on America and the world, but was really just the end result of millenia spent trying to cool homes; Susan Roaf’s history of climate control dates the icehouse (where blocks of ice were stored) back to 1800 B.C.
Novelist and short story writer Dale Peck doesn’t make it entirely clear what he dislikes about the modern novel, but he really, really dislikes "the worst writer of his generation," Rick Moody (essay first observed at Calamondin; discussed at Making Light). I’m not terribly familiar with Moody’s work; I believe I’ve read one (so-so) short story and part of a book in the store. Moody may well deserve the vitriol being heaped upon him. It’s very enjoyable vitriol, too, in the way that H.L. Mencken or Dorothy Parker can be delightful in their self-delighted meanness; you get a sense that Peck, who once wrote that "sometimes a bad novel is like a gift", has been storing this up for years.
The Culture Wars were supposed to be over years ago; the culture warriors in battleground disciplines such as history have picked out what elements of academic theory they find useful and the vast majority of those few people who were wrapped up in the battles over the Western canon and radical deconstruction and post-colonial theory have moved on. Or not. New York Times "critic-at-large" Edward Rothstein had called postmodernism’s moral relativism "perverse" in the wake of the September 11 attacks; noted theorist Stanley Fish has popped up to defend himself and his field, and the Culture War battle lines begin to be redrawn.
Alvin Clarence Thomas might have been one of the great American golfers of the 1920s, but it’s hard to tell; he refused to turn pro, he always said, because he didn’t want to take the cut in pay. Thomas, better known as "Titanic Thompson" ("Titanic" not, as some stories have it, thanks to an escape in drag from the doomed ocean liner, but from the way he kept sinking ‘em at pool; "Thompson" from a reporter’s error that he was fond of), could play golf right- as well left-handed. He was a poker shark, a pool hustler, a crack shot. The basis for Sky Masterson in Guys and Dolls, he never found a gimmick he didn’t like.
How many ways are there for an enterprising confidence man to separate a mark from his money? Confidence games can be divided into two categories, the big con and the short con. Big cons sent the mark home to get more money; short cons are designed to empty the mark’s wallet and send him on his way. The big con had three main variations: the wire (in which the mark was convinced that con men were delaying the telegraph reports of horse races, allowing them to make sure-fire wagers; this is the method used in The Sting), the rag (a similar setup, only involving stocks; this shows up in The Grifters), and the pay-off (in which the mark believes he is putting his money on a fixed race). But the small con flourished in a thousand tiny variations.
Over at Graham’s, talk has turned to randomness. I got remarkably long-winded, but I get so excited when I get to show off my rapidly fading math chops, and it was a chance to talk about Claude Shannon, a man who in his own way did as much to revolutionize twentieth century science as Einstein. Shannon worked at MIT with hypertext pioneer Vannevar Bush. His master’s thesis was a landmark piece of engineering theory (on circuit switching); his doctoral thesis was on theoretical genetics. During World War II, he worked on anti-aircraft detection; after the war, his publication of "Communication Theory of Secrecy Systems", marked the beginning of modern cryptography (in the civilian world, at least). His house was filled with toys: chess-playing computers, a Roman number calculator, juggling machines (while he was at it, Shannon had formulated a science of juggling), a machine whose sole purpose was to reach out with a skeletal arm to turn itself off. And as a Bell Labs researcher working on the question of noisy telephone lines, he had two insights that made him the father of information theory: communication is transmitting a message from one place to another, and information is how much needs to be transmitted.
English mathematician Sir Roger Penrose has written about relativity, cosmology, aritificial intelligence, and the nature of human consciousness (an anti-programmer debating society named itself after him). If one is not prepared to consider the merits of Penrose’s controversial arguments about the relation of Gödel’s incompleteness theorem or quantum phenomena like microtubules to the task of creating artifician intelligence, you can appreciate the creation that Penrose is perhaps best known for: Penrose tiling. Tessellation — tiling the plane so as not to leave any gaps — has long been of interest to artists, from the artists who created the Alhambra to M.C. Escher. Tessellations come in all shapes and sizes. When Martin Gardner published a list of eight convex pentagon tessellations, thought to be a complete list, a reader discovered a ninth type; Marjorie Rice, a San Diego housewife and mathematical layperson, then devised a method for investigating and found three more. (Fourteen regular tesselations using convex pentagons are now known, but this list has not been proved to be complete.) Escher’s tessellations are world famous, available on everything from posters to outdoor tiles (thanks, Kathryn!). But Penrose had worked out (by hand) a quasitessellation, a system by which two shapes could be used to tile the plane without ever repeating the pattern. This aperioditc pattern, based on "kites" and "darts" could be used to make anything from decorative floors to intricate, jigsaw-like puzzles. Penrose’s interest in tessellation perhaps triggered some happy memories; a teenaged Roger and his father had invented Escher’s impossible shapes, the tribar and the Escher cube. And it’s given rise to another distinction: Sir Roger is almost assuredly the only one of Stephen Hawking’s collaborators ever to file a lawsuit over toilet paper copyright.
I spent some of my time in Pittsburgh this weekend continuing my pokey voyage through Italo Svevo’s The Confessions of Zeno. which I’ve been reading slowly for a while now. It’s reminiscent of a more poignant and less slapstick Confederacy of Dunces, both comic tales of men who can’t tell what a joke their lives have become. It’s not what I expected from the best-known Italian modernist, but I’m enjoying it. I doubt I’d have heard of it if it hadn’t been recently been reprinted, and I might not have picked it up if it the reprent hadn’t been an attractive and relatively inexpensive Everyman’s Library edition.
He fought spiritualism, lobbying Congress for laws against fraudulent mediums. His name was a verb: "to release or extricate oneself from confinement, bonds, or the like, as by wiggling out". He was a pioneering aviator, a self-taught historian, a star of stage and screen, a man who quite possibly knew more about locks than any living person, but the reason that we remember Erik Weisz, an Austrian rabbi’s son born in Budapest in 1874, is because he knew how to sell himself. America’s "five great magicians" (Harry Kellar, Howard Thurston, Alexander Herrmann, Dante, and Tampa), even Robert-Houdin, the father of modern stage magic, are today all but forgotten. Harry Houdini stands alone.
Last weekend marked the fifty-third anniversasry of the creation of Flag Day as an annual observance. Flag Day commemorates the adoption of the American flag on June 14, 1777, although the modern observance dates to 1885, when Bernard Cigrand, a Wisconsin schoolteacher, had his class celebrate "Flag Birthday". Almost simultaneously, a former socialist minister named Francis Bellamy wrote the Pledge of Allegiance at the behest of James Upham, publisher of The Youth’s Companion, a popular magazine, in a dual effort to promote an idea of nation unity and make a few bucks selling flags; until then most schools displayed states, rather than the national flag (which was largely used by the military).
The most prolific author in a survey of Western lit is not Shakespeare, a piker with less than forty plays; Georges Simenon, whose 84 Maigret novels were less than half of his corpus; or Isaac Asimov, with more than 250 books to his credit. Dame Barbara Cartland, who churned out over seven hundred books (largely interchangable romance novels) in her seventy-seven-year career as an author, is a serious contender, but in the Harvard libraries, Anonymous has got her beat.
I’ve said before that paper is fragile (and a throwaway line aboutthe fate of Mikhail Bakhtin’s Bildungsroman manuscript made it onto a site that aspires to be the "penultimate [sic] site for Bakhtinian Smoking Research" site), but what happens if you lose an entire library — not lose it to fire, like the Library of Alexandria, but actually misplace it? A lost Roman library, buried by the eruption of Vesuvius, may soon be unearthed; already digital imaging is helping to decipher some of the recovered manuscripts. And then there’s the case of Tsarina Sofia Paleolog’s fifteenth-century library (link via Boing Boing), designed by Aristotle Fiorovanti and built beneath the streets of Moscow. Sadly, nobody quite remembers where it is; neither Napoleon nor Kruschev was able to turn it up.
The first paragraph of the introduction to Fantagraphics‘ new volume of the 1925 and 1926 Sunday Krazy Kat comic strips does not spend time discussing the strip’s legion of contemporary fans among the intelligentsia, fans like e.e. cummings and H.L. Mencken. Instead, it places Krazy Kat within a body of American vernacular art along with Chaplin and Twain, and then notes "…[T]here’s a pretty good chance [this book] won’t turn a buck. Krazy Kat doesn’t sell well at all."
The Fifth Amendment to the Constitution doesn’t just protect you from being forced to give self-incriminating testimony; tacked onto the end is what’s known as the "takings clause": "…nor shall private property be taken for public use, without just compensation." There’s a rather complex methodology for deciding if it applies to a given case, but at its most simple (say Ohio seizing property through eminent domain to build a highway), it requires property owners to be compensated for the state’s taking of their property. Recently, in the sexily-named Tahoe-Sierra Preservation Council v. Tahoe Regional Planning Agency, the Supreme Court ruled that a lengthy delay, one of thirty-two months, on development at Lake Tahoe did not constitute a taking under the Constitution. Steven Landsburg’s disingenuous Slate article on the Tahoe ruling managed to not mention the word "taking" at all, instead riffing on a distortion of Justice Stevens’ ruling and claiming that the Supreme Court might as well, by gum, have ruled government illegal.
In Canada, graduating students of engineering are presented with a hammered iron ring in an induction ceremony designed by Rudyard Kipling. The engineers’ rites are secret; although presumably no more menacing than Phi Beta Kappa, that most unthreatening of centuries-old secret societies, the idea of the Obligated Engineers is reminiscent of Masonic recognition symbols. Although Freemasonic mythology claims that the secret society dates back to Hiram, the apocryphal builder of Solomon’s temple, the rites and rituals seem more likely to have arisen with the stonemasons who travelled Europe building cathedrals. With no easy way of providing bona fides, the skilled laborers created secret means of identification — handshakes, words and phrases loaded with meaning — to prove that they were trained stoneworkers and engineers, master masons who could be trusted to work on a cathedral.
The Korean War — the "forgotten war" — seems to have been left behind by American literature. Dozens of Vietnam or World War II or Civil War novels can be found (many of them excellent), but the Korean War’s contribution to popular culture is largely limited to MAS*H (a half-funny satire of a book turned into a brilliant satire of a movie turned into a treacly swamp of liberal self-mythologizing of a telelvision show). Brainwashing has legs, however; from new religions to John Walker Lindh, it’s still being discussed today.
Who was Kilroy? Despite the ubiquity of the slogan "Kilroy was here" — it was common enough during World War II to have served as the title of a period humor collection, not to mention a Styx album — there’s debate over the phrase’s origin. In 1946, the New York Times (or a radio program, or both, depending on the account) declared Massachusetts shipyard inspector James J. Kilroy to be the Kilroy, but the phrase may predate the war; Chad, the little peering face who often appears with the phrase, may have been a British contribution. Chad tended to appear with a three word question, often related to rationing ("Wot, no bread?"; "Wot, no petrol?") and seems to have first been drawn by a British cartoonist, George Chatterton. Kilroy was briefly topical (Isaac Asimov wrote a pointless Kilroy story) and never quiet forgotten (more for Chad and the opportunities for visual puns than anything else). But unlike the smiley face (invented by Harvey Ball), Kilroy’s creator may never be fully known; he belongs to all of history.
In 1893, a tragedy shook London. The greatest detective England had ever known had met his match. The Napoleon of Crime, Professor James Moriarty (once merely a professor of mathematics at one of England’s smaller universities), had wrestled with amateur violinist, anthropologist, chemist, swordsman, and consulting detective Mr. Sherlock Holmes at Reichenbach Falls; both men had fallen to their deaths. Ten years after Doyle published "The Final Problem", however, a miracle was revealed to the world; Holmes had survived and, deciding that discretion was the better part of valor, sequestered himself in Tibet under the name of Sigerson. Doyle’s attempt to kill off a character he was no longer terribly fond of had failed. But what if it had succeeded? Would legions of Holmesians have had to explain on their own, outside the Canon, just how Sherlock had survived and why he had hidden himself from Watson? And what about Mycroft, Inspector Lestrade, and the long-suffering Mrs. Hudson? Someone would have filled the void, and it would have been fan fiction (link via Making Light).
Some recent discussion on fantasy novel cover art led to a post about old TSR artists on MetaFilter (thanks Aaaugh!); the theme meant I couldn’t toss in some artists like Thomas Canty, a one-man pre-Raphaelite tribute band, or Charles Vess, the comic book artist probably best known for his work with Neil Gaiman. On the other hand, the thread prompted some folks to dig up information on Dave Trampier, the artist behind Wormy, the most successful comic to ever come out of Dragon magazine. Trampier simply dropped out of sight one day, leaving behind an unfinished plotline and, more surprisingly, unclaimed paychecks. Given that Wormy hasn’t been published in 14 years (Larry Elmore’s SnarfQuest and Phil Foglio’s What’s New are back in print), I’m surprised that anyone remembered enough to check, but I suppose that role-playing games are one of those small obsessions that people nurture.
In 1980, Photoplay faded to black. The last of the great movie magazines, Photoplay dated back to the days of the silents, back to the days before movies were called movies. It’s only been twenty years, but Photoplay and its ilk — The New Movie, Motion Picture — seem more like something an archaeologist would dig up than cultural ephemera contemporary with E.T. and 48 Hours.
Death Valley isn’t really dead; it contains the lowest point in the Western Hemisphere and is damned hot, but small animals come out at night to feed off the almost one thousand varities of plant life that call the valley home. But once it crawled with animals that aren’t natural desert dwellers: people and mules. The famed 20 mule teams that hauled millions of dollars worth of the miracle cleaner out of Death Valley, the "Borax Desert" (and gave birth to a brand name still used today). Francis "Borax" Smith, the mining and railroad magnate, began building his fortune off Death Valley borax. Smith made his mark on dozens of towns between his mines and his markets. Ghost towns dot the West; there are over 1600 in Nevada alone. Today borax powers concept cars and, along with other borates, in the manufacture of glass and ceramics. Borax Smith’s financial empire collapsed with the East Bay real estate market in 1913; he left his mansion to the city of Oakland, but his railroad rusts away. Except for hawks and desert miceand the occasional history buff, no one watches as it returns to the desert.
Recently the Guardian was atwitter over the revelation that William Shakespeare might have had a relationship with a young man, the Earl of Southampton. I can’t speak to whether or not Shakespeare slept with men; as far as I know, the idea of Shakespeare as queer dates back to Oscar Wilde’s "The Portrait of Mr. W.H.", an amusing story but not anything resembling serious scholarship, and there’s not really anything in the historical record to make definitive statements one way or the other. But there seems to be a real desire on the part of the semi-scholarly (where I myself reside) to reexamine Shakespeare’s sex life, and this Southampton portrait is providing an opportunity to do it, history be damned.
V., with what is sure to be sporadic assistance from me, has started a food blog, The Hungry Tiger. It will be, we hope, opinionated, well-written, and useful. There are recipe sites out there, both professional and homegrown. There are weblogs devoted to bacon and soup and cocktails. Newsletters have sprung up, while older ones have come online. But the Internet is a notorious breeding ground for cranks and obsessives! Chuck Taggart may be a devoté of traditional Cajun food, but he’s hardly a lunatic about it. Jan and Michael Stern get worked up about greasy spoons and pancake houses, but they aren’t lunatics. Where are the eight-page-long rants about the proper way to grill a steak or oysters Rockefeller made with spinach? There are funny food pages, like the pork martini or Stinkfactor, and I can find opinionated (perhaps even surly) reviews of beer or wine, but over a dozen pages devoted to Dr Pepper knockoffs? Where is the person who will obsess about artisan cheeses as though they were as important as GI Joes or the Transformers, much less something really important like vi versus emacs? Somewhere in the world, there’s someone with a fondness for heirloom tomatoes and aged range-fed beef, a copy of Dreamweaver, and far too much time and attention to spend explaining why everything we know about food is wrong. I eagerly await his or her arrival.
Somewhere out there, the MoneyMaker Plus is earning its keep. It’s a water pump designed for irrigation in Kenya, and despite its noble goals, it’s also designed to turn a profit. It’s marketed by a marketing guy, designed by design people (the next generation will be designed by Ideo) and sold by salespeople. Simultaneously, the builders, Approtec, hope to demonstrate a new approach for how the First World can help the Third, massively improve Kenyan agriculture, and help build a Kenyan middle class from the ground up. It reminds me of The Ugly American; despite what the term has come to mean, the title character of Burdick and Lederer’s prescient pulp novel about a fictionalized Vietnam on the eve of Communist takeover was an American engineer (physically unattractive, but a swell fellow nonetheless). Against the backdrop of continuing failures on the part of the American diplomatic corps, he learned the language, travelled the country, built dams, and talked to the peasants about what they really wanted. Sadly, his bicycle-mounted water pump didn’t save Sarkhan for democracy, perhaps because it didn’t have a name remotely as catchy as "the MoneyMaker Plus".
If you were approached by an ex-con, a high school dropout who said that he had used a little-known and poorly understood property of physics to create a means of transmitting video that was twenty times faster than the fastest method on the market, would you believe him? If so, you’re in good company; Intel networking subsidiary Level One, the video chain Blockbuster, and a handful of venture capitalists also fell for what seems to have been an intricate scam run by a man named Madison Priest (link via Lake Effect). If you approach a situation like that with skepticism, you might miss dealing with the next Philo Farnsworth, but when a someone out of the blue starts promising deals worth hundreds of millions of dollars, it should set alarm bells ringing.
Readers Martha B. and Bob, quick on the draw the both of them, read last night’s post on alphabets and pointed me to the work of Chinese artist Xu Bing, whose current exhibit is showing at the Sackler through this weekend. The square word caligraphy is neat enough — English words represented as pseudo-Chinese ideograms; there’s apparently an OS X program at the exhibit which will do the trick for arbitrary user input — but the installation "A Book from the Sky", in which Xu created a thousand plausible but nonexistant Chinese-style ideograms then wrote a book using them. The idea of a real book in a fake language reminds me of "The Library of Babel", Borges’ story of the universal library which contained not just every book but every possible book. The idea of combinatorial writing, works that spanned the word-space, was taken up in the ‘60s by France’s avante-garde OuLiPo group (thanks, Mark!), who wrote poems based on things like linear algebra and repeated permutations of lines and stanzas. And Xu apparently is an immaculate craftsman, using beautifully realized caligraphy, bookbinding, and printing in service of his nonsensical works. I’ll have to stop by Saturday before I head out to a barbecue, because it sounds like a wonderful blend of concept and craft. Also, there are monkeys, and monkeys can only improve art.
In 1821, a former French army officer, Charles Barbier de la Serre, took his new invention to the Royal Institution for Blind Children; he had created a means by which his fellow artillerymen could read messages at night without betraying their position by lighting a lamp or candle. The director of the Royal Institution, Dr. Guillié, was unimpressed. Fortunately, Dr. Guillié was fired a little more than a week later after a scandal involving his affair with a female teacher at the school; André, who replaced him, was much more interested when Barbier showed him sonography, and resolved that Barbier’s "sonography", in which sounds were represented by patterns of raised dots, would be on the curriculum for all of the school’s students, including young Louis Braille. Braille would go on to survive tuburculosis, which left his health fragile for the rest of his life and the burning of his Braille-system books by a future headmaster of the Royal Institution, P. Armand Dufau, who would later become a champion of the alphabet when he decided it was better for his career. His work survived his early death and spirited attacks from champions of other writing systems for the blind, such as New York Point, to become the worldwide standard.
It was not until he was in his forties that Jean Dubuffet devoted himself to his art. An art school dropout, Dubuffet had devoted time to his family’s wine business between stints of sculpting marionettes and painting portraits, but in 1942 he quit his job to paint full-time and in 1944 he received his first show. His work from the ‘40s and ‘50s flouts every rule he learned during his brief stint at the Academie Julian: his figures are disproportionate, childish; the textures of the paint and canvas are highlighted at the expense of the composition; he painted faces that were barely recognizeable as human. Something had happened to Jean Dubuffet, something he thought wonderful. He had discovered that he admired the work of madmen.
Happy Loyalty Day! Today is a "a special day for the reaffirmation of loyalty to the United States and for the recognition of the heritage of American freedom." I’m all for the heritage of American freedom, but forgive my cynicism in questioning why May 1 is the day we officially celebrate it. Loyalty Day seems to have been an early instance of counterprogramming, offering a contrasting holiday to celebrate for the millions of Americans unwilling to participate in May Day parades and rallies. Those parades were haunted by a specter: the specter of Communism.
My birthday was last weekend. It was fun; the weather on Saturday was just about perfect. And it marked an end to the Literary Year. Last year on my birthday, I decided to write a short review of every book I read for an entire year. Among the many goals was to answer a question: how many books do I read in a year? When I was twenty-five, I read sixty-seven books, start to finish, not counting novellas, short stories, magazine articles, and comic books. Good enough. But I had an ulterior motive in doing the Literary Year.
The temperature mellowed last week, but I fear for the summer. Our apartment doesn’t have central air, so I survive by drinking lots of ice-cold drinks. It’s unsurprising that the cold-drink industry’s champ was born in sweltering Atlanta before the invention of air conditioning, even if providing a relief from the heat wasn’t the original goal. Coca-Cola was born in 1886, when pharmacist and morphine addict John Pemberton created Coca-Cola syrup as a non-addictive pick-me-up, featuring stimulating coca leaves and refreshing kola nuts. Georgia business man Asa Chandler acquired the company and switched the formula to denatured coca (enabling it to survive a series of legal hurdles when cocaine was made illegal in the United States, including the magnificently named case of United States v. Forty Barrels and Twenty Kegs of Coca-Cola, and the Coke juggernaut, propelled by decades of memorable advertising, both print-based and televised, was born.
Although most of my used bookstore jaunts are in search of desperately out-of-print science fiction and the occasional classic I’ve been looking for, I do love it when I find a treasure of oddness. I’ve got a copy of Dr. Sylvanus Stall’s anti-Onanism tract What Every Young Boy Ought to Know and some determinedly strange children’s books, even some dandy anti-Communist literature, but I don’t have anything really weird. I’ve never found a copy of the Codex Seraphinianus or a hitherto unknown key to the Voynitch manuscript. I’ve never found anything by Harry Stephen Keeler, thought by some (including Big Secrets author William Poundstone) to be the worst mystery writer ever, although others would apparently put forward Joel Townsley Rogers for the title. I’ve never found anything by Lionel Fanthorpe, the Ed Wood of British science fiction. I’ve found Masonic and anti-Masonic propaganda and all sorts of New Age hokum, but I’ve never found a hand-colored refutation of Einstein. I’ve never found anything worthy of mention in Book Happy, a zine published by Donna Kossy of Kooks fame, and I’ve certainly never found anything quite as remarkable as Down Home Gynecology, a women’s health book written in country-fried dialect that could have been (and possibly was) lifted from Li’l Abner. And for that I am grateful.
My apologies for the somewhat sparse posting of late, dear reader. I’ve been exploring a rather large book. Thomas Pynchon’s Gravity’s Rainbow tries to fit something approximating the whole of human experience into Europe at the end of and immediately after World War II. Like Pynchon’s immensely more approachable The Crying of Lot 49, it’s a novel of paranoia, as Tyrone Slothrop slowly investigates the truth about mad chemist Laszlo Jamf and the Schwarzgerät rocket, sought after by half the spies in Europe. It’s a novel about control: industrial, personal, sexual. It’s a novel about madness and love and extinction and language and lemmings and pigs. It’s terrifically funny at points and almost unreadable at others. It’s absolutely brilliant, and I have absolutely no problem remembering why I stopped reading it the first three times I gave it a whirl.
The founder of the Science Fiction Writers of America, Damon Knight, passed away Monday (link via Patrick Nielsen Hayden, a science fiction editor at Tor Books, who has written his own brief tribute to Knight). Knight is probably best known as the author of "To Serve Man", memorably adapted for The Twilight Zone and nicely spoofed on The Simpsons a few years ago, but his work in the field as a reviewer, teacher, and editor really surpass his contributions as a writer.
This year’s winner of the Pritzker Architecture Prize was announced this week, and he’s Australian. Glenn Murcutt‘s claim to fame is that he’s an environmentally conscious designer; his work, mostly composed of single-family houses, is designed to blend in with the environment. Murcutt apparently dislikes air conditioning, and many of his houses are designed to use passive cooling and don’t have air conditioners. His houses are attractive and modern looking, but not weird; they use some non-traditional materials, but nothing as strange as cordwood masonry. Murcutt makes environmentally conscious houses that don’t look like a seventh-grader’s diorama, slapped together out of styrofoam and masking tape after reading Dune for the first time.
In 1933, an Oklahoman physicist, Karl Jansky, made headlines when he announced that the we could listen to the galaxy. An employee of Bell Labs, Jansky had been investigating the background noise that interfered with transatlantic radio telephone calls. Much of the noise, he found, was from thunderstorms, but some was from another, unidentified source. Jansky eventually he determined that the noise was being produced not by a terrestrial source but by the Milky Way; he was the first real radio astronomer, the first to successfully learn about celestially bodies by analyzing radiation (such as radio waves or X-rays) that they emitted outside the visible spectrum. Jansky wanted to investigate further, but Bell Labs, having received the answers they needed, put Jansky on another project; he never did radio astronomy again.
Adam Cadre has published one book to his name, but I know him better as the author of the spectacularly funny MSTing of "The Eye of Argon", a spectacularly funny (and bad) piece of fantastic fiction in the Conan vein. MSTing has a certain postmodernist flair (V. used Cadre’s version of "Eye" as an example in a linguistics paper she wrote), and recent mention on MetaFilter reminded me that Cadre is also the author of some fascinating interactive fiction.
As phonograph enthusiasts know, the first phonographic recordings were not pressed onto wax but onto tin foil (although one early experimenter used lead; his recording can still be played today). Tin foil was not, however, such a durable medium. Earlier recordings on paper have lasted just as well, if not better. Sheet music was a huge commodity in the nineteenth century (and one that was widely pirated internationally, although copyrights on sheet music and live performance applied after 1831), but it wasn’t so much recorded as transcribed. But the invention of automatic music, particularly the player piano, meant that an individual performer’s work could be recorded, duplicated, and played in your parlor or at the local saloon almost as it had sounded in the concert hall.
Baseball season began yesterday, but today is opening day for most of the league. As the season stretches on, prepare to see signs of the continuing strife between ownership and the player’s union; expect to see someone — Carl Everett or John Rocker, perhaps, now that Al Belle has retired — make an ass of himself through a particularly churlish or stupidly criminal gesture; look forward to the first batch of stories comparing the current dross with the upstanding baseball of yesteryear, where love of the game was what mattered and no one ever entertained an ignoble thought. Except, as Roger Angell reminds us, it’s bunk.
The Musée Mécanique has been saved! I heard from Judith that the Park Service has been convinced by the outpouring of support to find a temporary home for my favorite place in San Francisco. The Musée is a collection of old penny arcade machines — love testers, fortune-telling gypsies, purportedly risqué moving pictures, hand-crafted minitatures dancing the minuet. Some photos are available online, including ones of the remarkable Laughing Sal, which surely terrified generations of San Francisco’s children.
Now that spring has sprung, we can spend another thankful year safe from deadly avalanches. Dry slab avalanches (the deadliest kind) are the result of stratification of snow on mountainsides due to temperature changes or wind-blown snow deposits; when the top strata all slides down the mountainside in a clump, you’ve got an avalanche. This demonstration for grade-school science students replicates the effect using burlap, flour, sugar, and potato flakes. The Westwide Avalanche Network (run by the American Avalanche Professionals Association) can answer all your avalanche questions.
In the 1940s, a young cook in Eagle Pass, Texas, faced a dilemma. The head chef had left, they were running low on ingredients, and people were hungry; he threw cheese and jalapenos on top of tostada chips, and Ignacio "Nacho" Anaya introduced his nickname to the English language. It may well have been a brave man who first ate an oyster, but that event is lost in the mists of history. Oysters Rockefeller, on the other hand, have a definite parentage; they were invented at Antoine’s in New Orleans and named after Franklin Roosevelt, to whom they were served. The recipe remains a guarded secret, although Chuck of the Gumbo Pages has taken a crack at reproducing it. Being a chef or bartender is one of the few professions I can think of — the others are biologist and astronomer — where you have a credible shot of naming something after yourself, your loved ones, or your patrons and having that name stick for generations.
I recently read (over on Ethel, and to my great sadness) that science fiction writer R. A. Lafferty has passed away after a long bout with Alzheimer’s Disease. He was 87 years old. I’ve written about my enthusiasm for Lafferty before; his work was a joyous muddle, both simple and deceptively complex. (See, for instance, his take on the fountain of youth trope in "Nine Hundred Grandmothers", collected in his short story anthology of the same name.) When I wrote before, I said that I had never read anything Lafferty wrote about the craft of science fiction; that is no longer the case.
You can draw a line from the glistening cobblestones of Vienna in Carol Reed’s The Third Man through the neon-reflecting puddles of Blade Runner. But you can draw a line backwards, as well; the noir vision of the city as rain-dappled menace has (by way of numerous mediocre Jack the Ripper movies, I’m sure) colored my vision of Victorian and Edwardian London, the stomping grounds of Springheel Jack and Crippen, as well as more literary monsters.
One of the great things about the Internet is the plethora of free things. Some are offered to draw an audience for which advertising can be sold; some are offered as advertisements themselves; some are offered out of pure missionary zeal. Most of these free things are digital — text files, software, music files, even short films and ISPs. And there are a number of illicit (tapes of the Goon Show, featuring Spike Mulligan and Peter Sellers) and semilicit (Plunderphonics songs not commercially available due to American copyright law) distribution efforts that coordinate on the Internet and make things available for a nominal fee. There are collaborative mail art projects run over the Internet, such as 20 Things and Nervousness, in which you contribute art and get different art back. But I am unaware of anything exactly like Booklend, the lending library run by Mark Anderson. Mark sends real, physical books to complete strangers for free. He doesn’t assign due dates or charge late fees. He pays return postage. He’s got over a hundred books in his library, and he relies on nothing more than people’s innate goodness, honesty, and fear of public shamings at the hands of a stranger to make them return the offerings. It’s a fabulous, generous project, and I’m thrilled to have been able to help make it happen.
Six months later, twin beams of light shine into the New York sky where the Twin Towers once stood. I think it’s a wonderful memorial (memories of Leni Riefenstahl films notwithstanding): the lights suggest both memorial candles and photographic negatives or ghosts of the buildings that stood there. I confess that I was not a fan of the World Trade Center; I never managed to get around to going up to the observation deck to take advantage of the view, which in my mind was the saving grace of what I found to be disproportionately large International Style buildings that didn’t really mesh with the rest of New York’s skyscraper-heavy skyline.
Yesterday at work I found myself listening to some blues songs recorded in 1941 and 1943, a record of a folk festival at a college in Georgia. A number are gospel songs; a number deal with topical subjects (Roosevelt, Hitler, Joe Louis, Pearl Harbor). I haven’t listened to them all yet, but they are wonderful, and they are all available on the web (free, in MP3 format) through the American Memory Project of the Library of Congress, along with an essay providing some historical context. I sometimes forget just how much wonderful stuff memory.loc.gov has to offer: animation, beautiful color photos of imperial Russia, old advertising circulars, photojournalism both cheering and chilling, this scan of Amelia Earhart’s handprint that Judith uses as a desktop image. It’s a testament to the diversity of the American artistry (high art and low art both) and craftsmanship, the breadth of public domain works, the hard work of the people at the LOC to make this work available, and the sheer packratdom that is part of the American character, Walden notwithstanding. They call the Smithsonian "America’s attic", but the Smithsonian collects things like Enron ethics manuals and Archie Bunker’s chair. Compare the Smithsonian’s ruby slippers to the LOC’s Oz manuscripts. They’re both recording the American experience, but the Smithsonian takes souvenirs where the Library of Congress takes mementos.
I was a teenage nerd. I played roleplaying games. I liked comic books. I read dozens of science fiction books. For a brief, shining period, I ran the high school math club. I did science projects: painfully simplistic cryptography systems. But hard science was never my thing; I could talk your ear off about Galois theory, four-word definitions of, say, the Krebs cycle or tribolumiscence ("how cells digest glucose" and "why Wintergreen lifesavers spark", respectively) are probably the most I could muster. But catalogs like American Science & Surplus, Fisher Scientific, and Einstein’s Garage stir some sort of residual Tom Swift fantasy. They’re like the Archie McPhee catalog with fewer action figures and more possibilities for explosions. Strange things can happen when you merge an adult budget with a childlike inability to follow simple, sensible directions; I should probably keep these catalogs far away from me and stick with safer catalogs, like Penzey’s. But even though I doubt I would ever use it, the idea of putting together a chemistry set is somehow terribly appealing to me. I’m apparently not the only one. Maybe I’ll just get some Pyrex labware to drink whisky out of. Things go boom!
William Shakespeare, the greatest playwright in the history of the English language, was born around April 23, 1564, the son of an alderman from Straford-upon-Avon. He married (in a hurry) in 1582 and moved to London around 1587, working for a theater company. Eventually he became a writer of some reknown; he retired a wealthy man in 1611 and died in 1616. His works ring through history; as his colleague Ben Jonson wrote, they are "not of an age, but for all time!" That is, unless the above is a lie and "Shaxpere" is not "Shakespeare".
An article by Todd Anderson called "Punk Rock = Capitalism" from Popshot is making the rounds. There’s a glimmer of a point in it. Indie rock has created a thousand small businessmen and businesswomen; every kid in a garage who has ever pressed a single and then tried to recoup the money selling it at shows is a capitalist. But ignoring the ludicrousness of his examples (how many people in all the world ever think to themselves that they would like to start a peanut farm?), ignoring the only tangential relationship of his arguments to punk rock (it could be about starting a muffin shop, and then it could be titled "Muffins = Capitalism" or "Capitalism: Muffin as Fuck"), ignoring the well-known diddling that major labels deal out to the vast majority of artists who sign with them, ignoring the fact that every single person I’ve ever met involved in DIY music, even on the business end of things, is doing it for love and not money, Anderson is just not thinking things out.
Last week, I was chatting with dcehr about Frank Miller’s 300, a comic book adaption story of the Spartan stand at Thermopylae. It veered into a discussion of Herodotus, the "father of history" (or, depending on who you ask, the "father of lies"), who records one of the great tough-guy lines of history; when Dieneces of Sparta was told that the Persians had enough archers to darken the sky, Herodotus records Dieneces cheerily responding, "Our Trachinian friend brings us excellent tidings. If the Medes darken the sun, we shall have our fight in the shade." And V. was discussing Arthur Golding, an Elizabethan Protestan who translated both Calvin and Ovid (Golding’s Ovid is the one that Shakespeare used). "But Ovid is so naughty," I cried, though after some rather self-conscious discussion we decided that he is not nearly so naughty as Catullus.
A largely sensible article in Fortune magazine (via Slate’s Moneybox) dissects unintended consequences run amok in the tremendously complicated issue of asbestos litigation in America. It’s tainted, however, by the incredibly deceptive description of the reason for W.R. Grace’s bankruptcy. (For the record, it reads: “The chemical giant paid nearly $2 billion for using asbestos in its fire-protection products.”) Poor Grace, a mere asbestos consumer, a victim as much as anyone! W.R. Grace, once a conglomorate involved in everything from cement to shrinkwrap, owned and operated a vermiculite mine in Libby, Montana, for 27 years; the vermiculite was contaminated with asbestos, and both the miners and the general citizenry of Libby have had health problems ever since. In 1999, the direct death toll was estimated to be 192, with hundreds more contracting severe cases of asbestositis, many of which will probably prove to be fatal. An award-winning article in Mother Jones suggested that executives at the mining company that Grace purchased knew about the effects of asbestos exposure on their workers’ lungs as early as 1959. Grace knew by 1976. Neither company did a thing other than stonewall OSHA and write memos.
Guns, Germs, and Steel author Jared Diamond argues that a nation’s wealth is in part determined by how well its producers can gain economy of scale, and he uses beer as an example. At the turn of the century, there were hundreds of independent American brewers, one or more in almost every major northeastern or midwestern city; almost all have now vanished. The ten largest brewers now dominate the industry (as cited by Beerhistory.com). What happened? Baltimore’s National Brewing, maker of National Bohemian and smooth Colt 45, is representative: a few miscalculations, the dwindling importance of relationships between brewery sales reps and local publicans, a disastrous period of ownership by Black Label maker Carling Beer (itself recently purchased by Coors), and the rise of national television campaigns. Now the National Brewery building in Baltimore is being converted into condos and Mr. Boh is shipped in from North Carolina.
The Alphabet Synthesis Machine (which I discovered in a roundabout sort of way; it’s a project by Golan Levin, which I discovered after he responded to a post by Graham about another one of his projects, the wonderful Secret Lives of Numbers) is a fun little toy, designed to take a user-entered scribble and run it through a genetic algorithm until it produces something akin to an alphabet. It’s impressively neat — I really admire it, especially because V. suggested the idea to me a year ago and I dismissed it as too hard — but it doesn’t quite look like a real alphabet. There are a heck of a lot of alphabets out there, and they all seem just slightly more complex than those I managed to create with the Synthesis Machine.
If I were to offer my sweetheart a bouquet of heliotrope, hollyhock, and ivy, I wouldn’t just be offering her flowers on the day when florists do 30% of their annual business: I’d be sending her a message more specific than "I love you." Everyone knows that roses are for love and rosemary for remembrance; if we think about it long enough, we may recollect Ophelia saying that pansies are for thoughts. But the Victorians had a whole extended language of flowers. As an article in the delightfully named Collier’s Cyclopedia of Commercial and Social Information and Treasury of Useful and Entertaining Knowledge put it in 1882:
I hear via Kathryn that a new edition of Verne’s The Mysterious Island will be surfacing soon. My knowledge of Verne’s work is limited to 20,000 Leagues Under the Sea and Around the World in Eighty Days, and I’d rather watch a movie adaptation (the 1956 Around the World in 80 Days David Niven, John Gielgud, and little-known actor Noel Coward) or read Alan Moore’s cracked take on Captain Nemo (and others, including that symbol of capital and individualism turned monstrous and destroyed by the common people, the Invisible Man; he is, of course, an invention of the other oft-cited grandfather of science fiction, H.G. Wells, and the Wells that I’ve read, stiff though it is, has been more enjoyable than the Verne). Bruce Sterling has written a pair of essays (1, 2) linking Verne to photographer, caricaturist, writer, and portraitist Felix Tournachon (also known as Nadar); the 1848 Paris Revolution; the Bohemian movement; and other interesting French things of the time, so I should probably give Verne another shot. The really fascinating thing about the new Mysterious Island is that the translation was largely prepared by a non-professional fan, Sidney Kravitz, a retired engineer from Dover, New Jersey. It’s as if AnimEigo decided to turn to fansub creators for its next project, only more so, this being a world of literature almost completely dominated by professional scholars and translators. I’ll probably give The Mysterious Island a looksee for just that reason. Long live the amateur enthusiast!
How do you extract money from an idea? You can keep it secret and make sure that other people are reliant upon it. When Samuel Slater, father of the American industrial revolution, came to America, he had to diguise himself as an agricultural laborer. Trained mechanics were not permitted to emigrate from England, lest they share their knowledge and ruin the mercantile system that forced colonies to be importers of processed goods. You can make sure your ideas will be paid for by someone else — John Harrison, inventor of the chronometer, was working for a prize of £20,000. However, it took a personal intervention by George III, more than ten years after Harrison had successfully demonstrated his device, for Harrison to be awarded even a fraction of that. Since 1449, however, the English-speaking world has had another method: the patent.
Good food is capable of moving diners to tears of joy; bad food is capable of moving diners to tears of a different sort. But there’s surprisingly little information online about how we used to eat. If you’re looking for a medieval Norweigan cookbook, you’re in luck. Fanny Farmer is out there, and The American Matron, or Practical and Scientific Cookery (1851) and Mrs. Goodfellow’s Cookery as It Should Be (1865) are available as scanned pages, though not searchable (or easily downloadable) text. The Feeding America proposal to digitize and make web-accessable a dozen or two of America’s most important cookbooks is apparently waiting on a grant. But how did the average Betty or Joe eat in, say, 1950?
Rachel Laudan is a food historian, and she has a bone to pick with (in her phrase) the culinary Luddism of the Slow Food Movement. I first read Laudan when she tackled the question of authenticity in Mexican cookbooks (link via the website for the highly recommended newsletter Simple Cooking) and came up with the answer that authenticity is a sham. Her central points seem to break down into two concerns: first, that food wasn’t so great back in the day; second, that the very notion of authentic cuisine is problematic.
Henri de Toulouse-Lautrec may be more famous today for his life — a dwarfish, disaffected, aristocratic, womanizing drunk, he seems tailor-made to serve as a symbol for the seedy artistic demimonde of late nineteenth century Paris — than his art. At the peak of his fame, however, he was the toast of Paris. Toulouse-Lautrec’s advertisement for the Moulin Rouge nightclub, where he was a regular, featured Louise Weber, la Goulue (that’s "the glutton", a reference to her hard-drinking ways) dancing a scandalous can-can and flashing her scandalous knickers, making him instantly the best known poster artist in Paris upon its release in 1891. Toulouse-Lautrec would die ten years later, having produced a number of well-regarded oil paintings, over three hundred lithographs, and no heirs; the Toulouse family line, which could be traced back to the time of Charlamagne, ended with Henri.
Last weekend included a trip to the mall. I escaped with the purchase I had come for, a lingering headache, and an urgent need for a nap. Why had I spent so long in there? Perhaps it was due in part to the "Gruen transfer", the term for shifting from targeted buying to impulse buying that seems to have been publicized by anti-consumerism scold and Media Virus author Douglas Rushkoff. Architect Victor Gruen was one of the creators of the modern mall (and not, despite what you may read, the cannibalistic leader of Uganda); he designed America’s first modern strip mall, Southdale Shopping Center in Minnesota, which is still in business today. Gruen had ideas about the new American city which were seized upon by Walt Disney, but while Gruen, like my home town‘s founder, shopping mall and suburban developer James Rouse, seems to have genuinely wanted to improve urban America, his invention ended the era of the downtown department store and directly contributed to the demise of the city as an everyday destination. Gruen’s employer, the Dayton-Hudson department store chain, is still around, but it’s evolved as the department store declined and the big box retailers (usually located in strip malls) arose to replace them; the company is now known as Target, and I can’t seem to leave its stores without buying something I didn’t intend to.
Peggy Lee, the impassive "Is That All There Is?" singer, passed away last week, and I’m sure some people out there are happy, not because they hated Peggy Lee or because they were traumatized by the Siamese song in Lady and the Tramp but because they had picked her in a dead pool. The dead pool — a contest based on predicting celebrity (or perhaps corporate) deaths in a certain period — is a game of long standing; one called the Game has been running since 1971, and they trace the idea back to either Maupassant’s Bel-Ami (in which a character tries to guess which members of the Academy will be the next to die) or the wagering on Papal lifespans in the 16th century.
Juliet at Eclogues has come out as a fan of creeping, eldritch horror, posting some great Weird Tales links, including a collection of Clark Ashton Smith short stories. Smith is really a lesser writer in the Lovecraft vein, but one of the great things about the Internet is that it only takes one fan of cult books (link via Baraita) to help spread the word. Unsurprisingly, Lovecraft’s friends and correspondents are well-rerpesented on the web: Fritz Leiber (best known for the Fafhrd and the Grey Mouser stories, Leiber also wrote the subtle tale of architectural horror, Our Lady of Darkness); Robert W. Chambers, author of "The King in Yellow"; Conan creator Robert E. Howard, who discussed ice cream flavors and turkey dinners with Lovecraft; Arkham House founder and posthumous Lovecraft editor August Derleth; and, of course, the man himself. I’ve always found Lovecraft’s life (his feelings of inadequacy after dropping out of school and failure to attend Brown University; his xenophobia and racism; his failed marriage and the role of his aunts in its failure; his slow descent into poverty; and his convivial spirit in the face of these adversities and a literary career he assumed would be entirely forgotten) and what I’ve read of his correspondence more interesting than Lovecraft’s fiction, but going to school in Providence gave me more of an appreciation for his work. I cannot think of a city that would be more appropriate for tales of decay and the haunting past. Lovecraft died of cancer in 1937, having maintained a clinical journal of his symptoms that he thought might benefit research. He was buried in an unmarked grave; forty years later, fans took up a collection to buy him a headstone. It reads "I AM PROVIDENCE", and on nights when fog wrapped College Hill, I could believe it.
Stephen Ambrose is officially screwed. A politician might be able to survive a plagiarism scandal, but a historian? As examples of unattributed direct quotations throughout Ambrose’s work continue to turn up, Ambrose’s work — more oral history than academic research, and, according to this Ambrose-bashing article I found linked on Talking Points, endlessly recycled — Ambrose’s career is going up in flames. I think, as Mickey Kaus puts it, "Ambrose’s best defense may be ‘I don’t really write my books.’" Ambrose, like Tom Clancy or James Michener, employed a bevy of researchers. I’m not sure that it would salvage his reputation to be revealed as someone who relied on ghostwriters, but wouldn’t that be better than being a thief?
In 1994, Marion Tinsley, the world’s best living checkers player, took on Chinook, the world’s best unliving checkers player, in the Second Man versus Machine Championship. Tinsley withdrew due to health concerns, and the man who replaced him, Don Lafferty, split the twenty game series at one win a side (with eighteen draws). In 1995, Chinook beat Lafferty in a thirty-two game series (one win, thirty-one draws) to become the undisputed world checkers champion. But two hundred years earlier, a Hungarian inventor named Wolfgang von Kempelen, had created a mechanical chess player that could compete against the Europe’s strongest players.
Urban legends — stories about $250 cookie recipies from Neiman Marcus or hippie babysitters putting babies in the oven — are a sort of folklore. They get passed around via email, Ann Landers prints them now and again, they slowly change to fit current events, and they end up getting debunked on Snopes. I never thought that an entire urban oral tradition, an underground mythology about the battle raging between angels and the forces of Hell, might exist, but according to this fascinating 1997 article about "shelter folklore" among Miami’s homeless children, it does. The Devil and his agent, Bloody Mary or La Llorona, hunt the children: "’If you wake at night and see her,’ a ten-year-old says softly, ‘her clothes be blowing back, even in a room where there is no wind. And you know she’s marked you for killing.’"
Somewhere between Kubrick’s The Killing and Peckinpah’s The Getaway, something happened to American culture. Crime novelist Jim Thompson wrote The Getaway and helped Kubrick spice up the dialogue of The Killing. But while Sterling Hayden can’t outthink fate in The Killing, Steve McQueen and Ali McGraw make it over the border to Mexico and freedom in The Getaway, escaping the nightmarish end their characters meet in Thompson’s novel. You could attribute the difference between novel and movie to the fact that Thompson was fired from The Getaway, part and parcel of his years of Hollywood failure. Thompson, variously a Catholic and a Communist, had a decidedly grim view of human morality and a rough sense of justice. You could attribute it to the decline in the Hays code. But I’ll attribute it to James M. Cain.
The end of the old year and the ringing in of the new is often a time for lucky food. In China you might eat a plate of dumplings looking for one with a coin inside — shades of the bean-sized Jesus doll baked into king cakes eaten before the onset of Lent, the sixpence in Christmas pudding, or the coin Greeks bake into Vasilopeta for New Year’s Day. Germans can ensure their luck with pork and sauerkraut, while the Japanese eat herring roe for luck. Poles sometimes eat pickled herring, and those lucky Scots have the traditional Hogmanay haggis. But for my money, the best lucky New Year food is hoppin’ John or a plate of black-eyed peas.
The redoubtable Miss Manners says that when you get down to it, people really only tell two stories: "My, how bad things have gotten!" and "My, how clever I am!" A book on the dumb things college students say is therefore a timeless idea. However, various people inside and outside the university are telling us that American colleges are being overrun by humorless, dronelike knobs — even at rarefied campuses such as MIT and Princeton.
Unsurprisingly, it was a bookish Christmas. Among the many, many books given to me and V. by our friends and relatives was The Arts of Deception: Playing with Fraud in the Age of Barnum (thanks for the recommendation to recommendation; G., who surely deserves a more kindly present than the DVDs of the damned that I saddled him with; however, I needed to make good on a promise), which I finished yesterday. V. received Carter Beats the Devil, a book recommended by Judith, which is being quite good. The two books are about roughly contemporary historical periods, and moving on just a few years brings us to the time of Louis Feuillade’s bizarro French serial Les Vampires, a gift from me to V. (I swapped DVDs with my friend Andrew; I gave him The Stunt Man and he gave me Coup de Torchon, a French adaption of pulp auteur Jim Thompson‘s disturbing little fable, Pop. 1280.) And much more (stationery! Le Creuset! Sweaters galore! Being John Malkovich!) — it was a good loot year, even if I didn’t get Detective Comics #27, an authentic Navy howitzer from 1883 (pointed out to me by Aaaugh!), or Potato Island. Maybe next year.
Do you find the holidays stressful? Does dealing with inlaws get you down? Why not horrify them with an old-fashioned Yuletide tradition, the Christmas ghost story? The most famous, of course, is Dickens’ A Christmas Carol, a perennial classic that is also short enough to be read aloud on Christmas Eve, if you have a bit of patience and a durable larynx. If you are looking for something shorter, a little digging will produce many authentic Victorian Christmas ghost stories, for instance "My Cousin, the Ghost, or Something Like a Christmas-Box", by Alfred Paxton, a genuine Victorian original from The Boy’s Own Paper, January 6th, 1883. (One presumes if The Boy’s Own Paper was a penny dreadful, designed to appeal to the sort of nineteenth century ragamuffin that Scrooge would have gladly caned.) From the turn of the century until after the Great War, M. R. James. James, a medieval scholar at King’s College, Cambridge University, made a tradition of writing ghost stories which he read aloud to friends and colleagues at Christmastime. James is considered one of the creators of the modern ghost story; his stories are widely available online, and Ash-Tree Press has collected his entire body of work within and about the ghost story genre into a single volume. Other writers are in the "Jamesian tradition" (Ghosts and Scholars defines this liberally, including such works as Fritz Leiber’s marvelous and creepy Our Lady of Darkness). Consider the Canadian novelist Robertson Davies, whose collection of ghost stories written for the University of Toronto Christmas party, High Spirits, features such less-than-terrifying works as "The Ugly Spirit of Sexism", "The Xerox in the Lost Room", and "When Satan Goes Home for Christmas". Davies himself cites Montague Summers, another scholar and ghost story writer; sadly, none of Father Summers’ ghost stories appear to be available online, but a well-stocked library will likely have a handful. Next year, settle down by a roaring fire and give your loved ones something even scarier than a stocking full of coal to think about. And I would like to wish my readers a merry Christmas and a joyful holiday season; don’t be scared by things that go bump in the night!
In the late eighteenth century, Robert Barker, an Irish portrait and landscape painter living in Edinburgh, had a brilliant idea. He would paint a 360° view of London (making sure to get the perspective right), enclose it in a circular room, and charge an admission fee. Amazingly, the idea worked. Barker had invented (and eventually secured a patent upon) the panorama. The panorama proved to be wildly successful, and variations were created. The moving panorama had rollers, allowing the scene to slowly move past a stationary audience and create the illusion of motion; the diorama, the main source of income for its inventor, Louis Daguerre, early in his career, used tricks of light to dissolve from scene to scene. There was money to be made in the panorama business, and a man named John Banvard was determined to cash in.
Combine cheap wood-pulp paper, prose deemed by noted critics to be "turgid — even bombastic — involved, needlessly parenthetical, and superabundant in epithets", and an plot that manages to combine adventure, sentiment, and the forbidden thrill of miscegnation, and what do you get? A best-seller. Ann Stephens was the author of Malaeska, the Indian Wife of the White Hunter, an 1860 reprint of an 1839 piece written for The Ladies’ Companion. Published by Irwin Beadle, it was a massive success, selling upwards of 60,000 copies, a respectable number even today. The price of Malaeska gave birth to a term: Ann Stephens had written the first dime novel.
Last weekend, I introduced V. to Two-Lane Blacktop, a movie every good as the reputation I wasn’t aware of when I first saw it. (One of the great advantages of working at the Fine Arts in Berkeley was that I could rely on Keith and Emily to know about such things.) Warren Oates, dressed in a dazzling array of pullovers, gives the best performance of his lengthy career, and both James Taylor and Dennis Wilson (of the Beach Boys) acquit themselves well. It’s pretentious as all hell, with the baffling last shot, the portentious character names (as the poster says, "James Taylor is the Driver. Warren Oates is GTO. Laurie Bird is the Girl. Dennis Wilson is the Mechanic."), the incredibly laconic characters played Taylor and Wilson. The director, Monte Hellman, was a Roger Corman house director. Corman, the frighteningly prolific director, had an eye for talent, and hired Hellman to make two movies with a B-movie actor and screenwriter named Jack Nicholson. Nicholson and Hellman made two Westerns, The Shooting and Ride in the Whirlwind; when Nicholson became a star with Easy Rider (and, more importantly to American International Pictures, showed that the youth market could be worth a hell of a lot of money to a studio), Corman sent Hellman off to do a road movie of his own.
Recently a Slate piece on Monsters, Inc., made reference to today’s "golden age of American cartooning":
Most people don’t think about media criticism that often. They might gripe about the left-leaning tendancies of a particular columnist or paper; they might grouse about what they see as blatant lies spewed forth by talk radio hosts; they might occasional catch wind of journalistic scandal, when a Pulitzer winner returns her prize for falsifying a story or a columnist admits to plagiarism. There are occasional flareups of interest, like those generated by Salon columnist David Horowitz’s attempt to show that right-wing speech in campus newspapers was being stifled. (Horowitz shopped a vaguely inflammatory ad denouncing slavery reparations around to numerous college papers. He got his wish when thuggish students at several campuses stole newspapers that ran the ad, but when Princeton’s newspaper called his bluff and ran both the ad and an accompanying editorial denouncing both the ad and Horowitz, Horowitz decided that the way to encourage free and open depate was to stiff the Princetonian on the thousand dollars he owed them for running the ad.) Since the attacks on New York and Arlington, media criticism has been perhaps more visible than usual, as various people who have the time to worry about such things have been beating each other up in the press about defining legitimate responses to the attacks and the usual debate about the proper relationship between the American press and military breaks out once again. And now there’s another possible flareup of attention: the news is in (via MetaFilter) that the editor of Smartertimes.com will be involved with a new New York paper, the New York Sun. I’m a big fan of media criticism, and I wish that I liked Smartertimes more.
Scientists believe that HIV has existed since the early 1950s, if not earlier; a scattering of plasma and tissue samples (from an African, an American, and a Norweigian) shows that it infected at least some people before AIDS became a pandemic. What made the virus so virulent? What changed? Well, HIV is terrifically prone to mutation, so it’s possible that the forms that existed before the 1970s were just less easily transmissible. But it seems more like the modern world makes us more vulnerable — ignoring whether behavioral patterns changed (through increased casual sex or increased use of IV drugs), the formation of a more efficient blood bank system and hugely increased intra- and international travel made it much more easy for the HIV virus to travel. I don’t see this trend going away; just as political and economic instability halfway around the globe can now effect the United States virtually immediately, medical crises are not always going to stay regional in a more interconnected world. In 2000, an estimated 2.4 million sub-Saharan Africans died of AIDS; an estimated 780,000 South and Southeast Asians died; an estimated 5 million people were newly infected with HIV this year. I’m fortunate that no one close to me has acquired AIDS, although friends and relatives of friends and relatives have died, and in the United States, we’ve done a largely masterful job of limiting the transmission of HIV and helping HIV-positive people live longer and more healthy lives; still, I don’t think we’re going to be able to ignore the spread of AIDS outside our borders forever. (Links via MetaFilter‘s day of participation in Link and Think, a weblog project for World AIDS Day, and communication with AIDS-researching MeFilistine Sennoma.)
Lingua Franca, the magazine that bills itself as "The Review of Academic Life", may be shutting down. Ron Rosenbaum, the author of Explaining Hitler and The Secret Parts of Fortune (which I highly recommend), expresses hope that some "hero" will step up. Judith Shulevitz half-seriously suggests the the government subsidize little magazines. But little magazines have always gone out of business; the Little Magazine Project plans on indexing at least 2,500 title from after World War II, and I’d suspect that no more than a twentieth of the magazines from the ‘40s, ‘50s, and ‘60s are still publishing.
From its beginnings as a European colony, Rhode Island (where I have spent a lovely Thanksgiving holiday) was a haven for religious dissenters; despite the folk etymology that "Rhode Island" is a corruption of "Rogue’s Island", the name arose from a comparison of the size of Block Island, today a summer vacation destination for Southern New Englanders, to the Isles of Rhodes, although the Rogue’s Island nickname might well have arisen later, as a Puritan response to the trickle of dissenters who were expelled or removed themselves from Puritan Massachusetts and headed south to religious liberty.
Why is it so ridiculous that America would hire an advertising executive to coordinate its propaganda campaign in Central Asia and the Middle East? Charlotte Beers broke new trails for women in the field of advertising. Now she is the State Department’s point woman for the dispersal of American propaganda. An article in Slate suggests that the move to hire Beers has been met with some sniggering from the press. I don’t know about that — mentions of Beers that I’ve read have largely (if not universally) been neutral to respectful. But if anyone taking this cynically, it’s a shame. Beers is attempting something that will, if successful, make Americans safer: the partial defanging of anti-American rhetoric in the Muslim world. Beers isn’t trying to sell Americans on the Administration’s land grab of governmental authority for the executive branch or the shameless retroactive repeal of the corporate alternative minimum tax, a $25 billion dollar hand-out to some of America’s largest corporations. Beers is trying to counteract a stream of falsehoods that contribute directly to the radicalization of the Islamic world. America need not be a perfect nation — or even a very good one — to be better than the Taliban; the American way of life doesn’t need to be misrepresented to offer something appealing to the average Muslim (probably young and poor, quite possibly jobless) in the Third World. Making sure that the truth can be heard should be enough.
The American Highway Project is dedicated to preserving images of "the architecture and cultural landscapes situated along the highways of the U.S." Texaco stations and abandoned signs are nothing more than historical artifacts waiting to happen — ghost towns in the making (links via Eclogues). If there’s something that the web is really good at, it’s providing a venue for these images of vanished and vanishing worlds. James Lileks photo and postcard archives of the forgotten Midwest are the product of one man’s obsession, but they’re as fascinating in their own way as the jaw-dropping Prokudin-Gorskii photographs of Russia in the years preceding the First World War. The difference is that one will be displayed at the Library of Congress and one is available to me only online. I’m a soft case for American cultural ephemera, and this sort of website fills a huge gap. Even beloved icons like San Francisco’s Doggie Diner heads, which have narrowly escaped destruction a number of times, are subject to the indignities of wind and rain. There are worse things to imagine than a world unfamiliar with tacky motel logos and old-fashioned urban advertising, but thank goodness people are putting together archives and making them available while we still have more than pictures left.
Suppose you are driving cross-country and you decide to stop in Abilene, Kansas, on a hot summer’s day. There are a few attractions where you can get into the air conditioning and buy a postcard. There’s the Dwight D. Eisenhower Museum, which you would probably need to pay me to visit, and the Museum of Independent Telephony, which is more interesting than you might think, as it reveals a whole world of small-town technological and societal infrastructure. You can almost taste the wheelings and dealings that led Abilene’s Brown Telephone Company to become one of America’s larger corporations. And then there’s the Dickinson County Historical Museum, which devotes much space to Abilene’s most famous lawman, Wild Bill Hickok.
The first indie rock show that I ever went to was Tsunami, with Franklin and Edsel opening, in the basement of an anonymous building at the University of Maryland. Or at least that’s the first that I can remember; I may have forgotten one or be repressing the thoughts of something that my modern self finds embarassing (although after some cajolling and with a few drinks under my belt I’m willing to admit that I went to see Rush with Mr. Big opening while I was in high school, so the embarassment bar must be set pretty high). So that’s my seminal indie rock show, just as Superchunk‘s No Pocky for Kitty was my very first indie rock album. Tsunami was intricately tied to the DC/Arlington pop scene and to Simple Machines Records, whose "Mechanics Guide to Putting Out Records, Cassettes, and CDs" probably did as much as MRR‘s Book Your Own Fuckin’ Life to foster the "Hey kids, let’s put on a show!" ethos of early- to mid-Nineties indie rock. Jenny Toomey of Tsunami was unfailingly nice to 16-year-old me, even when she probably shouldn’t have been; she let me interview her via email for the zine I did in high school, and there’s a direct line between my flailing away with a Xerox machine and a glue stick and the writing I’m doing now. Jenny’s been doing some thinking about the future of music (of the indie variety) in a post-Napster age, and she’s finally gotten herself a website. The nostalgia mills are recycling all kinds of things — hair metal bands, say — that should have been left buried, but Jenny is someone I’m thrilled to see resurface. It’s a shame I’ve haven’t been keeping an eye out, because I bet she never left.
I’ve often wondered how it feels to be a reporter at the dual-natured Wall Street Journal. The Journal is one of the finest — if not the finest — sources of business reporting in the country. It’s just the place you want to turn for information on the nation’s economy, not to mention investigative journalism like the discovery of fraud at AremisSoft (in which company officers apparently just made up foreign contracts to pad their numbers) last year. It’s also a pretty good source of general news, although its bread and butter is clearly the business reporting. But the editorial page generally runs, to paraphrase Dorothy Parker, the gamut of opinion from A to B. The viewpoints are diverse: from Kissingerian fatuousness to Peggy Noonan’s loopiness, from corporate welfare apologists to the occasional conspiracy theorist. The Journal is entitled to write editorials and print op-eds whatever slant it wishes, of course, but it’s a shame that the editorial page doesn’t even try for a semblance of even-handedness.
In 1635, the first patent on a perpetual motion was granted in England, and to this day, people are churning them out. The laws of thermodynamics mean that a closed system can only lose energy, not produce it, and the various schemes that have been proposed all overlook certain basic facts of physics that prevent them churning out free energy forever. Tidal turbines and Stirling engines are not enough; these tireless inventors want to start a revolution. And can you really blame them? Given the panoply of human desires, it’s hard to take umbrage at people who want to get rich and famous off a revolutionary discovery that will give everyone clean, cheap power. And what could be more romantic than a grand windmill-till?
Halloween has come and gone. I wore no costume. I skipped the only Halloween party I was invited to; V. and I have been getting up at 6, and I just didn’t think I could bedeck myself in nacre and ormolu and swing, brother, swing after a ten-hour day. We bought two bags of candy and received not a single trick or treater. I don’t know if it’s the awfulness of the the September 11 attacks, anthrax fears, or just a sense that partying would be inappropriate, but this was a very subdued Halloween. I’ve been invited to a Day of the Dead party tonight, but I may not got. I looked forward to this holiday so much when I was younger; even a few years ago, this was a big deal among my friends, who managed such costumes as an angel with five-foot wingspan, a BART station, and a commedia dell’arte puppet. But this year? We went to my friend Andrew’s apartment last weekend and watched a double feature of Spielberg’s excellent Jaws (which V. had never seen before) and the impressively shot (for its budget) and innovative but not actually good Carnival of Souls. Other than that, no concessions were made to the holiday, and ersatz and commercial though it is, I kind of missed it. I recently discovered, to my immense delight, Haunted Attraction, the trade journal of the "dark amusement industry". Spend a few minutes looking at the wonderful, wonderful articles. Even if Halloween’s pleasures prove transitory, geekery is forever.
I’m now halfway through my original twelve-month plan for the Literary Year, and boy, writing book reviews is harder than I thought. I’m increasingly impressed by people like Jessamyn, Jeremy, or Cosma (whose site is worth exploring at great length; I first stumbled across it when I was in college, and I’ve enjoy stumbling across it ever since) who can write reviews of a significant percentage of what they’ve read; I can read the darn things faster than I can review them, as evinced by my dozen-book backlog. I still haven’t managed to shame myself into finishing Gravity’s Rainbow (although perhaps the public declaration that I’m trying to will help), and the fact that I’m clearly reading fewer books than I did is vaguely depressing. Still, it’s a fun project; even if people don’t stumble across something worth reading and I don’t manage to polish up my reviewer skills, it still might be an interesting document twenty years from now as Older Me stares in awe at the incredible cheek and lack of discernment on the part of his younger self. These lists can be fascinating personal histories — witness the amazingly voluminous What I Have Read (found via MetaFilter, of course).
Gloomy thoughts have been on my mind lately. While pulling down links for a MetaFilter post on the Civil Defense Museum (which you should read, as the links other people posted are golden), I came across a page on nuclear holocausts in popular culture and, from the same professor, a fascinating compendium of plot summaries for atomic war fiction. It’s a popular story. The end of the world has always been with us.
Last weekend was, I suspect, the last gasp of summer. Two days of driving to strip malls in my new used car, Walter Mitty; listening to Guided By Voices (remember: "I am a heavy drinker who enjoys The Who. You are fired. Now, a high kick."); heading to the farmer’s market to buy lima beans and eggplant; buying books at the grungy used books store (two for me, one for V., and three as a swap to send to Judith); taking long walks in the warm sun; sitting outside the coffeehouse eating a cookie under a blue, blue sky. Goodbye, summer of 2001. We hardly knew ye.
For the last six years, I have had an October ritual: rooting for the Yankees to lose. For four of those years, I have been disappointed (and things don’t really look good this year). I grew up rooting for the Orioles; the Yankees have beaten out my secondary rooting interest, the A’s, the last two years; and I absorbed a healthy dose of Yankees antipathy from my grandfather, a Red Sox fan — but I’m not quite sure why I like baseball so much. There’s a tradition enjoyable baseball writing (from Roger Angell, most notably, but also from people like Mark Harris and W.P. Kinsella). There’s the stat-hound factor; over the last thirty years Bill James and other sabermetricians (from SABR, the Society for American Baseball Research) have completely rewritten how many people — ESPN.com’s Rob Neyer being a prime example — think about baseball. You don’t need to get into real propeller-head detail, although I find that sort of thing interesting if it’s short enough; after reading one Bill James book and seeing how he codifies baseball by studying the reams of statistics the sport produces, I honestly think you’ll know more about baseball than most managers did twenty years ago (and more than some do today). But I think the thing that attracts me the most is the history — the football greats of the ‘60s probably couldn’t even make the practice squad of a modern NFL team, but I think that Josh Gibson or Lefty Grove or Rogers Hornsby would still be great. Baseball just hasn’t changed nearly as much as any other American sport. You can compare Honus Wagner to Cal Ripken or Alex Rodriguez and have a meaningful arguement about which was better. When you see a game at Fenway Park or Yankees Stadium, the sense of history just boils off the field. I love making that brief connection to America’s pastime from days gone by — even if, then as now, the Yankees were probably winning.
Charles MacKay, in Extraordinary Popular Delusions and the Madness of Crowds, writes:
The discovery that the anthrax sent to Tom Daschle’s office was military-grade is rather shocking; anthrax is most deadly when made into a fine powder, something highly difficult to do. It had been thought that only the United States and the USSR had succeeded in creating aerosolized anthrax (which can be easily disseminated through the air and is more likely to be inhaled; anthrax is much deadlier in the lungs). I hope we find who did this and render them utterly incapable of doing anything like this again for a long, long time; I hope the yahoos who have been making fake anthrax threats enjoy the years they’re going to spend in jail. Obviously, this is terrible and scary and we need to get to the bottom of it as soon as we can. But my irony alarm started ringing when I read Senator Charles Schumer’s suggestion that generic ciproflaxin be produced immediately.
Steven Baum over at Ethel the Blog is doing an excellent job throwing out links to articles I’m unlikely to stumble across on my own. I think many of the articles he links to are wrong, but they’re almost uniformally thought-provoking. I have to disagree vehemently with this one, though. Steven links to a story in The Guardian describing the life cycle of propaganda. I’ll be the first to admit that the U.S. has a long history of fighting a propaganda war; Richard Hofstader‘s essays on the roots of American imperialism are convincing. Propagandizing is not always shameful. Should the U.S. have held back from printing "Loose Lips Sink Ships" posters? Should Edward Bok have resisted turning the Ladies’ Home Journal into an unofficial arm of Herber Hoover’s Food Administration during the First World War? Should Hoover’s government have locked up Eugene Debs and other war protesters? (Although the first one is innocuous and the second one only mildly disturbing, I’ll side with Mencken on that last one; H.L., no friend of socialists, wrote that Debs was misguided and dangerous and foolish, but a gentleman and in his own way a patriot, then laid into Woodrow Wilson as only Mencken confronted with censorship could.) But beyond that, the article in the Guardian makes a few risky assumptions.
Barry Bonds hit home runs 71 and 72. The Oakland Athletics won their hundredth game, which I hope will lead to more stories on their excellent starting rotation (including certified weirdo Barry Zito). I bought a block of Colby cheese and a pound of coffee at the store. I bought a used Honda to replace my beloved but aging and infirm Maxima. Sam Coomes of Quasi humped his keyboard at the Black Cat, while I (inexplicably stricken by a black mood) glared at various people in the crowd. I bought pears and apples and peppers and garlic at the farmer’s market, then bought some vegetable pakora from a stall run by Hare Krishnas. The shooting war in Afghanistan broke out. I saw Neko Case and Her Boyfriends at Iota in Arlington, and they played my two favorite songs off Furnace Room Lullaby.
Giovanni Casanova was a preacher, a violinist, a secretary, a jailbreaker, an occultist, and a librarian. He was famously a lover, but the reason for that fame lies in his writing. La historie de ma vie wasn’t published until twenty years after his death, and the French edition was butchered by the translator, one Prof. Jean Laforgue. But eventually more faithful translations emerged (the English edition, by Willard Trask, wasn’t published until 1966), and it’s a joy to read. Consider his account of how he lost the patronage of the Grimaldis:
I can’t do it any more. I can’t think about Afghanistan and Bin Laden every single day. I need to start thinking about other things. It’s not survivor guilt, or even (just) a sense of helplessness. It’s that America is in a kind of phony war interlude; we know something is going to happen, but we don’t know what or when. (And I hope it either happens soon or waits until the spring; if history has taught us nothing, I hope it has taught us that land wars in Asia in the winter are unwise.) And I think, and I worry. The phrase "NBC", for "nuclear/biological/chemical" is being tossed around. Paul O’Neill is reversing his previously stated opposition to a crackdown on money laundering in offshore banks, but Senate Republicans are still working on jamming Alaskan wilderness refuge oil drilling legislation through. War game planners are probably hard at work trying to figure out what’s next, having already established (in the "Dark Winter" exercise, starring Sam Nunn as the President) that the government "currently lacks adequate strategies, plans, and information system to manage" a large-scale bioterror attack. And it’s autumn. The moon is full. The air is getting a little more biting. The apples have started to come in. Half of all Americans support making Arabs carry identification, even if they are citizens.
Slate’s Inigo Thomas has a chilling speculation on Osama bin Laden’s master plan in the 9/11 attacks: force America into Afghanistan, then use a resulting anti-American backlash in Pakistan as cover for a raid on Pakistan’s nuclear weapons supply. Thomas quotes the Christian Science Monitor‘s description of Pakistan as a "powder keg". Pakistan’s army and immensely powerful Inter-Service Intelligence agency may not be able to stand by and let the Taliban, their most successful creation be destroyed; America may not be willing to let them do anything else.
Is the CIA capable of investigating an Islamicist terrorist movement? One former operative says no, quoting a former member of the agency’s Near East Division: "The CIA probably doesn’t have a single truly qualified Arabic-speaking officer of Middle Eastern background who can play a believable Muslim fundamentalist who would volunteer to spend years of his life with shitty food and no women in the mountains of Afghanistan." There are, of course, dissenting opinions; a different foreign operative argues that "it’s easy to find anyone if you’re willing to pay enough money", although he makes the rather dubious case that Congressional oversight and CIA reluctance to recruit badguys handcuffed intelligence operations. Given American backing of the various murderous right-wing regimes during the Cold War (as well as the more specific rebuttals linked above), this is somewhat hard to believe. But as William Saletan argues, anti-terrorism, like anti-Communism, is soon going to create its own moral framework in which smaller considerations (such as not dealing with murderous zealots) fall by the wayside.
I went to watch the Orioles beat the Yankees last night, and there were spontaneous chants of "U-S-A! U-S-A!" And it was weird, but not as weird as J.R. of Painted Land‘s remarkable trip to beautiful Fenway Park on Thursday.
War, war, war. The declarations that we are "at war" are coming furiously now — at war with an unknown enemy, at war on a battleground of the whole world. But we aren’t at war; what was done to New York (those poor dead secretaries and lawyers and janitors and insurance brokers; those poor brave fire fighters and police) and Arlington justifies military action, unilateral or multilateral, in a way that I think nothing in Kuwait ten years ago did. But is it war?
As a followup to Jason’s advice, you can contact the Red Cross for information about the nearest location to give blood. Their server is completely overloaded, and I can’t get through to 1-800-GIVE-LIFE, but they’re going to continue to need blood over the next few days. (The eligibility guidelines are very slow coming up at the moment, but if you’re above 110 pounds, don’t have a tattoo or bloodborne diseases, aren’t currently sick, and haven’t spent more than a few months in Europe in the last few years, you’re probably fine.) Here is the Google cache of their page; I hope that when things are slightly calmer, I can get through, find a location, and donate on my lunch hour.
I’m simply stunned by the events of today. I don’t think there’s anything that can be said at this point in time, except Jason of Queso’s sensible comment:
Man is a classifying animal. The urge to break things down into their component parts is an ancient and honorable one. Witness, for instance, the dietary laws delivered by God to Moses and Aaron in Leviticus:
What do you call something that’s big? Real big? Bigger than big? Maybe it’s mammoth, from the Russian mamut. Although the wooly mammoth have been extinct for approximately 10,000 years, it inspired art from Cro-Magnon days until modern times: "The mammoth roared again like thunder, / And charged as only mammoths can." (Mammoth links from the Field Notes Macropedia, where yet more can be found.)
One of the advantages to my girlfriend’s presence in grad school is that I can piggyback on her access to a research library. For instance, I have a decent chance of one day being able to read this essay on Stapledon’s immensely dense, often painfully expository novel of the repeated rise and fall of the human race. It’s an odd book and I’m looking forward to reading Lem’s take on it. Lem is, of course, no slouch of a writer himself — his mathematical love poem from The Cyberiad was one the readings at my friends Josh and Kim’s wedding, and it’s just as example of Lem at his best. Science fiction is blessed with good writers who are also good critics: Samuel R. Delany and Joanna Russ are two of the better known examples, but dozens of other genre writers have taken stabs at criticism. (LeGuin’s excellent critical writing is particularly noteworthy, as, in a different way, is Bruce Sterling’s aggressive little zine, Cheap Truth, almost as responsible as Neuromancer for creating cyberpunk.) I’m headed out of town for the weekend, but I’m bringing along a slim volume of R.A. Lafferty essays, It’s Down the Slippery Cellar Stair, which I bet will be as good as everything else he wrote. Then back to Science Fiction Studies to build up a reading list — let’s hear it for research libraries.
The state sport of Maryland is, naturally, jousting, but many have argued that the state should instead celebrate duckpin bowling. Duckpin bowling (and its New England cousin, candlestick bowling) stemmed from bowling alley owners’ efforts to differentiate themselves from alleys offering "big pin" bowling. Professional bowling and bowling leagues are less popular than in their heyday, but duckpin and candlestick bowling are almost entirely moribund, having fallen back from national semi-prominence to a flickering presence in their home regions.
Congratulations to Greg, who has arrived in Reno! (You can read dispatches from his trip at Notlost.) The arch seen in his final post of the trip has a history dating back to 1926, when it read "Reno Transcontinental Highway Exposition". Reno’s motto — "The Biggest Little City in the World" — has appeared on the arch since 1929, the winning entry in a contest to choose a city motto. Reno was named after Union Major General Jesse Lee Reno on May 9, 1868. Although Reno is most famous now for gambling, at one point Reno had a different reputation. Thanks to Nevada’s liberal divorce laws and six-week residency requirement, Reno was once the prime location for a peculiar sort of spa: the divorce ranch, where upper-class women could while away the time waiting for their divorce to come through. Today, the divorce ranch is largely forgotten, remembered only through the occasional reference in literature from the first half of the century and George Cukor‘s 1939 film, The Women. But if you’re ever in Las Vegas, stop by the Floyd Lamb State Park, the site of a former divorce ranch, and imagine a rich women — a Rockefeller, a Vanderbilt — spending a few weeks on the Monte Crisco or the Lazy ME, taking a poolside nap, flirting with cowboys, and riding into the sunset.
Most literary frauds tend to have obvious goals: "Some forge for love, some for money, and some for the glory of having done it." The false Shakespeare plays of William Ireland were done for fame; Alan Sokal or the Spectralists wrote their literary fakes as parodies designed to show a lack of critical judgment in movements they disagreed with. The Necronomicon (as available in stores, not as cited in the works of Lovecraft, Robert E. Howard, and the like) was put together to help publishing companies make a cheap buck off gullible high-schoolers. But what about those who lie in between — what about, for instance, Abbé Charles Etienne Brasseur?
The last couple of days at work have been kah-razy, man, kah-razy. Now I’m home again, and I spent the entire drive home thinking about to relax. My sweetie is making homemade sauce. I thought about what to listen to — Louis Armstrong‘s "Skokiaan" (also known as "that song from the end of Slacker", and one of my all-time favorite cheer-up songs) or Zumpano or the Pixies B-sides compilation on 4AD that I finally picked up. Something I find soothing. And a pot of tea, definitely tea. And something to read; I was thinking about starting Zilpha Keatley Snyder‘s The Egypt Game. I wanted to aim for the intellectual equivelant of nursery food, described by Jane and Michael Stern in Square Meals as food "that cannot fail to ease even the grumpiest crosspatch." No cilantro, nothing garlicky, no garam masala. I want to turn my brain off. But I discovered via Kathryn "the Oracle" Yu that Philip Pullman has been nominated for the Booker Prize. Maybe children’s books are an adult pleasure now. I’m thrilled for Pullman, but also irrationally peeved. Garlic and cilantro are fine, but tonight? Velveeta and buttered noodles, please.
Recently, I ran across a reference to the word "ecdysiast". It wasn’t attributed (by the AskOxford folks) to H.L. Mencken, so I dashed off (and clearly failed to proofread) a letter. Praising Mencken led me to read some of Mencken’s essays again, and what a treat they are. Mencken was the author of The American Language and several volumes of essays under the apt title Prejudices; he was a newspaperman, a columnist and reporter for the Baltimore Sun; he was an editor, founding both the American Mercury and the now-forgotten Smart Set. Mencken was viruently anti-Christian, anti-fundamentalist, anti-Appalachian, anti-black, anti-Semetic, anti-female, anti-FDR, anti-upper-class, anti-middle-class, and anti-lower-class. He was anti-people. He probably kicked dogs and babies. About the only things I can bring to mind that Mencken thoroughly approved of were free speech, lucid writing, good cigars, and the occasional tipple. One of the things he mostly approved of was Harriet Monroe‘s Poetry.
Last week, I got a song stuck in my head. "You threw out my Nancy Drew books / My model horses from Massachusetts / All my Barbies and all my Kens / My stuffed animals, my childhood friend!" The song is called "Nancy Drew", it’s by a band called Tuscadero, and even though I own the Teenbeat album it’s on, I probably haven’t heard that song in five years.
The name "Delia Derbyshire" may not be familiar to you, but if you were even a marginally geeky child in junior high, you’ve probably heard her work. Derbyshire, who died this July at the age of 64, wanted to be a recording engineer for Decca, but was turned away because she was a woman. Instead, she went to work for the Radiophonic Workshop at the BBC instead, and quietly developed into a prolific and pioneering electronic composer, working with comparatively primitive equipment to create haunting music for television, records, and live performance.
If you’re a casual reader of Shakespeare and you run across a word, phrase, or reference you don’t understand, what do you do? Grab an annotated edition! Annotated editions — whether of Shakespeare or the Bible, Alice in Wonderland or The Wizard of Oz, annotated editions make faking scholarly knowledge of a work ever so much easier. Annotations flourish on the Internet, but they tend to be a bit more esoteric than Shakespeare and the KJV (or even Lovecraft and Carroll and Baum).
A MetaFilter thread on text adventure game writer Andrew Plotkin led to a few happy discoveries. Plotkin (whose website is at eblog.com: "Blong! You are a pickle." Nobody who likes Daniel Pinkwater could be an evil man!) has written a number of entertaining (and, in some cases, viciously difficult) interactive fiction games, some of which I have played and enjoyed. But there’s more to this story.
Tonight I made a batch of spicy peanut butter noodles, using a recipie loosely adapted from a Deborah Madison recipie. The secret ingredient? Magic Rooster sauce! My college roommates and I first discovered Tuong Ot Sriracha sauce — the one with the rooster on the bottle — at Apsara, home of the best Thai and Cambodian food in Providence. (Apsara is also dirt cheap; if you happen to be passing through Rhode Island, stop by for some nime chow.) Magic Rooster sauce was a delicious compliment to our orders of noodles, and it’s still probably the best hot sauce for cooking with that I’ve found. Sure, a bad batch of sauce produced potentially explosive gas, but it adds some heat and a pleasant taste anywhere you might want to use cayenne. (For spicy faux-Thai peanut butter sauce, thin a scant cup peanut butter with a quarter cup of soy sauce, a quarter cup of water, and a half cup of rice vinegar. Stir in three cloves finely chopped garlic and a scant tablespoon each of sesame oil and peanut oil. Add a dollop of honey and Tuong Ot Sriracha to taste. Heat and stir until blended. Adjust the proportions if the sauce is too thin or thick — I don’t measure when I make the sauce, but it should be a fairly thick, gloppy liquid. Mix with rice noodles and garnish with finely chopped cilantro, scallions, or crushed peanuts if you have them on hand. It also makes a fine topping for fried eggplant.)
If you have been reading my book reviews, you may have noticed that I haven’t been keeping up. It’s not that I haven’t been reading (Spenser books are like popcorn, and I went on something of a spree); it’s that I’ve been working on a project for Greg, who is on his way to Reno, Nevada. Reno is the home of marriages a-plenty, the National Bowling Stadium, the Mackay School of Mines, and, in at least one bad movie, Clint Eastwood. Soon it will be the home of Greg and the 6 Car. Best of luck, Greg!
I live right outside a major tourist destination, but like most people who live near major tourist destinations, I rarely do tourist things — except when tourists are in town. My college roommate has been contemplating a move to DC, and he’s making an exploratory visit, which meant yesterday was spent wandering around the Mall and being reminded of just how nice tourist interactions can be. A man in a Sikh’s turban managed to communicate to me that he wanted me to use his camera to take a picture him using hand gestures and about three words of English. In the National Gallery, an older man with a heavy Italian accent stopped me. "Excuse me, what does it mean, this word?" he asked, pointing to the "American Naive Art" sign. "It means, well, untrained," I responded. He turned to look at one of the paintings. "Oh, yes! That baby," he said, pointing at a particularly unrealistic depiction of an infant, "he is 30 years old!" Then he broke out into a big grin, and I did too — two tourists enjoying a joke.
Ayn Rand’s Atlas Shrugged, one of the cornerstones of Objectivism, would by most any standards be considered a book of science fiction. Robert Heinlein’s The Moon is a Harsh Mistress is, among other things, a case study of a libertarian society. Jerry Pournelle claims credit, along with frequent collaborator Larry Niven, for much of the initial push for Reagan’s Strategic Defense Initiative. So is science fiction a literature of the political right?
Things have been pretty hectic at work, and V. has been running around like a crazy woman trying to get ready for the big academic conference she’ll be at next week. But the big reason that I haven’t posted is that I already have most of a hefty post written. It happened to 19th C. authors Thomas Love Peacock and Charles Dickens. (Dickens famously died before revealing who did kill Edwin Drood, sparking a hundred years of debate and leaving Edgar Allen Poe the title of inventor of the modern mystery story.) It happened to Schubert. Skyscrapers stand unfinished on the Pacific Rim, waiting for the next economic boom; the Cathedral of St. John stands unfinished in New York, waiting for something else. Often it seems like artistic performance anxiety prevents one from finishing the deed — see, for instance, Ralph Ellison’s Juneteenth, a gargantuan manuscript lost once to fire and entirely reassembled, added to in dribs and drabs for thirty years and never, ever handed off by Ellison. I feel for the editor who had to make something of that manuscript, and I feel for Ellison. My upcoming post won’t be that long, never fear — it’s not even as long as my brief history of comics — but it’s just not right. While you’re waiting, why not sit down, crack open a beer, and read a short story? I’ll be done any minute now.
When I lived in Berkeley, there were at least five revival houses in the Bay Area. I went to a lot of movies — probably an average of two a week by the end. Part of that is because I was a volunteer at the Fine Arts, and usually managed to slip in to see at least one movie during my popcorn-counter shift there. But part of it is that I had a number of friends whose opinions I could rely upon — I knew where their tastes overlapped with mine, and I knew when I could trust them when they told me that I had to, had to, see a movie. Which is good, because most newspaper film critics suck. One who doesn’t, I’m pleased to discover, is Jonathan Rosenbaum.
My senior year of college, I worked at the library. Specifically, I worked for the acquisitions department. I helped them catalog (and cart to the sub-basement) their special collections; I sifted through boxes of donations that hadn’t seen the light of day in twenty years; I leafed through the collection of the late Austin Warren looking for marginalia and spent a lunch break walking through the hundreds of joke books a pair of brothers had donated (before sitting down to read The Wit and Wisdom of Spiro Agnew). My boss was a charming woman who had begun her career at the library working in the bindery. She told me that when workplace stress got to be too much, she still liked to pull out a book and start sewing on a new cover.
Those with no monetary need to reign in their flights of fancy do all kinds of entertaining things. Round-the-world ballooning has practically been done to death by moguls. 19th century rifle heiress Sarah Winchester built her house into a 160-room maze, the better to protect her from ghosts. The Cone Sisters collected art in the Teens and Twenties, just like British adman and Thatcher confidante Charles Saatchi does today. (Determining the relative merits of their collections is left as an exercise for the reader.) Museum-building has long been popular, and philanthropy is a perennial winner, from the Ford and Carnegie Foundations to today’s Bill and Melinda Gates Foundation or Soros Foundations. But more Quixotic efforts are ever so much more interesting! Martin Gardner’s Fads & Fallacies in the Name of Science cites the case of financier Roger Babson, who devoted some of his fortune to the Gravity Research Foundation, dedicated to inventing an anti-gravity device. On a more earthbound note, the Clay Mathematics Institute was founded by money manager Landon Clay; "dedicated to increasing and disseminating mathematic knowledge," it offers million-dollar prizes for solutions to some of mathematics’ most important unsolved problems, including a final answer to the question of whether P = NP and a proof or disproof of the Riemann Hypothesis. And then there is the strange case of Marc Sanders, an amateur philosopher who has spent thousands of dollars to have experts on metaphysics review his manuscript. Godspeed, Mr. Sanders — debating the necessity of the divine is a hell of a lot more interesting than taking a balloon ride.
Friday night we went to see Will Oldham play at an art gallery in Baltimore. It was sold out, sadly, so we hung around outside the door to listen to a song or two while Amy practiced her cigarette-rolling skills. Oldham lives in Baltimore now, so I should get another chance to see him. From everything I’ve heard (and the one show I’ve seen, opening for Quebocois art-school favorites Godspeed You Black Emperor!), Oldham’s not much of a live performer, cultivating a chilly distance from the crowd, but there’s always the lure of hearing a new song (or one from his extensive back catalog that I’m not familiar with).
Carrie Nation stood six feet tall and weighed 180 pounds. She was not a woman in whose way you wished to stand as she brandished her hatchet and tore up a bar. The divinely-inspired Nation and other temperance workers — most famously the Women’s Christian Temperance Union — were a powerful political force in turn-of-the-century America. The WCTU was associated with the Progressives, the coalition of good-government reformers, trust-busters, Christian reformers, and suffragettes that made huge strides in the Midwest and Northeast and produced such notable polticians as Teddy Roosevelt and Battlin’ Bob LaFollette. The WCTU’s agitation for a dry America led to the 18th Amendment, which in turn led to the growth of organized crime and ethnic machine politics in America’s large cities. The temperance movement was an example in the law of unintended consequences, but I suspect that even Carrie Nation wouldn’t disagree with me when I say this: demon rum and keyboards do not mix. I beg your indulgence, dear reader; I’ll be picking up a new keyboard at the Apple Store and be regularly posting again Real Soon Now.
"Chairman" Bruce Sterling, science fiction writer and Texan, has been on something of a crusade lately. He is attempting — in a half-serious way — to build an artistic movement from scratch; the Viridian Design movement will have as its central theme the pressing aesthetic issue of…carbon emissions. His notion, as expressed in this speech and on his many, many Viridian Notes, is that by creating better, fancier, more attractive, cooler tchotkes that just happen to fight greenhouse emissions, one needn’t convince the Western world to do the environmental equivelant of eating ones vegetables. The rabid forces of consumerism, honed over the past fifty years, will do the job for you. If it suddenly becomes sexy to drive a well-designed electric car rather than a lumpish SUV, you don’t need to make the environmental case. I have lingering doubts about whether Sterling is willing and able to start guruing some designers into getting products to market, but now he’s started talking to furniture designers.
The thing that is so great about literature of the paranoid — the Time out of Joints and Crying of Lot 49s of the world — is the way that things can assume a sense of weight and meaning totally out of character with their surface value. A misspelled postage cancellation or scrap of Life magazine showing Marilyn Monroe can become something of world-shattering relevance. Paul DiFilippo‘s best story, I think, concerns communications through subway sign and Police song, and you can’t get less important than that. On the way home, I was overcome by a sense of portent (aided, no doubt, by the tape of the Clint Mansell soundtrack to Requiem for a Dream on my car stereo). There was a train halted on an overpass, and the word "Uniglory" stood out as if spotlit. I have no idea if Uniglory is this Asian shipping company, but the railcar seemed to bear an secretive and slightly ominous message. As I continued on my way, the Kronos Quartet faded out. The red Jag behind me passed me, only to pull all the way off the road; its driver sat there, head tipped back and looking for all the world like he was waiting for an impromptu meeting or for the bomb in his trunk to detonate. Signs and portents are everywhere. I turned off the highway and headed home, thinking about how much I like music that makes me feel like I’m in a movie.
Do your children know that God has put us here to do a job? "That job is to tell others about Jesus and how he died for our sins." The Supreme Court recently ruled that a school in upstate New York can’t prevent the Good News Club, a Bible group for children aged 6 to 12, from holding after-school meetings. Despite the fact that it puts me on the same side as George Will, I agree with this decision.
(To be read in a tone of breathless exuberance.) So I told a joke on a bulletin board I frequent. And for some reason known only to him, Anil Dash put it on his weblog. And Jessamyn — who recently mentioned an athenaeum in Vermont that I shall have to visit someday — thought that she was the only one who knew that joke, and she emailed me. But I stole it off a whiteboard at work and not from Jessamyn, so we’re copacetic. And we traded awful jokes! And because the Eel, beloved host of snarkout, likes truly awful jokes, here it is, my present to him.
It’s now June, and clearly November will never die. There was something very, very odd about hearing Abigail Thernstrom sparring with Mary Frances Berry over the Voting Rights Commission’s report on Florida. I’m no fan of Berry — whose botched handling of the KPFA situation a few years ago pretty much solidified my opinion of her as a vindictive, tin-eared authoritarian, but Thernstrom is one of the principal academic opponents of the current interpretation of the Voting Rights Act of 1965. I’m not a social scientist, thank goodness, or terribly well-informed about the world of sociology, but I get the impression that her work is taken fairly seriously, this scathing article notwithstanding. (Charles Murray, author of The Bell Curve gets hit much harder, as his reductionist argument deserves.)
Last Thursday, we went to the Black Cat to see Smog play. Smog’s "Be Hit" is perhaps #2 on the list of songs that invariably get stuck in my head at work and yet are wholly inappropriate to sing, even under one’s breath. (Also on the list: Quintaine Americana’s "And They Were Drinking…" and Shellac’s "Prayer to God") That song is off the utterly brilliant Wild Love — the only Smog album I owned — and as Smog doesn’t have a tuba player with them any more, I was doomed to not recognize a single song from their (quite good) set. Singer/songwriter Bill Callahan sure does make funny faces when he sings. After the show, I picked up Red Apple Falls, and I’ve found it disappointing so far — it lacks the manic calliope vibe the best songs off Wild Love had, and it (mostly) lacks the country-fried energy Smog generated live, so it’s just Bill singing minor-key songs about minor-key lives. Maybe it will grow on me; a Pitchfork Media review has me longing to buy Dongs of Sevotion. (I feel compelled to mention Callahan’s ex-girlfriend’s band, the excellent and thematically-somewhat-similar Cat Power; Smog’s Knock Knock is apparently about the decline and fall of their relationship. Happy music is for suckers.)
I’ve been thinking a lot about the meaning of "art" and "literature" over the past week or so, prompted by a post over at Hobbsblog and the article it linked to, a rather interesting summary of the academy’s view towards J.R.R. Tolkien. (I’ll get my biases on the table right now: I don’t think Tolkien’s books are stupendous, although I enjoyed them a lot when I was younger; I’m unabashedly pro-English-professors; and I read a fair amount of science fiction.) There are three separate questions, and I think they each need to be addressed separately. Is Lord of the Rings an enjoyable read? Is it of literary merit? Is it a work worth studying?
This is fascinating — a writer for the Iranian Times is complaining about the praise Iranian film gets from Western critics: "I find the uncritical attention given to Iranian cinema by the Western press patronizing and the adoration showered upon it by Iranians living outside of Iran uncritically patriotic." This when Hollywood is cranking out movies so bad that reviewers must be invented to provide quotes for some movies — as if Jeff Craig of "60 Second Preview" wasn’t good enough. A backlash to Iranian film I could understand, but a backlash to praise of Iranian film?
George F. Will is not my favorite pundit. Not even a little. He goes into fits of indignation over the calumnity of liberals at the least provocation, he plays loose with facts, and his arguments often fall apart when you give them a hard look (witness, for example, this column in defense of the S.A.T., in which Will waxes indignant over those vicious egalitarians trying to wipe out the supposed meritocracy of college admissions by eliminating standardized tests, leaping from the noble origins — skipping that nasty eugenicist bit — of the test to the present day while failing to provide a single piece of evidence that the S.A.T., fifty years later, is still a tool in the service of merit). I don’t know that the statement quoted in this Slate piece, about the McVeigh case and the FBI’s belated discovery of withheld evidence, was from one of his columns instead of Sunday afternoon blather, but if it was, the editor should be ashamed of herself. Will is quoted as saying "He’s guilty. He’s not remorseful about having killed 20 more people than were killed in the Gulf War." McVeigh is guilty, and McVeigh isn’t remorseful, but he sure didn’t kill 20 more people than were killed in the Gulf War. I’m not a pundit. I don’t get paid to think all day, bang out a smug column twice a week, and make an occasional talking-head TV appearance. But a 5 minute Google search confirmed that even the lowest estimates say that over two thousand Iraqis were killed. Perhaps George thinks that those who never saw DiMaggio hit one over the fence aren’t people.
I don’t know what it is that makes me such a sucker for old things. Although it’s not my bag, I understand the impulse that leads people to collect old soda bottles or pictures of baseball players from their grandparents’ day — or old soda bottles with pictures of said ballplayers. One of the reasons I like Providence so much is that you can wander around Federal Hill or College Hill, surrounded by the buildings worn down by hundreds of years of mildly seedy history. It’s tangible. The obscenely talented James Lileks devotes his attention to the Midwest of his youth and glimpses of the faded past. I have no desire to ever go to Disneyland, but I could read Yesterland for hours. Old girlie calendars, old country singers, old buildings, old pulp novels, all the cultural detrius of the twentieth century holds more attraction to me than the recognized artifacts of real historical importance. There’s something touching about ephemera that’s passed its appointed hour. But the air of importance we attatch to anything a few years old, anything which speaks of our connection to an exclusive cognoscenti who know just which blues guitarists or starlets or brands of bubble gum hold cachet through being nigh-forgotton is ridiculous, too. Which is why The New York that Really Never Was is such a brilliant sendup of the kind of inside skivvy, the appreciation for history of the absurd, dished out by people like me. Someday I’ll go to New York and I’ll find the hidden speakeasy, preserved through neglect, that contains those "golden coffers…filled with remnants of departed partygoers as an even more morbid momento mori."
Memorial Day isn’t just about cookouts and parades, of course. It’s also about remembering those who have given their lives for this country through the form that suits this remembrance best: bombastic war movies. In the current wave of greatest-generation swooning, we’ve decided to bring back the jolly war-time propaganda movie. Even Saving Private Ryan decided to stop its visceral display of the horrors of war — even the last good war — in favor of a ridiculous morality tale in which the moral is "Just shoot the prisoner." It’s a wave of sentiment about the WWII-era sacrifice this nation went through without any underlying understanding of the myriad downsides of that period. My grandfather was a navigator during World War II; he didn’t tell war stories. He hid his Purple Heart in the basement; we only found it after his death. I don’t know if he was embittered about the war or reluctant to dredge up memories or just intent on proceeding with his life. But on this Memorial Day, I’d like to salute his memory. I worry that my grandfather’s reluctance to talk about the war is being replaced by second-generation revisionism, that our collective memory of the war is going to dwindle to rah-rah jingoism and a sort of Hogan’s Heroes notion of a relatively clean war. Great aerial combat sequences and the lovely Kate Beckinsale notwithstanding, that’s just not right. Let’s leave Michael Bey’s vision of the war, told "with such high zest / To children ardent for some desperate glory," to another weekend next time.
A long weekend for me is a time for much drinking — not just alcohol, either. I’m headed to the Southwest Baltimore Sowebo Festival tomorrow, and if it’s not raining, I plan on drinking plenty of delicious lemonade. There’s nothing better at a hot street fair than something cold to drink; I’m not a big soda drinker, having weaned myself off the ridiculous amount of Mountain Dew I drank as a college student, but I like a Dr Pepper now and then. Poker night regulars (and sons of the Palmetto State) Greg and Chas have tipped me off to a regional delicacy — Blenheim’s Ginger Beer. (It’s hotter than Reed’s Original, although I haven’t taste-tasted it against Reed’s Extra; ginger beer, mixed with vodka and lime juice, makes a Moscow Mule, one of my favorite drinks.) And Jimmy at work informed me of Virginia’s Dominion Root Beer, which is just excellent: not harsh at all, but mellow and flavorful. V. and I had root beer floats tonight over a round of cards. Soda fountain classics and talking trash over cardplaying are a fine way to spend a rainy weekend night.
So the Bush Administration energy plan is out, with its accompanying war of spin. (Given the willingness of political reporters to regurgitate spin, the press battle may largely hinge on which side can press their case more effectively, rather than on a reading and review by the nation’s reporters — not that I blame them; I’ve only skimmed the thing myself, because it’s enormous.) I think looking at nuclear power is probably a good thing, but the rest of the plan’s particulars don’t thrill me. There are a number of real, technological solutions to help conserve it, and the government knows it, even if Dick "Conservation may be a personal virtue" Cheney doesn’t.
It’s been a busy couple of days here at the Kozy Shack; it seems like everyone I know is moving to a different apartment (or a different city). After hauling furniture into a U-Haul on Friday, we were off to a wedding on Saturday morning; after the ceremony, we ferried the minister to the reception (and had a fascinating conversation with her in the car and waiting in the restaurant, about everything from the performative aspects of reading Scripture to Gullah and the use of Quecha in Star Wars). Our friends Greg and Colleen are moving, but managed to get everything into their new apartment by the time we made it there. So we had no commitments on this rainy day — we can just relax, drink a beer or two, listen to the Maytals on the stereo, clean the kitchen, do some laundry, read a DeLillo book, start making some mushroom sauce… Rainy weekends in spring are beautiful for their possibilities.
Like most Americans, I’m not very familiar with the Algerian war for independence (fought against the French, largely between 1956 and 1962). What familiarity I do have is through a cursory knowledge of the life of Albert Camus (who was attacked by some French leftists for not coming out definitively in favor of the Algerians) and through a remarkable movie, The Battle for Algiers.
I just spent a few hours helping my friend Amanda load furniture into her U-Haul. I desperately need a shower, but first I need a cool, refreshing drink. I just won some Pocari Sweat, a Japanese drink akin to Gatorade, in a contest over at Metafilter run by all-around good guy Jeremy Bushnell. Unfortunately (or fortunately, because I have no idea how the stuff tastes), it’s not here yet, so water will have to do. (Check out the contest; there’s some weird stuff out there on the web.)
What sort of child wants to read Dickens rather than play SimCity? A Washington Post article posits that
Recently, The Atlantic had an little back-and-forth between one of its book critics and Barbara Ehrenreich. It’s worth reading — as is the MetaFilter discussion which led me to it — but Ehrenreich, who worked three different pink-collar "invisible" jobs (as a waitress, a house cleaner, and a Wal-Mart employee) and attempted to live off the income, tosses off an aside:
Followups on a couple of previous stories — those who managed to digest my long and rambling screed about comic books, or just those of you who are interested in comics as an art form, should read the Onion AV Club interview with self-deprecating illustrator-slash-genius Chris Ware. Further, mention of Michael Chabon on Memepool served as a reminder that, celebrated author though he may be, Mr. Chabon is a superhero-lovin’ dork. (And the always-amazing James Lileks is right; this is the worst Spirit ever.) Also, via rc3.org, you can drop in on this conversation at The Well about "zines and blogs." Finally, the fine folks at Invisible City do regular zine reviews; it’s good to know that the medium ain’t dead.
Ooh! Ooh! A Salon article on guides to being girlie mentioned in passing that Lynn of Mystery Date: One Gal’s Guide to Good Stuff, one of the dandiest zines I remember from the mid-Nineties (although it and paled, like everything else, before the awesome juggernaut of weird trivia that was Johnny Marr‘s Murder Can Be Fun), has a book coming out next year! Lynn knows everything, and I’m sure her book will be just dandy. But what happened to all the other books we were going to see from those zinesters of yesteryear? Did the Web suck up all the publishing dollars that were going to go to people like Al Hoff and Johnny Marr?
I came across a reference to the McLibel 2 in Fast Food Nation and I’m glad to see that, three years after I wrote a piece of them for a college magazine, they’re still fighting. The McLibel 2, for those not up on British court cases, were two members of London Greenpeace who distributed a nasty little pamphlet outside a British McDonald’s, accusing the company of all sorts of bad labor, environmental, and health practices. McDonald’s took advantage of Britain’s strict libel laws and sued the individuals distributing the pamphlet for libel. Unfortunately for the company, two of the people they sued just wouldn’t cave; they had gotten their jaws into McDonald’s leg, they wouldn’t let go, and the case just kept going on and on in a public relations nightmare for Mickey D’s. In British libel law, the burden of proof is on the defendent, and a number of statements were found to be libelous. But a number of statements in the pamphlet were found to be true, and information the two (untrained and underfunded) defendants wormed out of documentary evidence and cross-examination was just amazing — everything from details about McDonald’s techniques in dealing with union organizations to the fact that McDonald’s had sent corporate spies (with help from the police) to infiltrate London Greenpeace. Apparently, there’s a book on the original trial (which lasted from 1994 to 1996) — I think I’ll see if the library dig up a copy for me.
Salon recently ran a story on Amy Kapczynski, a Yale Law School student who was instrumental in pushing Yale into committing itself to less expensive access to AIDS drugs for Africans. Kapczynski and Doctors Without Borders started the ball rolling, but I think the key moment was a New York Times interview given by Prof. William Prusoff, the author of the work that resulted in Yale having a patent on on AIDS drug d4T. (Prusoff and Yale have, of course, profited immensely from the current patent on d4T, but Prusoff’s support for making generic versions available in the Third World was certainly candid. As far as I’m concerned, it was morally right. It was also, I assume, immensely embarassing to Yale.) What made the story noteworth, though, was some amazingly wrongheaded comments by a policy analyst.
What a dandy birthday weekend that was! Saturday was a birthday party cookout (involving my first purchase of meat in many, many years, as I picked up two packs of Hebrew National hot dogs for my meat-eating guests). Much beer-drinking occured as we enjoyed the lovely day. Hart and Chas took turns on grill duty; I ate mushroom caps and veggie burgers and potato salad and crackers and avacado cheese dip and potato chips and corn chips and salsa. A random homeless guy swung by and was given a hot dog. Books and the occupation of teaching and non-profits and computers and housing in Boston and divinity school were all yammered about. As the evening lengthened, the hardcore among us — Andrew and Greg and Amy (celebrating her new job as an assistant to legal counsel in the cough adult entertainment industry) — settled in for more beer drinking and penny poker. Sunday was sleeping in and heading downtown to buy myself a present and walking around in the bright sunlight and eating mozzerella sticks while I read my new book, then coming home to watch The Limey. Thanks to all my friends for making the weekend so lovely. Special thanks to Fey, who wrote a poem to celebrate:
Whee, it’s my birthday! Before I head off to see Superchunk, let’s pause a moment in appreciation of the author of “Happy Birthday.” (Not the Weird Al version, or the misbegotten knockoffs they trot out at Chili’s and the like. The normal one.) The Hill sisters got the royalties, but credit is due to a largely-forgotten scofflaw, Mr. Robert Coleman. Wherever you are, Mr. Coleman, I thank you. America thanks you. And the descendants of the Hills thank you, as they light their cigars with twenties.
As my birthday looms ever-closer, gifts are starting to trickle in. My parents gave me the DVD of Carl Dreyer’s The Passion of Joan of Arc. Thanks, Mom and Dad! This is an amazing movie, probably the most contemporary silent movie I’ve seen. Falconetti gives one of the most raw and riveting performances I’ve ever seen; Pauline Kael and Roger Ebert don’t sing her praises enough. But it’s a minor miracle that I got to see this movie. The original negative was thought lost in a fire; Dreyer composed a second version, editing alternate takes, and that was thought lost as well. For decades, the only way to see Passion was through a reconstruction. In 1981, in a supply closet at a Norweigan mental institution, a seemingly-perfect print of the first version was found. That’s the DVD Criterion sells; that’s what I watched.
The phrase "he wouldn’t hurt a fly" has always struck me as somewhat odd, perhaps as a lingering result of my early exposure to the Disney version of the Grimm story of the brave little tailor. Who wouldn’t kill a fly? The tailor killed seven with one stroke! (The tailor is a subset of the more general Jack the Giant-Killer folktale; as I am not Vladimir Propp, I’ll leave it at that, with the sole caveat that the tailor is the only giant-killer I know of who started his career as an exterminator.) Maybe Jainists or Buddhists wouldn’t hurt a fly. Maybe a strict vegan wouldn’t. But even though I’m a pretty easy-going guy, I know I would. You know what else? I could poison a roach.
Yesterday was the 226th anniversary of the midnight ride of Paul Revere, one of those classic "did you know that…" events. Although Revere was the one celebrated in verse, he never finished the ride; Dr. Samuel Prescott — the man to whom the Battle of Bunker Hill order "don’t shoot until you see the whites of their eyes" is attributed — was the only one of the three riders to actually reach Concord. Why Longfellow chose Revere as the subject for his poem is a mystery to me, but Prescott just doesn’t seem to be a popular subject for poets; a Dawes-leaning parody version of Longfellow’s poem was written in the late 1800s, and it doesn’t mention Prescott either. Maybe it’s the lack of satisfying rhymes. In any case, the "one if by land, two if by sea" message was delivered, the riders warned the militia that British soldiers were coming, and the spectacularly improbably American Revolution was underway. Nonetheless, one can’t help but wonder what would have happened if such a message had to be transmitted a few years later, after Claude Chappe invented semaphore and the French built a network of towers to distribute information at the unheard-of speed of seventy miles a minute — "Listen my children, and you shall hear / Of the midnight telegraph transmission of dozens of tower semaphore operators" just doesn’t sing!
What hath Willis Haviland Carrier wrought? Carrier, the inventor of the air conditioner and the founder of the air conditioning company that bears his name, has my vote for the American with the highest influence-to-name-recognition ratio. Without the air conditioner, there would be no modern Sun Belt. Some attribute the Sun Belt’s growth to immigration from other countries or business-friendly attitudes, but I think it has to do with blasts of cool, cool air conditioning. The fact that Clinton Administration energy standards for air conditioning will be relaxed isn’t that surprising; the fact that the energy saved by the stricter 30%-savings standard would, "over 30 years [equal] the annual electricity use by all American households today" is. That’s not from a 30% savings versus no savings, mind you; that’s from 30% versus 20%. Contemplate the sheer volume of air that represents, if you will. In not-at-all-unrelated news, recent studies have provided further evidence of global warming and greenhouse gases’ role in the phenomenon. Given George W.’s support for (and from) the greenhouse-gas-emitting coal industry, I have a feeling there are some long, hot summers ahead; those clever engineers better get residential fuel cells developed soon — I don’t know about you, but I don’t want to spend my sweltering dystopian future without the ability to crank up the A/C.
Last week, there was a terrible incident at a soccer match in South Africa; over three dozen people were killed by a stampeding crowd as an estimated 60,000 fans charged into a stadium that was already full to capacity. This is a tragedy — 43 people are dead, because security guards took bribes to let in fans to a heavily anticipated game or because somehow too many tickets were sold or because crowd control was minimal or because these things just happen sometimes. I can only hope that some good comes out of this, reformed fire codes, whites and blacks reaching out to each other in a time of mutual grief (some of the footage I saw on CNN was of two fans, a white woman and a black woman, holding each other and sobbing), something that will help those people’s loved ones rationalize what happened.
I went to a Caps game this afternoon. Not much of note during the game — Lemieux scored, Bondra scored, the Caps lost — but before the game there was a well-produced little Mission Impossible parody running on the Jumbotron. "Your mission, should you choose to accept it…" it ended. And then words flashed on the screen: "Eliminate Pittsburg."
To add a brief note before I stagger off to bed (suffering from an overdose of 5k), a world that allows the Mekons’ brilliant album Rock ‘n’ Roll to go out of print could, in the immortal words of the Dead Milkmen, use some fixin’. For easily-digitized media like books and records, the day when the term "out of print" is meaningless will be here soon.
The athenaeum is a private club which is also a library. As an idea and an institution, it is largely defunct — I am familiar with it due to the presence in Providence of a still-extant example. Before the public library’s arrival in the United States, almost all libraries were private, and most were quite small. John Harvard’s library, bequeathed to the grateful university that now bears his name, was a scant 400 volumes. The athenaeum (besides sounding like a delightful place to hang out, combining the retrograde, cigar-smoking, brandy-swilling pleasures of the Drones with the opportunity to settle down with a really good book) was an early, privatized attempt at making large libraries available to the less-than-spectacularly rich. Notorious libertine and rabblerouser Benjamin Franklin’s library was a cross between the private and public models; non-members were allowed to borrow if they bonded themselves by leaving a sum of money to be returned when the book was. Copyright libraries tend not to allow borrowing; university library policies differ, but use of the library is often restricted to the community of scholars that the university serves.
Hey, it’s that time of year again. No, I’m not talking about spring. Nope, not baseball season or the annual thrashing the Penguins give the Capitals in the playoffs. Not even the Cadbury Bunny’s annual leadup to Easter. No, it’s time for Project Censored‘s underreported stories list to break. I’m always of two minnds about Project Censored; it’s a noble goal, and often the stories really are important, but none of these stories are "censored" as such, and misusing the word weakens the case. "Project Ignored by Major Media, as Predicted by the Chomskyian Model of News Control" doesn’t quite do it, I suppose. And except for story #4 (in which it is alleged that America bombed the Chinese embassy in Belgrade with malice aforethought) all of these stories were either familiar to me or of somewhat dubious importance. OSHA’s failure to protect blue-collar workers and the use of H1-B visas are hardly breaking news — the rhetoric on Project Censored’s recap of the H1-B story is embarassingly overheated. Story #3, CNN’s employment of Army psychological operations officers, was covered in TV Guide, for crying out loud!
Those who made it through my lengthy comics post below may be interested in this newspaper article on the multi-talented Chris Ware, whose retro-y, design-forward comics are some of the best being done now. The comics-criticism zine The Imp did an issue about Ware; I think it’s an important read for anyone who, not content to just be awed by Ware’s craftsmanship, wants to think about his narratives in a more abstract way. Ware does good stuff. Check him out. (Times link stolen from Dan Hartung’s Lake Effect.)
With the possible exception of boy-band songs and Saturday morning cartoons, no American art form has been so firmly relegated to the kiddy table of the cultural dining room as the comic book. This was not always the case — in the early days of the medium, comics (and not just Tijuana bibles) were read by adults. I’ve read an account of a West Indian sociologist being greatly surprised at the level of readership among blue-collar Americans, only to get sucked in and start reading them himself. (If anyone can point me to a source — or even a name — to attach to this anecdote, I’d be grateful; drop me a line.) But it shouldn’t have been that surprising — in the ‘30s and ‘40s, comics may have been garishly drawn and poorly written, but they were anything but monolithic.
In further baseball news, A’s leadoff man Johnny Damon will commute to the Oakland Colossium via skateboard. In a slightly expanded (but still quite tiny) version of this story at the Sporting News (sorry, no permalink), it was revealed that Damon will be riding BART to Oakland from his house in Pleasanton.
Surprise! I couldn’t stop thinking yesterday. April fool!
Things that sound like fun possibilities for this afternoon: Going to see The Damned, sitting around drinking coffee and reading a Julian Barnes novel, twiddling around with my website, trying out a few Perl tricks and plotting out my forthcoming "literary year" project.
V. jokingly suggested that I make my next entry all about Caliban. "You’ll only have to buy one new letter!" she said. Little did she know.
Like most people, I’m appalled by the Taliban‘s destruction of priceless Buddhist statuary. But as reprehensible as the Talbian’s willfull destruction of their Afghan history is, I’m sorry that this is what it took to bring the regime to the attention of the general public. The Taliban-led Afghan government is reportedly among the worst violators of human rights in the world. The State Department report has such cheerful factoids as: “Prison conditions are poor. Prisoners held by some factions are not given food, as normally this is the responsibility of prisoners’ relatives, who are allowed to visit to provide them with food once or twice a week” and “There were reports that Hindus are now required to wear a piece of yellow cloth attached to their clothing to identify their religious identity; Sikhs reportedly were required to wear some form of identification as well."
Holy cats! Some kind soul willing to suck up the dizzying losses caused by printing an obscure writer’s obscure books has taken it upon him- or herself to bring R. A. Lafferty back into print. If you haven’t heard of Lafferty, do yourself a favor: read one of the paeans (1 | 2) to him out on the web or read the pure product itself, as represented by his story "Nine Hundred Grandmothers". If you didn’t find those at all interesting, thanks for trying them solely on my say-so, and please check back on Sunday for my next thrilling installment.
I was doing well at penny poker night, until V., perennially scornful of the game, showed up like a god of gamblers. "I don’t like poker," she said. "Only losers play poker." And with that, she drew for another full house, her third of the evening. Well, yes, we’re all losers now.
I spent some happy hours at work cranking out ColdFusion while listening to WFMU, freeform radio out of Orange, NJ, thanks to the sound-driver-twiddling skills of Brian in the cube next to mine. (I doubt that he’ll ever read this page, but if so, thanks, Brian!) Listening to Terre T‘s shift as DJ, I heard not only a song by (Neko Case side project) the New Pornographers but also “Identity” by poppy brat-punk pioneers X-Ray Spex. Coincidentally, last weekend I went on a drag-me-out-of-the-store-before-I-spend-more-money shopping spree at Soundgarden in Baltimore and finally replaced my copy of X-Ray Spex’s "Germfree Adolescents", which I’ve been lacking since someone walked off with about 20 CDs of mine in college. I don’t know how I’ve done without it all these years; it’s so damn exuberant and full of life that it makes being a teenager again sound almost fun. Poly Styrene makes those days of coding fly right by!
Well, spring has officially arrived. Bring on the accoutrements: cherry blossoms, baby peas, the Sweet Sixteen, Cadbury Eggs, campaign-finance reform struggles in Congress. The farmer’s market should be opening soon, which is absolutely delightful; cabbage, root vegetables, eggplant, and squash are all lovely things, but I’m craving some variety in my fresh produce, thank you very much. Soon there will be all sorts of lovely vegetables to experiment with.
The ongoing struggle between pharmaceutical companies and activists concerned with the spread of AIDS in the Third World entered its next stage, as the World Health Organization and the World Trade Organization stepped in to try to hammer out some sort of deal. As leftist economist Dean Baker (of Economics Reporting Review fame) notes, "On previous occasions, the drug industry has made offers of low cost AIDS drugs that turned out to have very limited effect, since they applied stringent conditions on the countries receiving the drugs."
I must confess to many, many bad habits, gentle reader. Along with biting my nails, humming the occasional tuneless little ditty in elevators, and chronically allowing the deadline for personal projects to slip past (the original mockups for snarkout.org were done in 1998, when CSS was but a meaningless acronym to me), I am a compulsive classifier. The world can be divided neatly into halves any number of wonderful ways: the Blue-vs.-Red political map Salon loves so; people who read for pleasure and those who would be caught dead before doing any such thing; cat lovers and dog lovers; pie people and their nemeses, cake people. Playing this game with food classifications makes a dandy little icebreaker, in fact, as it’s one of those things — like the Oscars or crimes of the century — that almost everyone has an opinion on. Determining what people called a carbonated soft drink in their part of the country (soda, pop, soda pop, and, strangely enough, Coke) was one of the ways people in my freshman dorm started conversations, which is probably to be expected, as I went to school in a state where milkshakes are called "cabinets."
"The Library is unlimited and cyclical. If an eternal traveler were to cross it in any direction, after centuries he would see that the same volumes were repeated in the same disorder (which, thus repeated, would be an order: the Order). My solitude is gladdened by this elegant hope." — Jorge Luis Borges, "The Library of Babel"