Showing posts with label history. Show all posts
Showing posts with label history. Show all posts

Monday, July 18, 2011

Oh, Gee - Ostalgie

More a spelling error than a portmanteau, Ostalgie comes from the German word Nostalgie (nostalgia), cleverly missing the N to begin with Ost, the German word for East. Ostalgie thus refers to a nostalgic attitude towards former East Germany.

As with many formerly Soviet-led countries (including Russia), people yearn for the perceived ease of life under communist rule. State-owned industry meant that everyone could have a job and everyone could have food (when the country wasn't stricken with famine, anyway). In times of high unemployment and with the vestiges of bloc architecture slowly fading from East German cities, Ostalgie has developed into a defining characteristic of East German culture.


The Road to German Reunification

After World War 2, Germany's borders not only shrank, but the Allies split the nation into occupied zones - one each controlled by the UK, the USA, and the USSR (later the US and UK would split their zones and give one to France). Eventually in May of 1949 the western allies (UK, USA, France) unified their occupied zones into the the Federal Republic of Germany (Bundesrepublik Deutschland in German). As a response, the Soviet Union had their zone formalized as the German Democratic Republic (Deutsche Demokratische Republik). For 41 years the FRG and GDR existed as separate nations.

In August of 1989 Hungary (a member of the Soviet-led Warsaw Pact) opened its border with Austria (not a member of NATO, but pro-West). East German tourists then flocked to Hungary in September...to escape to the West via the opened border. Subsequently, East Germany decided to open its borders, resulting in the fall of the Berlin Wall on November 9th, 1989 and a flood of people to the west.

With free elections in March the following year, East Germany started on the rocky road to unification with West Germany. Despite resistance by many NATO members (most famously Margaret Thatcher), eventually German diplomats secured the reunification of the country. The final step being the formal institution of 5 German states at midnight on October 3rd, 1990 (October 3rd is subsequently celebrated as reunification day istead of November 9th due to some unfortunate implications with that whole Nazi thing).


Who Loves a Trabant?

Ostalgie materializes in a love for the Ampelmännchen - the little traffic signal man. With a hat and powerful strut he adorned many of the Walk/Don't Walk signs in Eastern Germany (vintage signs can still occasionally be seen today). Due to his former ubiquity on every street corner (with a stop light, anyway), the Ampelmännchen has become the dominant symbol of Ostalgie today.

The Trabant from the title of this section is also a prime example of Ostalgie. By far the most prominent car in East Germany, the Trabant was designed and created solely to be a cheap, working man's car. A small two-stroke engine gave the car little power, but the flimsy Duroplast chassis and small frame gave it enough power to push 4 adults around at modest speeds. When the checkpoints to the West opened in the '90s waves of Trabants streamed out of East Germany since few people owned any other brands. To the casual viewer a Trabant looks like a heap, but, the Trabbi remains beloved for its simplicity and its part in history.

Many stores in East Germany mark certain goods with an Ostprodukt label, indicating they were manufactured in (former) East Germany. Ostalgie all but revived Vita-Cola (a sort of citrus-cola mix). Due to import bans on much of what the West had to offer, local products reached a rather large consumer base in the East. The government demanded a non-alcoholic drink to serve the masses, so they had a chemical company whip something together. So East Germans drank Vita Cola instead of Coca Cola or Pepsi (whose products are still very uncommon in Europe, but particularly East Germany).


Old ladies reminisce about how great it was that everyone had work and how the trains ran on time. And in some ways these memories prove correct. In the GDR unemployment ran at nearly 0% thanks to a state-run economy the handed out work details. East Germany exported a large amount of industrial and engineering equipment. By the 1980s they had begun to dabble in computers (essentially the Soviet's equivalent of tech-savvy 80s Japan...except much more expensive and not as successful).

But by the late 1980s the East German government was running a large deficit. In order to maintain the standard of living and import necessary raw materials for the industrial sector, East Germany began amassing large debts.


Black and Blue Tinted Glasses

Soviet-controlled East Germany was no picnic. Understandably upset at the loss of millions of Soviet citizens the Soviet Union was not kind in its occupation of East Germany. After Germany surrendered to end World War Two (in Europe, anyway), the Soviet Union proceeded to take any heavy machinery that wasn't bolted down. And some that was. The GDR (East Germany) came out with a crippled economy, a puppet government and - most notoriously - a brutally repressive secret police.

East Germany had to contend with large amounts of unrest as the population suffered prolonged depression with their weakened industrial base. Party loyalty got you employment much faster than ability, so skilled technicians were often relegated to lower tier jobs. A severe brain drain further stunted the economy as the intelligensia and youth attempted to flee to the West for greater personal freedom and the potential for a higher standard of living.

The GDR eventually attempted to stymie these developments by integrating East Germany into the economic interdependencies of the Soviet Eastern Bloc. As mentioned the GDR became the focal point for the bloc's machinery and computer manufacturing. The GDR erected the Berlin Wall to prevent flight into the West (less famously they also built a barbed wire fence along the entire East Germany-West German border). But the Stasi represented the GDR's efforts to curb unrest in much more brutal ways.

The Ministerium für Staatssicherheit (Ministry for State Security, colloquially abbreviated to the Stasi) was created for counter espionage and monitoring unrest amongst the population. Loyalty to the incumbent communist party was paramount and dissenters were brutally repressed. The Stasi had a presence in nearly every town in East Germany (Magdeburg even had a Stasi prison). The Stasi routinely held and interrogated citizens and kept them for prolonged periods in prison-like conditions. Making jokes about the government could keep you there indefinitely.


In My Day We Were Oppressed Only Once or Twice a Day!

For many people Ostalgie just means remembering the good parts of the past.

But for others Ostalgie remains not just a focal point of nostalgia, but a representation of an authentic desired destination. The east still has a lower standard of living and higher unemployment than the west, which breeds resentment. And as the older generation sees the new youth grow up to outrageous modern fads and Western culture they yearn for days of simplicity and respect. Just as they often do in the USA and elsewhere (stress of imminent nuclear war? I don't know what you're talking about).

Every once and a while people yearn for the good ol' days. When behatted men helped you cross the street and your car was made out of plastic and plant fiber.



So, Good Bye, Lenin; hello...Merkel?

Wednesday, December 8, 2010

The Succulent Succession of Swine Side: A Brief History of Bacon

Although not the most popular animal in the US, the pig remains a "wonderful, magical animal" and has become a staple of the American diet (and internet memes - some more NSFW than others). It is, in fact, even more popular in Europe. Of course, if you happen to eat the pig it becomes pork - like transubstantiation. Pork just happens to encompass a wide swatch of carnivore favorites, including ham, fatback, soki, porkchops, and - arguably most important of all - bacon. While consumption of pork in general hasn't changed much in the last 50 years, bacon has become more popular than ever.

Depending on how you define bacon, of course.


Defining Deliciousness

Bacon in the US comes from the pig belly, which gives it the distinctive parallel stripes of light fat and darker meat (coincidentally, 80s movies often mention pork belly futures, which seems like an outlandish trading commodity, but is actually a means of alleviating risk for meat packers by helping to stabilize the price). In other parts of the world this bacon takes up the name American bacon or breakfast bacon. The USDA even appetizingly defines bacon as the "the cured belly of a swine carcass." But this is a bit backwards, as etymologically bacon comes from the Old French and Germanic words for "back."

In order to preserve the meat and give it its distinctive flavor, it has to be cured - usually by sitting in a smoke house or in a barrel with a heavy brine. Curing defines bacon, but location on the pig is also important (in fact, the only thing initially separating ham from bacon was that ham came from the legs and bacon came from almost anywhere else). Bacon in general often refers to any portion of meat (non-organ) cuts rear of the front legs and excluding the rear legs. Although modern consensus limits bacon to the belly, sides below (or behind, if you don't want anthropomorphic pigs) the ribs, and the fatty portion of the back.

As noted, American-style or streaky bacon comes from the belly. The back portion is usually called Canadian (or Irish) bacon in the US (when it's not ham) and tends to have much less fat. Side bacon represents a mix of the fatty and meaty American and Canadian associations. The Kevin Bacon tends to have a lot of Golden Globes and SAG awards and isn't considered very edible. These distinctions between different cuts of bacon - and even the difference between ham and bacon - have not existed for long.


A Brief History of Deliciousness

Pigs were one of the earliest domesticated animals. Human diets have included various types of pork for quite a while. Chinese historians often claim the first ancestor of bacon in the form of pickled pork bellies around 1500 BC. While whole animals became the focus of feasts and banquets, armies used cured cuts of meat as a marching staple. Unlike the American association with long strips of smoked pig belly, most early forms of "bacon" likely came in chunks and were heavily salted to prevent decomposition and to remain edible.

The Romans issued soldiers rations with pork (about 2lbs of grain and 1lb of meat when possible, augmented with what was available nearby). They distinguished pig by two types: perna (hind-quarter/ham) and petaso (fore-quarter/shoulder bacon). Soldiers were often given petaso (often just called bacon in English); a contubernium (squad of eight) had its own frying pan to bake bread and fry meats.

During their occupation of Britain they brought numerous Roman foods to the Celtic populace - including soldiers' bacon. Subsequent generations of immigrant Angles and Saxons enjoyed using bacon grease (and pork) in their cooking. Distinguishing bacon from other type of pork has happened in numerous countries recently, but for much of the last few centuries the culinary distinction existed primarily in the Anglosphere.

But the Roman invention of specifying different parts of pork mostly disappeared after the empire collapsed. In the centuries afterward etymological ancestors of bacon simply defined cuts of meat. By the 12th century in England bacon was being used to refer to cuts of meat from the back - initially adopted as a synonym of flicche or flitch (this corresponds to adopting an abundance of Old French words as the Normans came in).

Subsequently, the famous story of the Dunmow flitch has helped solidify bacon's place in history. Supposedly the tale spawned the phrase "bring home the bacon" because a married couple could bring home a flitch of hog (i.e. side of bacon) if they had not quarreled for twelve months and a day. Supposedly the custom was so widespread that Chaucer referenced it. Bacoun (Middle English compared to Old French bacon above) soon referred to any cut of pork.

By the 1600s, bacon referred to a cut of pig meat cured as a single piece (back when slices (or rashers if you're British) were called flitches). By the 1750s bacon was synonymous with the cured side or back of a pig (close to the current general definition now). Northern England (not Scotland) had pickled pork - a close equivalent to modern bacon. By the late 1700s, ranchers and industrialists bred pigs to emphasize particular portions and flavors of the meat (even breeding them for bacon that could more easily be cured).

Although most bacon was heavily salted or smoked in a chimney, more refined curing processes began to develop. Wiltshire curing, one of the oldest styles of modern bacon, developed in the 1860s. As ice became more common, ice houses developed. The cold temperature let butchers cure meat over longer periods of time, requiring less salt - allowing for sweeter, more flavorful bacon.


Now let's see some petaso lingerie.

Sunday, October 10, 2010

Which Came First: The Stick and the Egg

Media featuring WW2 have a flare for the dramatic. Americans wield iconic M1 Garands and invariably drive around in jeeps. Germans sleep at MG42s and will inevitably scrounge up a tank. In movies and video games there is a satisfying duality between the Axis and Allies; not only ideology separated the two, but a dichotomy of technology as well. Americans throw "pineapples," Germans throw "potato mashers" (what Damian Lewis is holding in this picture).

Undoubtedly this is because the Stielhandgranate (stick hand grenade) is a recognizable piece of military equipment from the Second World War unique to the Germans (that whole Eastern Front thing? I don't know what you're talking about). It's unfamiliar design and curious operation evoke a very foreign feeling towards Wehrmacht soldiers. Americans (and - when pictured - British and other Allies) invariably appear with grooved, fist-sized grenades like the Mk II "pineapple" grenade. Never mind the fact that the US had copied the basics of this design from the British Mills Bomb during the First World War.

Depicting Germans exclusively using Stielhandgranaten a convenient avenue of influencing the audience's perception that the Germans were vastly different in culture. The truth, however, is that the favored German grenade was very similar to the Allies' design, with a funnier (yet still food-based) name: the egg hand grenade.

German military production favored two types of grenades: the Stielhandgranate (stick hand grenade) and the Eihandgranate (egg hand grenade). Mass production of the Eihandgranate began in June of 1940 and soon far surpassed the production numbers of the iconic potato masher. Even including production numbers from 1939 and 1940 (before the "egg" grenade came into use), 9 million more egg-shaped grenades were produced during the war (84.2 million to 75.4 million stick grenades). In fact, the only modern movie or video game to actually display the Eihandgranate is The Pianist (where they are never used).

Aside from potential identification errors by the audience, there's a reason Eihandgranaten aren't usually depicted in movies and games. They don't appear in historical photographs as prominently as their stick-y counterparts. This has little to do with their actual prevalence on the battlefield; the grenades tended to be kept in a soldier's pockets until needed (unlike American Mk II grenades, which tended to be clipped on to assault webbing). This gives them much less visibility than Stielhandgranaten, which tended to be tucked into the belt.

Here's another chance to catch a glimpse of these elusive eggs.


In case you're wondering why all these eggs and sticks seem to have roots: explosive German grenades used a friction-based fuze. The soldier would yank on the bead, pulling a cord attached to a wire coated with abrasive, which scraped through the friction-sensitive compound in the detonator...similar to lighting a match (or pulling a stick of sandpaper through a tube of match heads). Like a Rube Goldberg machine, except it explodes. Generally the cord was secured inside the grenade's housing, except when they would need to be used on short notice (instances of exposed cords getting caught and prematurely detonating the grenade happened occasionally).

The Eihandgranate consisted of a fuze and detonator in one convenient package. So convenient in fact that by 1943 the German army had essentially changed its stick grenades to be Eihandgranaten with attached handles. Originally, the stick grenade's detonator was housed in the handle of the grenade. This meant the soldier had to unscrew the handle and insert a detonator into the head of the grenade and then screw the handle back on before the grenade could be used, prompting the famous text on the side of the charge: VOR GEBRAUCH SPRENGKAPSEL EINSETZEN, reminding the soldier to unscrew the handle and insert a detonator before taking the grenade into battle.

(If you think that implies soldiers are incredibly aloof, you probably don't want to know what's printed on the M18 Claymore mine)

Both types of grenades favored concussive force over fragmentation - and for this reason are often dubbed "offensive grenades" (as opposed to passive-aggressive grenades, maybe?). That is, German grenades relied on the raw force of the explosion to incapacitate enemies, allowing soldiers to more safely use the grenades at shorter distances (such as charging an enemy trench). Although the German army had developed fragmentation sheaths for use on stick and egg grenades, it was really the Allies' that preferred fragmentation (the Mills Bomb mentioned earlier had an effective range farther than any soldier could throw it, thus the idea of a "defensive grenade," one a soldier would only want to use in cover). They tended to use a smaller charge of explosive to blast apart a shell of metal, which would break apart into high-speed fragments.

So the next time Band of Brothers is on TV, just imagine that the German soldiers have some Eihandgranaten nestled in their pockets for warmth.



"Speak softly and carry a big egg" just doesn't have the same weight...unless your antagonist has ovaphobia.

Wednesday, June 2, 2010

The Importance of Auxiliary Verbs

The United States officially entered World War II on December 7th, 1941 after the Japanese attack on Pearl Harbor. Often considered a terrible strategic blunder, Hitler subsequently declared war on the US on December 11th. These actions dragged the United States in the Battle of the Atlantic against Germany's commerce raiding navy - primarily submarines. It also brought us the rhyming phrase "loose lips sink ships." However, the original poster adds a very important auxiliary verb to the mix: "might."


Battle in the West (Atlantic)

Between the fall of France in June of 1940 and the loss of numerous u-boat aces in early 1941 (most famously Günther Prien and his magic torpedoes (like magic fingers, but half a meter wide and explosive instead of tingly)), German submarines and commerce raiding ships proved frighteningly successful. In the four months after the Fall of France German submarines sank 282 Allied merchant ships totalling nearly 1.5 million tons of shipping. Their success was so great that after the war Churchill commented that, "The only thing that ever really frightened me during the war was the U-Boat peril...It did not take the form of flaring battles and glittering achievements, it manifested itself through statistics, diagrams, and curves unknown to the nation, incomprehensible to the public."

Ever increasing British anti-submarine efforts resulted in wanning commerce raiding opportunities for German submarines in late 1941. However, Hitler's declaration of war allowed u-boats to hunt new targets off the coast of North America ("Canada? What's Canada?"). The first wave of long range Type IX submarines departed as part of Operation Paukenschlag (Operation Drumbeat). For the next 8 months, German submarines saw a resurgence of success (hence the Wikipedia article title "Second Happy Time," from German commander's referring to a period of success as "glückliche Zeit" (meaning "happy time" or "fortunate time").

Germany only had 12 available Type IX boats, so commanders of smaller (more famous) Type VII boats suffered more cramped quarters and meagre rations to risk the journey to hunt down fresh American vessels. They proved so successful that a newly created American Office of War Information began a campaign of information control by mid-1942.


Look at that Ess (c)ar gee-oh

Ironically, the US likely started the campaign to prevent Americans from learning about sinking ships instead of preventing Germans learning information letting them sink ships.

Similar to other propaganda from the period, a large company created the "Loose Lips Sink Ships" (LLSS) poster to aid the war effort. The poster came as part of a series of eight drawn in 1942 by the art director at Seagram Distillers' branch in New York (think Pepsi Co., but with more alcohol). Seagram printed the posters for placement in taverns. Apparently they felt bad for liquoring up all those intelligence officers ready to provide vital war information to German spies. Loose lips might sink ships, but lots o' sips loosen lips.

Obscurely signed as Ess-ar-gee, the poster's artist went without much recognition, despite creating one of the most recognizable phrases from World War Two. Ess-ar-gee enigmatically disguises the initials SRG, which refer to the Seymour R. Goff Jr. (some places slip a Henry in there...we could probably throw in a John or William to cover some other common turn-of-the-century names - or maybe he just didn't like hens).


What Might Sink Ships

The American navy and coast guard were initially unprepared for the waves of German submarines that came. The British immediately recommended switching to the Commonwealth's convoy system - the American navy eschewed the perceived burden of the system. The British recommended flying constant reconaissance and sending out available ships for escort duty (sick of that 'fishy' smell, the British had commandeered many fishing trawlers for anti-submarine duty in 1940 and 1941) - the American navy again abstained, unwilling to seize civillian vessels and lacking available destroyers. Ships leaving American ports suffered heavily between January and August 1942. The British recommended blackouts in coastal cities - and yet again the American navy refused the suggession.

At the height of the "Second Happy Time," German submarines were operating within sight of American harbors - identifying ships' silhouettes against illuminated cityscapes and sinking them as they ventured out to sea. The British tanker Coimbra was sunk within 30 miles of Long Island; residents who spotted the wreck's burning load of oil called the authorities. Due to light air patrols and a lack of available escort ships (many having been "lent" to the British in 1941 in Roosevelt's Destroyers for Bases program), occasionally German submarines sank ships during the day as well - such as the Dixie Arrow sunk 12 miles from the Diamond Shoal anchored light buoy off the coast of North Carolina.

Loose lips might sink ships, but a poorly prepared navy, obstinate leadership and an aversion to adopting proven strategic decisions do sink ships.


"Negative Buoyancy Sinks Ships" just didn't rhyme well enough.

Friday, July 31, 2009

Milwaukee Trivia Collection - Back to the Future Edition

People clamor for shorter articles, so I present the first in a series of Milwaukee-oriented trivia collections randomly organized into arbitrary themes. Today: I link Milwaukee's history with Back to the Future because of a note listing the time and date a building was struck by lightning.


Ten Years Early; One DeLorean Short


Flux capacitor not fluxing and out of plutonium? The Wells Building on Wisconsin Avenue was struck by lightning on July 9th, 1945 at 3:45pm. Unfortunately old ladies won't be handing out fliers asking you to save the clocktower; there aren't any clock faces on the building.

Thinking of speeding down Wisconsin Avenue in order to hit a metal wire at 88 miles per hour? Milwaukee's mass transit might have something to say about that. Not only would you have to contend with an abundance of streetcar wires, you'd also have the streetcars themselves, which would still be around for nearly a decade.


"Tab? I can't give you a tab unless you order something."

Putting aside the fact that Marty orders a drink marketed towards women, Tab did not exist until 1963. Coincidentally (in the realm of missing beverages), Milwaukee didn't have any taverns on record in the 1920s. Still want to wet your whistle? The Milwaukee city directories can point you to page after page of soda fountains.


Road names? Where we're going we don't need... road names.

There's plenty of dispute over the origin of the city of Milwaukee's name. But many of the streets have unique stories as well. Some changes came through convention, some came through history, and some came because urban planners like trying to confuse Polish immigrants.

As mentioned previously, Milwaukee went through many street renaming phases, but the most extensive happened in 1930. Almost every East-West street in East Milwaukee had a different name before the 1930s. Directional indicators were appended to street names (making something like Wisconsin Avenue into East Wisconsin Avenue and West Wisconsin Avenue - or something more fun like changing Aldrich Street into East Bay Street and South Bay Street). Unfortunately, they also decided to move the numbered streets as well.


View Larger Map

The city of Milwaukee does not have a Main Street (although Brown Deer Road becomes Main Street in Waukesha County). Broadway was formerly Main Street, before the name was changed in 1871.


The only reasonable explanation is that it's the main street to use to get out of the city, right?

Wednesday, June 17, 2009

This Old House on 1 That Street

I generally don't drive, which often means that my knowledge of specific street names is limited to my immediate vicinity. My navigation relies much more on landmarks and directions, since I don't bother looking for street signs (I generally don't see them, anyway). But recently I've needed to accustom myself to the lay of the streets a bit more.

So if you're from Milwaukee I'll ask you this: do you know about where in Milwaukee this house was from?



Here's a hint: its address was 519 Astor Street. Not helpful? It's from near the intersection of Biddle Street and Astor Street. Still not helpful? That's because all of Milwaukee's urban planners aspire to be Hunter S. Thompson. The house's description may prove useful in figuring out this problem.

Milwaukee has gone through a series of street name changes, but the largest happened in 1930. Some streets were renamed, but most simply had a direction appended to them. Astor Street became North Astor Street. In addition, people couldn't pick whatever wacky address number they wanted (do you really want to live on 1 Bay Street next to 1 Aldrich Street?). Numbers were assigned according to a property's frontage (every 15' had a new number, every new block increased this number by 100). This house's address changed from 519 Astor Street to 913 North Astor Street.

Unfortunately, even having an address number and street name doesn't help us with this house. Keen visitors that click on my fancy links will already know the other problem. This house neighbored the intersection of Biddle and Astor. The problem being that Biddle Street became Kilbourn Avenue when Cedar and Biddle were widened and connected by a bridge over the Milwaukee river. The intersection of Astor and Biddle now encompasses the intersection of Kilbourn, Prospect and Astor.


View Larger Map

You know what will teach you streets pretty quick? Poring over fire liability maps of a city. Unfortunately, half those streets may not even exist anymore - like Biddle Street. The Sanborn Fire Insurance Map Company (talk about knowing your target market) created detailed, scale maps of thousands of American cities between 1867 and 1970. You can view black and white versions of these maps for Milwaukee online, but you may want the key.

Physically they're giant bound volumes about two feet square. With a scale of 1 inch to 50 feet, this means one page can show about 1000 feet on each side. Milwaukee is a rather large city, and the maps from 1910-1926 come in 6 volumes (about 20 giant bound books of maps). New volumes were ordered for expansions of the city limits, but the index map remains the same (which makes finding page numbers for modern streets not listed on the index that much more fun).

Aside from being unwieldy in size, the individual pages are a bit strange. Directional north is not explicitly at the top of the page; each page has its own compass rose to denote which way is north. Unlike the online versions mentioned above, the actual maps are color-coordinated with the key. Each color indicates a specific building material (and therefore a building's status as a potential fire hazard). And finally, in order to provide up-to-date fire hazard information, the company provided new versions of buildings and streets that could be pasted into the volume on hand. This kept the maps current, but isn't so helpful when you're looking for an older demolished house, since it's bound to be under 3 layers of pasted revisions. In older versions of the map the presence of gas and electrical lines is often marked as well (if I had been around in 1910 I'd get an electrical line to my house just so some lazy surveyor has to pencil in "Electric Line" on some giant map).

So where was that house from? It was located in downtown Milwaukee, where Kilbourn Avenue starts and the Regency House Condos now stand.


And maybe it's still there, like some sort of Morlock house.

Friday, May 29, 2009

Collection of Curiosities - Cynic's Edition

Hoaxes, practical jokes and confidence schemes have featured in entertainment (and the art of separating people from money) for hundreds of years. Despite massive amounts of cynicism present in today's society, hoaxes still manage to catch quite a few people, but only a select few have impacted entire nations and imprinted themselves on the public psyche.


Of Mars and Men

Substantially famous already, the War of the World's hoax isn't actually so much a practical joke or a hoax as it is a testament to people's tendency to jump to conclusions. Taking place on October 30th, 1938 (and continuing on for a few people in the population), the hoax generated a panic in some of the population in fear of a Martian invasion.

The War of the Worlds hoax was a radio broadcast of a radio adaptation of H. G. Wells' The War of the Worlds directed by Orson Welles. The broadcast followed a format of "breaking news" bulletins interrupting a performance by an orchestra. Each of the bulletins provided the audience with updates that followed the plot of the War of the Worlds - essentially a Martian invasion. Subsequently, portions of the public panicked at the thought that the invasion was real.

You know how some people miss the first five minutes of television shows or movies? Well, most television shows produced today are designed so that audiences can watch them without paying close attention. After all, viewers may be making dinner or folding clothes and they might not be pouring all their energy into watching. This means that many shows (particularly sit-coms) restate the plot or update viewers quite frequently. Unfortunately for listeners to the Welles broadcast, many missed the disclaimer introduction.

The broadcast also functions as a testament to some people's short attention spans. The story eventually switches to one man's narrative of his attempts at survival. This makes it readily apparent the broadcast is not real...but by that time people had ran screaming from their radio sets and were busy stockpiling supplies before the impending Martian invasion force arrived.

You can relive the mystical aura of a panic-stricken 1930s by listening to the broadcast here or here. Just like when you're watching TV, remember to skip the first few minutes.


Nothing for the Spaghetti Weevils

Did you know spaghetti grows on trees? Of course you don't... because it doesn't. That didn't stop the British show Panorama from broadcasting a fake documentary on April 1st, 1957 showcasing the bountiful Swiss spaghetti crop. Panorama happened to be one of the most trusted news sources on British television, so the hoax managed to get two types of responses: people who loved the joke and people who wanted to know how to grow their own spaghetti trees. This was the first - and only - time that the news program decided to air an April Fools joke. This is the benefit of having a cameraman who knows a good practical joke.

Through the glory of the internet you can pretend to be British in the 50s by watching the video here.

Pasta has become so ubiquitous now that everyone knows its secret ingredient: flour. Of course, you can add things like salt and eggs to the mixture, but pasta is essentially flour formed into fancy shapes. Unfortunately for much of the British population in the 50s, who had suffered rationing throughout the 40s, pasta was still considered something of a delicacy and remained relatively rare (I'm guessing their college students must have subsisted off some sort of Dickensian gruel, instead).

As a side benefit, the hoax documentary created and exterminated the adorable spaghetti weevil.


Soaking in Cynicism

Have you heard of the dangerous chemical dihydrogen monoxide that's responsible for thousands of deaths per year? I'm sure you have, but it was probably called something different, like hydrogen hydroxide or hydroxic acid. Or maybe water.

Developed in the 1990s, this hoax plays on a lack of scientific knowledge. Usually the hoax pops up in the form of a petition banning dihydrogen monoxide (DHMO), with a spokesperson listing off many of the dangerous-sounding aspects of water to persuade someone to sign. At face value all of this information is true, but used in an exaggerated manner (after all, thousands of people die every year to drowning). The hoax even has its own psuedo-advocates with a website listing off the dangers of DHMO.

This hoax ends up a source of amusement for chemists and a source of embarrassment for politicians. The hoax relies on exaggeration and a listener's lack of specific knowledge (or attention). It tells us that pretentious language and specialized jargon can often be used to circumvent people's logic and reasoning. Unfortunately it also showcases people's willingness to generate uninformed decisions. Who would ban water? Plenty of people if they don't know it's water.

Also lava monsters.


Many people might believe that modern hypercyncism may prevent hoaxes from even gaining a foothold anymore, but the case of DHMO shows that people as a whole are as gullible and misinformed as ever. Afterall, there's more information now than ever before, how do you know who to not trust? Unfortunately this results in more work for individuals because it takes even more effort to form factual, informed decisions... many people often don't bother.

The predominant form of the hoax is no longer steeped in April Fools Day jokes and emails that can be disproved with a single Google search. They rely more on confidence schemes and human fallibility. Even in the digital age, con artists still rely on surreptitiously gaining information directly from people more often than through brute force cracking of electronic information. For less criminal misinformation there's plenty of help around. The website Snopes exists to discredit modern hoaxes and urban legends which manage to find their way into chain mailings and conversations.


At least we don't need to worry about spaghetti weevils. Or do we!?

Thursday, March 5, 2009

Arabian Golf

The Persian Gulf has been a hotspot of contention ever since some Sumerians decided they wanted to live next to each other a few thousand years ago. Recently, Arab dominated lands have referred to the body of water as the Arabian Gulf, which has led to a vehement outcry among Iranians (or Persians, for anyone alive before 1935). This nationalism has evolved to the extent that Iran now has a Persian Gulf Day (on April 29th, in case you planned on taking the day off). You might also notice the rather undiplomatic language that seems to permeate Iranian literature on the subject. To their credit, the UN and some random guy at MIT (someone in Iranian Studies, anyway) have determined that Persian Gulf (or variations thereof) has functioned as the de facto name for the gulf in European circles for centuries and should stay that way. I'm not really sure where the Arabian prompt to change the name is coming from - they have a perfectly fine Red Sea to the west that could do with a spruced up name. Maybe they're hoping the next war in the area to be a more eponymous Arabian Gulf War instead of a Persian Gulf War.

Now, my History 104 course with Professor Wick also featured a bit of discussion on the popular gulf (he also writes a mean introduction to the History of the Peloponnesian War by Thucydides). One of my favorite professors through a mixture of immense topical knowledge with dry wit, his lectures provided an exceptional historical background for future learning and critical thinking. When covering the topic of ancient civilizations in Mesopotamia, the Persian Gulf featured in the topical discussion to a fair extent. (Un)fortunately, we didn't cover any sort of historiography on the subject of the gulf's geography.

The depth of the modern gulf does not exceed 90m, which is helpful since sea levels rose by about 90m when the glaciers from the last ice age melted. The Persian Gulf of the time was likely a fertile valley, but there was no recorded history at that point. The Sumerian great flood (an early analogue to the Biblical tale of Noah) is likely unrelated to the inundation (or Deluge, if you're still going all Biblical on me) of the Persian Gulf. The gulf sits at the collision zone of the Eurasian and Arabian tectonic plates which still periodically undergo tectonic activity related to orogeny (that is: mountain upheaval and usually accompanying subsidence somewhere else).

For quite a while the historical coastline of the Persian Gulf was believed to have been between about 200 kilometers to the northwest of its present position. An archeological geologist named Jacques de Morgan theorized around 1900 that the Persian Gulf had slowly been filling in with sediment deposited by the Tigris, Euphrates, and Karun rivers. He suggested that the Tigris and Euphrates emptied into the gulf without forming a confluence (the Shatt al-Arab estuary (or the Avrandrud, if you're Persian - not to be confused with the Evinrud)), and that the Karun river's sediment formed a series of shoals, which eventually built up into the modern shoreline. Through an in-depth survey of archeological sites in Mesopotamia, de Morgan hypothesized that the coast of the Persian Gulf would have been near Baghdad in the 4th millennium BC... Never mind that his use of historical sites relied on his own survey of historical voyages whose point of origin we still don't definitively know (turns out you can say you've found anything if no one knows the actual location).


There are a few problems with de Morgan's assertion (besides his mélange of potentially made-up historical sites). Much of the rock in the area appears to be from freshwater sediment. There's also the problem of Lake Hammar in southeastern Iraq which miraculously hasn't really filled with sediment and wasn't there six thousand years ago. While sedimentary accretion is an accepted geological phenomenon, de Morgan was missing a few important bits of information (mostly the geology of his archeological geology).


The predominant theory behind the coastline of the Persian Gulf seems to still be Lees and Falcon's subsidence theory. With their fancy use of geological sampling, they hypothesized that the Persian Gulf had intermittently undergone (and continues to undergo) subsidence, which counteracts the silt deposits to a great extent. Iraq collides with Persia building mountains, but the creation of mountains requires a complementary subsidence zone. So, the story of Noah had it wrong: the land wasn't being flooded by water, the water was being flooded by land (...and was slowly sinking to cover it up, like some geologist-fantasized episode of CSI). Sure, Noah's flood is supposed to be from rain and rivers overflowing, but you can't have quality jokes and accuracy, what do you think this is the Daily Show?

Through the use of aerial photography and charts from the 1800s, Lees and Falcon determined that the primary coastal change was a migration of the Shatt al-Arab's output further to the northeast. Subsidence and silt deposits have resulted in some ancient sites buried under a substantial depth of sediment (and occasionally water) as the shoreline meanders northeast. Or maybe Sumerians were just subterranean, Tolkien-esque dwarves with gills. So maybe in a few millennia the main river outlet into the Persian Gulf will be in Iran and we can avoid squabbles about preferred geographic names. Or half of the region will be buried under ten meters of silt, and everyone will turn into Morlocks. Either way I see a great future for the science fiction community and historians. I preemptively dub it historical futuristic science fiction.


Now we just need to work on renaming Lake Michigan to Lake Wisconsin.

Thursday, February 19, 2009

Primer up to Primers

The modern concept of an arms race is almost exclusively tied to the proliferation of nuclear weapons between the USA and USSR and the attempts at building bigger, longer-ranged weaponry. Or to anyone who's ever played Civilization: racing Ghandi to nuclear weapons before he destroys the world. Like the nuclear arms race and the related idea of the Red Queen theory, history is populated by arms races. Just like the light bulb or personal computer (or even modern firearms), inventions stand on the innovation that came before them.

Firearms constitute a prime example of a technological race. Now, through the miracles of etymology, firearms might be either "flaming arms" or "arms that make use of fire". Although early firearms may have lead to plenty of conflagrated limbs (and torsos), the concept of arms here ties in with weapons (Old French armes from Latin arma (weapons)) and not arms (Old English earm from Latin armus (shoulder/upper arm)) - both of which are from the Indo-European root of ar- (too fit or join). Firearms have three separate components that influence their effectiveness: the gunpowder, the bullet, and the design. These three factors facilitate faster firing rates, more range, cheaper manufacturing, more mobility, and easier use - all of which were desired since firearms were invented.

General consensus lands the discovery of gunpowder sometime in the 800s in China. The use of recognizable guns in China dates to the 1100s (occasionally Arabic scholars argue this point). This often leads to the misconception that early gunpowder use focused on fireworks or the idea that the peace-loving Chinese couldn't find a use for it in wartime. The advent of gunpowder coincides with the downfall of the Tang dynasty (not to be confused with the Tang dynasty - easy way to differentiate them: only one is "orangey") and the emergence of the war-filled Wudai and Shiguo (Five Dynasties and Ten Kingdoms - wu (五) being five, shi (十) being ten) period and the warfare that accompanies more than ten states in an area less than half the size of modern China. Recognizable firearms developed during the subsequent Song dynasty (not to be confused with the, uh...song dynasty?), but despite the empire's relative stability, they weren't any strangers to war either. Turns out Chinese people were all about burninating the countryside.

Early gunpowder in all spheres of influence around the world (Chinese, Arabian, and European) lacked explosive force. Gunpowder functioned more as an incendiary in the early years of its adoption in each area. Before the arrival of black powder to European armories, naphtha (think pitch or oil) occasionally filled this role. Soldiers filled small ceramic or glass pots with the flammable liquid. When lit and thrown, the pot would break and the liquid would splash out and engulf an area in fire. Greek fire (or liquid fire) occasionally pops up alongside naphtha and pitch, but writers and transcribers often used the term carelessly (...let me paraphrase: "Holy crap, guys - we just got our asses handed to us by those Saracens. They totally used Greek fire on us."). The primary users of Greek fire consisted of the Byzantines (Greeks to the rest of Europe) and the myriad Arabian states around the Mediterranean.


Ingredients and Processing

Early incendiary mixtures consisted primarily of sulfur and charcoal. These mixtures wouldn't provide any explosive punch, but worked well enough for burning things (I'll avoid linking non-Trogdor again...for now). The third essential ingredient of medieval gunpowder was saltpeter (or potassium nitrate, KNO3). Mined from areas of China and India, Chinese alchemists had relatively easy access to the chemical. In Europe, saltpeter remained elusive until alchemists uncovered suitable amounts of the chemical in a more obscure form. The initial source for saltpeter in Europe came from Batman bat caves in the form of guano. Disgusting you say? The larger scale (far more odoriferous) production of saltpeter inevitably required copious amounts of aged urine (or by LeConte's recommendation: dung-water) or manure.

Here's another fun fact: saltpeter often functioned as a food preservative in the middle ages. Hopefully you're not reading this during lunch. The historical reenactors among you may be happy to know that the modern production of black powder does not rely on urine.

The basics of effective black powder had finally arrived by the early 14th century. Recipes proliferated, offering a variety of additives and proportions. The theoretically most effective ratio nears 75% saltpeter, 12% sulfur and 13% charcoal, however medieval chemists tended to use far less saltpeter (the hardest component to produce). Recipes usually took a form of basic ratios, such as 7 parts nitre, 5 part brimstone and 5 part charcoal. The standardization of black powder in a form close to its theoretical explosive limit didn't occur until the late 18th century.

Beyond the evolution of the formulas, methods of transporting and mixing the constituent components developed as well. Engineers discovered that mixing the saltpeter into the sulfur and charcoal just before firing resulted in a more reliable explosive force (although it tended to produce a lot of powder dust which was prone to ...exploding). Gunpowder was often milled as a function of the mixing process, providing relatively consistent powder (compared to mortar-and-pestle mixing, anyway).

One of those major lightbulb-esque developments came about in the 1420s with the advent of corning. Also known as granulating, corning is the process of wetting the gunpowder and forming it into kernels or grains. Initial liquids for corning were spirits, vinegar, and - that old nedieval standby - urine (apparently urine was medieval duct tape). Turns out water works best. It helps the saltpeter fill in the pores of the charcoal, allowing for a very consistent and more powerful propellant (it's also mentioned in Timeline, in case you're interested). Subsequent developments focused on standardizing grain size and providing additives to stabilize the powder (like the addition of graphite to avoid static discharges setting off the powder - if you go back in time bring a pencil).


Materials and Shapes

I would be remiss if I didn't point out that gunpowder developed into a general purpose explosive. Arriving sometime after 1250, firearms developed relatively quickly into projectile weapons but saw other uses as well. Black powder was used in civil engineering (mining, canal building, etc.), but proved especially dangerous due to the inconsistency of the powder and the lack of reliable fuses. Besiegers also used the explosive mixture to great effect. The most famous of these for us now are petards, which come to us with the phrase "he was hoisted by his own petard" as in "he was foiled by his own plan". But, like our contemporary association with the word firearms, the most common weapons to make use of fire were guns.

European ventures into projectile firearms began with cannons. Walter de Milemete's illustration in De Nobilitatibus, Sapientii, Et Prudentiis Regum (On the nobility, wisdom, and prudence of kings) features the first picture of a European cannon (on the left). This type of cannon and gonne were referred to as a vaso (ingeniously, Italian for vase). It is unlikely that de Milemete had actually seen the cannon fired at this point due to the ambiguity of the gun's carriage (despite how stable putting a cannon on a sawhorse may be).

During the early centuries of gunpowder use, siege engineers favored cast bronze cannons. Leaders preferred forged or cast iron guns for their economical price. Metallurgical processes of the period meant that it was easier to cast bronze (or much more expensive brass), than iron. The pliability of bronze also made it easier to notice when a bronze cannon had undergone too much stress due to a large bulge that would form. Iron cannons tended to just explode due to their brittleness (this tends to be the reason why the operators preferred bronze guns). Although generally stronger as a metal, iron metallurgy and refining processes slacked behind bronze. The picture in de Milemete's depiction is a cast piece, probably of bronze or brass (due to the color). Forged iron cannon consisted of a tube (often of wood covered in metal slats) held together by rings of iron. This lead to the idea of a gun barrel since the weapons originally resembled barrels (a cylinder of wood slats held together by iron rings).


The Part That's Supposed to Hurt People

Projectiles came in a variety of shapes and sizes. De Milemete's depicted cannon fired a dart (or shortened arrow). Early ammunition often came from rock, something that tended to be rather plentiful. Round bullets or shot became the preferred standard (as round as you can make a rock, anyway). As gun calibers became standardized, metal ammunition began to readily replace stone. Shot, darts, and bullets all had separate tactical applications and saw use as their production processes became more refined. This is the part where the euphemism of a cannon as a big hard tube with balls comes in.

Until the advent of the cartridge and primer, the vast majority of firearms were muzzle-loaded (loaded down the barrel and then rammed into position). Only very small cannon tended to be breech-loaded (loaded into the rear of the weapon, right into the firing position). Until machining caught up, these required a removable chamber held in place by a wedge to make them nearly air-tight for firing.

Bullet makers (generally lead casting metal smiths) spent time perfecting the art of shot dropping - the act creating spherical rounds for use in firearms. Molten lead would be dropped from the towers through a sieve so that it could cool into a nearly perfect sphere as it fell. This resulted in a vast assortment of shot towers dotting the countryside. This didn't work quite as well for artillery rounds, which needed to be cast and hand corrected. Spherical rounds remained the standard until the advent of inexpensive manufacturing processes for conical bullets such as the Minié ball in the early 1800s (never mind that something called a Minié ball wasn't a sphere). The combination of a rifled gun barrel and a more aerodynamic bullet provided greater accuracy and a more damaging impact.


Hand-held Firearms and Locks

Hand-held firearms took off when someone decided to make a cannon small enough to be held by one person (one crazy person, these things often exploded when firing after all). So it's no surprise that early handguns (or handgonnes or whichever phonetic spelling you prefer) looked like miniature cannons. Like their cannon-y counterparts, these weapons required their operators to insert a charge of powder, ammunition, and then light the whole thing off with an open flame. The flame would ignite the priming powder (held in a small receptacle called a flash pan), whose flame would travel through a touch hole and fire the weapon.

Turns out even people who ran at each other with big knives thought this was dangerous. In early artillery and handcannons, a linstock (essentially a big fork) held the match so that the weapon operator could try not to die when he fired his weapon. Eventually safer and more useful firing mechanisms (or locks) developed.

The first of these - the matchlock - appeared in the mid-15th century, and was essentially a burning wick that clamped down when a lever was pulled (igniting the flash pan, going through the touch hole and so on). The first lever, or trigger, came in the form of an S-shaped piece of metal called a serpentine (because an S is always a snake). The matchlock remained the primary firing mechanism during the early years of gunpowder, when arquebusiers ran around the battlefields of Europe with arquebuses. Well, without the running part, anyway.

Early firearms like the arquebus and the later, heavier musket often required soldiers to rest their weapon on a window sill or a Y-shaped fork in order to aim. The word arquebus, like its shotgun-like partner the blunderbuss, comes from Dutch. Arquebus - and its counterparts harquebus, hackbut, hagbut and the like - comes from the Old Dutch hākebusse and German hakenbuchse or hook gun (due to hooks that were originally cast onto the barrel so that it could connect to the Y-shaped firing stand). While blunder may be an appropriate word for a weapon prone to blowing up in the operator's hand, it was most likely named for its loud, thunderous report (so, thunder gun).

The addition of these intricate parts encouraged developments in the shape of the weapon into something we could reasonably call a gun today. And because running around with a flaming wick was considered dangerous, development towards safer and more reliable locks proceeded.

The next advancement in the early 1500s, called the wheellock (or German lock), allowed the operator to carry the weapon loaded and fire it without an open flame dangling about. Using a piece of fool's gold (or iron pyrite, FeS2), a spark would be created by having the mineral snap against the flash pan's cover, pushing it out of the way, where it would fall on a rotating wheel and create a spark. The mechanism required the operator to fire the weapon gangsta-style so that the spark would actually ignite the powder in the flash pan (...and because medieval soldiers were gangsta, yo). It also took longer to fire than simply touching a match to the flash pan, but avoiding exploding oneself is probably preferable.



Further advancement brought us the familiar firestarting trick of snapping flint against steel. Coincidentally enough, this type of firing mechanism is referred to as a snaplock. The snaplock functioned much like the wheellock, except the flash pan had to be opened manually (and it was cheaper to produce with fewer moving parts). The flintlock (or French lock (or even English lock, depending on who's doing the shooting)) solved this problem, by combining the flash pan cover and the steel target for the flint into one simple L-shaped piece of steel called a frizzen. The simplicity of the flintlock lead to its dominance in weapon manufacturing for over three centuries until the implementation of percussion caps and primers in the 1860s.


Firearms: proving the versatility of Urine™ since 1326.

Sunday, December 21, 2008

An die Freude

Often cited as a 'symphony within a symphony', Beethoven's Symphony No. 9 in D minor, Op. 125 "Choral" is arguably one of the greatest pieces of classical music. It's also one of only a few symphonies to have vocals. I'll refer to it as Beethoven's 9th Symphony because I am not musically inclined. The fourth movement, named after Friedrich Schiller's poem An die Freude or "to Joy", is often the most recognized portion of the entire symphony. Because the word ode exemplifies the lyrical content of the poem we usually end up with the name "Ode to Joy". The piece has some musical flourishes that I really like, but since my musical talent is limited to knowing which violin strings are which through a clever mnemonic device (G'Day, or GDAE) I can't really tell you much about the in-depth musical facets of the piece.

The European Union took its anthem from one of the more famous musical portions of Ode to Joy. The German national anthem, on the other hand, consists of the same piece it has for the last 90 years, Haydn's Gott erhalte Franz den Kaiser (God save Franz the emperor) set to lyrics by Augustus Hoffmann in 1841 and picked as the anthem for the Weimar Republic in 1922 (and subsequently adopted by Western Germany). After 1945 they just took out that whole "Deutschland, Deutschland, über alles" bit that dominated the first stanza and kept the rest. This is probably for the best because the first stanza also mentions the boundaries of the Empire of Germany based on a collection of rivers that don't actually border Germany anymore.

As many are well aware, the American national anthem takes its musical foundation from a British drinking song popular during the early 1800s. It also happens to stem from the War of 1812, and not the revolutionary war. We've all heard plenty of renditions of people holding notes just a bit too long, so perhaps I could interest you in an instrumental version without singing.

Since it's Christmas time I figured I'd present (hah!) a simple analysis of the German lyrics in Ode to Joy. The primary theme of the piece is universal brotherhood (results may vary). Concert versions of the entire symphony, separated by movements can be found here. Ode to Joy is the fourth and final movement (directly linked below if you want to save a click). Don't worry, it might seem half an hour long, but five minutes of that is the applause. Plus it's cool music. I apologize in advance to all of the unfortunate cube-dwelling people who don't have speakers; I can hum along with you, but humming comes with a no-money-back guarantee.

Symphony No. 9, in D minor, Op. 125, 'Choral'-IV. Finale: Ode to Joy
Music by Ludwig van Beethoven
Lyrics by Friedrich Schiller (with additions by Beethoven)
(original German is in italics, loose translation is underneath)

Performed by the Vancouver Symphony Orchestra here.

O Freunde, nicht diese Töne!
Sondern laßt uns angenehmere anstimmen,
und freudenvollere.
Freude! Freude!


Oh friends, not these notes!
Instead let us sing more pleasant
and joyful songs.
Joy! Joy!


The first few minutes of the movement recycle musical themes from the first three movements. The movement finally settles on a definitive theme just before this first lyrical interlude. The music starts getting more tense and less joyous before the first singing starts. So the baritone's all, "please stop playing somber music - let's keep it a bit more happy, eh?" (translated to Canadian for the recording linked above).


Freude, schöner Götterfunken
Tochter aus Elysium,
Wir betreten feuertrunken,
Himmlische, dein Heiligtum!
Deine Zauber binden wieder
Was die Mode streng geteilt;
Alle Menschen werden Brüder,
Wo dein sanfter Flügel weilt.


Joy - beautiful divine spark
Daughter of Elysium -
Filled with fire we enter,
Holy one, your sanctuary.
Your magic mends again
What tradition has sternly parted;
All men become brothers,
Wherever your gentle wing descends.


The common interpretation for the lyrics, but particularly this stanza, is mostly literal. The joy and happiness provided by God elates mankind and overcomes the troubles of history and unites everyone. God imbues mankind with the capacity for joy, which is a commonality between all men. Of course, if Schiller (the lyricist) were talking about beer, this would explain a lot too (why the chorus happens to be feuertrunken or 'drunk/filled with fire' for starters). Here it would also imply Germans are a lot friendlier once they've been liquored up a bit.

A common interpretation that starts from this portion of the lyrics is that pure joy presents a divine replacement for the Christian god (which Schiller devoutly followed), although Some Christians combined God with the concept of hedonism as early as the 1700s. This derives from the addition of Götterfunken ("divine spark" or, even more heathenish, "spark of the gods") and Elysium. Elysium - as astute viewers of Gladiator or students of mythology may know - was the Roman resting place for the heroic and virtuous (an evolution of the Greek Elysion).


Wem der große Wurf gelungen,
Eines Freundes Freund zu sein;
Wer ein holdes Weib errungen,
Mische seinen Jubel ein!
Ja, wer auch nur eine Seele
Sein nennt auf dem Erdenrund!
Und wer's nie gekonnt, der stehle
Weinend sich aus diesem Bund!


Whoever has had the fortune
To be a friend's friend;
Whoever has won a fair woman,
Add in your cheer!
Yes, even he who has but a soul
To call his own in this world!
And he who is unable, let him steal away
Weeping from this band!


So apparently the only ones that don't get to join in this cheerful celebration are those without souls. Guess that means Faust is out. People with wives or friends seem to still be in though. Wem der große Wurf gelungen, eines Freundes Freund zu sein literally means "for whomever the great dice roll has succeeded, to be a friend of a friend"...Proving that Schiller loved to gamble his friends in games of chance.

There's a tendency with modern German to explain away the use of masculine nouns here as male chauvinism (where gender neutrality for nouns is even more of a pipe dream than in English). I guess his stipulation that you can join in if you have a nice wife limits it to heterosexual males and the occasional lesbian, but he sort of overrides that with "or if you've got a soul come on in." That's one of the problems with noun genders (the other problem annoys foreign speakers by forcing them to memorize genders). This is also one of the reasons why Freude is a daughter of Elysium, since die Freude is feminine.


Freude trinken alle Wesen
An den Brüsten der Natur;
Alle Guten, alle Bösen
Folgen ihrer Rosenspur.
Küße gab sie uns und Reben,
Einen Freund, geprüft im Tod;
Wollust ward dem Wurm gegeben,
Und der Cherub steht vor Gott.


All creatures drink joy
At the teat of nature;
All good, all bad
Follow her trail of roses.
Kisses she gave us - and wine -
A friend, proved in death;
Pleasure was given to the worm,
And the cherub stands before God.


The idea is that nature is part of life and every creature experiences joy and pleasure (hence the 'pleasure was given to the worm', meaning 'even the lowly can experience pleasure'). The exception here are angels (the cherub), who don't get a choice and need to hang out with God. Maybe Schiller is the original inspiration for Dogma. Schiller's apparent love for alcohol pops up again here. It may be important to note that Reben is the plural of die Rebe - the vine (often translated as grapes or wine here).

This is where that 'Joy as a god' thing comes back too, with the deification of nature. Coincidentally, it also features the anthropomorphization of nature. There's probably something about the uniformity of death buried in there too (what with everything following the path ordained by nature and all). Or it could just be a bunch of worms having sex and angels standing in front of God for no discernible reason.


Froh, wie seine Sonnen fliegen
Durch des Himmels prächt'gen Plan,
Laufet, Brüder, eure Bahn,
Freudig, wie ein Held zum Siegen.


Jubilantly, as his suns fly
Through the heavens' glorious design,
Run, brothers, on your way,
Joyfully, like a hero on to victory.


Schiller wrote his poem during the Age of Enlightenment, so it may be important to note that die Bahn also means orbit (relating to the suns). This could be read as an absolution in all things, or that every day should be lived joyously. Or both. Unfortunately, the alcohol thread from earlier stanzas doesn't fit so well in this one.


Seid umschlungen, Millionen!
Diesen Kuß der ganzen Welt!
Brüder, über'm Sternenzelt
Muß ein lieber Vater wohnen.
Ihr stürzt nieder, Millionen?
Ahnest du den Schöpfer, Welt?
Such' ihn über'm Sternenzelt!
Über Sternen muß er wohnen.


Be embraced, you millions!
This kiss for the entire world!
Brothers - above the canopy of stars
A loving father must dwell.
Are you penitent, millions?
Do you sense the Creator, world?
Seek him above the canopy of stars!
He must dwell beyond the stars.


The first portion of this song has probably been translated best by Chet Powers. This stanza diminishes the 'Joy as a god' concept a bit, although we might assume that Schiller mentions 'above the stars' as a euphemism for 'being high'...but that construction didn't really exist in English or German in the 1800s.

The symphony recycles the rest of its lyrics from earlier (Beethoven apparently loved him some Götterfunken). It may be a bit heavy in the religious department, but still has a universally functional theme.

At the very least I think we can all enjoy the God full of peace and friendship from the Age of Reason over the God full of Providence and Original Sin from the time of the Inquisition and the Salem Witch Trials and such. A song about embracing joy and all of mankind is a bit more touching than a song about flagellating yourself and burning all of the women in your village for having mind powers.


Proving Englightenment Age German poets and lyricists are drunks since 2008.