People clamor for shorter articles, so I present the first in a series of Milwaukee-oriented trivia collections randomly organized into arbitrary themes. Today: I link Milwaukee's history with Back to the Future because of a note listing the time and date a building was struck by lightning.
Ten Years Early; One DeLorean Short
Flux capacitor not fluxing and out of plutonium? The Wells Building on Wisconsin Avenue was struck by lightning on July 9th, 1945 at 3:45pm. Unfortunately old ladies won't be handing out fliers asking you to save the clocktower; there aren't any clock faces on the building.
Thinking of speeding down Wisconsin Avenue in order to hit a metal wire at 88 miles per hour? Milwaukee's mass transit might have something to say about that. Not only would you have to contend with an abundance of streetcar wires, you'd also have the streetcars themselves, which would still be around for nearly a decade.
"Tab? I can't give you a tab unless you order something."
Putting aside the fact that Marty orders a drink marketed towards women, Tab did not exist until 1963. Coincidentally (in the realm of missing beverages), Milwaukee didn't have any taverns on record in the 1920s. Still want to wet your whistle? The Milwaukee city directories can point you to page after page of soda fountains.
Road names? Where we're going we don't need... road names.
There's plenty of dispute over the origin of the city of Milwaukee's name. But many of the streets have unique stories as well. Some changes came through convention, some came through history, and some came because urban planners like trying to confuse Polish immigrants.
As mentioned previously, Milwaukee went through many street renaming phases, but the most extensive happened in 1930. Almost every East-West street in East Milwaukee had a different name before the 1930s. Directional indicators were appended to street names (making something like Wisconsin Avenue into East Wisconsin Avenue and West Wisconsin Avenue - or something more fun like changing Aldrich Street into East Bay Street and South Bay Street). Unfortunately, they also decided to move the numbered streets as well.
View Larger Map
The city of Milwaukee does not have a Main Street (although Brown Deer Road becomes Main Street in Waukesha County). Broadway was formerly Main Street, before the name was changed in 1871.
The only reasonable explanation is that it's the main street to use to get out of the city, right?
Friday, July 31, 2009
Monday, July 27, 2009
Gravity Gone Ballistic
Some of my paternal grandfather's war stories involved his time in the army. The most action-packed involved a time on patrol when a spent round bounced off his upper chest, and simply fell to the ground because it had effectively lost all of its kinetic energy. More action-packed than stories of siphoning gasoline out of military vehicles, anyway. Strangely, the story has propagated disagreement among people who doubt its veracity. I can't vouch for the authenticity of the story, but I can vouch for its viability.
However, it wasn't enough that people simply believe me on my birthday, so I was forced to break out the W/t of SCIENCE (If I can have a dogion joke I can have a power = energy/time joke). Unfortunately, this led to a point of contention involving the physics of gravity.
Some people are surprised to know that the Aristotelian view of gravity is not true (despite how cool Aristotle is). Heavier objects do not inherently accelerate faster than lighter objects. This belief persists because lighter objects tend to be more buoyant and have more air resistance, and hence drop more slowly on Earth.
Current scientific thinking links gravity with the curvature of spacetime. Unfortunately, quantum mechanics disagrees (but quantum mechanics is like that frizzy-haired uncle no one likes to talk to because he seems to only speak gibberish). Despite all that, most situations still work fine with Newton's "simple" Law of Gravity. Here we also see that laws are meant to be broken, even when they're scientific, since relativity and quantum mechanics have proven that Newton's law does not apply to all possible scenarios. Basically, given a vacuum (to negate air resistance and buoyancy) and objects of negligible mass (relative to a planet), objects will fall to the ground at the same rate (both would fall at approximately 9.8m/s2 on Earth).
Astronauts make everything better, so fortunately the concept was illustrated by our good friend Commander David Scott during the Apollo 15 mission.
Transcript for people without video: "Well, in my left hand I have a feather. In my right hand, a hammer. I guess one of the reasons we got here today was because of a gentleman named Galileo a long time ago, who made a rather significant discovery about falling objects in gravity fields. And we thought, 'Where would be a better place to confirm his findings than on the moon?' And so we thought we'd try it here for you. The feather happens to be appropriately a falcon feather...for our Falcon. And I'll drop the two of 'em here, and - hopefully - they'll hit the ground at the same time. [hammer and feather hit the ground simultaneously] How 'bout that? This proves that mister Galileo was correct in his findings."
Returning to our opening statements: people also tend to believe that horizontal motion negates gravity. If an object is shot horizontally and another object dropped simultaneously from the same height, both objects will hit the ground at the same time. Gravity's pull is uniform regardless of horizontal motion. This too, is illustrated by crazy science teachers around the world. Exhibit A:
(I apologize in advance for the lack of more astronauts)
...There is no Exhibit B.
Edit for 2013-2-06: There is an Exhibit B; I've since become aware that the Mythbusters have also performed this experiment using actual firearms with a result that is well within margin of error (basically the only improvements you could ask for is better timing on the drop/shot simultaneity and doing it in a vacuum).
Randomly teaching people about gravity since 2003; just another reason not to follow me when I walk home from school.
However, it wasn't enough that people simply believe me on my birthday, so I was forced to break out the W/t of SCIENCE (If I can have a dogion joke I can have a power = energy/time joke). Unfortunately, this led to a point of contention involving the physics of gravity.
Some people are surprised to know that the Aristotelian view of gravity is not true (despite how cool Aristotle is). Heavier objects do not inherently accelerate faster than lighter objects. This belief persists because lighter objects tend to be more buoyant and have more air resistance, and hence drop more slowly on Earth.
Current scientific thinking links gravity with the curvature of spacetime. Unfortunately, quantum mechanics disagrees (but quantum mechanics is like that frizzy-haired uncle no one likes to talk to because he seems to only speak gibberish). Despite all that, most situations still work fine with Newton's "simple" Law of Gravity. Here we also see that laws are meant to be broken, even when they're scientific, since relativity and quantum mechanics have proven that Newton's law does not apply to all possible scenarios. Basically, given a vacuum (to negate air resistance and buoyancy) and objects of negligible mass (relative to a planet), objects will fall to the ground at the same rate (both would fall at approximately 9.8m/s2 on Earth).
Astronauts make everything better, so fortunately the concept was illustrated by our good friend Commander David Scott during the Apollo 15 mission.
Transcript for people without video: "Well, in my left hand I have a feather. In my right hand, a hammer. I guess one of the reasons we got here today was because of a gentleman named Galileo a long time ago, who made a rather significant discovery about falling objects in gravity fields. And we thought, 'Where would be a better place to confirm his findings than on the moon?' And so we thought we'd try it here for you. The feather happens to be appropriately a falcon feather...for our Falcon. And I'll drop the two of 'em here, and - hopefully - they'll hit the ground at the same time. [hammer and feather hit the ground simultaneously] How 'bout that? This proves that mister Galileo was correct in his findings."
Returning to our opening statements: people also tend to believe that horizontal motion negates gravity. If an object is shot horizontally and another object dropped simultaneously from the same height, both objects will hit the ground at the same time. Gravity's pull is uniform regardless of horizontal motion. This too, is illustrated by crazy science teachers around the world. Exhibit A:
(I apologize in advance for the lack of more astronauts)
...There is no Exhibit B.
Edit for 2013-2-06: There is an Exhibit B; I've since become aware that the Mythbusters have also performed this experiment using actual firearms with a result that is well within margin of error (basically the only improvements you could ask for is better timing on the drop/shot simultaneity and doing it in a vacuum).
Randomly teaching people about gravity since 2003; just another reason not to follow me when I walk home from school.
Wednesday, June 17, 2009
This Old House on 1 That Street
I generally don't drive, which often means that my knowledge of specific street names is limited to my immediate vicinity. My navigation relies much more on landmarks and directions, since I don't bother looking for street signs (I generally don't see them, anyway). But recently I've needed to accustom myself to the lay of the streets a bit more.
So if you're from Milwaukee I'll ask you this: do you know about where in Milwaukee this house was from?
Here's a hint: its address was 519 Astor Street. Not helpful? It's from near the intersection of Biddle Street and Astor Street. Still not helpful? That's because all of Milwaukee's urban planners aspire to be Hunter S. Thompson. The house's description may prove useful in figuring out this problem.
Milwaukee has gone through a series of street name changes, but the largest happened in 1930. Some streets were renamed, but most simply had a direction appended to them. Astor Street became North Astor Street. In addition, people couldn't pick whatever wacky address number they wanted (do you really want to live on 1 Bay Street next to 1 Aldrich Street?). Numbers were assigned according to a property's frontage (every 15' had a new number, every new block increased this number by 100). This house's address changed from 519 Astor Street to 913 North Astor Street.
Unfortunately, even having an address number and street name doesn't help us with this house. Keen visitors that click on my fancy links will already know the other problem. This house neighbored the intersection of Biddle and Astor. The problem being that Biddle Street became Kilbourn Avenue when Cedar and Biddle were widened and connected by a bridge over the Milwaukee river. The intersection of Astor and Biddle now encompasses the intersection of Kilbourn, Prospect and Astor.
View Larger Map
You know what will teach you streets pretty quick? Poring over fire liability maps of a city. Unfortunately, half those streets may not even exist anymore - like Biddle Street. The Sanborn Fire Insurance Map Company (talk about knowing your target market) created detailed, scale maps of thousands of American cities between 1867 and 1970. You can view black and white versions of these maps for Milwaukee online, but you may want the key.
Physically they're giant bound volumes about two feet square. With a scale of 1 inch to 50 feet, this means one page can show about 1000 feet on each side. Milwaukee is a rather large city, and the maps from 1910-1926 come in 6 volumes (about 20 giant bound books of maps). New volumes were ordered for expansions of the city limits, but the index map remains the same (which makes finding page numbers for modern streets not listed on the index that much more fun).
Aside from being unwieldy in size, the individual pages are a bit strange. Directional north is not explicitly at the top of the page; each page has its own compass rose to denote which way is north. Unlike the online versions mentioned above, the actual maps are color-coordinated with the key. Each color indicates a specific building material (and therefore a building's status as a potential fire hazard). And finally, in order to provide up-to-date fire hazard information, the company provided new versions of buildings and streets that could be pasted into the volume on hand. This kept the maps current, but isn't so helpful when you're looking for an older demolished house, since it's bound to be under 3 layers of pasted revisions. In older versions of the map the presence of gas and electrical lines is often marked as well (if I had been around in 1910 I'd get an electrical line to my house just so some lazy surveyor has to pencil in "Electric Line" on some giant map).
So where was that house from? It was located in downtown Milwaukee, where Kilbourn Avenue starts and the Regency House Condos now stand.
And maybe it's still there, like some sort of Morlock house.
So if you're from Milwaukee I'll ask you this: do you know about where in Milwaukee this house was from?
Here's a hint: its address was 519 Astor Street. Not helpful? It's from near the intersection of Biddle Street and Astor Street. Still not helpful? That's because all of Milwaukee's urban planners aspire to be Hunter S. Thompson. The house's description may prove useful in figuring out this problem.
Milwaukee has gone through a series of street name changes, but the largest happened in 1930. Some streets were renamed, but most simply had a direction appended to them. Astor Street became North Astor Street. In addition, people couldn't pick whatever wacky address number they wanted (do you really want to live on 1 Bay Street next to 1 Aldrich Street?). Numbers were assigned according to a property's frontage (every 15' had a new number, every new block increased this number by 100). This house's address changed from 519 Astor Street to 913 North Astor Street.
Unfortunately, even having an address number and street name doesn't help us with this house. Keen visitors that click on my fancy links will already know the other problem. This house neighbored the intersection of Biddle and Astor. The problem being that Biddle Street became Kilbourn Avenue when Cedar and Biddle were widened and connected by a bridge over the Milwaukee river. The intersection of Astor and Biddle now encompasses the intersection of Kilbourn, Prospect and Astor.
View Larger Map
You know what will teach you streets pretty quick? Poring over fire liability maps of a city. Unfortunately, half those streets may not even exist anymore - like Biddle Street. The Sanborn Fire Insurance Map Company (talk about knowing your target market) created detailed, scale maps of thousands of American cities between 1867 and 1970. You can view black and white versions of these maps for Milwaukee online, but you may want the key.
Physically they're giant bound volumes about two feet square. With a scale of 1 inch to 50 feet, this means one page can show about 1000 feet on each side. Milwaukee is a rather large city, and the maps from 1910-1926 come in 6 volumes (about 20 giant bound books of maps). New volumes were ordered for expansions of the city limits, but the index map remains the same (which makes finding page numbers for modern streets not listed on the index that much more fun).
Aside from being unwieldy in size, the individual pages are a bit strange. Directional north is not explicitly at the top of the page; each page has its own compass rose to denote which way is north. Unlike the online versions mentioned above, the actual maps are color-coordinated with the key. Each color indicates a specific building material (and therefore a building's status as a potential fire hazard). And finally, in order to provide up-to-date fire hazard information, the company provided new versions of buildings and streets that could be pasted into the volume on hand. This kept the maps current, but isn't so helpful when you're looking for an older demolished house, since it's bound to be under 3 layers of pasted revisions. In older versions of the map the presence of gas and electrical lines is often marked as well (if I had been around in 1910 I'd get an electrical line to my house just so some lazy surveyor has to pencil in "Electric Line" on some giant map).
So where was that house from? It was located in downtown Milwaukee, where Kilbourn Avenue starts and the Regency House Condos now stand.
And maybe it's still there, like some sort of Morlock house.
Friday, May 29, 2009
Collection of Curiosities - Cynic's Edition
Hoaxes, practical jokes and confidence schemes have featured in entertainment (and the art of separating people from money) for hundreds of years. Despite massive amounts of cynicism present in today's society, hoaxes still manage to catch quite a few people, but only a select few have impacted entire nations and imprinted themselves on the public psyche.
Of Mars and Men
Substantially famous already, the War of the World's hoax isn't actually so much a practical joke or a hoax as it is a testament to people's tendency to jump to conclusions. Taking place on October 30th, 1938 (and continuing on for a few people in the population), the hoax generated a panic in some of the population in fear of a Martian invasion.
The War of the Worlds hoax was a radio broadcast of a radio adaptation of H. G. Wells' The War of the Worlds directed by Orson Welles. The broadcast followed a format of "breaking news" bulletins interrupting a performance by an orchestra. Each of the bulletins provided the audience with updates that followed the plot of the War of the Worlds - essentially a Martian invasion. Subsequently, portions of the public panicked at the thought that the invasion was real.
You know how some people miss the first five minutes of television shows or movies? Well, most television shows produced today are designed so that audiences can watch them without paying close attention. After all, viewers may be making dinner or folding clothes and they might not be pouring all their energy into watching. This means that many shows (particularly sit-coms) restate the plot or update viewers quite frequently. Unfortunately for listeners to the Welles broadcast, many missed the disclaimer introduction.
The broadcast also functions as a testament to some people's short attention spans. The story eventually switches to one man's narrative of his attempts at survival. This makes it readily apparent the broadcast is not real...but by that time people had ran screaming from their radio sets and were busy stockpiling supplies before the impending Martian invasion force arrived.
You can relive the mystical aura of a panic-stricken 1930s by listening to the broadcast here or here. Just like when you're watching TV, remember to skip the first few minutes.
Nothing for the Spaghetti Weevils
Did you know spaghetti grows on trees? Of course you don't... because it doesn't. That didn't stop the British show Panorama from broadcasting a fake documentary on April 1st, 1957 showcasing the bountiful Swiss spaghetti crop. Panorama happened to be one of the most trusted news sources on British television, so the hoax managed to get two types of responses: people who loved the joke and people who wanted to know how to grow their own spaghetti trees. This was the first - and only - time that the news program decided to air an April Fools joke. This is the benefit of having a cameraman who knows a good practical joke.
Through the glory of the internet you can pretend to be British in the 50s by watching the video here.
Pasta has become so ubiquitous now that everyone knows its secret ingredient: flour. Of course, you can add things like salt and eggs to the mixture, but pasta is essentially flour formed into fancy shapes. Unfortunately for much of the British population in the 50s, who had suffered rationing throughout the 40s, pasta was still considered something of a delicacy and remained relatively rare (I'm guessing their college students must have subsisted off some sort of Dickensian gruel, instead).
As a side benefit, the hoax documentary created and exterminated the adorable spaghetti weevil.
Soaking in Cynicism
Have you heard of the dangerous chemical dihydrogen monoxide that's responsible for thousands of deaths per year? I'm sure you have, but it was probably called something different, like hydrogen hydroxide or hydroxic acid. Or maybe water.
Developed in the 1990s, this hoax plays on a lack of scientific knowledge. Usually the hoax pops up in the form of a petition banning dihydrogen monoxide (DHMO), with a spokesperson listing off many of the dangerous-sounding aspects of water to persuade someone to sign. At face value all of this information is true, but used in an exaggerated manner (after all, thousands of people die every year to drowning). The hoax even has its own psuedo-advocates with a website listing off the dangers of DHMO.
This hoax ends up a source of amusement for chemists and a source of embarrassment for politicians. The hoax relies on exaggeration and a listener's lack of specific knowledge (or attention). It tells us that pretentious language and specialized jargon can often be used to circumvent people's logic and reasoning. Unfortunately it also showcases people's willingness to generate uninformed decisions. Who would ban water? Plenty of people if they don't know it's water.
Also lava monsters.
Many people might believe that modern hypercyncism may prevent hoaxes from even gaining a foothold anymore, but the case of DHMO shows that people as a whole are as gullible and misinformed as ever. Afterall, there's more information now than ever before, how do you know who to not trust? Unfortunately this results in more work for individuals because it takes even more effort to form factual, informed decisions... many people often don't bother.
The predominant form of the hoax is no longer steeped in April Fools Day jokes and emails that can be disproved with a single Google search. They rely more on confidence schemes and human fallibility. Even in the digital age, con artists still rely on surreptitiously gaining information directly from people more often than through brute force cracking of electronic information. For less criminal misinformation there's plenty of help around. The website Snopes exists to discredit modern hoaxes and urban legends which manage to find their way into chain mailings and conversations.
At least we don't need to worry about spaghetti weevils. Or do we!?
Of Mars and Men
Substantially famous already, the War of the World's hoax isn't actually so much a practical joke or a hoax as it is a testament to people's tendency to jump to conclusions. Taking place on October 30th, 1938 (and continuing on for a few people in the population), the hoax generated a panic in some of the population in fear of a Martian invasion.
The War of the Worlds hoax was a radio broadcast of a radio adaptation of H. G. Wells' The War of the Worlds directed by Orson Welles. The broadcast followed a format of "breaking news" bulletins interrupting a performance by an orchestra. Each of the bulletins provided the audience with updates that followed the plot of the War of the Worlds - essentially a Martian invasion. Subsequently, portions of the public panicked at the thought that the invasion was real.
You know how some people miss the first five minutes of television shows or movies? Well, most television shows produced today are designed so that audiences can watch them without paying close attention. After all, viewers may be making dinner or folding clothes and they might not be pouring all their energy into watching. This means that many shows (particularly sit-coms) restate the plot or update viewers quite frequently. Unfortunately for listeners to the Welles broadcast, many missed the disclaimer introduction.
The broadcast also functions as a testament to some people's short attention spans. The story eventually switches to one man's narrative of his attempts at survival. This makes it readily apparent the broadcast is not real...but by that time people had ran screaming from their radio sets and were busy stockpiling supplies before the impending Martian invasion force arrived.
You can relive the mystical aura of a panic-stricken 1930s by listening to the broadcast here or here. Just like when you're watching TV, remember to skip the first few minutes.
Nothing for the Spaghetti Weevils
Did you know spaghetti grows on trees? Of course you don't... because it doesn't. That didn't stop the British show Panorama from broadcasting a fake documentary on April 1st, 1957 showcasing the bountiful Swiss spaghetti crop. Panorama happened to be one of the most trusted news sources on British television, so the hoax managed to get two types of responses: people who loved the joke and people who wanted to know how to grow their own spaghetti trees. This was the first - and only - time that the news program decided to air an April Fools joke. This is the benefit of having a cameraman who knows a good practical joke.
Through the glory of the internet you can pretend to be British in the 50s by watching the video here.
Pasta has become so ubiquitous now that everyone knows its secret ingredient: flour. Of course, you can add things like salt and eggs to the mixture, but pasta is essentially flour formed into fancy shapes. Unfortunately for much of the British population in the 50s, who had suffered rationing throughout the 40s, pasta was still considered something of a delicacy and remained relatively rare (I'm guessing their college students must have subsisted off some sort of Dickensian gruel, instead).
As a side benefit, the hoax documentary created and exterminated the adorable spaghetti weevil.
Soaking in Cynicism
Have you heard of the dangerous chemical dihydrogen monoxide that's responsible for thousands of deaths per year? I'm sure you have, but it was probably called something different, like hydrogen hydroxide or hydroxic acid. Or maybe water.
Developed in the 1990s, this hoax plays on a lack of scientific knowledge. Usually the hoax pops up in the form of a petition banning dihydrogen monoxide (DHMO), with a spokesperson listing off many of the dangerous-sounding aspects of water to persuade someone to sign. At face value all of this information is true, but used in an exaggerated manner (after all, thousands of people die every year to drowning). The hoax even has its own psuedo-advocates with a website listing off the dangers of DHMO.
This hoax ends up a source of amusement for chemists and a source of embarrassment for politicians. The hoax relies on exaggeration and a listener's lack of specific knowledge (or attention). It tells us that pretentious language and specialized jargon can often be used to circumvent people's logic and reasoning. Unfortunately it also showcases people's willingness to generate uninformed decisions. Who would ban water? Plenty of people if they don't know it's water.
Also lava monsters.
Many people might believe that modern hypercyncism may prevent hoaxes from even gaining a foothold anymore, but the case of DHMO shows that people as a whole are as gullible and misinformed as ever. Afterall, there's more information now than ever before, how do you know who to not trust? Unfortunately this results in more work for individuals because it takes even more effort to form factual, informed decisions... many people often don't bother.
The predominant form of the hoax is no longer steeped in April Fools Day jokes and emails that can be disproved with a single Google search. They rely more on confidence schemes and human fallibility. Even in the digital age, con artists still rely on surreptitiously gaining information directly from people more often than through brute force cracking of electronic information. For less criminal misinformation there's plenty of help around. The website Snopes exists to discredit modern hoaxes and urban legends which manage to find their way into chain mailings and conversations.
At least we don't need to worry about spaghetti weevils. Or do we!?
Friday, April 17, 2009
The Low High Brow
Because I've got exciting papers and projects and presentations due in ever-increasing amounts in April and May, there's been a lull in the quality of writing at the site over here (that is... there hasn't been any). In order to satiate my swooning audiences who desire only the best quality writing and humor, I've decided to release one of my rare (i.e. only) highly-prized, hand-drawn, meticulously-inked comics.
...That is, hand-drawn in MS-Paint with copious amounts of Photoshop blur effects and paint bucket fills.
That's right, I drew croissants instead of crumpets on the plate, what of it?
...That is, hand-drawn in MS-Paint with copious amounts of Photoshop blur effects and paint bucket fills.
That's right, I drew croissants instead of crumpets on the plate, what of it?
Thursday, March 5, 2009
Arabian Golf
The Persian Gulf has been a hotspot of contention ever since some Sumerians decided they wanted to live next to each other a few thousand years ago. Recently, Arab dominated lands have referred to the body of water as the Arabian Gulf, which has led to a vehement outcry among Iranians (or Persians, for anyone alive before 1935). This nationalism has evolved to the extent that Iran now has a Persian Gulf Day (on April 29th, in case you planned on taking the day off). You might also notice the rather undiplomatic language that seems to permeate Iranian literature on the subject. To their credit, the UN and some random guy at MIT (someone in Iranian Studies, anyway) have determined that Persian Gulf (or variations thereof) has functioned as the de facto name for the gulf in European circles for centuries and should stay that way. I'm not really sure where the Arabian prompt to change the name is coming from - they have a perfectly fine Red Sea to the west that could do with a spruced up name. Maybe they're hoping the next war in the area to be a more eponymous Arabian Gulf War instead of a Persian Gulf War.
Now, my History 104 course with Professor Wick also featured a bit of discussion on the popular gulf (he also writes a mean introduction to the History of the Peloponnesian War by Thucydides). One of my favorite professors through a mixture of immense topical knowledge with dry wit, his lectures provided an exceptional historical background for future learning and critical thinking. When covering the topic of ancient civilizations in Mesopotamia, the Persian Gulf featured in the topical discussion to a fair extent. (Un)fortunately, we didn't cover any sort of historiography on the subject of the gulf's geography.
The depth of the modern gulf does not exceed 90m, which is helpful since sea levels rose by about 90m when the glaciers from the last ice age melted. The Persian Gulf of the time was likely a fertile valley, but there was no recorded history at that point. The Sumerian great flood (an early analogue to the Biblical tale of Noah) is likely unrelated to the inundation (or Deluge, if you're still going all Biblical on me) of the Persian Gulf. The gulf sits at the collision zone of the Eurasian and Arabian tectonic plates which still periodically undergo tectonic activity related to orogeny (that is: mountain upheaval and usually accompanying subsidence somewhere else).
For quite a while the historical coastline of the Persian Gulf was believed to have been between about 200 kilometers to the northwest of its present position. An archeological geologist named Jacques de Morgan theorized around 1900 that the Persian Gulf had slowly been filling in with sediment deposited by the Tigris, Euphrates, and Karun rivers. He suggested that the Tigris and Euphrates emptied into the gulf without forming a confluence (the Shatt al-Arab estuary (or the Avrandrud, if you're Persian - not to be confused with the Evinrud)), and that the Karun river's sediment formed a series of shoals, which eventually built up into the modern shoreline. Through an in-depth survey of archeological sites in Mesopotamia, de Morgan hypothesized that the coast of the Persian Gulf would have been near Baghdad in the 4th millennium BC... Never mind that his use of historical sites relied on his own survey of historical voyages whose point of origin we still don't definitively know (turns out you can say you've found anything if no one knows the actual location).
There are a few problems with de Morgan's assertion (besides his mélange of potentially made-up historical sites). Much of the rock in the area appears to be from freshwater sediment. There's also the problem of Lake Hammar in southeastern Iraq which miraculously hasn't really filled with sediment and wasn't there six thousand years ago. While sedimentary accretion is an accepted geological phenomenon, de Morgan was missing a few important bits of information (mostly the geology of his archeological geology).
The predominant theory behind the coastline of the Persian Gulf seems to still be Lees and Falcon's subsidence theory. With their fancy use of geological sampling, they hypothesized that the Persian Gulf had intermittently undergone (and continues to undergo) subsidence, which counteracts the silt deposits to a great extent. Iraq collides with Persia building mountains, but the creation of mountains requires a complementary subsidence zone. So, the story of Noah had it wrong: the land wasn't being flooded by water, the water was being flooded by land (...and was slowly sinking to cover it up, like some geologist-fantasized episode of CSI). Sure, Noah's flood is supposed to be from rain and rivers overflowing, but you can't have quality jokes and accuracy, what do you think this is the Daily Show?
Through the use of aerial photography and charts from the 1800s, Lees and Falcon determined that the primary coastal change was a migration of the Shatt al-Arab's output further to the northeast. Subsidence and silt deposits have resulted in some ancient sites buried under a substantial depth of sediment (and occasionally water) as the shoreline meanders northeast. Or maybe Sumerians were just subterranean, Tolkien-esque dwarves with gills. So maybe in a few millennia the main river outlet into the Persian Gulf will be in Iran and we can avoid squabbles about preferred geographic names. Or half of the region will be buried under ten meters of silt, and everyone will turn into Morlocks. Either way I see a great future for the science fiction community and historians. I preemptively dub it historical futuristic science fiction.
Now we just need to work on renaming Lake Michigan to Lake Wisconsin.
Now, my History 104 course with Professor Wick also featured a bit of discussion on the popular gulf (he also writes a mean introduction to the History of the Peloponnesian War by Thucydides). One of my favorite professors through a mixture of immense topical knowledge with dry wit, his lectures provided an exceptional historical background for future learning and critical thinking. When covering the topic of ancient civilizations in Mesopotamia, the Persian Gulf featured in the topical discussion to a fair extent. (Un)fortunately, we didn't cover any sort of historiography on the subject of the gulf's geography.
The depth of the modern gulf does not exceed 90m, which is helpful since sea levels rose by about 90m when the glaciers from the last ice age melted. The Persian Gulf of the time was likely a fertile valley, but there was no recorded history at that point. The Sumerian great flood (an early analogue to the Biblical tale of Noah) is likely unrelated to the inundation (or Deluge, if you're still going all Biblical on me) of the Persian Gulf. The gulf sits at the collision zone of the Eurasian and Arabian tectonic plates which still periodically undergo tectonic activity related to orogeny (that is: mountain upheaval and usually accompanying subsidence somewhere else).
For quite a while the historical coastline of the Persian Gulf was believed to have been between about 200 kilometers to the northwest of its present position. An archeological geologist named Jacques de Morgan theorized around 1900 that the Persian Gulf had slowly been filling in with sediment deposited by the Tigris, Euphrates, and Karun rivers. He suggested that the Tigris and Euphrates emptied into the gulf without forming a confluence (the Shatt al-Arab estuary (or the Avrandrud, if you're Persian - not to be confused with the Evinrud)), and that the Karun river's sediment formed a series of shoals, which eventually built up into the modern shoreline. Through an in-depth survey of archeological sites in Mesopotamia, de Morgan hypothesized that the coast of the Persian Gulf would have been near Baghdad in the 4th millennium BC... Never mind that his use of historical sites relied on his own survey of historical voyages whose point of origin we still don't definitively know (turns out you can say you've found anything if no one knows the actual location).
There are a few problems with de Morgan's assertion (besides his mélange of potentially made-up historical sites). Much of the rock in the area appears to be from freshwater sediment. There's also the problem of Lake Hammar in southeastern Iraq which miraculously hasn't really filled with sediment and wasn't there six thousand years ago. While sedimentary accretion is an accepted geological phenomenon, de Morgan was missing a few important bits of information (mostly the geology of his archeological geology).
The predominant theory behind the coastline of the Persian Gulf seems to still be Lees and Falcon's subsidence theory. With their fancy use of geological sampling, they hypothesized that the Persian Gulf had intermittently undergone (and continues to undergo) subsidence, which counteracts the silt deposits to a great extent. Iraq collides with Persia building mountains, but the creation of mountains requires a complementary subsidence zone. So, the story of Noah had it wrong: the land wasn't being flooded by water, the water was being flooded by land (...and was slowly sinking to cover it up, like some geologist-fantasized episode of CSI). Sure, Noah's flood is supposed to be from rain and rivers overflowing, but you can't have quality jokes and accuracy, what do you think this is the Daily Show?
Through the use of aerial photography and charts from the 1800s, Lees and Falcon determined that the primary coastal change was a migration of the Shatt al-Arab's output further to the northeast. Subsidence and silt deposits have resulted in some ancient sites buried under a substantial depth of sediment (and occasionally water) as the shoreline meanders northeast. Or maybe Sumerians were just subterranean, Tolkien-esque dwarves with gills. So maybe in a few millennia the main river outlet into the Persian Gulf will be in Iran and we can avoid squabbles about preferred geographic names. Or half of the region will be buried under ten meters of silt, and everyone will turn into Morlocks. Either way I see a great future for the science fiction community and historians. I preemptively dub it historical futuristic science fiction.
Now we just need to work on renaming Lake Michigan to Lake Wisconsin.
Thursday, February 19, 2009
Primer up to Primers
The modern concept of an arms race is almost exclusively tied to the proliferation of nuclear weapons between the USA and USSR and the attempts at building bigger, longer-ranged weaponry. Or to anyone who's ever played Civilization: racing Ghandi to nuclear weapons before he destroys the world. Like the nuclear arms race and the related idea of the Red Queen theory, history is populated by arms races. Just like the light bulb or personal computer (or even modern firearms), inventions stand on the innovation that came before them.
Firearms constitute a prime example of a technological race. Now, through the miracles of etymology, firearms might be either "flaming arms" or "arms that make use of fire". Although early firearms may have lead to plenty of conflagrated limbs (and torsos), the concept of arms here ties in with weapons (Old French armes from Latin arma (weapons)) and not arms (Old English earm from Latin armus (shoulder/upper arm)) - both of which are from the Indo-European root of ar- (too fit or join). Firearms have three separate components that influence their effectiveness: the gunpowder, the bullet, and the design. These three factors facilitate faster firing rates, more range, cheaper manufacturing, more mobility, and easier use - all of which were desired since firearms were invented.
General consensus lands the discovery of gunpowder sometime in the 800s in China. The use of recognizable guns in China dates to the 1100s (occasionally Arabic scholars argue this point). This often leads to the misconception that early gunpowder use focused on fireworks or the idea that the peace-loving Chinese couldn't find a use for it in wartime. The advent of gunpowder coincides with the downfall of the Tang dynasty (not to be confused with the Tang dynasty - easy way to differentiate them: only one is "orangey") and the emergence of the war-filled Wudai and Shiguo (Five Dynasties and Ten Kingdoms - wu (五) being five, shi (十) being ten) period and the warfare that accompanies more than ten states in an area less than half the size of modern China. Recognizable firearms developed during the subsequent Song dynasty (not to be confused with the, uh...song dynasty?), but despite the empire's relative stability, they weren't any strangers to war either. Turns out Chinese people were all about burninating the countryside.
Early gunpowder in all spheres of influence around the world (Chinese, Arabian, and European) lacked explosive force. Gunpowder functioned more as an incendiary in the early years of its adoption in each area. Before the arrival of black powder to European armories, naphtha (think pitch or oil) occasionally filled this role. Soldiers filled small ceramic or glass pots with the flammable liquid. When lit and thrown, the pot would break and the liquid would splash out and engulf an area in fire. Greek fire (or liquid fire) occasionally pops up alongside naphtha and pitch, but writers and transcribers often used the term carelessly (...let me paraphrase: "Holy crap, guys - we just got our asses handed to us by those Saracens. They totally used Greek fire on us."). The primary users of Greek fire consisted of the Byzantines (Greeks to the rest of Europe) and the myriad Arabian states around the Mediterranean.
Ingredients and Processing
Early incendiary mixtures consisted primarily of sulfur and charcoal. These mixtures wouldn't provide any explosive punch, but worked well enough for burning things (I'll avoid linking non-Trogdor again...for now). The third essential ingredient of medieval gunpowder was saltpeter (or potassium nitrate, KNO3). Mined from areas of China and India, Chinese alchemists had relatively easy access to the chemical. In Europe, saltpeter remained elusive until alchemists uncovered suitable amounts of the chemical in a more obscure form. The initial source for saltpeter in Europe came fromBatman bat caves in the form of guano. Disgusting you say? The larger scale (far more odoriferous) production of saltpeter inevitably required copious amounts of aged urine (or by LeConte's recommendation: dung-water) or manure.
Here's another fun fact: saltpeter often functioned as a food preservative in the middle ages. Hopefully you're not reading this during lunch. The historical reenactors among you may be happy to know that the modern production of black powder does not rely on urine.
The basics of effective black powder had finally arrived by the early 14th century. Recipes proliferated, offering a variety of additives and proportions. The theoretically most effective ratio nears 75% saltpeter, 12% sulfur and 13% charcoal, however medieval chemists tended to use far less saltpeter (the hardest component to produce). Recipes usually took a form of basic ratios, such as 7 parts nitre, 5 part brimstone and 5 part charcoal. The standardization of black powder in a form close to its theoretical explosive limit didn't occur until the late 18th century.
Beyond the evolution of the formulas, methods of transporting and mixing the constituent components developed as well. Engineers discovered that mixing the saltpeter into the sulfur and charcoal just before firing resulted in a more reliable explosive force (although it tended to produce a lot of powder dust which was prone to ...exploding). Gunpowder was often milled as a function of the mixing process, providing relatively consistent powder (compared to mortar-and-pestle mixing, anyway).
One of those major lightbulb-esque developments came about in the 1420s with the advent of corning. Also known as granulating, corning is the process of wetting the gunpowder and forming it into kernels or grains. Initial liquids for corning were spirits, vinegar, and - that old nedieval standby - urine (apparently urine was medieval duct tape). Turns out water works best. It helps the saltpeter fill in the pores of the charcoal, allowing for a very consistent and more powerful propellant (it's also mentioned in Timeline, in case you're interested). Subsequent developments focused on standardizing grain size and providing additives to stabilize the powder (like the addition of graphite to avoid static discharges setting off the powder - if you go back in time bring a pencil).
Materials and Shapes
I would be remiss if I didn't point out that gunpowder developed into a general purpose explosive. Arriving sometime after 1250, firearms developed relatively quickly into projectile weapons but saw other uses as well. Black powder was used in civil engineering (mining, canal building, etc.), but proved especially dangerous due to the inconsistency of the powder and the lack of reliable fuses. Besiegers also used the explosive mixture to great effect. The most famous of these for us now are petards, which come to us with the phrase "he was hoisted by his own petard" as in "he was foiled by his own plan". But, like our contemporary association with the word firearms, the most common weapons to make use of fire were guns.
European ventures into projectile firearms began with cannons. Walter de Milemete's illustration in De Nobilitatibus, Sapientii, Et Prudentiis Regum (On the nobility, wisdom, and prudence of kings) features the first picture of a European cannon (on the left). This type of cannon and gonne were referred to as a vaso (ingeniously, Italian for vase). It is unlikely that de Milemete had actually seen the cannon fired at this point due to the ambiguity of the gun's carriage (despite how stable putting a cannon on a sawhorse may be).
During the early centuries of gunpowder use, siege engineers favored cast bronze cannons. Leaders preferred forged or cast iron guns for their economical price. Metallurgical processes of the period meant that it was easier to cast bronze (or much more expensive brass), than iron. The pliability of bronze also made it easier to notice when a bronze cannon had undergone too much stress due to a large bulge that would form. Iron cannons tended to just explode due to their brittleness (this tends to be the reason why the operators preferred bronze guns). Although generally stronger as a metal, iron metallurgy and refining processes slacked behind bronze. The picture in de Milemete's depiction is a cast piece, probably of bronze or brass (due to the color). Forged iron cannon consisted of a tube (often of wood covered in metal slats) held together by rings of iron. This lead to the idea of a gun barrel since the weapons originally resembled barrels (a cylinder of wood slats held together by iron rings).
The Part That's Supposed to Hurt People
Projectiles came in a variety of shapes and sizes. De Milemete's depicted cannon fired a dart (or shortened arrow). Early ammunition often came from rock, something that tended to be rather plentiful. Round bullets or shot became the preferred standard (as round as you can make a rock, anyway). As gun calibers became standardized, metal ammunition began to readily replace stone. Shot, darts, and bullets all had separate tactical applications and saw use as their production processes became more refined. This is the part where the euphemism of a cannon as a big hard tube with balls comes in.
Until the advent of the cartridge and primer, the vast majority of firearms were muzzle-loaded (loaded down the barrel and then rammed into position). Only very small cannon tended to be breech-loaded (loaded into the rear of the weapon, right into the firing position). Until machining caught up, these required a removable chamber held in place by a wedge to make them nearly air-tight for firing.
Bullet makers (generally lead casting metal smiths) spent time perfecting the art of shot dropping - the act creating spherical rounds for use in firearms. Molten lead would be dropped from the towers through a sieve so that it could cool into a nearly perfect sphere as it fell. This resulted in a vast assortment of shot towers dotting the countryside. This didn't work quite as well for artillery rounds, which needed to be cast and hand corrected. Spherical rounds remained the standard until the advent of inexpensive manufacturing processes for conical bullets such as the Minié ball in the early 1800s (never mind that something called a Minié ball wasn't a sphere). The combination of a rifled gun barrel and a more aerodynamic bullet provided greater accuracy and a more damaging impact.
Hand-held Firearms and Locks
Hand-held firearms took off when someone decided to make a cannon small enough to be held by one person (one crazy person, these things often exploded when firing after all). So it's no surprise that early handguns (or handgonnes or whichever phonetic spelling you prefer) looked like miniature cannons. Like their cannon-y counterparts, these weapons required their operators to insert a charge of powder, ammunition, and then light the whole thing off with an open flame. The flame would ignite the priming powder (held in a small receptacle called a flash pan), whose flame would travel through a touch hole and fire the weapon.
Turns out even people who ran at each other with big knives thought this was dangerous. In early artillery and handcannons, a linstock (essentially a big fork) held the match so that the weapon operator could try not to die when he fired his weapon. Eventually safer and more useful firing mechanisms (or locks) developed.
The first of these - the matchlock - appeared in the mid-15th century, and was essentially a burning wick that clamped down when a lever was pulled (igniting the flash pan, going through the touch hole and so on). The first lever, or trigger, came in the form of an S-shaped piece of metal called a serpentine (because an S is always a snake). The matchlock remained the primary firing mechanism during the early years of gunpowder, when arquebusiers ran around the battlefields of Europe with arquebuses. Well, without the running part, anyway.
Early firearms like the arquebus and the later, heavier musket often required soldiers to rest their weapon on a window sill or a Y-shaped fork in order to aim. The word arquebus, like its shotgun-like partner the blunderbuss, comes from Dutch. Arquebus - and its counterparts harquebus, hackbut, hagbut and the like - comes from the Old Dutch hākebusse and German hakenbuchse or hook gun (due to hooks that were originally cast onto the barrel so that it could connect to the Y-shaped firing stand). While blunder may be an appropriate word for a weapon prone to blowing up in the operator's hand, it was most likely named for its loud, thunderous report (so, thunder gun).
The addition of these intricate parts encouraged developments in the shape of the weapon into something we could reasonably call a gun today. And because running around with a flaming wick was considered dangerous, development towards safer and more reliable locks proceeded.
The next advancement in the early 1500s, called the wheellock (or German lock), allowed the operator to carry the weapon loaded and fire it without an open flame dangling about. Using a piece of fool's gold (or iron pyrite, FeS2), a spark would be created by having the mineral snap against the flash pan's cover, pushing it out of the way, where it would fall on a rotating wheel and create a spark. The mechanism required the operator to fire the weapon gangsta-style so that the spark would actually ignite the powder in the flash pan (...and because medieval soldiers were gangsta, yo). It also took longer to fire than simply touching a match to the flash pan, but avoiding exploding oneself is probably preferable.
Further advancement brought us the familiar firestarting trick of snapping flint against steel. Coincidentally enough, this type of firing mechanism is referred to as a snaplock. The snaplock functioned much like the wheellock, except the flash pan had to be opened manually (and it was cheaper to produce with fewer moving parts). The flintlock (or French lock (or even English lock, depending on who's doing the shooting)) solved this problem, by combining the flash pan cover and the steel target for the flint into one simple L-shaped piece of steel called a frizzen. The simplicity of the flintlock lead to its dominance in weapon manufacturing for over three centuries until the implementation of percussion caps and primers in the 1860s.
Firearms: proving the versatility of Urine™ since 1326.
Firearms constitute a prime example of a technological race. Now, through the miracles of etymology, firearms might be either "flaming arms" or "arms that make use of fire". Although early firearms may have lead to plenty of conflagrated limbs (and torsos), the concept of arms here ties in with weapons (Old French armes from Latin arma (weapons)) and not arms (Old English earm from Latin armus (shoulder/upper arm)) - both of which are from the Indo-European root of ar- (too fit or join). Firearms have three separate components that influence their effectiveness: the gunpowder, the bullet, and the design. These three factors facilitate faster firing rates, more range, cheaper manufacturing, more mobility, and easier use - all of which were desired since firearms were invented.
General consensus lands the discovery of gunpowder sometime in the 800s in China. The use of recognizable guns in China dates to the 1100s (occasionally Arabic scholars argue this point). This often leads to the misconception that early gunpowder use focused on fireworks or the idea that the peace-loving Chinese couldn't find a use for it in wartime. The advent of gunpowder coincides with the downfall of the Tang dynasty (not to be confused with the Tang dynasty - easy way to differentiate them: only one is "orangey") and the emergence of the war-filled Wudai and Shiguo (Five Dynasties and Ten Kingdoms - wu (五) being five, shi (十) being ten) period and the warfare that accompanies more than ten states in an area less than half the size of modern China. Recognizable firearms developed during the subsequent Song dynasty (not to be confused with the, uh...song dynasty?), but despite the empire's relative stability, they weren't any strangers to war either. Turns out Chinese people were all about burninating the countryside.
Early gunpowder in all spheres of influence around the world (Chinese, Arabian, and European) lacked explosive force. Gunpowder functioned more as an incendiary in the early years of its adoption in each area. Before the arrival of black powder to European armories, naphtha (think pitch or oil) occasionally filled this role. Soldiers filled small ceramic or glass pots with the flammable liquid. When lit and thrown, the pot would break and the liquid would splash out and engulf an area in fire. Greek fire (or liquid fire) occasionally pops up alongside naphtha and pitch, but writers and transcribers often used the term carelessly (...let me paraphrase: "Holy crap, guys - we just got our asses handed to us by those Saracens. They totally used Greek fire on us."). The primary users of Greek fire consisted of the Byzantines (Greeks to the rest of Europe) and the myriad Arabian states around the Mediterranean.
Ingredients and Processing
Early incendiary mixtures consisted primarily of sulfur and charcoal. These mixtures wouldn't provide any explosive punch, but worked well enough for burning things (I'll avoid linking non-Trogdor again...for now). The third essential ingredient of medieval gunpowder was saltpeter (or potassium nitrate, KNO3). Mined from areas of China and India, Chinese alchemists had relatively easy access to the chemical. In Europe, saltpeter remained elusive until alchemists uncovered suitable amounts of the chemical in a more obscure form. The initial source for saltpeter in Europe came from
Here's another fun fact: saltpeter often functioned as a food preservative in the middle ages. Hopefully you're not reading this during lunch. The historical reenactors among you may be happy to know that the modern production of black powder does not rely on urine.
The basics of effective black powder had finally arrived by the early 14th century. Recipes proliferated, offering a variety of additives and proportions. The theoretically most effective ratio nears 75% saltpeter, 12% sulfur and 13% charcoal, however medieval chemists tended to use far less saltpeter (the hardest component to produce). Recipes usually took a form of basic ratios, such as 7 parts nitre, 5 part brimstone and 5 part charcoal. The standardization of black powder in a form close to its theoretical explosive limit didn't occur until the late 18th century.
Beyond the evolution of the formulas, methods of transporting and mixing the constituent components developed as well. Engineers discovered that mixing the saltpeter into the sulfur and charcoal just before firing resulted in a more reliable explosive force (although it tended to produce a lot of powder dust which was prone to ...exploding). Gunpowder was often milled as a function of the mixing process, providing relatively consistent powder (compared to mortar-and-pestle mixing, anyway).
One of those major lightbulb-esque developments came about in the 1420s with the advent of corning. Also known as granulating, corning is the process of wetting the gunpowder and forming it into kernels or grains. Initial liquids for corning were spirits, vinegar, and - that old nedieval standby - urine (apparently urine was medieval duct tape). Turns out water works best. It helps the saltpeter fill in the pores of the charcoal, allowing for a very consistent and more powerful propellant (it's also mentioned in Timeline, in case you're interested). Subsequent developments focused on standardizing grain size and providing additives to stabilize the powder (like the addition of graphite to avoid static discharges setting off the powder - if you go back in time bring a pencil).
Materials and Shapes
I would be remiss if I didn't point out that gunpowder developed into a general purpose explosive. Arriving sometime after 1250, firearms developed relatively quickly into projectile weapons but saw other uses as well. Black powder was used in civil engineering (mining, canal building, etc.), but proved especially dangerous due to the inconsistency of the powder and the lack of reliable fuses. Besiegers also used the explosive mixture to great effect. The most famous of these for us now are petards, which come to us with the phrase "he was hoisted by his own petard" as in "he was foiled by his own plan". But, like our contemporary association with the word firearms, the most common weapons to make use of fire were guns.
European ventures into projectile firearms began with cannons. Walter de Milemete's illustration in De Nobilitatibus, Sapientii, Et Prudentiis Regum (On the nobility, wisdom, and prudence of kings) features the first picture of a European cannon (on the left). This type of cannon and gonne were referred to as a vaso (ingeniously, Italian for vase). It is unlikely that de Milemete had actually seen the cannon fired at this point due to the ambiguity of the gun's carriage (despite how stable putting a cannon on a sawhorse may be).
During the early centuries of gunpowder use, siege engineers favored cast bronze cannons. Leaders preferred forged or cast iron guns for their economical price. Metallurgical processes of the period meant that it was easier to cast bronze (or much more expensive brass), than iron. The pliability of bronze also made it easier to notice when a bronze cannon had undergone too much stress due to a large bulge that would form. Iron cannons tended to just explode due to their brittleness (this tends to be the reason why the operators preferred bronze guns). Although generally stronger as a metal, iron metallurgy and refining processes slacked behind bronze. The picture in de Milemete's depiction is a cast piece, probably of bronze or brass (due to the color). Forged iron cannon consisted of a tube (often of wood covered in metal slats) held together by rings of iron. This lead to the idea of a gun barrel since the weapons originally resembled barrels (a cylinder of wood slats held together by iron rings).
The Part That's Supposed to Hurt People
Projectiles came in a variety of shapes and sizes. De Milemete's depicted cannon fired a dart (or shortened arrow). Early ammunition often came from rock, something that tended to be rather plentiful. Round bullets or shot became the preferred standard (as round as you can make a rock, anyway). As gun calibers became standardized, metal ammunition began to readily replace stone. Shot, darts, and bullets all had separate tactical applications and saw use as their production processes became more refined. This is the part where the euphemism of a cannon as a big hard tube with balls comes in.
Until the advent of the cartridge and primer, the vast majority of firearms were muzzle-loaded (loaded down the barrel and then rammed into position). Only very small cannon tended to be breech-loaded (loaded into the rear of the weapon, right into the firing position). Until machining caught up, these required a removable chamber held in place by a wedge to make them nearly air-tight for firing.
Bullet makers (generally lead casting metal smiths) spent time perfecting the art of shot dropping - the act creating spherical rounds for use in firearms. Molten lead would be dropped from the towers through a sieve so that it could cool into a nearly perfect sphere as it fell. This resulted in a vast assortment of shot towers dotting the countryside. This didn't work quite as well for artillery rounds, which needed to be cast and hand corrected. Spherical rounds remained the standard until the advent of inexpensive manufacturing processes for conical bullets such as the Minié ball in the early 1800s (never mind that something called a Minié ball wasn't a sphere). The combination of a rifled gun barrel and a more aerodynamic bullet provided greater accuracy and a more damaging impact.
Hand-held Firearms and Locks
Hand-held firearms took off when someone decided to make a cannon small enough to be held by one person (one crazy person, these things often exploded when firing after all). So it's no surprise that early handguns (or handgonnes or whichever phonetic spelling you prefer) looked like miniature cannons. Like their cannon-y counterparts, these weapons required their operators to insert a charge of powder, ammunition, and then light the whole thing off with an open flame. The flame would ignite the priming powder (held in a small receptacle called a flash pan), whose flame would travel through a touch hole and fire the weapon.
Turns out even people who ran at each other with big knives thought this was dangerous. In early artillery and handcannons, a linstock (essentially a big fork) held the match so that the weapon operator could try not to die when he fired his weapon. Eventually safer and more useful firing mechanisms (or locks) developed.
The first of these - the matchlock - appeared in the mid-15th century, and was essentially a burning wick that clamped down when a lever was pulled (igniting the flash pan, going through the touch hole and so on). The first lever, or trigger, came in the form of an S-shaped piece of metal called a serpentine (because an S is always a snake). The matchlock remained the primary firing mechanism during the early years of gunpowder, when arquebusiers ran around the battlefields of Europe with arquebuses. Well, without the running part, anyway.
Early firearms like the arquebus and the later, heavier musket often required soldiers to rest their weapon on a window sill or a Y-shaped fork in order to aim. The word arquebus, like its shotgun-like partner the blunderbuss, comes from Dutch. Arquebus - and its counterparts harquebus, hackbut, hagbut and the like - comes from the Old Dutch hākebusse and German hakenbuchse or hook gun (due to hooks that were originally cast onto the barrel so that it could connect to the Y-shaped firing stand). While blunder may be an appropriate word for a weapon prone to blowing up in the operator's hand, it was most likely named for its loud, thunderous report (so, thunder gun).
The addition of these intricate parts encouraged developments in the shape of the weapon into something we could reasonably call a gun today. And because running around with a flaming wick was considered dangerous, development towards safer and more reliable locks proceeded.
The next advancement in the early 1500s, called the wheellock (or German lock), allowed the operator to carry the weapon loaded and fire it without an open flame dangling about. Using a piece of fool's gold (or iron pyrite, FeS2), a spark would be created by having the mineral snap against the flash pan's cover, pushing it out of the way, where it would fall on a rotating wheel and create a spark. The mechanism required the operator to fire the weapon gangsta-style so that the spark would actually ignite the powder in the flash pan (...and because medieval soldiers were gangsta, yo). It also took longer to fire than simply touching a match to the flash pan, but avoiding exploding oneself is probably preferable.
Further advancement brought us the familiar firestarting trick of snapping flint against steel. Coincidentally enough, this type of firing mechanism is referred to as a snaplock. The snaplock functioned much like the wheellock, except the flash pan had to be opened manually (and it was cheaper to produce with fewer moving parts). The flintlock (or French lock (or even English lock, depending on who's doing the shooting)) solved this problem, by combining the flash pan cover and the steel target for the flint into one simple L-shaped piece of steel called a frizzen. The simplicity of the flintlock lead to its dominance in weapon manufacturing for over three centuries until the implementation of percussion caps and primers in the 1860s.
Firearms: proving the versatility of Urine™ since 1326.
Tuesday, January 13, 2009
Subways Without Sandwiches
Public transportation in the United States leaves much to be desired. Due to a confluence of factors, the US has a much higher reliance on personal motor vehicles than other nations. This is unfortunate not only because mass transit can be efficient and more environmentally friendly than congested motor vehicle traffic, but because mass transit networks often feature interesting infrastructure and engaging station designs. For me, subway systems are the pinnacle of quality mass transit, but there are many options available to the inquisitive urban planner within all of us.
Trams (or streetcars or cable cars or trolleys, whichever name you're going for) suffer the most from a culture infatuated with cars. Their rail placement in streets can potentially hamper traffic flow and cause accidents with unwary drivers, which makes them more undesirable in American cities despite numerous advantages. After the 1970s many tram networks in the US saw hard times as the economic boom of the 1980s decreased passenger counts - even Milwaukee used to have streetcars. Buses tend to be the mode of choice now, due to their ease of integration into traffic patterns and their comparatively low initial cost (compared to trams which require a hefty preliminary investment in infrastructure, but are supposed to be cheaper to maintain).
Another alternative is an independent transportation network that doesn't share infrastructure with private traffic. Building a separate network for mass transportation can be expensive, but also avoids traffic problems and allows the transport of more passengers per operator. Light rail falls into this category and includes trams with a dedicated rail network or elevated trains like Seattle's monorail or Chicago's 'L'.
This is also where subways come in.
Due to their sleek infrastructure and efficiency, I enjoy subways as aesthetic and functional components of modern transportation. Of course, subways have varied reputations and qualities around the world. As a good starting point for this we have Berlin.
Berlin's transportation network is a combination of buses, trams, light rail (S-Bahn, or Stadtbahn - city rail), and subways (U-Bahn, or Untergrundbahn - subterranean rail). All of these together are run by BVG - the Berliner Verkehrsbetriebe - the Berlin Transportation Company (as you might notice the acronym doesn't make any sense unless you spell with imaginary Gs; it's a holdout from the company's older name, the Berliner Verkehrs-Aktiengesellschaft - the Berlin Transportation Corporation). The BVG works in conjunction with the German national rail company, Deutsche Bahn, because the individual S-Bahn trains travel beyond Berlin and Brandenburg and run on track dedicated to intranational rail traffic. The Berlin-Brandenburg network is divided into three sections called - conveniently enough - A, B, and C.
For a majority of visitors to Berlin the A-section will suffice. It includes all stations within the ring created by the S41 and S42 S-Bahn trains (the well-defined octagon in the picture here). Colloquially the ring is called the S-Bahn Ring (conveniently enough) or Ringbahn (circle line) or the Hundekopf (dog's head) due to its true geographical layout (it's not really an octagonal shape). The B-section contains Berlin's suburbs, and a few tourist-worthy sites. The C-section is in fact not Berlin (or a Cesarean), but Brandenburg (the city of Berlin sits in an administrative island surrounded by the German state of Brandenburg). The C-section is also useful for visitors to Berlin, because it includes travel to Potsdam (see things like World War 2 or Prussian history for more information).
One of the pitfalls of such a vast network is the difficulty in determining your route (still, probably easier than the MTA's map for New York City). This results in stations with their own passenger help kiosks and shopping malls, like Friedrichstrasse which sees the convergence of six lines, a collection of trams and buses, and a regional train station. There's also the more famous Alexanderplatz which saw a bit of action in the Bourne Supremacy due to its size and complexity (the triple-layered Alexanderplatz subway station is pictured here with entry level (top) and two train platforms (middle and bottom). However, this vast number of stations also allows for great variation in station architecture and design, as well as great opportunities for exploration. The enjoyment of sights, sounds, and smells is subjective for each individual station, but it's hard to not like at least one.
There's quite a bit you can glean about a country just by examining some of its network layout and stations. For example: the Berlin subways are remarkably clean - this cleanliness being maintained by a legion of custodial staff who seem to constantly make rounds. Most stations have clearly posted electronic signs displaying the time until the next train's arrival, as well as timetables for other trains. Those that don't have fancy signs usually at least have clocks (which some may tell you exemplifies German punctuality - those people are liars). Most trains run on 10 or 15 minute intervals during peak hours, and usually arrive on time. Most transportation shuts down around 2AM and has a few hours of downtime before starting up for morning commutes.
Riding on the trains of Berlin is incredibly easy, due to the fact that there are no turnstiles or transit authority personnel checking tickets at every station entrance. Passengers purchase tickets (coincidentally) at ticket vending machines or at passenger help kiosks. The process is highly automated and simple if you know where you want to go. You might think this setup would be prone to passengers riding without a ticket - known as Schwarzfahren in colloquial German (or if you want to brush up on your German legalese: Beförderungserschleichung - essentially "avoidance of paying a fare") - but it seems most people are fine buying a 3€ ticket instead of paying a 50€ fine.
This results in a number of ticket controllers going from train to train checking tickets like Indiana Jones (except with fewer people getting thrown out of blimps). This means buying a day ticket gives you free reign over riding whatever you want, to and from wherever you want, whenever you want (...in a day), with minimal hassle. A day ticket for all sections is 6.50€, which isn't bad for having hundreds of destination options and the span of two major cities. Compare it to New York City's 1-day Fun Pass MetroCard at $7.50 (half of that price is probably paying for the ink to print the name of the card).
Just like in New York, that ticket will also net you some public entertainment in the form of people that cannot abstain from playing music in public.
Trams (or streetcars or cable cars or trolleys, whichever name you're going for) suffer the most from a culture infatuated with cars. Their rail placement in streets can potentially hamper traffic flow and cause accidents with unwary drivers, which makes them more undesirable in American cities despite numerous advantages. After the 1970s many tram networks in the US saw hard times as the economic boom of the 1980s decreased passenger counts - even Milwaukee used to have streetcars. Buses tend to be the mode of choice now, due to their ease of integration into traffic patterns and their comparatively low initial cost (compared to trams which require a hefty preliminary investment in infrastructure, but are supposed to be cheaper to maintain).
Another alternative is an independent transportation network that doesn't share infrastructure with private traffic. Building a separate network for mass transportation can be expensive, but also avoids traffic problems and allows the transport of more passengers per operator. Light rail falls into this category and includes trams with a dedicated rail network or elevated trains like Seattle's monorail or Chicago's 'L'.
This is also where subways come in.
Due to their sleek infrastructure and efficiency, I enjoy subways as aesthetic and functional components of modern transportation. Of course, subways have varied reputations and qualities around the world. As a good starting point for this we have Berlin.
Berlin's transportation network is a combination of buses, trams, light rail (S-Bahn, or Stadtbahn - city rail), and subways (U-Bahn, or Untergrundbahn - subterranean rail). All of these together are run by BVG - the Berliner Verkehrsbetriebe - the Berlin Transportation Company (as you might notice the acronym doesn't make any sense unless you spell with imaginary Gs; it's a holdout from the company's older name, the Berliner Verkehrs-Aktiengesellschaft - the Berlin Transportation Corporation). The BVG works in conjunction with the German national rail company, Deutsche Bahn, because the individual S-Bahn trains travel beyond Berlin and Brandenburg and run on track dedicated to intranational rail traffic. The Berlin-Brandenburg network is divided into three sections called - conveniently enough - A, B, and C.
For a majority of visitors to Berlin the A-section will suffice. It includes all stations within the ring created by the S41 and S42 S-Bahn trains (the well-defined octagon in the picture here). Colloquially the ring is called the S-Bahn Ring (conveniently enough) or Ringbahn (circle line) or the Hundekopf (dog's head) due to its true geographical layout (it's not really an octagonal shape). The B-section contains Berlin's suburbs, and a few tourist-worthy sites. The C-section is in fact not Berlin (or a Cesarean), but Brandenburg (the city of Berlin sits in an administrative island surrounded by the German state of Brandenburg). The C-section is also useful for visitors to Berlin, because it includes travel to Potsdam (see things like World War 2 or Prussian history for more information).
One of the pitfalls of such a vast network is the difficulty in determining your route (still, probably easier than the MTA's map for New York City). This results in stations with their own passenger help kiosks and shopping malls, like Friedrichstrasse which sees the convergence of six lines, a collection of trams and buses, and a regional train station. There's also the more famous Alexanderplatz which saw a bit of action in the Bourne Supremacy due to its size and complexity (the triple-layered Alexanderplatz subway station is pictured here with entry level (top) and two train platforms (middle and bottom). However, this vast number of stations also allows for great variation in station architecture and design, as well as great opportunities for exploration. The enjoyment of sights, sounds, and smells is subjective for each individual station, but it's hard to not like at least one.
There's quite a bit you can glean about a country just by examining some of its network layout and stations. For example: the Berlin subways are remarkably clean - this cleanliness being maintained by a legion of custodial staff who seem to constantly make rounds. Most stations have clearly posted electronic signs displaying the time until the next train's arrival, as well as timetables for other trains. Those that don't have fancy signs usually at least have clocks (which some may tell you exemplifies German punctuality - those people are liars). Most trains run on 10 or 15 minute intervals during peak hours, and usually arrive on time. Most transportation shuts down around 2AM and has a few hours of downtime before starting up for morning commutes.
Riding on the trains of Berlin is incredibly easy, due to the fact that there are no turnstiles or transit authority personnel checking tickets at every station entrance. Passengers purchase tickets (coincidentally) at ticket vending machines or at passenger help kiosks. The process is highly automated and simple if you know where you want to go. You might think this setup would be prone to passengers riding without a ticket - known as Schwarzfahren in colloquial German (or if you want to brush up on your German legalese: Beförderungserschleichung - essentially "avoidance of paying a fare") - but it seems most people are fine buying a 3€ ticket instead of paying a 50€ fine.
This results in a number of ticket controllers going from train to train checking tickets like Indiana Jones (except with fewer people getting thrown out of blimps). This means buying a day ticket gives you free reign over riding whatever you want, to and from wherever you want, whenever you want (...in a day), with minimal hassle. A day ticket for all sections is 6.50€, which isn't bad for having hundreds of destination options and the span of two major cities. Compare it to New York City's 1-day Fun Pass MetroCard at $7.50 (half of that price is probably paying for the ink to print the name of the card).
Just like in New York, that ticket will also net you some public entertainment in the form of people that cannot abstain from playing music in public.
Thursday, January 8, 2009
English - Enemy of the State
Americans - beyond a small enclave of Texans - will not need to listen to George W. Bush speak after January 20th, 2009 (although I'm sure quite a few haven't been listening recently, anyway). His mastery of the English language has brought us gems like "rarely is the question asked 'is our children learning?'", along with a rendition of Sunday Bloody Sunday. To be honest, his gaffes are minor compared to the illustrious James Danforth Quayle, who served as George H. W. Bush's vice president (maybe that's where Bush Jr. picked up some of those speech habits...). Many of the comedic, misspoken quotes ascribed to Bush are, in fact, Quaylisms. I suppose these sorts of things are going to crop up every once and a while if half the populace is voting for a guy they could "have a beer with."
But let's not be too hasty in judging their eloquence and use of English. Although it is relatively easy to be understood in English, the language is difficult to master. So, I'd like to point out a few factoids and interesting usage nuances in the English language. First up:
Factoid
Contrary to its common usage in news broadcasting, a factoid is not a "little fact". The suffix -oid is not used to form diminutives. That is: adding -oid to the end of a word does not make it tiny or small. The suffix "-oid" means "resembling" (often "imperfectly resembling"). Proper diminutive suffixes would be -ette or -let, like cigarette or piglet (so, factette or factlet).
Compare factoid with cuboid (resembling, but not quite a cube) and humanoid (resembling, but not quite a human) and the problem becomes immediately apparent. A factoid resembles a fact, but is not quite a fact. The term was coined in the 1970s referring to a bit of misinformation that was repeated so often that it appeared to be factual. Apparently trivia is a much too trivial word for many news anchors.
Discreet and Discrete
Not a factoid: the robots in Futurama almost exclusively have square pupils (Hedonismbot and others have circular ones on occasion).
That's right, I jumped right into the homonyms. While less common than their/there/they're, to/too/two, and for/fore/four, discrete and discreet are exciting words that even Microsoft Word won't fix. Discrete means separate or distinct; discreet means prudent or unobtrusive (the more usual meaning). Just to confuse the Romans in the crowd: both words derive from the Latin descretus (separated/discerned). Discreet entered common speech and developed into 'prudent', while discrete stuck with the more intellectual groups that used Latin and retained something closer to its original meaning.
Ado About Adieu
Let us discreetly separate these two out into their discrete uses.
Adieu, like many English words, is stolen from French (technically a parent language of modern French - Occitan...but let's see you try to find a school that teaches it). Its original meaning - now buried under a husk of modernism and evolved connotations - was "(I/we commend you) to God". Convention in English was that when one party departed from another they would say "adieu" to those that remained, while those that remained would say "farewell" to the departing group ("fare thou well" if you want to get all Middle English on me). I don't think that convention lasted all that long, but it may be important to note that English was not the primary language of much of the the nobility in England until around at least the 1360s. Now "adieu" just functions as a fancy way of saying goodbye.
Adieu exchanges positions with ado quite frequently - particularly in "without further ado". Ado is a fancy northern Scandinavian contraction for at do or "to do". It has also developed more into a noun meaning "trouble" or "fuss". "Without further ado" announces that the speaker has said everything necessary and the ceremony/play/whatever may continue unhindered.
Without further stalling for time...
Going Farther (or Further) with Farther (and Further)
The most subjective of the group here, some people don't bother acknowledging a difference between "farther" and "further". Technically farther should only refer to physical or spatial distance ("That field is farther away than this one."), while further should refer to the extent or degree of something ("Further understanding will require a lot of work."). That said, the use of either is fine for most people...unless they are a grammar Nazi - in which case you'll probably want to avoid using affect and effect, too.
Just trying to effect understanding of English.
But let's not be too hasty in judging their eloquence and use of English. Although it is relatively easy to be understood in English, the language is difficult to master. So, I'd like to point out a few factoids and interesting usage nuances in the English language. First up:
Factoid
Contrary to its common usage in news broadcasting, a factoid is not a "little fact". The suffix -oid is not used to form diminutives. That is: adding -oid to the end of a word does not make it tiny or small. The suffix "-oid" means "resembling" (often "imperfectly resembling"). Proper diminutive suffixes would be -ette or -let, like cigarette or piglet (so, factette or factlet).
Compare factoid with cuboid (resembling, but not quite a cube) and humanoid (resembling, but not quite a human) and the problem becomes immediately apparent. A factoid resembles a fact, but is not quite a fact. The term was coined in the 1970s referring to a bit of misinformation that was repeated so often that it appeared to be factual. Apparently trivia is a much too trivial word for many news anchors.
Discreet and Discrete
Not a factoid: the robots in Futurama almost exclusively have square pupils (Hedonismbot and others have circular ones on occasion).
That's right, I jumped right into the homonyms. While less common than their/there/they're, to/too/two, and for/fore/four, discrete and discreet are exciting words that even Microsoft Word won't fix. Discrete means separate or distinct; discreet means prudent or unobtrusive (the more usual meaning). Just to confuse the Romans in the crowd: both words derive from the Latin descretus (separated/discerned). Discreet entered common speech and developed into 'prudent', while discrete stuck with the more intellectual groups that used Latin and retained something closer to its original meaning.
Ado About Adieu
Let us discreetly separate these two out into their discrete uses.
Adieu, like many English words, is stolen from French (technically a parent language of modern French - Occitan...but let's see you try to find a school that teaches it). Its original meaning - now buried under a husk of modernism and evolved connotations - was "(I/we commend you) to God". Convention in English was that when one party departed from another they would say "adieu" to those that remained, while those that remained would say "farewell" to the departing group ("fare thou well" if you want to get all Middle English on me). I don't think that convention lasted all that long, but it may be important to note that English was not the primary language of much of the the nobility in England until around at least the 1360s. Now "adieu" just functions as a fancy way of saying goodbye.
Adieu exchanges positions with ado quite frequently - particularly in "without further ado". Ado is a fancy northern Scandinavian contraction for at do or "to do". It has also developed more into a noun meaning "trouble" or "fuss". "Without further ado" announces that the speaker has said everything necessary and the ceremony/play/whatever may continue unhindered.
Without further stalling for time...
Going Farther (or Further) with Farther (and Further)
The most subjective of the group here, some people don't bother acknowledging a difference between "farther" and "further". Technically farther should only refer to physical or spatial distance ("That field is farther away than this one."), while further should refer to the extent or degree of something ("Further understanding will require a lot of work."). That said, the use of either is fine for most people...unless they are a grammar Nazi - in which case you'll probably want to avoid using affect and effect, too.
Just trying to effect understanding of English.
Subscribe to:
Posts (Atom)