Wednesday, October 31, 2007

Significant

Hmm. I may have a bit of a block on this. Let's just try to bull ahead then, and see what happens.

Over on Mark Thoma's Economist's View blog, there were a couple of discussions about a, well, let's call it a "raging debate," albeit one in fairly slow motion. The backstory papers are here:

McCloskey and Ziliak, "The Standard Error of Regressions," Journal of Economic Literature 1996.

Ziliak and McCloskey, "Size Matters: The Standard Error of Regressions in the American Economic Review," Journal of Socio-Economics 2004.

Hoover and Siegler, "Sound and Fury: McCloskey and Significance Testing in Economics," Journal of Economic Methodology, 2008.

McCloskey and Ziliak, "Signifying Nothing: Reply to Hoover and Siegler."

These papers were pulled from an entry on "Significance Testing in Economics" by Andrew Gelman, and there followed two discussions at Economist's View:

"Tests of Statistical Significance in Economics" and later, a response by one of the main players (McCloskey), followed by my arguing with a poster named notsneaky. That led to my essay, "The Authority of Science."

Okay, you are allowed to say, "Yeesh."

So let me boil down some of this. McCloskey published a book in 1985, entitled, The Rhetoric of Economics, in which she argued that the term "Statistical Significance" occupied a pernicious position in economics, and some other sciences. The 1996 paper by McCloskey and Ziliak (M&Z) continued this argument, and the 2004 paper documented a quantitative method for illustrating the misuse of statistics that derived from what was, basically, an error in rhetoric, the connecting the word "significant" to certain sorts of statistical tests. The forthcoming (to be published in 2008, the link is to a draft) paper by Hoover and Siegler (H&S) finally rises to the bait, and presents a no-holds-barred critique of M&Z. Then M&Z reply, etc.

Any of my readers who managed to slog through my criticisms of the use of the word "rent" (See "Playing the Rent" and subsequent essays) in economics (as in "rent-seeking behavior"), will understand that I start off on the side of the rhetoriticians. When a technical subject uses a word in a special, technical sense that is substantially different from its common language use, there is trouble to be had. "Significant" carries the meaning of "important," or "substantial" around with it, but something that is "statistically significant" is simply something that is statistically different from the "null hypothesis" at some level of probability. Often, that level of probability is arbitrarily set to a value like 95%, or two standard deviations, two sigma, which is about 98% for a normal distribution.

(I'll note here that in statistical sampling, one usually uses something like the t-distribution, which only turns into the normal distribution when the number of samples is infinite, so it adds additional uncertainty for the size of the sample. The t-distribution also assumes that the underlying distribution being sampled is normal, which is rarely a good assumption at the levels of reliability that are being demanded, so the assumption train has run off the rails pretty early on).

But some differences make no difference. Given precise enough measurements, one can certain establish that one purchased pound of ground beef is actually one and one thousandths of a pound, but no one who purchased it would feel that they were getting a better deal than if they'd gotten a package that was one thousandth of a pound light. We just don't care about that small a difference; some of the beef is going to stick to the package.

I saw something written recently that referred to something as "statistically reliable," and on the face of it, that would be a much better phrase than "statistically significant," and I will use it hereafter, except when writing about the misused phrase, which I will put in quotes.

So, okay, "statistically significant" is not necessarily "significant." Furthermore, everyone agrees that this is so. But one disagreement is whether or not everyone acts as if this were so. And that is where M&Z's second criticism comes in: that many economics journals (plus some other sciences) simply reject any paper that does not show results at greater than 95% reliability, i.e. the results must be "statistically significant." M&Z say outright that the level of reliability should adapt to the actual importance of the question at hand.

The flip side of this is that, in presenting their work, authors sometimes use "statistically significant" as if it really mean "significant" or "important," rather than just reliable.

Alternately, one can simply report the reliability statistic, the so-called "p value," which is a measure of how likely the result is to have come about simply because of sampling error. I have, for example, published results with p values of 10%, meaning that there was one chance in 10 of the result being just coincidence. I've seen some other p values that were much lower, and those are usually given in the spirit of "there might be something here worth knowing, so maybe someone should do some further work."

In fact, this giving lower p values, or using error bars at the single sigma level, is fairly standard practice is some sciences, like physics, chemistry, geology, and so forth. Engineers usually present things that way as well. On the other hand, the vague use of "significant" that M&Z criticize is often used in social sciences other than economics, e.g. psychology and sociology, as well as some of the biological sciences, including especially, medicine.

It's in medicine where all this begins to get a tad creepy. In one of their papers, M&Z refer to a study (of small doses of aspirin on cardiovascular diseases like heart attack and stroke) as having been cancelled, for ethical reasons, before the results reached "statistical significance." "Ha!" exclaim H&S (I am paraphrasing for dramatic effect). "You didn't read the study, merely a comment on it from elsewhere! In fact, when the study was terminated, the aspirin was found to be beneficial to myocardial infarction (both lethal and non-lethal) at the level of p=0.00001, well past the level of statistical significance! It was only stroke deaths and total mortality that had not reached the level of p=0.05!"

Well, that would surely score points in a high school debate, but let's unpack that argument a bit. M&Z say that the phrase "statistically significant" is used as a filter for results, and what do H&S do? They concentrate on the results that were found to be statistically reliable at a high level. How about the stroke deaths? What was the p value? H&S do not even mention it.

(As an aside, I will note that the very concept of a p value of 0.00001 is pretty ridiculous. Here we have an example of the concept of statistical reliability swamping actual reliability. The probability of any distribution perfectly meeting the underlying statistical assumptions of the t-distrubution is indistinguishable from zero, and the likelihood of some other confounding factor intervening at a level of more than once per hundred thousand is nigh onto one).

Furthermore, H&S use a little example involving an accidental coincidence of jellybeans seeming to cure migraines to show why one must use "statistical significance." Then, when discussing the aspirin study, they invoke the jellybean example. On the face of it, this looks like they are equating migraines with heart attacks and strokes, again, completely ignoring the context in which samples are taken, in order to focus on the statistics. In many ways, it looks like H&S provide more in the way of confirming examples of M&Z's hypothesis than good arguments against it.

Also consider what H&S are saying about the aspirin study, that there was a period of time when members of the control group were dying, when the statistical reliability of the medication had been demonstrated, but the study had yet to be terminated. Possibly the study did not have an ongoing analysis, and depended upon certain predetermined analysis and decision points. But how would such points be selected? By estimating how long it would take for the numbers to be "statistically significant?"

Some studies definitely do use ongoing statistical analyses. Are there really studies where a medication has be been shown to be life-saving, to a statistical reliability of 90%, where patients are still dying while the analysts are waiting for the numbers to exceed 95%? How about cases where medications are found to have lethal side effects, but remain on the market until the evidence exceeds "statistical significance?"

The blood runs a little cold at that, doesn't it?

Tuesday, October 30, 2007

The Duck as Spirit Guide




I’m not sure when I first began to take the duck as my animal spirit guide.

Of course the Carl Barks duck comics informed my childhood, as did Daffy, Bugs, and Elmer. Sure, I saw portions of Duck Soup on afternoon television, and later in its entirety, and “Why a Duck?” resonates, as does the duck that comes down and gives you a hundred dollars (or as I later joked, the Groucho Marxist duck that comes down and give everybody a hundred dollars, and hey, wait a second, didn’t Congress just talk about doing that?).

But if I had to pick a point in time, it would probably either be the first time I saw the Dillard’s album, Tribute to the American Duck.




Or when Howard first walked in out of the swamp in Man Thing comics, later to star in his own magazine, tempting many a comics speculator to folly.



By the time I got to Berkeley, I was committed to the idea that you could not go wrong by following the ducks. This was sorely tested by the discovery of some old issues of a comic called Super Duck that had been published by Archie Comics in the late 1950s, but in the spirit of true believers everywhere, I soon concluded that he was neither super nor a duck.

So, crisis of faith averted, when my apartment mate at the time, Steve Ellner pointed out a show that was at U.C. Berkeley by some comedy group named Duck’s Breathe Mystery Theater, I was bound to attend.




A little later, I commissioned Dan Stefan to do a faux movie poster: Duck Savage. It’s too bad I don’t have a scan of it (or permission to show such a scan), because it’s really choice quality stuff, as is Wendy Lindboe’s piece, Drakula.

And, of course, I have various decoys, little sculptures, etc. of ducks, generally in some sort of tableau with a like number of Godzilla’s, plus the occasional rooster, just to keep it real.

Like I say, I think that you can do a lot worse than following ducks. Oh, and the real ones have a really cool “Quack!”

Monday, October 29, 2007

Nashville

As I drive from your pearly gates
I realize that I just can't stay
All those mountains, they kept you locked inside
And hid the truth from my slighted eyes…

I fell on my knees to kiss your land
But you are so far down
And I can't even see to stand
In Nashville
You forgot the human race
You see with half a mind
What colors hide the face
Nashville
I'd like to know your fate
I'd like to stay a while
But I've seen your lowered state…


Tennessee, like Caesar's Gaul, is divided into three parts.

East Tennessee is Appalachia; it has two real cities, Knoxville and Chattanooga. Chattanooga has a north Georgia feel to it, while Knoxville was historically the home to the Tennessee Republicans, who ran the state in the Reconstruction years, to be beaten back once the Southern Democrats wrested control of the region from the Damn Yankees, and their fellow travelers, carpetbaggers, in the jargon of resentment. The University of Tennessee, Knoxville is a major presence in the city; some sections are a bit like a southern University Town, perceived by the outlying region as horribly liberal, but really staid and conservative, just with different things to conserve.

West Tennessee is Mississippi River land, and the magnet city for West Tennessee is Memphis, which also serves as the cultural capitol of Mississippi. Cross the Tennessee river heading west towards Memphis and you have entered the Deep South, but I have little experience with the Tennessee version. I've spent far more time in south Georgia than I have in West Tennessee, and that's another kind of Deep South entirely.

There was a time, when I was still strongly connected to the lands of my youth, when I could place a Southerner within about 200 miles of their home by their accent, a South Carolina accent being so different from a North Alabama accent as to be apparent within a sentence or two. It's been a long time and I'm no longer calibrated for playing 'enry 'iggins, and besides, there has been so much southern migration that I doubt the old standards apply.

The third part of Tennessee is Middle Tennessee, and its capitol is also the state capitol, Nashville.

I was born and raised in Middle Tennessee, in Davidson County, and the county government merged with the Nashville city government when I was in high school. We lived primarily in Donelson, a suburb of Nashville, and did not leave until I was 18, with two minor exceptions. Our family lived for about a year in, and near, Greensboro, North Carolina. I started school in a small town outside of Greensboro named Summerfield. We moved back to Nashville two months after I started school, so I was the "new kid" in school in a town that I'd lived in for years.

The last time I was back in Nashville, we discovered that Donelson Elementary School, where I'd spent the first and second grade, had been turned into a nursing home. I briefly flashed on the idea of some fellow who'd attended Donelson Elementary now in that home, and how he might wake up at night terrified that there was going to be a test the next day, and he hadn't studied. Would it come as a relief to discover that he was merely old?

A new elementary school greeted us for the third grade; this was during the feverish construction during the Baby Boom, after all. So I attended Hickman elementary until about six weeks into the sixth grade, when the family moved to Russellville, Kentucky, where we stayed a little less than a year, because the bowling alley that my Dad had been hired to manage, failed. Still, the experience gave him something to put on his resume, so he wound up getting another job as an assistant manager at an alley in Donelson, then got a better job at Pla-Mor Lanes, near a 100 Oaks Shopping Mall off I-65, pretty much in Nashville proper, but not downtown (note the shopping mall). That job lasted until after I'd graduated high school. The family moved to Illinois while I was in college, because my grandmother got sick and (for good reason) Dad did not trust the few remaining relatives to take care of her.

So all of this is basic suburban white boy growing up, southern edition. Except for the wanderlust, if that's what it can be called.

I explored practically every neighborhood in Donelson, on foot, before I was 14. Sometimes I had the excuse of selling candy and cookies, to pay for going to YMCA camp. Sometimes it was to visit friends. But mostly it was just exploring the world. I walked from one end of Donelson to another, a fetch of about 2 miles, maybe 4 or 5 square miles worth of territory to memorize.

When I was eight, my parents gave me the choice of continuing with the Boy Scouts in Donelson, or going to the downtown Nashville YMCA three times a week. It was no contest, really. The Y had a pool, a gym, and it was downtown.

There were probably some rules about where I was allowed to go downtown. Right. That worked. The only thing that really limited my wanderings was the time limits; I did have to catch the bus to get back to Donelson at a reasonable hour. Once I extended the time limit, calling my folks to get permission to go to a movie. I forgot to mention that it was a double feature. By the time I got home my mother was literally worried sick, nearly to the throwing up and needing sedatives to unclench her stomach part. I was careful not to worry her like that again, but it was many years before I realized the sheer courage it took, on an ongoing basis, for her to let me make those trips downtown, at such a young age.

In high school, as a member of the Donelson High Forensics Club (more nerd points for me), I went on numerous trips to other schools, public speaking venues, etc. By then I had a driver's license, and I tended to date girls from other high schools, often from other forensics clubs, because they were smart, verbal, and not from my high school. Oftentimes they were considerably more upscale than I was, so I saw how the other 5% lived pretty often. On the flip side, at the YMCA, then later when I was doing day labor the summer after my freshman year in college, I saw the bottom 5% pretty frequently, too.

The major industries of Nashville are banking and insurance, religious publishing (that Bible in your hotel room had a good chance of being printed in Nashville), and country music/show business, in that order, with the music biz being a distant third. But it gets all the publicity, of course, partly because it runs in part on publicity. Then there's the glamour bit.

There was recently a short-lived "reality show" called "Nashville." I tried watching it once, then fell back and tried to sample it by putting it on a channel surfing rotation. Even that was unbearable. Okay, it's not fair to call it out; one major type of reality show is about putting hot twenty-somethings in some sort of group arrangement and generating whatever salacious interest can be generated. In "Nashville," the group arrangement came from the bunch trying to "make it in the music business," so it was trying to cross pollinate from the other sort of reality show, Ted Mack's…American Idol. The talent part of it was a dismal failure as well.

Robert Altman's "Nashville" suffered from a related definitional problem, though Altman, being a semi-genius, managed to make a decent movie. It had almost nothing to do with Nashville, of course, and everything to do with Los Angeles. Show biz people think that all show biz is like it is in Los Angeles. Maybe it is, but that isn't what Nashville is about.

Nashville, the reality show never got anywhere near to Nashville the city. Altman's Nashville managed one actual touch that rang true: practically every character in the film was in church on Sunday morning. Only the real outsiders were somewhere else.

I stopped going to church when I was 14, after "membership training" in the Methodist Church, and the whole process that was supposed to make me a member for life. Bless my folks for understanding that after all that, it's okay to say, "no more."

I haven't lived in Nashville for years. Maybe the place I knew is gone, but every time I've been back it looks like it's still there. I wonder if anyone sees it but me?


Now I'm leaving - I've got all these debts to pay
You know we all have our dues - I'll pay 'em some other place
I'd never ask that you pay me back
We all arrive with more - I left with less than I had

Your town is made for people passing through
A last chance for a cause I thought I knew
But, Nashville, you tell me what you're gonna do
With all your Southern style - it'll never pull you through
Nashville, I can't place no blame
But if you forget my face I'll never call your name again
(No, never again, never again)
--"Nashville," Amy Ray, The Indigo Girls

Sunday, October 28, 2007

The Authority of Science

In Conservatives Without Conscience, John Dean relates a description of Authoritarian Followers, who form the greatest mass of authoritarian psychology. There are three basic underpinnings of authoritarian psychology:

Submission to Authority
Aggressive Support for Authority
Conventionality

It's easy to go deeper, of course, to fear of the unknown, a desire for stability, and an intolerance for ambiguity, all related, all consistent. Similarly, it's easy to find traits that the three basic traits can lead to. Sara Robinson on Orcinus blog has done some heavy lifting in condensing Dean's arguments to clarify the case, especially in her "Cracks in the Wall" series, that begins here.

We tend to speak disparagingly of authoritarians; the word itself is pejorative, as evidenced by the fact that people whom you or I would call authoritarian will simultaneously point the finger at us.

To some extent, they would have a point, in that Rightful Authority does exist, insofar as there are rules and laws of both man and the natural world, and there are networks of people (lawmakers, courts, and lawyers on the one hand, scientists, engineers, and technicians on the other), whose authority within their own domain matters. Courts can send you to jail, and your plumber can clear the water line.

Some of the examples of rightful authority are trivial, and some are deeply held, supported aggressively by their constituents. If you are visiting a construction site you damn well do what the site manager tells you to do. If you don't, someone could get hurt. And if someone you are with doesn't follow the rules, you help keep him in line.

The fact is that most conventional wisdom is, in fact, wisdom, just as each of us makes a thousand decisions every day, almost every one of them right, and almost every one of the wrong ones gets immediately corrected. We don't fall down the stairs, and we don't turn left when we should have turned right on the way to work.

I doubt that anyone who has read even a small fraction of what I've written who would take me as a servant of conventionality, and I score very, very low on tests that measure authoritarian tendencies. But I seldom break rules for the sake of breaking rules (well, okay, sometimes I break a rule just to see what will happen, but come on now, we've got to have a little fun). I do believe in process, and I do believe in rigor.

So I want it understood that my intention here is descriptive more than it is proscriptive. It's part of a fight I've been having (or so I come to understand) for a long time now. I've come to understand that it's a necessary fight, a perpetual fight, and a fair fight.

When I was starting out in the then bright shiny new field of computer simulation modeling, I quickly came to believe that the models we were using were complicated machines that needed an experienced and talented operator. There were a lot of moving parts, and a lot of things that could screw up. Moreover, you needed to have a feel for what you were doing; the sort of pattern recognition that we call "intuition" mattered.

You could not, as the saying goes, "Just turn the crank."

In fact, the people who were paying the bills wanted exactly that. They wanted something with a crank that could be turned to grind out the right answers. I've come to understand that there is a good reason for wanting that, incidentally. If you just turn the crank, if you don't know exactly what's going on, and what buttons to push to twist the results, then it looks like it's less likely that someone will cheat and deliberately give results to advance a particular cause.

But just because there is a good reason for wanting something doesn't make it ultimately work out that way. If you have several groups doing the work and you pick and choose whichever one accidentally gives you what you are looking for, then you've accomplished essentially the same thing as if you had someone twisting the knobs. If there is an accepted protocol, you use whatever flexibility exists in the protocol to give the results closest to what you want. You "game the system," to use the current phrasing.

And there you have the argument for the most inflexible possible system. It is less likely to be gamed.

Science (and scholarship, and the law, and tradition) exists in a constant tension between protocol and choice, between methodology and judgment. Some people, because of whatever combination of their experience and natural tendencies, wind up being strong adherents to, and defenders of, protocol and methodology. Others emphasize insight and judgment. As a strong proponent of insight and judgment, let me stipulate the importance of protocol and methodology. The methodology and protocols of science represent a tradition of judgment and a condensation of insight that cannot be replaced by a single individual, no matter how smart and clever.

Still, we must use judgment and insight to examine methodology and protocol, on an ongoing basis. Otherwise, the process of science becomes stagnant, at best in an endless loop of endlessly reinventing the wheel, just maybe this year in a different color.

All of this is basically an epilog and an introduction. There is a very interesting debate that has flared up in statistical economics, about the phrase "statistically significant" what it means as a descriptor and as a protocol. There is also an element of another phenomenon that I've remarked upon before: the use of a technical term that is substantively different from its meaning in ordinary speech, which means that it can have a pernicious effect on thought.

And, of course, I had an argument about it. But I'll write about that in a later essay.

Saturday, October 27, 2007

Meanwhile...

Meanwhile, deep in the heart of the Ural Mountains, the secret Great Communist Conspiracy powers up their iinsidious Tesla Coil to send another burst of Dark Chaos at the heart of the One Nation Under God. The conspirators had already achieved their first purpose, of lulling the Western Powers into believing that Communism was dead, by pretending to relinquish control over the vast Soviet Empire. But the wiley Commies were never interested in mere empire; they required world domination. So they banked their fires in Mother Russia, and intensified their efforts to put across the Greatest Hoax of All: the Global Warming Conspiracy.

Now their plans were accelerating. Dark Prince Gore, having failed to acquire the job of President of the U.S. (only the heroic efforts of the Godly men--and woman--of the Supreme Court had managed to thwart that scheme!), forged a new alliance with Soros, that Elder of Zion, first to create a propaganda masterpiece of a movie, and then, through their Swedish dupes to steal a Nobel Prize.

Now the Tesla Coil was turned to the task of igniting fires on the West Coast of the United States. Significantly, the surreptitious heat ray was aimed at parts of the State of California that understood that the Communist Conspiracy was not dead, that its goal was to wreck the U.S. economy by a combination of air quality regulations and socialized medicine, because once those were under its insidious control, the Nation itself would soon fall....

Friday, October 26, 2007

I Have Officially Scared Myself

The folks at The World's Fair call it a meme, but it's really just a game:

...the premise is that you will attempt to find 5 statements, which if you were to type into google (preferably google.com, but we'll take the other country specific ones if need be), you'll find that you are returned with your blog as the number one hit.

This takes a bit of effort since finding these statements takes a little trial and error, but I'm going to guess that this meme might yield some interesting insight on the blog in question.

To make it easier, we'll let you use a search statement enclosed in quotations - this is just to increase your chances of turning up as number one, but if you happen to have a website with the awesome traffic to command the same statement without quotations, then flaunt it baby! Of course, once you find your 5 statements, pass the meme on to others.

Okay, this blog I have here is not a high traffic site; I'm pretty sure I have yet to hit triple digits for a single day. But...

Killus gives my sff.net web page. But the following single words bring you here:

  • trademarksism
  • semi-karma
  • vertkrieg
  • vuutle

But wait! There's more! The following search strings, sans quotes, also come up #1 on Google:

  • brittle strategies
  • John Galt is a slan
  • fox hollow cafe lena
  • nonrepresentational art pollack
  • nonrepresentational art richard powers
  • creepy little smile
  • death rays and disintegrators
  • playing the rent
  • dead catfish under the drivers seat
  • rocket boys meet radioactive boy scout
  • max headroom stirner
  • magic market fairy dust
  • phony tough crazy brave alsop
  • privation morality
  • tihs not typo
  • dorothy slave girl of oz
  • knurdly lasers
  • slan a critique
These need quotes:
  • "black swan bashing"
  • "son of on the road"
  • "moral equivalent of socialism"
  • "why barry goldwater lost tennessee"

I'm giving up now. And maybe going upstairs to pull the covers over my head.

Reasons to be Cheerful

One



Two


Three

Thursday, October 25, 2007

Alcohol

Forget the caffe latte, screw the raspberry iced tea
A Malibu and Coke for you, a G&T for me
Alcohol, Your songs resolve like
my life never will
When someone else is picking up the bill
I love you more than I did the week before
I discovered alcohol
O Alcohol, would you please forgive me?
For while I cannot love myself
I’ll use something else
–”Alcohol,” Barenaked Ladies

If you take a molecule of the simplest hydrocarbon, methane, remove one of its four hydrogen atoms and replace it with a hydroxyl group (-OH), you get methanol, the simplest alcohol. The hydrogen at the end of the hydroxyl is more “labile” than the others, so it’s relatively easy for methanol to lose it. That leaves the oxygen with a very friendly bond dangling, and it likes to hook up with its nearby carbon buddy, to form what is called the carbonyl bond. Since carbon is pretty firmly quadrigamous, it has to give up something, and since the carbon already has three hydrogens, one of them just has to go. Essentially, the methanol gives up two hydrogens, enough for a hydrogen molecule. In smog photochemistry, the hydrogens go one at a time, as part of a process involving “hydrogen centered radicals.”


This result of dehydrogenating methanol yields formaldehyde, the simplest aldehyde. “Aldehyde” is, in fact, a contraction of “alcohol de-hydrogenated.” Rounding out the “simplest of its kind” bestiary, is formic acid, the simplest carboxylic acid. It has an alcohol group (-OH), and a carbonyl group (C=O), and a single, lonely hydrogen remaining with the carbon, though it has another hydrogen in the hydroxyl group, which it easily loses in solution, giving formic acid its acidic character.

Formic acid is pretty nasty stuff; ants make it and it’s what they use to sting you with. In fact, formic acid was first isolated by distilling dead ants. Formic acid is specifically toxic to the optic nerve, so the ingestion of formic acid, or a formic acid precursor, can cause blindness.

Methanol is a formic acid precursor, biologically, so formic acid is responsible for most of methanol’s bad effects when ingested. The enzymes that turn methanol into formic acid are cross-potentiated by ethanol, so ethanol is an antidote to methanol poisoning. The metabolization of ethanol substitutes for the metabolization of methanol, giving time for the methanol to be excreted via lungs or urine.

Ethanol is our old friend grain alcohol, the active ingredient in the demon rum. Bootleg ethanol during Prohibition and at other times was sometimes cut with methanol, to give it “more kick” or simply because denatured alcohol is cheap. “Denatured alcohol” is usually made unsafe for consumption by the addition of methanol. There’s an urban legend that says you can make denatured alcohol fit for drinking by filtering it through pumpernickel. It’s not an urban legend that people have tried this, of course; clearly people have tried it. The question is whether or not it does any good.

Ethanol has two carbons to methanol’s one; a way of looking at the setup is that if you replace one of methanol’s hyrdrogens with a methyl group (-CH3), you get ethanol. The same thing happens for formaldehyde/acetaldehyde and formic/acetic acid. However, acetaldehyde and acetic acid (vinegar) are much more biologically benign. Acetaldehyde forms a trimer in the presence of acid catalysts such as sulphuric or phosphoric acid, to make paraldehyde, a pharmacological sedative.

But it is ethanol, not paraldehyde that most people are familiar with. Simply stated, ethanol ingestion gets you drunk. It gets you blotto, looped, lit, loaded, hammered, wasted, pickled, pissed, polluted and plastered. It intoxicates, inebriates, befuddles, besots, bewilders, and stews. It makes you three sheets to the wind, and either more or less interesting than you are when sober. Whatever it does, a lot of people want it done to them, at least from time to time, so ethanol technology has an ancient history.

Many of the most basic tricks of the chemical laboratory were first used to do something interesting to ethanol, especially to concentrate it into hard liquor. Distillation is the best known, and can be used to concentrate alcohol to 96% purity, the rest being water. Ethanol and water at the 96/4 proportions form an azeotrope, which is a mixture of stuff that boils in the same proportions as it is in liquid form. You have to work hard to break up the ethanol and water azeotrope, and if you do, you’ve probably wasted your time, because exposure to air will allow the ethanol to absorb enough water to form the azeotrope again. Besides, for most people, 192 proof is quite enough.

When I lived in upstate New York we’d drive out into the country (which was a short drive) every fall and buy fresh apple cider in big plastic jugs. Sometimes we couldn’t finish a jug before it went hard; sometimes we’d just let it sit on the back porch until it went hard. Then we’d make applejack, a traditional New England drink.

The principle of applejack is pretty simple: water freezes at a higher temperature than alcohol, and when you cool a water alcohol mixture, water freezes out. Here’s a list of the volume of ethanol in a water/alcohol mix, and the freezing point of the water in that mixture (in degrees F and C):

[0:32,0], [10:25,-4], [20:15,-9], [30:5,-15], [40:-10,-23], [50:-25,-32], [60:-35,-37], [70:-55,-48], [80:-75,-59], [90:-110,-73], [100:-175,-114],

Put it another way, if you have some dilute ethanol mixture, and you cool it to the requisite temperature, the water will freeze out until you get the water/ethanol mixture above. So if you start with a 14% mix of ethanol and water (the highest alcohol you can get from fermentation alone), the water will begin freezing at somewhere near 20 degrees F. Most freezers are around 0 degrees F, so you can boost your ethanol concentration to around 35%. In applejack there is additional freezing point depression caused by the sugar, so your actually wind up with more like a 30% concentration of ethanol, but that’s 60 proof, and that ain’t bad. If you happen to have a real cold snap (the coldest it got while I was at RPI was -28 F), you can get upwards of 80 proof.

Those old New Englanders knew their stuff.

The theater group at RPI had a traditional beverage they called “Players’ Punch,” or alternately, “blog” (no relation to weblogs, of course; it was probably a backformation of “grog” and possibly “blotto”). It consisted of various fruit juices and sodas, plus laboratory ethanol (the commercial equivalent is Everclear), or, failing that, whatever liquor was available, usually vodka or rum. To this was added dry ice, which chilled it mightily, and froze out some of the water. Potent stuff, and pretty dangerous, because the perception of alcohol content involves smelling the alcohol vapors, and if you get the drink cold enough, it deceives. Also, the dry ice added some carbonation, and carbonation enhances alcohol absorption by the digestive tract.

A high school friend of mine who went to Vanderbilt University, told me of a concoction called the “Funderburg,” no doubt named for its creator. It was a blender drink; I think it used frozen concentrated grape juice. I decided to make blender daiquiris by a similar method. That was the year I ran a lab course for undergraduates, which meant that I had access to the fabled laboratory ethanol.

The drink was simplicity itself: one 6 oz. can of frozen limeade concentrate, then the same can filled with 95% ethanol. To that was added a tray of ice. Then hit the max button on the blender. The final result looked a bit like a slushy, and was very cold. The liquid itself was somewhere around 80 proof, by my estimate, but because of the cold, it tasted about as alcoholic as wine. Very dangerous stuff.

Whenever I think about this particular concoction, I’m bound to remember one particular night in 1972 involving the blender daiquiris plus the Quicksilver Messenger Service’s extended version of “Who Do You Love?” by Bo Diddley. Modesty and discretion compel me to refrain from giving specifics. I will note, however, that the effects of ethanol are such that, while one may still remember that actions have consequences, the relative values placed on the actions vs. consequences may change substantially. Suffice it to say that it all Seemed Like a Good Idea at the Time.

Ozone in the Troposphere

…well yes you did get some kind of award for “Mostest Detailed Information on an Obscure Topic”. –JP Stormcrow

[D]on’t tempt me to go all photochemical on your ass. If you want detailed information on really obscure topics, I can bury you. –James Killus

Ozone is the key ingredient in photochemical smog. Air quality standards for smog are designed to limit ozone on the assumption that, if ozone is reduced, other photochemical smog constituents will also be reduced. While other air pollutants like carbon monoxide and fine particulates are, by and large, directly emitted, ozone is a “secondary air pollutant,” meaning that it is formed by chemical processes in the atmosphere.

There is, however, a natural background of ozone in the troposphere, the layer of air that contains 90% of the atmosphere, and the part of the atmosphere where we breathe, where weather happens, etc. Some of the tropospheric ozone is from the stratosphere; it slowly leaks down from above, through the tropopause, a very strong thermal inversion that exists, in fact, because the stratosphere contains so much ozone. Ozone absorbs infra-red radiation, so the stratosphere heats up and forms a thermal inversion “cap” that suppresses convective mixing. (The stratosphere is hot only in the relative sense. It’s still very cold; it’s just not as cold as the air immediately below it).

Most ozone in the stratosphere is formed by the direct photo-dissociation of oxygen by very short wave UV light, i.e., a different production pathway that can exist in the troposphere, because the stratosphere absorbs all the UV in those wavelengths before it reaches the troposphere. But near the bottom of the stratosphere, some of the ozone that is produced is formed by the “smog reactions,” from traces hydrocarbons (mostly methane) that manage to get through the tropopause, plus some nitrogen oxides that are formed from solar proton events, emitted from high altitude aircraft, and produced from the photolysis of nitric acid.

The smog reactions also work in the troposphere, of course, and not just in urban areas, although that’s where they are most obvious and where they were first studied. But there are natural sources of both reactive hydrocarbons (from trees, oil seepage, etc.) and nitrogen oxides (forest fires, lightning, bacterial action), and it’s pretty obvious (and inevitable) that some ozone will form that way.

So how much of the troposphere’s background of ozone begins in the stratosphere, and how much is formed in situ? There have been a lot of studies of this, from different directions. Leakage from the stratosphere, for example, can be estimated via tracers, including radioisotopes that were left over from nuclear bomb tests as they slowly leaked out of the stratosphere, and CFCs, as they slowly leaked into the stratosphere from below.

Estimating the strength of the tropospheric source is a little more difficult. I made a stab at it in the early 1980s, after an earlier stab at estimating the natural sources of nitrogen oxides.

The key to the analysis is a sort of “pseudo-stoichiometry” that exists in the smog reactions. Essentially, the process that forms ozone also destroys nitrogen oxides, although at a rate that depends on a lot of factors. One example of this destruction is in the case of the hydroxyl radical.

The hydroxyl radical (HO) is necessary for the oxidation of hydrocarbons that drives photo-oxidation, but HO also reacts with nitrogen dioxide, which is also necessary to form smog, to give nitric acid, the termination species (except in the stratosphere). This is only one of the radical species in the smog reactions that produces a sink for nitrogen oxides, so, as smog is formed, nitrogen dioxide is destroyed.

The ratio between ozone produced and nitrogen oxides being destroyed varies substantially, from less than 1 at high concentrations of precursors (such as in NOx rich plumes from power plants), to as high as 5 or even 10 to 1 at low concentrations under ideal conditions. I made various plausible estimates of the production ratios under various conditions for various NOx sources, and estimated the additional tropospheric ozone background that could be achieved from each source.

My estimates wound up being in the same ball park as prior estimates of the stratospheric ozone sources, around 1-4 ppb ozone, give or take. Given the uncertainties in just about every type of data going into the process, that was pretty good. It also had some bearing on a number of other questions, like the general photochemical background of “clean air,” and the fact that, over the past few decades, tropospheric ozone has been increasing. Since ozone is a (minor) greenhouse gas, the matter even has some application to climate change and global warming.

I presented these results in a poster session at a conference in the early 1980s, and one fellow who came by told my boss that he thought I was just the sort of smart fellow the field needed. That fellow was Paul Crutzen, who later got the Nobel Prize in Chemistry for his work on stratospheric ozone depletion. Having a future Noble laureate call me smart is one of those things that I don’t seem to be able to work into conversation nearly often enough.

Saturday, October 20, 2007

Enlightenment is not a Competitive Sport

One of the post-Firesign Theater “rock and roll comedy” groups was The Conception Corporation. They put out two albums, A Pause in the Disaster, and Conceptionland. Both were good for Progessive Radio play in the 60s, which is to say the early 70s. (As I understand it, there is a third, live album now available as well).

One of the cuts on Conceptionland was “Rock and Roll Classroom,” another “What if Freaks Ran Things?” idea (see also “Returned for Re-Grooving,” by Firesign Theater). In this case, what if high school were really hip, or at least trying to be?

In one bit, gym class is taught by “Fizz Ed” who announces to the class in a drill instructor voice, “Today, we’re going to learn to meditate! On three, one, two, meditate! One, two, meditate! You! Over in the corner, you’re not meditiating!”

In short, it was a lot like a Pilates class.

A fair number of Aikido practitioners also practice zazen, meditating in seiza, the standard Japanese sitting posture, knees and feet on the floor, buttocks resting on the feet (feet tops flat on the floor). Seiza is a natural and comfortable posture—provided you have been sitting that way since you were a child. For us Westerners, it can rapidly become torture; our bones, ligaments, and blood vessels did not grow into seiza as we matured, so at the very least, our legs tend to fall asleep, not to mention the cramps, aches, etc. that also accompany extended periods in seiza.

In zazen, a zafu, a small circular cushion, is often used to alleviate the seiza problem, but many students don’t use them, or they don’t use enough of a cushion to straighten the legs enough to really make the thing less of an ordeal. So then we get all sorts of rationalizations about “letting go of the pain,” etc.

It’s quite true that meditation is often used as a way of alleviating chronic pain, but pain isn’t actually the point of meditation. Meditation itself isn’t supposed to hurt, nor is the posture you’re in supposed to hurt. Actually, being in a painful posture during meditation is dangerous, since you are then basically ignoring an important message from your body, and you can cause or exacerbate an injury by doing so.

But, of course, “letting go of the pain” feels like such an accomplishment.

At a college reunion many years ago, one of my freshman buddies was there with his wife, and they were explaining their study of kundalini yoga. The posture in which you begin meditation is supposed to cause some group of muscles to stretch, though not to anything approaching pain. Then, as meditation progresses, the stretched muscles relax, releasing kundalini energy. The reason why advanced students wind up tying themselves into knots, so to speak, is that it becomes more and more difficult to give your muscles that necessary stretch, because the practitioner becomes more and more limber. But the limberness is not the point of it, the kundalini is.

“Kundalini energy” sounds like woo-woo Asian mystical mumbo jumbo, but actually, all of the related concepts, prana, ki, gi, chi, and all the related “energies” are fairly easy to perceive if you put a bit of work into the matter. As for the “woo-woo” part of it, one can just as easily talk about dopamine, serotonin, norepinephrine, and any of the other myriad neurotransmitters that have been discovered and studied. If freeing up kundalini energy went with an increase in dopamine levels, what then? Does that “explain” the matter, or just make it more palatable to a mechanistic world view?

In any case, the question becomes, what is all that energy for? In the martial arts, of course, we have one explanation: it makes you stronger, more able to practice the art, and, if necessary, win the fight. But there are other answers, some more prosaic, some downright cosmic. Meditation becomes yet another tool, whatever your goal, be it better health or a doorway into infinity. Whatever floats your boat.

It is noteworthy, however, that so much energy is expended in oneupsmanship. “I’m more enlightened than you” seems to be, on examination, a self-canceling statement. To have it issued (albeit usually indirectly) by someone who has achieved that state by spending hours staring at a wall and literally doing nothing, well, that just makes the leap into paradox, doesn’t it? But then, zen thrives on paradox, and art thrives on irony, even the unintentional sort.

Friday, October 19, 2007

Killing the Goat

[another WAAGNFPN crosspost]

I read the story some years ago, in The Wall St. Journal, I think, in the first page center column where they put their “strange but true” features. It concerned an occurrence at a semiconductor plant in Indonesia.

The work was semi-skilled labor, of the sort that required close eye/hand coordination, for which the local native women were well suited. Much of it was done under the microscope. The overall situation was stressful: clean room standards, long hours of intense concentration. After some months the women began seeing things under the microscope. They called these things ghosts, and told the supervisors that the place had become haunted with the spirits of the dead.

The plant engineer, being a practical fellow, paid a visit to the local shaman. The shaman confirmed the diagnosis of ghosts. “What must I do to rid the plant of the ghosts?” asked the engineer, doubtless in the spirit of playing along with local customs.

An infestation of this magnitude requires the sacrifice of a goat, he was told.

“How much do you charge to do this?” asked the engineer, probably in the spirit of expecting a bit of a shakedown.

No, the shaman explained, the head of the enterprise must do it, (thereby demonstrating that the principle of separation of powers is not a purely Western invention). If it were in the village, the chief would be the one, but since it is in the manufacturing plant, the engineer himself would have to do it.

Needless to say, this took the engineer aback somewhat. This was more playing along with local customs than he’d bargained for. But after much thought, and probably silently cursing his fate, the engineer went through the ritual sacrifice. He killed the goat.

And the ghosts went away, and the women went happily back to work.

Now the first question that you should ask yourself is “were the ghosts real?”

I realize that the expected “rational” answer is that they were not “real,” that they were “merely” the products of the women’s imagination. Then, having blamed the women for their perceptions, a “rational” procedure would have been to put the women into counseling or perhaps give them tranquilizing drugs. We know pretty much how well these procedures work; not very well, and they are expensive. Is this rational?

No, the engineer did the rational thing. He performed the ritual and the ghosts went away. Cost, one goat, and a certain amount of embarrassment on the part of the head guy. Sounds fair, doesn’t it?

But wait, what about the ghosts. I’m not really saying that they were real, am I?

Yes, I’m saying that they were real, but I’ll explain more about that when I answer the second and third questions.

The second question is, what were they made of? The third question is, were they supernatural?

The ghosts were made out of silicon, and plastic, and glass, and microscopes, and clean rooms, and Indonesian women. They were not supernatural. How can something so ordinary be supernatural?

Ah, so you think I’m saying that they were not real after all, yes?

Now I ask that you consider that sentence. You read it, it makes sense. Is that sentence not real? It is made out of phosphors or a computer screen, or paper and ink, it doesn’t matter. It’s the same sentence either way. So what is it made of?

The same thing as ghosts. The same thing as all magic. Whatever is available.

People say “occult” and I will leave the room. Occult means hidden, and magic is not hidden. We live in it. You can’t escape it. You can refuse to look at it, however.

I once told the story of killing the goat at lunch once with a friend of mine, Al, and his co-workers. One of the party, a guy named Mike, just would not stand for my stating that the ghosts were real. “No, no,” he kept saying. “They were merely imaginary.”

“What’s ‘mere’ about imagination?” I asked. “The nature of a hammer is different from a hurricane, but they are both real. Magic is of another different nature. But it’s still real. The ghosts were real.”

“No, no, that’s wrong,” he kept insisting.

Al later told me that Mike had a Korean wife, and about a year before she had a child that died at birth. Some months later, after seeing the ghost of the child several times, the wife had performed an exorcism. Mike would have nothing to do with it.

“He’s still pretty touchy about the whole thing,” Al told me.

“The exorcism or the death?” I asked.

“Both,” he said.

“How’s his wife doing now?” I asked.

“Oh, she’s fine.” he replied. Al knew what the story was about.

Wednesday, October 17, 2007

Bowling

[Crossposted at WAAGNFPN]

At Eastshore Aikikai, where I practice Aikido, we’re pushing the geriatric envelope pretty hard. I’m in my mid-fifties and I’m in no way the oldest person in the dojo; there are also several students who are only a few years younger than I am. Get off my lawn, you whippersnappers or I’ll throw you off.

My mother is in her 80s, though, and she still belongs to a bowling league. Granted, bowling is a lot lower impact than Aikido. It’s also the only one of two sports I know of where people regularly die during participation, the other being golf. Of course the reason for that is that both are sports that have participants of any age, including the very old.

Or the very young. Tiger Woods famously appeared on The Mike Douglas Show at the age of 2. I wasn’t that young when I began bowling, but I started before I entered grade school, although I didn’t join a league until some years later. My dad was manager at a total of three bowling alleys over the years, and I was in a league in all of them.

You’d think I would have gotten really good at it, but actually I was only in the upper reaches of average. The highest I ever scored was 236, and my league average hovered around 170 from late high school to the last league I was in, sometime in the late 80s. the highest it ever got was about 175. I’ve fallen away from the faith since then, so nowadays I bowl a little less often than I golf, which is to say every year or two, when I’m visiting relatives. My bowling is still a lot better than my golf, however.

Being in a bowling league is interesting from an intellectual snob’s standpoint. Despite numerous attempts to move upscale, bowling is still a pretty working class activity, so you wind up rubbing shoulders with truck drivers, beauticians, policemen, and mail carriers. That’s all to the good, in my opinion. It did become less fun when automatic scoring came in, since it eliminated my natural ecological niche: scorekeeper. In fact, I was keeping score in my parents’ leagues well before I belonged to a league on my own.

I’ve occasionally joked that I’m genetically selected for bowling, given that my parents met in a bowling league. It was better joke when I was bowling regularly and I could show people that my right thumb was bigger than my left. I still own two bowling balls, plus bags, shoes, etc.

After my car was broken into in college, where the thieves smashed a window to get in, I took to keeping my car unlocked, never leaving anything in it that was more valuable than repairing a broken window. The only significant thing that’s been stolen since then was my bowling bag, with ball. The humor there was almost worth the loss; a used bowling ball is worth maybe fifty cents, the bag a dollar or two. I like to imagine the thief hauling the sixteen pound ball into the pawn shop or thrift store, only to find out that he’d have made more money panhandling.

Some of my earliest memories are of bowling alleys (I still call them alleys, never having gotten used to saying “bowling lanes”). I’m just barely old enough to remember pin boys, before automatic pin spotters came in. In Nashville, they were invariably black, with the phrase “pin boys” echoing the generally demeaning practice of calling all black men in a service job “boy.” I don’t have any specific recollection of any of the pin boys, but I do have the general recollection that they were all teenagers, given the athletics necessary to hop up onto a perch above the pins when the ball came down the alley, then jump down and reset the pins when time came.

The woman who handed out the bowling shoes was not in her teens, however. She was a middle aged black woman whose name I don’t quite recall, but I want to say, “Miz Abigail,” or “Miz Abbey.” She befriended the little 2-3 year old tyke and let me stay behind the desk with her and hand out the shoes.

I learned, many years later, that the bowling league that my parents belonged to included a company team sponsored by the firm of Acuff-Rose, which included one of the principals, Fred Rose, and one of the artists managed by Acuff-Rose, Hank Williams. Williams died in 1953, and Rose in 1954. The dates are such that it’s just barely possible that I once handed Hank Williams his bowling shoes.

I’ve heard a possibly apocryphal story that Lyndon Johnson insisted on including the public accommodations portion of the 1964 Civil Right Act so that his old housekeeper wouldn’t have to “pee in the bushes” when she drove from Texas up to Washington to visit. It has the ring of truth, because Johnson took his politics personally. He might listen to varying interpretations of the commerce clause, but in the end he knew that politics came down to a pissing match, and a good many southern boys have at least one Miz Abigail in among their warmest memories.

Tuesday, October 16, 2007

Ancient Traditions – Bushido III

The bottom line.

Training in the martial arts has many virtues: discipline, concentration, physical conditioning, and, sometimes, an indefinable elevation of the spirit. "Cheated death again," I often say at the end of a class, and it feels like a real accomplishment.

Many of the Asian martial arts are also suffused with various philosophies, Buddhism in one or another of its variations being common, and that is the one that often appeals to Western students. The founder of Aikido, Morihei Ueshiba Osensei, was a devotee of Ōmoto-kyō, a mystical religion deriving from Shinto.

But martial virtue is martial virtue, and it would be foolish to believe that the Asian arts are somehow intrinsically superior to Western arts like fencing, boxing, wrestling, or long bow archery. The Western arts likewise have "ancient traditions" and if the links to ancient times is as tenuous as the connection of the modern Olympics to its Greek predecessor, well, that is par for the course, isn't it?

Likewise there are many philosophical and spiritual covers for the practice of warfare and the arts of war. I've mentioned Chivalry, which included the Christian doctrine of the "just war," and I'm sure that both sides often find their cause to be just, and that is course par as well.

When all of life is nasty and brutish, then the nasty brutishness of war does not seem like such a breach. But what does one do when the nastiness is lessened? What is the state of man when peace is an option?

I have, upon occasion, justified Aikido with these words: You understand the calm serenity that is the object of meditation exercise? Those practicing meditation spend long hours in darkened, quiet rooms attempting to find that center of peace. It is the goal of Aikido to reach that peace even on the battlefield.

A paradox, perhaps. But what to make of those who attempt to find war even in the midst of peace?

We train. Aikidoka train in that art. Other martial artists train in their own arts. Soldiers train. Competitive athletes train. Do we train for combat, real combat? If that combat never comes, do we feel that we have wasted our time?

What happens to the athlete who trains and trains but never gets to play the game?

What happens to the soldier who never fights in war?

Those seem like they should be different questions, because playing the game rarely kills people, so there is no obviously creepy downside to the simulated combat of competitive sports. Except perhaps that the sports themselves are considered simulations of war, and the winning player of games looks to find a more exciting game.

They sometimes call it that. That's how the British referred to the contest with Russia for control of central Asia in the 19th Century. Quite a lot of blood was spilled. The idea was to "civilize" Asia—a truly laughable notion given the respective age of the civilizations involved.

Warriors are rarely philosophers and philosophers are rarely warriors. But warriors do fascinate philosophers, who often yearn for the "reality" or the "authenticity" of war and fighting. Read Norman Mailer on boxing, or Hemmingway on bullfighting. But do not read them for real insight into boxing or bullfighting; what you will find instead is the writers' projections of their own internal conflicts onto the external reality of the fight. If you want to learn about boxing, put on the gloves and find a teacher.

When the philosophy of war is formulated, the Necronomicon is opened. People who read and fantasize about war puff themselves up with martial spirit. They imagine themselves as heroes, never as cowards. They lose their real selves in the distorting mirror of wannabe. They feed off what they imagine to be the experiences of others.

Is there a martial artist alive who has not imagined himself/herself fending off an attacker with some artful move or maneuver? God knows I have. Sometimes I also make myself pay penance, by imagining myself simply handing over my wallet to the mugger, or darting down the alleyway, running away, as fast as my aging legs can take me. But the lure of the imagined fight remains. I'm lucky; I get to put these fantasies into fiction, where I know that I'm the puppet master pulling all the strings.

Find the strings. I guess that is the nub of what I'm saying. Follow the strings back to the fingers that pull them. But don't be surprised if you wind up staring at your own hands.

Monday, October 15, 2007

Ancient Traditions – Bushido II

In my essays on Privation Morality, I tried to get a handle on how the authoritarian tradition begins, in times of privation, when the world reduces to zero sum, the game becomes not just a matter of "him or me," but rather, "us or them" and the "us" includes your family, your children, all you hold dear. Is there one among us who would not steal for their children, lie if it meant the lives of your parents, kill if the option were the deaths of all you hold dear?

So who do you pray to and who do you prey upon? Those are real world questions when the times are sufficiently rough.

The nuclear family becomes an extended family, and families merge to form tribes. And tribes war upon each other, conquer or be conquered. Is there a single human on the planet whose ancestors did not pass through this cultural experience? Is there a one of us who can claim a single lineage that did not?

I've been trying to find a passage that I believe I saw on the blog Orcinus, written by Sara Robinson, who has become one of the better analysts of the authoritarian culture. It outlines the authoritarian family bargain: The husband is the leader, the voice of the highest authority. His command is law and all must do his bidding and attend to his wishes and comfort. In return, he provides for the family and defends it with his life, if necessary.

The Paterfamilias has the power of life and death within the family. It falls to him to decide if a newborn infant is taken into the family or left to die of exposure. If any family member angers him sufficiently, he may kill them with impunity. "My home is my castle" is not just a figure of speech to a true authoritarian.

Over time, tribes merge with other tribes. Cities arise and different needs emerge. The story goes that the tradition of Paterfamilias in Rome was broken when a father ordered the death of his son—who happened to be an important Roman general. Whether the tale is true or not, the power of the head of a household is bound to ebb when the household is connected to a larger society.

One way that power is restricted is through the power of law, or the precursor of law, tradition. Familial roles become codified and legalized. The power of the father becomes the power of a caste. Since the most important role is aggression and defense, there comes to be a warrior class. The power of tradition and law often comes to rest with the priests. The True King requires the support of both.

None of this is very controversial. What comes next is where the real meta-combat begins.

The 19th Century upended everything. Distance was obliterated. Every sort of production exploded, and every commodity market in the world found itself in a game of crack-the-whip. Napoleon remade the map of Europe and the New World colonies of Spain and Portugal nominally freed themselves from their former masters, only to experience a different sort of colonialism, with economies tied to the aforementioned commodities, whose markets were not under their influence or control. Those few nations in possession of modern weaponry conquered the rest of the world with almost pitiful ease.

All the "ancient traditions" were rewritten in the new light. What became the new codes of conduct were called by the same names as the older codes, but it was a new world, and it was not the same.

The Japanese code of Bushido, the code of the warrior, was not written by warriors. It was written by a "warrior class," the Samurai, who were perhaps descended from warriors, but by the 19th Century, most of them were administrators, bureaucrats, at most policemen. Their martial experience was in the dojo, not the battlefield. Actual warriors rarely articulate philosophy, and philosophies are rarely written by warriors.

Even now, you can see the "martial spirit" in the southern United States, an echo of an echo of an echo. The Antebellum plantation elite considered themselves to be the heirs of Chivalry. They fought duels. They carried the favors of their ladies with romantic flourish (and visited the slave quarters when their carnal needs arose). They devoted themselves to fine houses and fine clothes.

And when they lost, they spent the next century mourning the "noble cause" and convincing themselves that their culture was something special, unique, lost in a torpor of nostalgia for an "ancient tradition" that never existed, could never have existed, and which, in any case, was utterly inappropriate for a time when war is simple slaughter, without glory, without honor, and largely without purpose. But the mythology lives on, first transmuted by romantic literature into the noble gunfighter of the Old West, then to the Top Gun aviator of the dogfight in the air. Denatured still further we get Spectator Sports, those little Kabuki Plays that simulate warfare as unrealistically as the playing fields of Eton simulated Waterloo (or am I the only one who always found that sentiment to be a crock?).

The Wikipedia article suggests these as the hallmarks of Bushido:

  • Rectitude
  • Courage
  • Benevolence
  • Respect
  • Honesty
  • Honour, Glory
  • Loyalty
  • Filial piety
  • Wisdom
  • Care for the aged

A recent essay by my sensei's sensei called out these:

  • Courage
  • Endurance
  • Nobility
  • onesty and Faithfulness
  • Unsullied Integrity
  • Simplicity and Directness
  • Compassion (toward the weak)
  • Honor
  • Self-sacrifice
  • Non-attachment to worldly importance (including life and death)

One might think that these are all noble sentiments, as indeed they can be. So how is it that the spirit of Bushido found its way into Japanese militarism in the early 20th Century, one of the greatest national disasters in history?



Well, here is an example of the code of Chivalry:

  • Thou shalt believe all that the Church teaches, and shalt observe all its directions.
  • Thou shalt defend the Church.
  • Thou shalt respect all weaknesses, and shalt constitute thyself the defender of them.
  • Thou shalt love the country in which thou wast born.
  • Thou shalt not recoil before thine enemy.
  • Thou shalt make war against the Infidel without cessation, and without mercy.
  • Thou shalt perform scrupulously thy feudal duties, if they be not contrary to the laws of God.
  • Thou shalt never lie, and shall remain faithful to thy pledged word.
  • Thou shalt be generous, and give largess to everyone.
  • Thou shalt be everywhere and always the champion of the Right and the Good against Injustice and Evil.

That gave us the American Civil War. Somehow, I don't think that it is the Words' fault.

Ancient Traditions – Bushido I

CP Snow once said that all ancient British traditions date to the second half of the 19th Century, and his only error was to limit this claim to Britain. The great majority of real traditions having been swept away or reduced to irrelevance with the rise of capitalism, the 19th century saw the rise of a whole set of new ones, which were then fixed in shape by the system of nation-states, each with their own newly-codified language and officially sanctioned history that took shape at the same time. – John Quiggan

There are basically no publications in English on ninja worth reading–it’s all junk. The only serious academic scholarship available outside Japanese language publications would be the material on Roy Ron’s website at ninpo.org. Roy is a (fairly) recent PhD graduate from U Hawaii, and has spent a number of years doing research on ninja and related topics.

The most reliable reconstructions of "ninja" history suggest that "ninja" denotes a function, not a special kind of warrior--ninja WERE samurai (a term, which didn't designate a class until the Tokugawa period--AFTER the warfare of the late medieval period had ended--before that it designated only an occupation) performing "ninja" work.

Movie-style ninja, BTW, have a much longer history than the movies (although the term "ninja" does not appear to have been popularized until the 20th century). Ninja shows, ninja houses (sort of like American "haunted houses" at carnivals), and ninja novels and stories were popular by the middle of the Tokugawa period. The "ninja" performers may have created the genre completely out of whole cloth, or they may have built on genuine lore derived from old spymasters. Either way, however, it's clear that much of the lore underlying both modern ninja movies and modern ninja schools has both a long history AND little basis in reality outside the theatre.
--Karl Friday
Instructional Coordinator & Associate Head
Dept. of History
University of Georgia
Athens, GA 30602

Another problem stems from the nature of Japanese society and Japan's social history. From the early seventeenth century until the middle of the nineteenth century (Meiji Restoration) Japanese society was locked in a rigid class structure that allowed very little or no mobility at all. That meant that members of a social group within a certain social class had no choice but to accept their place in society. In addition, there was a clear distinction between the ruling class--the samurai--and the other classes--peasants, craftsmen, and merchants. Within each class as well, there was a certain hierarchy according to which members of the class had to act their social role with little opportunity to change their status. This reality have produced strong identifying characteristics for each social class to which the individual had to conform. Outside these social classes, as they were designated by the ruling samurai elite, were the classless people and outcasts who were placed bellow everybody else. Ninjutsu, for the most part, was the fighting skills and methods practiced by a small number of families who belonged to the lower samurai class, peasants, and even outcasts, and only rarely by warriors belonging to the samurai elite. Consequently, ninjutsu since the Edo period has been identified as different than the noble traditions of the samurai, and those practicing it were usually regarded by the rest of society as lowly people. In other words, ninjutsu was anything but conformity to the pre defined social rules. As such, it could have never received a seal of approval as a recognized martial tradition, not even when those samurai were actually employing warriors proficient in ninjutsu.

The social conditions and the strong tendency for conformity I have just discussed produced another problem. Fighting methods or weapons that were not practiced by the samurai elite were considered mysterious at best, sometimes demonic, often super natural, and certainly unworthy of respect. Here again is the problem that rises from social conformity. For the samurai elite who were bound by rules of behavior and a code of honor and ethics, fighting methods were confined to a small number of weapons, namely bow, sword, staff, jutte, and spear. This resulted in little creativity in fighting. However, for warriors other than the samurai, those who were not constrained by their position in society, creativity was a necessity for winning. They have maintained unusual and innovative fighting methods and weapons that were developed in earlier periods, while systematizing, recording, and adding to it during the Edo period. Consequently, ninjutsu came to be perceived very negatively, and when Japan moved into the modern period ninjutsu gradually disappeared while its dark and mysterious image, which already became folklore, was now viewed as an historical fact.
--Roy Ron

And yet, in the 1921 Ryūkyū Kenpō: Karate, the first fully published karate text, little of this [historical development of karate traditions] appears: karate is not a dō, lacks mythology, and is frank about recent Chinese influences. Reaching further back, to the unpublished writings of Itosu Anko, karate lacks even a name, makes no claims on the spirit, and mentions history not at all. Beyond that, the writing is in Chinese. Strangest of all, and most easily overlooked, is that through the 1920s there was only really one name: karate, the Chinese hand…

But the historical imperative was not a simple descriptive one, for it included certain strategic silences. They needed to know the details of their mystical origins, but they also needed to be at a loss to make a full accounting of the middle of karate history. Make no mistake: karate history soon had a middle, but it was indistinct—an outline with many precise gaps, a carefully composed picture of fog—because along with the questions that required answers were ones that could not be asked at all… --Craig Colbeck: Karate and Modernity, A Call for Comments

Bushidō, meaning "Way of the Warrior", is a Japanese code of conduct and a way of life, loosely analogous to the European concept of chivalry. It originates from the samurai moral code and stresses frugality, loyalty, martial arts mastery and honor unto death. Bushidō developed between the 9th to 12th centuries as set forth by numerous translated documents dating from the 12th to 16th centuries (as mentioned below). However, some dependable sources also state the document might have been formulated in the 17th century.

According to the Japanese dictionary Shogakukan Kokugo Daijiten, "Bushidō is defined as a unique philosophy (ronri) that spread through the warrior class from the Muromachi (chusei) period." Nitobe Inazō, in his book Bushidō: The Soul of Japan, described it in this way. "...Bushidō, then, is the code of moral principles which the samurai were required or instructed to observe... More frequently it is a code unuttered and unwritten... It was an organic growth of decades and centuries of military career."

Under the Tokugawa Shogunate during the Edo Period (1603-1867), Bushidō became formalized into Japanese Feudal Law. –Adapted from the Wikipedia

Sunday, October 14, 2007

The Heat Death of Science Fiction

I haven't yet read all the stories in Helix #6, nor even properly digested the ones I have read, but I'm going to go on a bit about John Barnes' essay, "Reading for the Undead." In it, Barnes suggests that genres tend to last about 70 years, vaguely about three generations, and he describes the dynamics of the origin, development, and final flowering of a genre, until it becomes, in his phrasing "undead," a fixture on the cultural landscape that still moves from time to time, but which isn't growing, breaking new ground, or even becoming more artistically interesting, because all the tropes have been explored and exploited, to become merely familiar.

The "70 year lifetime of a genre" caught my attention right away, because my friend Dave Stout has been saying that for quite a while, and he derived the time span from pondering the history "picaresque novel," which had a lifespan of about 70 years. So we've been talking from time to time about the nature of genre lifespans for quite a while around our house.

I myself tend to use more of a combustion metaphor, with a genre beginning from kindling, moving to tender, then burning up most of the available fuel. So my word for the phenomenon that Barnes describes as "undead" is "banked." That's less provocative, but then, Barnes has more reason (and greater stature) to be provocative than I do.

But there are exceptions that test the rule ("test" being the original meaning of "prove"), and it's instructive to look at a few of them.

The "puzzle" form of the detective story, discounting Poe, can be dated from Sherlock Holmes, and Holmes had a number of well-know imitators. Eventually, many of the tropes became codified in the "drawing room mystery," where all the suspects eventually wind up in the drawing room and the Great Detective explains to one and all whodunit. A period of 70 years from 1887, and the 70 year rule would place its death somewhere around 1960. Agatha Christie was still wringing the last few drops out of the handkerchief at that time, but it's true that the drawing room mystery was largely a banked fire—at least in its literary incarnation.

But as a pop culture artifact, we had yet to see Columbo, MacMillan and Wife, Murder, She Wrote, and Monk, to say nothing of the other various lesser cop shows, detective shows, etc.

Literary snark would now claim that this is further proof, not just of Barnes' concept, but of his terminology. What can be a better indication of "undead" than TV Zombieland? Moreover movie and TV tie-in books will tend to crowd out any valid attempts to revive the literary genre, something that has indeed occurred in science fiction, where Star Trek and Star Wars novels routinely outsell the "serious stuff."

Something very similar happened to the Western. Born in the Dime Novel era, circa 1875, the climax practitioner, Zane Grey, died in 1939. But 1950s television was dominated by Westerns, just as motion pictures had been dominated by the genre just a few years before. Then, practically in a puff of smoke, the genre evaporated, almost vanishing from pop culture entirely (Heaven's Gate certainly helped remove it from the motion picture landscape), leaving only Louis L'Amour to tend to the embers, and the occasional Silverado homage, and Rustler's Rhapsody spoof to hold the motion picture fort.

But notice that I wrote "puzzle form" of the detective story up above. The drawing room mystery was still going strong when the "hard-boiled detective" came on the scene, and while a Black Mask story might very well feature a puzzle mystery, and even, occasionally a gather-all-the-suspects-into-a-room scene, those were not the dominating tropes. For hard-boiled detectives, it was the action, the social commentary, the atmosphere, and a different blend of characters than you find in the drawing room,

Then hard-boiled got even darker, tougher, and noir came along, a bank shot off of motion pictures, where B movies were cranked out of the studios aging black and white units, quickly written, even more quickly filmed, and verging on experimental in their art direction. Cue the cigarettes, the gunfire, and black, black blood. On the literary side, the original paperback novel was invented, and suddenly Mickey Spillane was the best selling novelist in the English language.

In the 1960s, the spy novel broke out of its genre and became a fad, and anyone who missed the connection between hard-boiled noir and the nihilistic secret agent was simply not paying attention (Spillane paid attention; he began writing spy novels). The fad finally culminated in parodies of spoofs of satires, like Get Smart and The Man from U.N.C.L.E. A plethora of parodies is often the signal of the end of a fad; genres tend to go less willingly.

The spy novel is usually considered to have originated in the early 20th Century. By the 70 year rule, the spy fad of the 1960s was its swan song. But the Cold War was still in operation, and the spy novel refused to bank down. It also gave rise to the "techno-thriller" which often used spy novel tropes.

Back at the detective novel, after noir came the procedural, whose origin is usually dated from the post-WWII period, i.e. a similar time frame to noir. The 70 year rule would suggest that we are now seeing the final flowering of both, unless noir is just a refinement of hard-boiled, in which case it's dead already. That would mean that I'm hauling two dead genres with Dark Underbelly.

So now let's consider a sci-fi subgenre: superheroes.

The superhero comic has dominated that field for quite a while, being the almost the sole reason for comics to exist as a medium for a considerable period, say from 1960 to 1990, barring the brief explosion of the Undergrounds. Moreover, Odd John and Gladiator notwithstanding, the superhero originated as much in comics as in science fiction, with Superman and Slan being approximately contemporaneous, 1938 and 1940, respectively. Intriguingly, Superman's creators, Siegel and Schuster wrote a fanzine story in the early 1930's, entitled "The Coming of the Superman," featuring a Slan-like (telepathic mind control) character.

The 70 year rule would suggest that superheroes are nearing the end of their run, which may be true, given the ongoing colonization of television and motion pictures by the genre. But notice that the superhero genre began as a sub-genre of science fiction. SF conventions used to be where comics fans could meet; now Comicon attendees vastly outnumber Worldcon attendees. Moreover, gamers are the real growth demographic.

But let's not forget the external dynamics here. Comic books are a natural precursor to motion pictures and television because of basic mechanics: a comic book is substantially like a story board. By contrast, novels are dreadful movie precursors, because a novel is much too long a form to translate to the screen. The amount of story in an average movie is approximated by a novella. Even given that most novels these days are at least 50% padding, there is still a mismatch. By contrast, game-based movies tend to be heavy on the glitz of special effects and martial arts choreography, but the stories tend to suck.

The short story itself had an economic lifetime of close to genre length (by "economic lifetime," I mean that the time given to writing a short story paid its own way, so that it was actually possible to make a living as a writer of short stories). But the short story flowered because a combination of the pulp magazine (the low end, with a low sales price) and the mass circulation magazine (the high end, with a heavy advertising base). Both markets sputtered and largely died in the 1950s, owing to external forces (television sucking away the mass advertising dollar and the death of the pulp magazine distribution network). The short story form is hardly dead, or even undead, but it's no longer a paying proposition. If the venues came back, there would be plenty of both readers and writers, but lack of venues is due to a different market dynamic than a loss of readership.

All of this may seem a bit far a field from science fiction as such, but it's worth noting, as a friend of mine recently said, that the "science fiction" section of his local bookstore is still growing. Part of that may be simple book bloat, but I rather suspect that it's because other genres are encroaching. Epic fantasy was revived after decades of ember tending by writers like Fritz Leiber and Henry Kuttner, but The Lord of the Rings and the Frazetta Conan put it back on the shelves and fired up the genre. Can we count on 70 years of elves and magic starting from 1965? If not, then who originates, Howard or Lord Dunsany?

Similarly, vampires, werewolves, urban and gothic horror, all seem to be churning right along, yet there was a time when such stories had to masquerade as science fiction to find an audience. Now it's often the other way around.

What we may in fact be witnessing is the great hybridization of fiction. Something similar has been underway in music; it's not uncommon to hear a fugue riff in a hip-hop number, or a salsa variation of a hot jazz piece. Similarly, most literary fiction now has a pop culture awareness, so a showdown with ray guns becomes just another bit of wallpaper.

A genre is a literary form where the willing suspension of disbelief is aided by an appeal to the conventions and tropes of the genre itself. When categories break down, the number of unacceptable things declines, while at the same time, the opportunities for theft, er, I mean, "influences" or "homage" or "pastiche" increase. Ultimately the real question is the same as it always was.

"How can I make money off of this?" –Jim Turner, "The Brain that Wouldn't Go Away"

Helix is at:

http://www.helixsf.com/index.htm

Be nice. Toss some money in the tip jar.

Saturday, October 13, 2007

Hi Ho, Doin' the New Lowdown

I should probably mention that I put up a new chapter of Dark Underbelly on Thursday. I've been doing it on weekends, and I'll still try to put one up tomorrow as well. Call the greater frequency an experiment, and here's an old cartoon. If you wait until near the end, you get "Doin' the New Lowdown."


Friday, October 12, 2007

Privation Morality II

When times get really desperate “Us vs Them” becomes a real matter of life and death, not just for individuals, who might choose death rather than savage their fellow man, but for those dependent upon those individuals, the women and children of the tribe in other words. So tribal warfare is an almost inevitable result of extreme privation. If you would not loot someone else’s village to save your own children from starvation, you perhaps shouldn’t have children.

Rationalizations follow. I had a psych professor who liked to quote from The Brothers Karamazov: “I once did him dirt and I’ve never forgiven him for it.” Your enemies deserve to be looted. They’d do the same to you. They have too much as it is. They worship a false god. They’re weak, and the world belongs to the strong. They breed like rats. Whatever it takes to make your own actions palatable to you.

And teams are more effective than individuals. The hunting band can hunt people as well as game. The warrior is respected when he is what keeps the tribe from disaster, either by plundering the “other” or keeping the “other” from plundering you.

Furthermore, the warriors must be fed first. They are the seed corn when it all goes zero sum. They are the protectors of the tribe.

But warfare doesn’t just redistribute; it consumes resources and it kills off part of the population, most frequently the men. So the strongest warriors also get more than their share of women (and, more of their share of danger and death).

So the ideas, concepts, and mental models meld into a worldview, the worldview of perpetual struggle. The warrior lives for struggle. It is what he is all about. Valhalla is his heaven.

Good times threaten this worldview. When there is plenty of food and other goods, where is the need for heroes? “We’ve gone soft,” say those who have the worldview of perpetual struggle. “God will punish us for our sins of gluttony, lust, and avarice.”

And God always does—eventually. He’s just that sort of guy. And people fall back into line. And if God seems to be taking a little too long, then he gets a helping hand, because God is so very weak sometimes, or maybe absent-minded. Or maybe he’s just saving it up. No matter, it’s always easy to cook up a war. Or to make your business “lean and mean” with a “take no prisoners attitude.” See? Easy.

When punishment is always just over the horizon, or, more accurately, always in the back of your mind, when privation is just a few streaks of bad luck away, then what? You can never have enough wealth to absolutely guarantee your survival. Indeed, it’s only death that is preordained. If you are wealthy, there is always someone who wants to take it away from you. If you are poor, you’re always on the ragged edge of disaster. For those of us in the middle, both those specters lurk in our shadows.

It is a recipe for the Social Darwinist vision of the world, privation morality even in the midst of plenty. It demands racism, class, and caste. It craves people on top and people at the bottom. It is the well-spring of the authoritarian personality. At its very core is fear, fear of death, fear of failure, fear of loss, not just personal fear, but fear for all you hold dear and the destruction of your way of life. Fear is not a meme. Fear is a primal emotion that overwhelms everything in its path. And it spares no one.

Sometimes you gotta be an s.o.b.
You wanna make a dream reality
Competition? Send ‘em south
If they’re gonna drown
Put a hose in their mouth
Do not pass ‘go’
Go straight to hell
I smell that meat hook smell
Or my name’s not Kroc
That’s Kroc with a ‘k’
Like ‘crocodile’
But not spelled that way, now
It’s dog eat dog
Rat eat rat
Kroc-style
Boom, like that
--Mark Knopfler, “Boom Like That,” Shangri-La