Tuesday, July 15, 2008
So, in honor of that last bit, I'm going to rep0st something I put up on my Newsgroup a couple of years ago, following the 2006 World Science Fiction Convention:
[originally posted Sept. 5, 2006]
I was watching the News Hour on PBS last Friday, the Shields and Brooks segment, and Jim Lehrer asked a question about Bush’s latest PR offensive, equating the War on Terror and War in Iraq (two phrases Bush always uses interchangeably) to the Cold War and WWII. And Mark Shields. Just. Went. Off. He was nearly ranting, forcefully demanding to know why, if it was all so important, why the country hadn’t been put on a war footing, why there weren’t enough troops in Iraq, why taxes hadn’t been raised, etc. etc. etc.
Brooks was obviously taken aback, and tried his best to shift the argument, talk about how the country would never stand for such measures, and so forth. But mostly he looked nervous and, well, dare I say it, wimpy, irresolute, even lost. After all, Shields is the liberal; he’s not supposed to be the one spitting fire.
The World Science Fiction Convention isn’t anywhere close to the political mainstream, actually. Most convention going fans are college-educated, intellectual-leaning, and more respectful of science rather than, say, religion. Certainly there are plenty of right-leaning fans, but they tend toward the libertarian or Social Darwinian right, rather than the religious right that forms the core of the current Movement Conservatism.
Still, I’ve heard plenty of support in the past from various SF types for various portions of the Conservative Movement agenda, especially the anti-tax, liberal bashing part of it.
Not this last convention, however. In fact, several times, some from panelists, but just as often from ordinary convention goers, the subject would flash over to current politics and the Bush Administration and someone would. Just. Go. Off. On a tirade, a screed, a rant, a whatever-you-want-to-call-it.
And there would be no response. Not even afterwards, in the men’s room, where all the important political thoughts are voiced. No one is willing to say in public, or even semi-private, that they support the Bush Administration.
The secret ballot covers a lot. It especially covers a lot of bigotry. David Duke the white supremacist always polled about 10% higher in the actual race than he did in preliminary opinion polls. So it’s always hard to predict elections beforehand, especially when fevers run high on issues like immigration, where race matters even more than it does otherwise.
But authoritarians are bullies, and the truism is true: bullies are cowards. At a certain point, pulling in your horns becomes reflexive, especially when you’re not sure what you’re voting for in the first place. Moreover, bullies really, really, hate to lose. Better to not fight, then tell yourself that you’re the victim here.
So while I’m not exactly predicting a surprising shift in voter turnout this fall, with the authoritarian right sitting on their hands, it wouldn’t surprise me one bit.
Sunday, July 13, 2008
But I want to examine Herzberg's essays about capital punishment here. His liberal moderation forbids him from calling it "state sanctioned murder," and he notes that this would be similar to calling incarceration "state sanctioned kidnapping." (The actual analogous crime is "false imprisonment," however; make of that what you will).
Then there are the usual arguments about accidentally executing the wrong man, and how capital punishment actually demeans us, the rest of society, which is true, but only from the liberal perspective. Conservatives and right-wing ideologues actually glory in that demeaning; it's part of their vision of how society should operate.
But there is an argument that Herzberg doesn't make that I heard once from a capital punishment opponent whose name escapes me, and it is a much more powerful argument against that barbaric practice. That capital punishment is capricious is a feature, not a bug, in this view, and I find it persuasive.
We see the mechanism displayed in popular culture repeatedly, in the procedural television shows that have become so prevalent, from all the Law and Order flavors to the various CSI clones. Somebody dies and the interrogating authority figure brings up the death penalty as a threat, to shake loose information, or perhaps to force a plea bargain from the suspect. Of course in these shows the accused "perps" are almost always guilty, so it's all just part of the tools of the trade.
But in real life, the accused are frequently not guilty, at least not of the crime for which they are accused, and therein lies the problem. Because the threat of the death penalty can make an innocent man cop a lesser plea, as death is permanent, while the lesser crime allows at least some future.
Moreover, sometimes prosecutors know this, they know that the accused hasn't done this particular crime, but they are sure that he has done something for which he deserves punishment. Thus does the death penalty make a mockery of the idea of the rule of law, substituting the opinion of a D.A. for that of a judge and jury. It is a system that is made for abuse, and abused it certainly is.
Thursday, July 10, 2008
Yet another Medusa
This is one serious woman with snakes. The original source can be found here.
Tuesday, July 8, 2008
Giving Up the Ship
Partly this is due to my multiple lives during high school. I have lifeguard memories that occurred at that time, but those were from the Downtown Nashville YMCA. There were girls I dated, primarily from distant high schools, possibly owing to a fear of gossip, or the instinct of self-preservation that realizes that breaking up with someone who is in your English class is Very Rough on the System. There are a number of Forensic Club memories, but I can categorically attest to the basic lameness thereof.
Besides, the best memories are about mischief made, and I was such a sweet kid, really I was.
Still, I believe that I held some sort of record for being tossed out of Mr. R's 8th Grade class, which was either technically in High School because it was in the same building, or technically not, since it was still Jr. High School. Whatever. Mr. R was an ignorant twit, and 40+ years has not dimmed that assessment.
I will leave aside the times when he said foolish things like "Rock and Roll is a Communist Plot." I seldom bothered to call him on things that everyone knew were stupid. But when he told the class that Earth satellites stay in orbit by balancing between the gravitational attraction of the Earth and Moon, my calm demeanor vanished and I said something like, "That's idiotic." That got me a quick trip to the library (Oh, throw me in the briar patch, Mr. Detention).
Then there was the pop quiz on the Revolutionary War, where one of the questions was, "This man said, 'Don't give up the ship!'" Sadly, I knew that it was a dying quote from Captain James Lawrence during the War of 1812, and that said ship, in fact, was given up shortly thereafter (though the words became a slogan used in the Battle of Lake Eerie). R had, as so many before him, confused that saying with "I have not yet begun to fight," which was from John Paul Jones, the correct answer to the wrong question. I said, "I don't think you have the correct quote," and R said, "How do you know who I'm talking about?" So I levelly answered, "All I know is that John Paul Jones never said, 'Don't give up the ship.'" My fellow students' pencils scratched the answer that R had been looking for, and I, once again, was ejected from a class that was trying very hard to make me know less than I already knew.
Monday, July 7, 2008
Does Science Fiction (Still) Matter?
But in "Why Science Fiction Matters," I argued that science fiction is (or was) more than an escapist literature of popular culture, that it fulfilled a central role in the lives of at least one major segment of the post-war generation, the upwardly mobile children of working class (or agrarian) parents. For the tech oriented Baby Boom generation, plus a segment of a couple of generations before and after, SF provided a world view, including a program for the future, access to a social network, and a window into transcendence of the sort that usually falls to religion and philosophy.
The period of SF ascendance can be demarcated by the two major science fictional events of the 20th Century, the advent of nuclear weapons and the Apollo Space Man-in-Space Program. In truth, I would tend to move the starting point a little earlier, to the beginning of World War II, because that war was, in many ways, a science fictional war. New weaponry, especially radar, but also missiles, submarine technology, and so forth, played a major role in the fighting and winning of the war. Moreover, commentaries prior to the war speculated on whether or not the next war would be the end of civilization. And the savagery loosed during that war met or exceeded the most nightmarish visions of pulp literature.
At the other end, the lunar landings and the space program generally seemed to validate everything that had ever been written about space exploration, and an entire generation was sure that colonies on the Moon and Mars were just around the corner. That they were disappointed in this contributed to the anti-government backlash that occurred thereafter. To this day, there are SF fans who are sure that it was only the incompetence of U.S. bureaucrats that stood in the way of their dreams of interstellar civilization.
In fact, it was reality that nixed the deal. Space if far bigger and more hostile (and less economically valuable) than most people imagined.
That's the problem with reality. It keeps intruding and messing up our dreams. There was a time when it seemed like science and scientific authority would loom large on the political landscape. But science kept delivering bad news, like warning about environmental degradation, limitations on energy use, changes in the global atmosphere and the implications thereof.
And, in truth, SF had always been a bit anti-establishment when it came to science. Astounding, under Campbell, spent over a decade pushing ESP/PSI, to no good effect. Then there was the entirely embarrassing Dianetics episode. Toward the end, a good many SF types jumped on the SDI bandwagon, as yet another excuse for space research, just as solar power satellites (a truly silly idea), had gripped imaginations earlier, and still do, to this day.
But subsequent generations took a look at all this and saw what was basically an escapist literature that had been co-opted into a number of big budget motion pictures. Fun, but nothing to wrap your life around. Now, things like World of Warcraft take more of the escapist freight than does reading SF. For religious transcendence, people are showing a disturbing tendency to turn to—religion. And, as I say, the Authority of Science has a lot of people claiming to speak for it, or attacking it outright.
So where does that leave science fiction? The aging of the SF fan community has been much remarked upon, and it's a fact of life. To my eye, it doesn't really look like SF is currently the place where someone with something to say goes to say it (I suspect that blogging now occupies that ground), nor is it the place where people go to read what such people have to say. SF has spawned several sub-genres, like military SF, alternate history, paranormal romance, and the like, so maybe that is where the energy has gone, into smaller and smaller niches.
The center no longer holds, or at least it no longer holds that much attraction. But maybe that's just me being an old fart.
Friday, July 4, 2008
Woman and Snake: Amano
This one was sent to me by a friend. Amano is a Japanese artist that I don't know nearly enough about, because he's really good. In addition to the .jp site, there's a .com site for those who insist on English.
Wednesday, July 2, 2008
But there is the matter of how to treat those who are dead, those who can neither grant or deny permission. I can see both sides to the argument there, so it’s inevitably a case by case basis. In this case, I think that Barry would not mind, and besides, I think he deserves some sort of eulogy from me, the sort that I could not deliver at his actual memorial service, because there were friends and family there who might have been wounded by some of the details.
Barry was a drug dealer. He began peddling marijuana in the early 1970s, and had graduated to cocaine by the time I met him in the early 80s. He was, when I met him, at something of a low ebb. A while before he’d planned on giving up the trade, getting married, the whole bit, then his fiancé ran off with someone he’d thought was his best friend, and then a series of ill-advised deals went sour, rendering him, if not impoverished, at least less able to finance the sybaritic life style that it turned out most of his other “friends” were interested in. So he was struggling to keep from foundering financially, a struggle made much more difficult by the lure of the product that he was back to peddling.
We met through a mutual friend, and Barry and I became friends. I won’t say that there weren’t drugs involved, because at that time, if you were around Barry, drugs were usually involved. However, we also joined a bowling team together, went to concerts, parties, etc. Barry was an amateur theater person, so I went to some of his performances, and he was interested in standup comedy. I never saw him perform, but we did go to see various comics he was interested in, and the SF comedy scene has had a lot of talent pass through.
When I started to feel the effects of “the Lurgy,” the maybe-it-was-chronic-fatigue-syndrome that I got in the mid-80s, drugs were one of the first things I gave up (with the occasional lapses, of course; I’m not some sort of iron-willed paragon). But Barry remained a friend, one of those who was willing to help with those minor things that sick people need, like the occasional trip out to a restaurant when I just couldn’t get it together to feed myself.
Barry got busted in the late 80s, but he was only holding grass at the time, the dealers’ paranoia serving him well enough that he didn’t take any coke to that particular meeting. The prosecutors tried to get him to set up his own suppliers (just as he’d been set up), but Barry was having none of that. While awaiting trial he dumped the rest of the business, cleaned up his act, and took a job as a night security guard. He sufficiently impressed the arresting officer that the man recommended straight parole, no jail time, and that was that for Barry as dope dealer.
Still, Barry was ever the entrepreneur, making frequent trips to Bali, to buy artwork that he then sold when he returned home. And he’d always been a comics collector, so he ramped that up as a small business as well.
Barry had an occasional girlfriend, whom I’ll call Di. It turned out that Di had lived near RPI during my time there, as a place called “The Farm” (did every college in the early 1970s have a “Farm” associated with it?). She’d been involved at that time with a guy I put in the acknowledgements to SunSmoke, Jim Nagy, who’d played McMurphy in the RPI Players production of “One Flew Over the Cuckoo’s Nest” and had been given tepid reviews because he obviously hadn’t been acting, just playing himself. Which is to say that Nagy had major charisma, and I had the unfortunate task of telling Di that Jim had died in the mid-80s. He’d lived an amphetamine fueled life for years and had cleaned up too late, apparently, the damage having been done, and something critical finally gave out.
It can be a small world in the fast lane, even if you’re usually just a passenger.
After I married Amy, she and I had dinner a few times with Barry and Di. Amy told me after one of them that, during a time when I’d left the table to go to the restroom, Di had confided to her, “If ever any man looks at me the way he looks at you, I’m his forever.”
Barry quit the drugs, but he kept his Corvette, and that was what did it for him. He was on Highway 101 in Northern California, a twisty turny stretch that shrinks to two lanes for periods, and he was behind a camper and he was always impatient. He tried to pass when he couldn’t see far enough ahead, and someone was coming.
I developed a theory about the illegal drug business back when I had a better vantage point for observation. It seemed to me that the nature of it was primal, stark, a world of black and white. There is no legal system to enforce contracts and the regular criminal justice system is often deliberately manipulated by criminals to their own advantage, creating situations where the police and the law was the instrument of injustice, rather than of justice. And it looked to me like the tradesmen were either snakes or honorable men, with a great gulf between the two.
A friend of Barry’s told me he saw the highway patrol report of the accident, and it suggested that Barry deliberately went through a railing and down a steep embankment, rather than hitting the oncoming car. I believe it, because that was Barry, always taking risks, but trying his damnedest to confine the damage to himself.
Tuesday, July 1, 2008
The two protagonists are a galactic exploration team, and they have discovered that the galaxy is awash with Homo Sapiens, practically one inhabited world in every viable solar system, and all of them primitives who greet space explorers with either worship or homicidal intent. It's a puzzlement.
Then they come across a civilized world, but one that is oddly decadent. They have such technology as automatic translation machines, but have no idea how they work. When asked, the inhabitants reply, "We asked the Oracle how to make one and it told us."
So, first error in presentation. You don't build things by just being told how to make them. To build a translator (or automobile, or even a stone house) you need pre-existing infrastructure like semiconductor fabs, or foundries, or stone quarries. Knowledge alone isn't enough.
Next, one source of answers simply would not work for an entire world. This is the alien-planet-as-desert-island analogy that I once railed against when critiquing Clarke's Law. A civilized world has billions of people on it, far too many to crowd into a room.
But the Oracle does indeed reside in a room, and our explorers are given an audience. It reveals that it was created by an extra-galactic race (from the Magellenic Cloud) as a weapon that worked by answering all questions truthfully. This destroys the institution of science in those who possess it (no need to pursue answers when they are handed to you an a plate), and when taken to a empire's home world, wrecks said empire.
One of the two explorers wants to steal the Oracle and take it back to Earth, rigging it to answer only his questions. The other wants to head back empty handed and warn Earth. They fight. The first guy dies. The second realizes that he is now stranded, since their ship required two men to operate. But the Oracle could tell him how to save his own life, so….
In one of the Foundation stories, Asimov makes a swipe at what happens when you trade science for scholarship, i.e. when you stop experimenting and just look up the answers. I never bought that argument. Nobody verifies everything that they are told under the authority of science, to attempt to do so would result in another end state—where science keeps reinventing the wheel, over and over again.
However, "Perfect Answer" gets the reaction of the two explorers correctly, or, more specifically, the one that wants to monopolize the gizmo. That is how it would actually work, so our travelers should not have found a happy-go-lucky decadent society, they should have found an authoritarian state in the grip of those controlling access to the answers from the Oracle.
Now you can replace Oracle with "simulation model." But you still need that infrastructure that I spoke of earlier. In science, the infrastructure consists of scientists and the community of science. The community of science is not command-and-control oriented, as many have discovered, to their discomfort.
There is a difference between authoritative and authoritarian, which some people get and some people do not. Authoritarians don't get it. They never do.
Monday, June 30, 2008
The View from 30,000 Feet
The only other faces you see are those who are in the plane with you. Your inner circle looms larger than entire counties. It's no wonder that cronyism becomes the watchword from high above. Who else matters except the nearby few?
Pilots are used to privilege. They sometimes fancy themselves as "mavericks," like the character in Top Gun, but really, they're at the apex of a pyramid with the lower orders devoted to keeping them in the air, and they cannot stray far from the pyramid. Every pilot depends utterly upon the dozens of maintenance personnel who keep the plane from failing, the hundreds who built it, the thousands (and more) who have paid for it.
Can the country afford another 30,000 ft. President? I think not. John McCain may not have Bush's sadistic streak, nor his superstition and prejudices, but the sense of privilege is fully intact, an inevitable result of heritage and the flyboy mystique. Moreover, McCain has killed, directly, by dropping bombs on targets from high above, in a different war that was also instigated with lies. It would be termed murder if it ever went to trial, the sole defense being "I was following orders," and we know how that works.
It is part of American Exceptionalism that our country claims the right of aerial bombardment, to kill from on high, with only the phrase "collateral damage" serving to cover the deaths of women, children, or innocent men merely doing their jobs. "Strategic bombing" is entirely a doctrine of total war, the belief that war is inevitably genocidal, a duel to the death between two tribes of humans. There are no civilians in total war, only nits and gnats, and body counts, if you care to make them, which our country no longer cares to do.
But explosions do look beautiful from high enough. So do hurricanes and the damage done. Almost everything looks beautiful at a distance. It's only up close where the pain and suffering reside.
Saturday, June 28, 2008
It's certainly true that many, if not most, Americans believe this is true, at least at some level. And it does lead to all sorts of pernicious behavior and attitudes, including denial of all the tragedy that our country has wrought over the years, decades, and centuries.
[Here, incidentally, is where I'm supposed to insert something about all the good that America has done, in order to prove that I "don't hate America," that I do love my country, and wouldn't think of living anywhere else, etc. etc. Then we all sing The Star Spangled Banner. But I'm kinda tired right now, and I'm not sure I could hit the high notes].
It seems to me that there are plenty of other countries that think they're pretty special. Britain once "ruled the waves," and the Brits certainly thought they were better than the "wogs [who] begin at Calais." China has always seemed pretty full of itself (in so many ways). I promise you that the Japanese feel plenty exceptional. The French? Do tell. Germany? You don't try to take over Europe if you feel ordinary, and the hair shirt they've worn for the past half century was tailored just for them. Israel? Check. Egypt, Saudi Arabia, Iran? Check, check, and check.
People write about these different exceptionalisms to varying degrees, and the U.S. gets the lion share of ink. But every country seems to have a exception clause built into its national character, as nearly as I can tell. I would be interested to hear of some country whose inhabitants all say, "Ah, our little country is pretty ordinary. We exist more or less by accident, you know, and if we vanished tomorrow as a nation, history and the world would probably never notice.
I mean, that would be quite extraordinary, wouldn't it? Even exceptional.
Friday, June 27, 2008
Dylan: Lay Down Your Weary Tune
For Black Dog Barking, who wondered in comments why the invisible hand of the market never put this on a commercial album. My answer would be that the invisible hand works in mysterious ways, only a few of which having to do with markets and money.
As noted, the photography is from Miranda Jane, who is unknown to me until now. More of those "invisible college" things that the pointy heads such as myself talk about.
Wednesday, June 25, 2008
Women and Snakes: Simplicity
Because sometimes, you just want to see a photo of a scantily clad, good-looking woman, in heels, on a bed, with some snakes.
Tuesday, June 24, 2008
The Occult History
I won the raffle at a party thrown by a job agency a while back, and one of the prizes was a gift certificate from Starbucks. I don’t drink coffee; a single glass of Coke at dinner is enough to move my sleep time back an hour or more. But Starbucks sells other stuff, so I had a cup of hot chocolate and bought the Dylan No Direction Home CD.
The PBS special on Dylan was directed by Scorsese, and covered Dylan’s career up to the point where he had his motorcycle accident. It feels important, somehow, that Dylan survived the accident, that he didn’t follow the “good career move” that got so many of the other 60s icons. Dylan was always the Trickster, so it also feels appropriate, and besides, living is better than dying. I don’t care how many mediocre albums he’s made since then, how many unmemorable songs. He’s alive; good for him.One of the things that was very obvious, and left very unmentioned, in No Direction Home was how thoroughly ripped he was for much of the time. The scenes from “Don’t Look Back” were particularly obvious, with Dylan’s speech, wordplay, little tics and gestures, all showing the obvious signs of amphetamine use. Gee, a pop star in the 60s on tour, using speed. What a shock. And a lot of his songs are scornful; just watch one of his press conferences and take a guess as to why.
The obvious reason why the documentary didn’t mention the drug use is that, once drugs get mentioned in any narrative, the overall narrative gets hijacked by the drug narrative. It’s pretty much the same with sex; once sex gets mentioned, it takes over the story line, because that’s what people are most interested in. I’m not sure what happens to a sex narrative once drugs are mentioned, or vice versa. I suspect it’s just that there is a sex/drugs story, and well, there you are.
The drug narrative, in order to be palatable, pretty much has to follow either the “I saw the light and now I am redeemed” plot line, or the “descent into hell followed by death” plot line. No others are really acceptable to a mass audience, although there are specialty tastes, of course, and times do change. It used to be the case that adultery had to be punished, for example, but not so much these days. Of course, adultery used to be actually illegal, and drugs still are. More accurately, the drug narrative is the illegal drug narrative. Legal drugs tend not to get much of a mention because there is no “moral principle” involved, unless the drugs are procured illegally, of course.This wasn’t always the case. Opium was legal when Coleridge wrote “Kubla Khan” and that got turned into a morality fable, of a sort. When you examine it closely, however, it’s hard to find what the moral of the tale is. Without opium, there wouldn’t have been a poem; its incompleteness is usually blamed on the “gentleman from Porlack.” It’s also worth asking if “Kubla Khan” would have received the same response if it had not been known as an opium dream, and if it had been finished. Again, no way to know, is there?
Drugs affect art. Hell, everything affects art. But drugs, owing to their effects on the psyche, modify art more than most other things. Of course, love has had more effect on art than, say, heroin or amphetamines, but love has its own neurotransmitters. Pharmacologically, love is a stimulant. Add War, in its most general terms, to the list, and you’re still talking about internally generated substances linked to external events. Love, War, Drugs, go write about those. Let me know if you find richer subjects.
Most of the attention given to drugs in the narrative of the artist and the artistic life follows the plot of seduction and corruption. He had such a promising career until he became an addict. By the time she was 40, she looked 70 and her voice was shot, owing to the combination of alcohol and drugs. And so forth.I once saw a bio of F. Scott Fitzgerald that referred to the matter as a “Faustian Bargain,” and that has more truth to it. We can decry the art lost to early death from alcoholism, but we cannot know how much of the art during life was the product of that alcoholism. One is supposed to hew to the line that drugs and alcohol only subtract, never add, but can one really listen to “Subterranean Homesick Blues” and believe that it would have been the same song without a speed boost?
Dylan in particular was old school beat poetry with a rhyming dictionary. Try to imagine Kerouac without the liquor and speed, Burroughs without the heroin, Ginsberg without the peyote. One might was well imagine Nick and Nora Charles without martinis or Hemmingway without the guns and fights.
In SF, there are also plenty of overt examples of the occult history. In my essay Sleeping in Fritz Leiber’s Bed, I note that Leiber’s alcoholism informed a number of his stories, including ”The Thirteenth Step”, “Gonna Roll the Bones”, and “The Secret Songs”. Leiber might have written other stories had he not been an alcoholic, but they would have been different stories. The same is surely true of Philip K. Dick, whose habit (until his health failed him) was to sell a book contract, then take enough amphetamine to “speed rush” (Dick’s own phrase for it) the book into existence. And really now, does anyone think that Dick’s paranoid, reality-shifting, dark-yet-glittering visions would have been the same without the meth and dex?
The title of this essay is the short-hand that my friend Dave and I use to refer to the general subject of the effects of drugs on history generally, but on art in particular. I think Dave first used it when he’d listened to Elvis: The Sun Sessions and discovered a cut entitled “Bop Pills.” “Jim,” he said, “We overlooked something very important. Elvis was a truck driver.” What he meant, of course, was that truck drivers have always used uppers, and always knew where to get them. “Bop Pills” was part of how you got to bop.
You can certainly make your own list of all the art that has had drugs as part of the pervasive influence. “Wine, women, and song,” has been replaced by “Sex, Drugs, and Rock ‘n Roll.” It’s the same old song, though everything else about it has changed.
Well hell, without sex and drugs, what would the songs be about?
Monday, June 23, 2008
Another real gem was The ABC of Technocracy, by Frank Arkright. The word “technocracy” means “rule by experts” and a lot of people were talking about it near the beginning of the 20th Century, people like H. G. Wells and Thorstein Veblen. But by the time the Great Depression rolled around, it had turned into a crank economic theory, holding that the problem was that the value of money fluctuated (which is mostly true), so it should be instead based on something whose value didn’t fluctuate (which is probably impossible). The Technocracists decided that money should be based on energy, with the basic unit being the erg.
I think I recall a mention of Technocracy in Martin Gardner’s Fads and Fallacies in the Name of Science but there’s no substitute for the pure uncut stuff. What I mostly recall from The ABC of Technocracy is just how tired I got of the endless repetition of the slogan, “an erg is always an erg.” (And you thought you got tired of the phrase “Guns, Germs, and Steel,” in the PBS series). Yes, from a physics standpoint it’s sorta kinda true that the erg is invariant, but from an economics standpoint, context still matters. An erg of electricity in my toaster is still more valuable to me than an erg of heat on my roof.
I’m guessing that the notion of a unit of energy as money came from the labor theory of value, the notion of Ricardo (and Marx) that all economic value is derived from human labor. Confuse “labor” with “work” and confuse the latter’s meaning in economics with it’s meaning in physics and bob’s your uncle.
Of course, even in physics, “work” isn’t the same as “energy,” since thermodynamics limits the amount of work that can be extracted from any given source of energy, but that’s hardly the most egregious error in the mix, is it?
And jeez, why the erg? I mean, that’s a tenth of a microjoule, and a joule is much closer to human scale, one watt-second, enough to lift a kilogram about tenth of a meter. An erg will lift one microgram one centimeter. What good is that? It would be like trying to base your money on micrograms of gold. That’s too small to even see.
Gold, at least, has some advantages as a commodity basis of money. It’s not a consumable, for example. It lasts more or less forever. It’s nice and compact, so it’s easy to store. It’s pretty, so you can always make a necklace out of it.
Of course any commodity-based money puts your money supply at the mercy of changes in relative commodity values. Gold in California resulted in a huge local inflation (e.g. the legendary ten dollar eggs), followed by a national inflation, which was then followed by the inevitable compensatory deflation. It was such a joy to be a commodity producer in the 19th Century, though I admit, it did beat being an inhabitant of Central America in the 16th Century.
The essential error here is confusing what are called “institutional facts” and “brute facts.” The former depend upon human institutions, like the value of money, the location of a state line, the name of the President of the United States, or, indeed, the existence of the Office of President, or even the United States itself.
By contrast, water freezing is a brute fact, as is the weight of a certain volume of gold, or the conversion of one form of energy to another. All proceed untouched by human hands.
There’s a related error here, however, and that is the notion that brute facts are somehow superior to institutional facts. One can make all sorts of conjectures and claims about “objectivity” vs “subjectivity” and the nature of human institutions and the physical world, but I rather suspect that a big part of the attraction of Technocracy and its erg-based money was the idea that scientists and engineers would run things better than politicians, bankers, or even economists. After all, energy is better understood than money, right? So why not use energy as money?
And there lies the error in the idea of technocracy in its more general meaning, “rule by experts.” It has at its center certain prejudices about what constitutes valid expertise. But a politician is an expert in his own field; if you don’t believe me, watch what happens if you try to get any given physicist elected to Congress. Everyone believes that their own job (or class, or race, or political philosophy) is more difficult and more important than the next guy’s, so why not try to gimmick the system to make sure that the “right” people run things?
And there’s no idea that is so loopy that someone won’t re-invent it:
Quoted in The Economist’s View:
A new kind of money, by Julian Darley, Alternet: The decline in the availability of cheap energy is likely to be accompanied by an equally ominous possibility of world financial meltdown. That we are facing both of these threats now is not an accident: energy and financial stability are intimately linked. I believe the solutions for dealing with these twinned threats are equally linked. To build an environmentally sustainable, monetarily stable world, we need to create an economy in which locally produced energy provides the backing for local currencies. ...
Friday, June 20, 2008
Many years later, I forget the exact circumstances, but I think it involved parking my car in a very small space. Dave volunteered to get out and talk me in, but I brushed him off and parked the car by eye and feel. Afterwards, he said something like, “You didn’t trust me to do it, did you?” I thought for a moment and confessed, “Probably not, but don’t take it personally; I don’t really trust anybody.”
I don’t know whether it’s a “guy thing” or an American thing, or a Southern thing that I project, but I’ve known a lot of people who aren’t real big in the trust department. When you view life as a struggle, red in tooth and claw, trust is pretty scarce. I can relate, given my own experience and reactions. For one thing, just ordinary acting in good faith doesn’t seem that thick on the ground these days. I have a natural tendency to take people at face value that has eroded pretty badly over the years.
I remember one project manager who, I’m pretty sure in retrospect, was trying to sabotage my career, by telling a client lies about me, and telling me lies about the client, and making sure that I never got a chance to talk with any of the client representatives directly. He later quit to become an EST trainer and got personally screwed over by Werner Erhard, after which he had a psychotic break and more or less complete mental collapse. I am not so highly evolved that I didn’t enjoy watching that Karmic Komedy play out.
But simple treachery hardly accounts for the matter. When does “passive aggressive” turn into “doesn’t give a damn?” I don’t know, but I’ve learned not to rely on unsecured promises. Beyond that comes the frequent simple inability of many people to accomplish what they set out to do. They start with the best intentions, but something comes up, something invariably comes up, and there you are, stuck holding that bag.
So I make allowances, and I’ll bet you do to, so often that you don’t even realize you’re doing it most of the time. You tell the chronically late fellow that you’re going to leave two hours before you really need to leave. It started out as fifteen minutes, but the chronically late guy caught onto that, so it’s been clock creep ever since.
Back in the 70s and 80s, there were all the “human potential movement” tropes, one of them being the “trust exercise” where you stood up, closed your eyes and fell backwards, trusting the person behind to catch you. How pathological is it of me that I cheated on trust exercises? I never trusted the folks behind me to catch me; I just decided that I didn’t mind falling on my back. Besides, I knew how to fall.
But then there’s this. A while back, Ben made a comment about my “owing Amy my life.” He was referring to her quite heroic endeavors on my behalf immediately following my melanoma diagnosis. She cut through the county health bureaucracy in probably record time; by the end of the day that I’d been given the diagnosis, I had an gatekeeper physician appointment for the very next day. The next day I got scheduled for surgery, and an appointment with an oncologist for the following Monday. Time counts when you have cancer, and delay can make the difference between a good and bad outcome.
“Owing your life” sounds like a debt, though, and this doesn’t feel like a debt. It’s impossible to be sure of the “what ifs?” Without Amy’s efforts it would probably have taken longer, but I expect I’d have managed it. I usually manage. That isn’t really the point. The point is that I didn’t have to, and the point is that I can (and do) trust Amy with my life. That isn’t a debt; it is a much more blessed state of mind.
Thursday, June 19, 2008
“Amy and I had dinner last weekend with Brad and a couple of other guys. One of them was a Buddhist who was also Elvis Costello fan and we spent some time swapping concert stories. He’d been to Madam Wong’s and had seen the Naughty Sweeties during their heyday. On the other hand, he hates Jackson Pollack.”
“It’s always a little surprising when someone who likes the same things doesn’t like all the same things.”
“Yeah, it makes the idea of ‘shared experience’ a little dicey as a way of separating ‘us’ from ‘them.’”
“Then there’s the flip side, when people seem to like the same things, but for such completely different reasons that they might as well be from Mars.”
“Ah, sure. Welcome to my world. Except I’m the Martian.”
“Where’s the Kaboom? There was supposed to be an Earth Shattering Kaboom!”
“The Illudium Pu-36 Explosive Space Modulator! That creature has stolen the space modulator!”
“Speaking of Martians, what do you think of Tom Delay’s asking The Colbert Report for the clip of Colbert asking Robert Greenwald, ‘Who hates America more, you or Michael Moore.”
“I think Delay and some of his people may be brain-damaged. There are certain sorts of deficits that make people unable to comprehend irony. It’s like aphasia; they just don’t hear it.”
“Well, that’s charitable.”
“Yeah, I’m a philanthropist. But maybe that’s another way to run the password thing. Remember when we were doing IQ guessing?”
“Yeah, you were pretty good at it.”
“It’s not that hard. Most IQ tests load primarily onto verbal acuity. You can get that by just talking to someone for a few minutes. I do remember that a friend of mine in college asked me what I thought the IQ of his fiancé was. I thought about it a moment and realized that she was smarter than I’d have first thought, IQ around 135, which turned out to be an exact hit. But she’d laughed at the right places in our jokes, not a beat behind, like someone does who is following someone else’s laughter.”
“You think it would work for the sentry?”
“I don’t see why not. Guy comes up to the checkpoint and the guard says, ‘Halt, friend or foe?’ and the other guy says, ‘Friend.’ So the sentry says, 'A priest, a minister, and a rabbi walk into a bar…'”
Wednesday, June 18, 2008
Was there ever really a World War II movie where the sentry asked the guy coming up to name the team that won the American League Pennant in 1940? (Ha! Bet you said the Yankees! But actually the Detroit Tigers won it, the only break in what would otherwise have been an eight-year streak for the Yankees). There must have been some movies where that sort of thing happened, but I’ll be damned if I can think of one offhand.
Anyway, you can see the danger in that sort of password. All the enemy needs is a knowledge of American baseball, and you’re screwed. Real passwords need to be arbitrary, hard to guess, like swordfish, or taiyo kamuri.
We may be hard-wired to have a sense of “us” and “them.” There have been news stories that reported on the “implicit bias” tests that I mentioned in an earlier post as demonstrating that people are “naturally” racist. That argument fails both because those tests show the effects of learning, and also because “natural” doesn’t mean “inevitable” or “good.” That last part applies to any “us-ness” and “them-ness” as well. We may perceive such things as part of our basic functions; what we do with those perceptions is something else again.
How we decide who is “us” and who “they” are also matters. Sometimes it’s appearance, certainly. At other times it’s dress, language or dialect, behavior, or abstract notions like nationality and religion. When the demarcation gets abstract, as it is in things like religion or political faction, what then? What is the litmus test?
Let me suggest that, like the password during wartime, the way to tell us from them needs to be something that can’t be simply guessed by being rational; irrational requirements make a much stronger test. So the crucial test becomes adhering to some behavior that looks at least a bit weird to an outsider. You can eat meat, just not meat from “unclean” animals. Or you have to pray a certain number of times a day, facing a particular direction. Or you’re not allowed to dance, or sing to musical accompaniment. Or you have to believe that some well-respected scientific theory is a hoax.
Obviously, the more irrational the behavior, the greater the cost of belonging. Paradoxically (but in accord with human psychology), this enhances the perceived value to the believer.
Fortunately, irrationality isn’t the only thing that’s hard to guess. Experience itself isn’t rational, it’s non-rational, so shared experience can bind a group together as tightly as a hunting band or jazz combo. The shared experiences don’t require direct interaction amongst those who share them, either (although obviously such interaction intensifies the connections). It’s often quite enough to have seen the same sights, felt the same emotions, to make you one of “us.”
So we come full circle back to popular culture. There are a lot of folks writing in the blogosphere, who, whatever their primary interest, suddenly stop to post an iPod playlist. For the past several generations, music has been a crucial part of the shared experience, a way of affirming that, yes, we do all share some common ground.
When Ben first loaned me his iPod shuffle, I loaded it up with T-Bone Burnett’s The Criminal under My Own Hat, Chris Isaak’s Speak of the Devil, the CD from the Dylan No Direction Home documentary, INXS, Welcome to Wherever You Are, The Chieftains, Long Black Veil, and a CD called The Heart of the Forest, music of the Baka people of Camaroon. The rest of it mostly came from a mix CD I made a couple of years ago. I've written previously about the art of the segue, and setting the thing to shuffle sounds like a radio show that my people would like to hear, and would feel like they belong wherever it played.
Tuesday, June 17, 2008
Woman and Snakes: A Loss for Words
This one comes from Crooked Brains. I'm at a loss to describe it. I'm pretty sure it's disturbing, but then what? Is it grotesque? Is it erotic? Is it beautiful? Is it even possibly obscene? Could it be said to be essentially exotic? Evocative? Rich in associational content? Bizarre? Wondrous? Unsanitary? Shocking?
I'm gonna go with, I wonder what it would be like to meet either the woman or her snakes?
Monday, June 16, 2008
Working the System
The first feedback control device was probably the float valve, used in ancient water clocks. I’m discounting biological and other feedback systems, obviously. Those are usually called homeostasis.
Prior to the 20th century, there aren’t a lot of examples of feedback devices. Watt’s governor is the one most commonly cited, and its 18th century origin was close to concurrent with the steam release valve, which is also a feedback device. The governor is also of note because it is an example of proportional control. It didn’t just shut the steam on and off; it throttled the steam by varying the size of an aperture.
On/Off control is the sort that you get with a thermostat. When the temperature drops, the furnace kicks on at full force, then it stops when the temperature rises. That produces a limit cycle because the process is non-linear. If the furnace heating were proportional to the difference between the room temperature and the thermostat’s set point, then you’d have proportional control. That also produces a cycle, but the cycle is sinusoidal, and the process is termed linear, because of the type of equation that describes it.
The 19th century invention of a torpedo control system by Robert Whitehead was probably the first mechanical invention that addressed the oscillation problem. The first torpedo designs used a simple hydrostatic valve to adjust the control fins, but this caused “porpoising,” an up-and-down motion that sometimes put the torpedo above the surface of the water. Whitehead realized that something was needed to damp out the fluctuations, so he devised a pendulum that crudely measured the torpedo’s angle and modified the control in the direction to reduce that angle. This added a rate-of-change term (aka, a derivative) to the control equation, and reduced the depth fluctuations of the torpedo from 40 ft. to less than 6.
The problem with a proportional controller with damping is that the system often settles to a point of stable error, because the small error signal is damped out by the derivative signal. The solution to that is to add what is called the integral term, so a small error signal is integrated over time, and thus builds to a large enough signal to move the settling point.
The first example of a full PID (proportional-integral-derivative) controller comes in 1922, when N. Minorsky devised an automatic controller for the steering of ships. The mathematical characterization of control systems was also advanced enough by then to properly analyze such systems.
The “feedback loop” as it came to be called, seemed to offer some benefit to another, more general problem, of the sort that a wide variety of scientists and others were facing, that of the reductionist trap. When someone says, “We’re nothing but a bunch of atoms that think we’re alive,” that’s voicing the reductionist trap. A bunch of atoms we certainly are, but it doesn’t seem accurate to say that we’re nothing but a bunch of atoms. There are, after all, a lot of bunches of atoms around, but none of them behave just like me. I rather doubt that any of them think they are me, either.
Another way of addressing the problem is to use phrases like “emergent phenomena,” which is a fancy way of saying that the whole is more than the sum of its parts. Since a feedback loop is also more than the sum of its parts, and since homeostasis (feedback, remember) is a general characteristic of living organisms, there came a general belief that feedback analysis might offer some insights into biology, or psychology, or sociology.
Thus was born the Cybernetics Movement, which included some folks like A. H. Maslow, whom I mentioned in a recent essay, as well as Margaret Mead and Gregory Bateson, plus some heavy hitters like Claude Shannon, John von Neumann, and Norbert Weiner, whose 1950 book, The Human Use of Human Beings: Cybernetics and Society became a best-seller. I’ll mention in passing that Claude Shannon had just pretty much invented information theory, which, aside from revolutionizing electronic communications, also became part of the cybernetics movement.
Later, Cybernetics became General Systems Theory, which was not exactly a cult and not exactly a movement. But it did have some Believers, and I was probably one of them. The systems guys were of the belief that systems theory could be applied to, if not everything, an awfully big part of everything, and that it could and would revolutionize everything it was applied to.
In my own case, as I’ve previously written, I was attracted to the idea of simulation modeling of large scale biological, environmental, and social systems. I started off doing lake ecology, then slid over to atmospheric chemistry with barely a hiccup, because the methods of analysis were so similar. So that part of the program worked pretty well, at least from my viewpoint. However, I hit the downside of it all pretty quickly.
The downside was first, that while the tools of analysis were top notch, to use them in real world situations, you need a lot of data, and the methods of data collection weren’t really up to it. I hit that first in lake ecosystem modeling, where data from sunlight, nutrients, and plankton were pretty good, but the data we had for fish populations were horrible. And, oddly enough, the fish were important. After that experience, atmospheric science was wonderful; there was so much data available.
The second drawback was the real killer: analysis isn’t enough. In order to “change the world” you have to change the world. You can have the right answer, but if people aren’t willing to use it, what good is it? And, if your way of doing things is different in any way from what people are already doing, what they are, in fact, trained to do, you’re not going to make much headway.
It doesn’t help to blame the other guy, either. Everyone thinks their job is hard and everyone else’s is easy. No, what they are is different. Getting the correct engineering analysis isn’t the same as getting the right policy analysis, and neither of them make getting the policy adopted that much easier.
The worst of it was with the physicians. They go through hell getting their medical education. If you want medicine to change, you’re going to have to wait for an entirely new cohort. Worse, because medical education is also controlled by those same people, you’re actually talking about many generations. I watched more than one systems engineer bash his head into that brick wall, over and over again.
The quote at the beginning of this essay is from July, 2006. It could just as easily have been from 1976. Or 1956 Maybe it will happen, but I’m not holding my breath.
Saturday, June 14, 2008
One important criticism of meme theory hinges on the following question: "If memes are the solution, what is the problem?"
“Meme” comes from that troublemaker Richard Dawkins, of course, as an analogy to "gene" in molecular biology. I usually don’t mind a bit of trouble, but really now, enough is too much. Meme, according to Dawkins refers to a unit of cultural information transferable from one mind to another. Dawkins said, “Examples of memes are tunes, catch-phrases, beliefs, clothes fashions, ways of making pots or of building arches.”
So what exactly is wrong with using the words “tunes, catch-phrases, beliefs, clothes fashions, ways of making pots or of building arches” when you want to refer to them? Or, if you’re moving up the abstraction chain, what’s wrong with “idea, concept, notion, belief, ideology, etc. etc. etc. I mean, it’s not as if we have a dearth of words for these things.
Worse, gene actually means something; most genes code for a particular kind of protein formation, and those that don’t are modifiers of other gene expression (or they are “silent,” but what good is a “silent meme?”). What’s the analogy to a protein in the abstract universe of the “meme?” Behavior? Emotion? Doing the funky chicken? (Which may be considered both behavior and emotion if you’re doing it correctly). Moreover, genes are communicated primarily from parent to offspring, while “memes” are mostly just communicable. So memes are more like viruses and other diseases. If you want to look at the spread of ideas, its better to look at disease vector models than it is to look at heritability.
And, crap, here I am treating this as if it actually meant something. But “meme” has more meanings than “paradigm” another fad word that caused more trouble than it was worth. And a word with too many meanings winds up having no meaning at all. These are words that make you stupid, or at least words that interfere with clear thinking.
No, I’m thinking that “meme” is more akin to the secret password (see message 106, May 29, 2006). It’s a word that is overwhelmingly used by the left/progressive side of the intellectual aisle, expressing solidarity with Dawkins, Darwin, and in-your-face you creationist, Christian conservative scum!
Well, dang, your goal is admirable; it’s your methods I question. At least that’s my “story,” (or idea, notion, plan of action) and I’m sticking with it.
I don’t much like mimes, either.
Friday, June 13, 2008
An Other James
I was at a party once, attended mostly by SF fan types, and the subject of name changes came up. I asked the group how many of us had changed our “go by” names in some significant way since we were young. It turned out that something like ¾ of us had.
My own change was fairly major; I was “Pete” in high school, then “James” or “Jim” in college. The former had made sense to distinguish me from my Dad (I’m actually a “Junior”), but I’d never cared for Pete as a name, and college is all about reinvention. I eventually had to settle for "Jim" because so many people just automatically shorten the name unless you're a bore about it. I did, however, draw the line at "Jimmy." There is one person in the world who is allowed to call me Jimmy, and he goes by Jimmy, and he's older than I am, so I can't/won't object.
So the guy I knew as James in high school, and who still goes by that name, knew me as Pete. He didn’t like me, and vice versa. I was what was known then as a “brain,” while he was a “hood,” short for “hoodlum” back then. (Now it's short for "neighborhood" and comes from African American slang). Truth to tell, in our white bread suburb, before drugs, the counter-culture, and the flood of semi-automatic weapons, “hoods” were usually pretty tame.
Which isn’t to say that they didn’t make trouble, nor that I had no trouble with them. Indeed, three times a week, just as school was letting out, I’d be down on the corner waiting for the bus that took me to the downtown YMCA, where I was a lifeguard and sometimes gym class leader for younger kids. But there on the bus bench, I was a target. Sometimes the rowdies would just yell at me; sometimes they would throw things like wadded up paper, empty cartons, or occasionally, half empty soda cups, or empty soda cans.
One day, after a week where it seemed like the barrage had been escalating, I reached down, grabbed a rock, and threw it back. It was a pretty heavy rock. I still remember the rather sickening thud it made against the car door.
The car pulled over and James got out and began stomping in my direction. He outweighed me by probably 40 or 50 pounds and I’m sure he expected me to run. I did not. Instead, I said something phony tough, like “Come on!”
I’m not sure what I’d have done if he’d followed through. I wrestled at the Y, (but within my weight class!), so I’d have probably gone in low, hoping for a leg grab to upend him. Most probably, he’d have beaten the crap out of me. But it never came to that. I was a brain and he was a hood, and it was a busy street, and plenty of people had seen the first object thrown at me. Even if he won the fight, he would have lost, because hoods get into trouble pounding on guys smaller than they are, especially if the smaller guy actually puts up a fight.
So he turned, said something that I didn’t catch, got back into the car, and he and his buddies drive off.
They left me alone after that.
Several years later, on my last trip back to Donelson before my folks moved away, my Dad bought me a $350 used car as a present to take back to graduate school. As I was leaving town, I stopped off for gas, and there was James, working at the gas station. I was enough of a snobbish snot to take pleasure in the thought of him being stuck in a dead end job for the rest of his life.
My high school class hasn’t had many reunions, partly because what had been our high school is now a middle school, so there’s no administrative push for reunions. But we did have a 20 year reunion, and I won the prizes for “came farthest” (you’d almost need to leave the continental U.S. to beat me), and “most changed,” which was basically my classmates voting on whose appearance had changed the most. The still long hair and beard carried the day. The guy voted “least changed” did indeed still look a lot like he did in high school; I just didn’t remember him as looking as gay as he obviously now is.
At the first party on Friday night, held in the “Don’s Den” building that was the after-the-game party place that I’d actually never been to before, I saw James. He looked very good, with no major weight gains, smile on his face, and a pretty wife nearby. “Hi,” I said (or something equally clever). “What are you doing these days?”
“Actually, Pete,” he said, “I’m a cop.”
Later, I learned that he was not only a cop, but he was a major honcho for the Nashville Police Kids Summer Camp (or something like that) the sort of place where you send kids who are starting to maybe be a problem, in hopes that some summer sunshine, clean air, and good role models will straighten them out. So that’s part of his job now: being a good role model.
Okay, okay, it’s almost as stereotypical as the gas station gig. Wild kid turns his life around and becomes a policeman. But narratives work like that. At some point, James had to decide who he wanted to become, and he chose to be like the authority figures he’d had experience with, the ones who had probably more than once cut him slack when he needed it, the ones who’d been there themselves in a previous turn of the narrative wheel.
James had been a major driving force behind organizing the reunion; he’d wanted to show the rest of us he’d turned out well, good family, good job, pillar of the community. And I liked him, and I liked that he was proud of what he’d become, and I liked that he wanted to show it off,
We get plenty of narratives about the curses that are passed down from the older generations to the younger: poverty, pedophilia, drunkenness, drug addiction, child and spousal abuse. I take from James the counter-narrative, that the right mix of kindness and authority is also contagious, at least if supplied to the right person.
I once took it as a major sign of my own personal evolution when I realized that I really didn’t believe that the world’s problems would disappear if everyone was like me. There are plenty of people in the world who aren’t much like me at all, who add to the world and my appreciation if it. James is one of them, and I hope he prospers.
Thursday, June 12, 2008
I once did pretty much exactly this for two New York newspapers for 1911, Hearst’s New York American and The Morning Telegraph (not a Hearst paper), because Damon Runyon wrote for the former and Bat Masterson wrote for the latter. I had a story in mind, and I managed 20,000 words of it, though I’d need to get back to the source material to do any more of it. It took a particular mindset, and that mindset came only with full immersion.
That’s more or less my first point here, that history looks a lot different when it’s happening, and primary sources are essential. Otherwise, you’re just taking sides in what amounts to literary criticism, comparing the narratives assembled by different historians, each with their own notions of what parts are important. That’s true with the newspaper accounts also, of course, but the narrative tissue is often easier to unwrap when it’s been hastily conjured in an ephemeral publication.
My experience also left me with a certain unease about historical fiction generally, including alternate history. Part of that comes from a realization that I had that it’s impossible to do historical figures justice in a modern narrative. Their actions made sense to them, embedded as they were in their own times, but modern audiences will not abide a true re-creation of those times (how many previous years’ bestsellers are even in print nowadays?), and translation to modern sensibilities smothers the real individuals.
I don’t have that sort of problem with out-and-out historical fantasies; it’s understood (at least by me) that the Edison or Coleridge that you encounter in a Tim Powers novel isn’t meant to be the real guy, and anyone who confuses them has trouble telling fact from fiction in the first place.
The problem is hardly limited to history, is it? Amy sometimes does transcription work, and seems to have found a small niche amongst a certain sort of documentary filmmaker. As a result, we have videotapes of various people talking about Sam Wagstaff, who was Robert Maplethorpe’s lover, patron, and promoter, and who was, as much as anyone, responsible for the shift in the consideration of photography as fine art. One of the interviews is with Patti Smith, who lived with Maplethorpe for a time in the 1970s, and whose presence, judging by the video, is absolutely riveting. That may be just the fan in me talking, since I consider Patti Smith as one of the artists in the 20th Century who kept the word “poet” from becoming something risible. But however you figure it, I’ll bet that the eventual documentary doesn’t feel the same as the original source material, because there will be someone else’s notion of the narrative in between.
Another one of the transcription projects concerned the movement to ban military recruitment from high schools and colleges. There we got to see an interview with Cindy Sheehan, talking about her son Casey. This was several months before Sheehan became famous, and the raw emotion and the severity of the injury to her soul was just nakedly displayed. This is one of those cases where the competing pro and anti-war narratives have done a substantial job of smothering the original truth of the matter. But the original source material destroys all subsequent storylines, starkly projecting the central image of a woman shattered at the loss of her child.
Real history doesn't play into narrative that well. It often misses the good tricks. I had certain reasons for reading newspapers from 1911, reasons that didn't include the occurance of the Triangle Shirtwaist Factory fire. A novelist would have had some foreshadowing, but with newspapers, you just turn the crank on the microfilm reader and suddenly you are staring at one of the most famous tragedies of the early 20th Century. No good storyteller would just hit you in the face like that, but history is a story made up after the fact, while original events are facts on the wing.
Wednesday, June 11, 2008
The first science fiction and fantasy convention I ever went to was something called CreationCon. There is now a regular comic convention by that name, but the CreationCon I went to with Ben, Johnny, and the Albany gang had nothing to do with comics. God knows, it had everything else in it though.
Someone had the idea that there was a crying need for a convention bringing together science fiction, fantasy, new age ideas, and the occult. I mean, it sounds like it might work; it just turned out that fantasy writers like L. Sprague deCamp and Lin Carter considered “the occult” to be pseudo-scientific rubbish.
But it was fun. I met David Gerrold for the first time, just after When Harlie Was One came out, and a small group of us had lunch with him. I’ve met David maybe five or six times now, and every time it’s as if we’ve never met before, which is kinda cool, actually.
I also started my Freak File at that convention. That’s my collection of fringe material. Years later, Larry Jannifer and I spent an evening comparing notes on the subject. He called his the “Nut Shelf” so you can see where this is going. I also once loaned my collection to Jim Turner of Ducks Breath Mystery Theater, after seeing his one man show “The Brain that Wouldn’t Go Away.”
I think my real prize from CreationCon was Norman Bloom. Bloom was handing out copies of his self-published ‘zines, really well produced things, photo-offset on news stock, with sturdy staples, the works. Bloom thought he was Jesus, or, more accurately “The Second Coming of Christ.” As nearly as I could tell, he was serious, and harmless. His booklets were filled with proofs of the existence of God, all of which boiled down to the proof of improbability. You see this a lot in Creationist circles, “The odds of life forming are similar to having a 747 appear after a tornado in a junkyard.” Bloom, god bless him, stripped the whole thing down to basic fundamentals. He’d open up the phone book, look at a phone number, and calculate the odds of that particular number appearing. For a seven digit number, assuming all the numbers are random (which they aren’t of course, but I’m not going to stop a crazy man on a roll), you’re talking 100 million to 1. And there are hundreds of thousands of numbers! Good lord, the improbability of it!
If you didn’t like that one, the one about how unlikely it is to have the Moon be just the right size and distance to just barely, yet completely, eclipse the Sun, well, tornado in a junk yard, here we are.
Bloom was, I believe, an engineer, so he had just enough statistical knowledge to get him into trouble. Nevertheless, the problem that he fastened onto is a real problem. It’s known in philosophy as the Plenitude Principle, the notion that if the universe if big enough (infinite sounds about right), then everything that is possible must occur. Or alternately, why do some things happen when other, seemingly just as likely, things don’t?
Regular probability theory finesses the problem nicely: the probability of any event that has happened is 1. Baysean statistics allow a bit of a scew from that: you can’t always be sure that something has happened, so the probabilities then become a measure of your own ignorance.
The Wikipedia article on the Plenitude Principle goes all the way back to Aristotle, though it misses Nietzsche’s take on it: eternal recurrence. Old Friedrich decided that if everything happened once, it would happen over and over again; infinity is big enough, after all. Hard to argue with that, though it’s pretty easy to ignore or dismiss.
A lot of people have taken the entire “many worlds” idea a step beyond, to the notion that everything that you can imagine happening happens somewhere. The real problem with that line of thinking is that it’s possible to imagine things that can’t actually happen, like flying horses and FTL spaceships. Then there is the extended problem of people who think that they are imagining something when what they are really doing is imagining that they are imagining something. That, it turns out, is pretty easy. Just ask Norman Bloom.
Tuesday, June 10, 2008
Wind and Smoke
Trying to loosen my load
I've got seven women on my mind,
Four that want to own me,
Two that want to stone me,
One says she's a friend of mine
Take it easy, take it easy
Don't let the sound of your own wheels
Drive you crazy
Lighten up while you still can
Don't even try to understand
Just find a place to make your stand
And take it easy
-"Take It Easy" by Glen Frey and Jackson Browne
Several times in my life I have done things that were courageous to the point of foolhardiness, or possibly desperation. Usually, I realize this only years later, after much reflection.
Attending RPI was one such action, a decision of existential change, necessitated by an abiding need to get far away from the land of my birth and upbringing, which is to say the mid-South of the United States, Nashville, Tennessee in particular. The transition to RPI didn't seem like such a stretch at the time. After all, my father was born in Montana, raised in Illinois, stationed in Utah and Alaska during World War II, and then settled in Nashville to marry and raise a family. But all these things were in the context of major support systems, family, the Army, his job. What I had was RPI, an unknown quantity that wound up treating me pretty well. Besides, I had the confidence of the true knurd.
So, upon graduating with my bright shiny Master's degree in Engineering Science, I decided first, that I would move somewhere that I wanted to live, then look for a job, rather than having the job search make my decisions for me. Then, because I was truly sick of snow and winter, I decided that California was the place I wanna be, to paraphrase the immortal words of the Beverly Hillbillies theme song. That I went to Northern California rather than Southern California was because I knew a guy.
And there we go. Rather than having a major support organization, I knew a guy. One. Guy. Douglas and I had been friends at RPI, and in 1974 he was a graduate student at U.C. Berkeley. He kindly agreed to let me sleep on his floor while I was looking for a job etc. So Berkeley it was. I had no idea how brave I was being.
Soon after I arrived, complications ensued. Specifically, another friend of both of ours, Henry, decided to move back to Berkeley and also look for work. I'd known Henry when he was a 'Tute student, but he'd later transferred to Berkeley and he and Douglas had been roommates for a while. Douglas could hardly turn him down, but it made Douglas' one bedroom apartment pretty small. Then there was the fact that Douglas had decided to leave the UCB Computer Science program, so he was also looking for a job.
Yep, three twenty-something guys living in a one bedroom apartment, all looking for work in the middle of the 1974 recession. What could go wrong?
Anyway, we tried to give each other space, but it was pretty tense. The gloom of looking for a job at that time was pretty thick and the apartment was kinda depressing. So I ate out pretty much every meal. Fortunately, there are a lot of cheap places to eat in Berkeley, it being a student town. Almost every day I'd have at least one meal at Salerno, an Italian restaurant, where I could get a bowl of minestrone soup and fill up on bread. Or sometimes the soup was from the aptly named Soup Kitchen, which was on the corner of Dwight and Telegraph. Henry and I would eat at Kip's pretty frequently. And so forth.
For entertainment, well, again, student town. Concerts in the park, Sproul Plaza and the like. Low cover charge clubs. And so forth. Plus libraries, used book stores, comic book shops, all geared to low disposable incomes.
On some bulletin board or another, I saw a card for a J. P. Sartre discussion group. That's how I met Steve L. (Another Steve, Steve E., was to become my roommate for the year beginning the summer of '75. Steve E. and I met through the California Mythopoeic Society, probably another contact gleaned from a card on a bulletin board). Steve L. organized the group to help him get ideas for his senior thesis in Philosophy. I admired his ingenuity on that matter. Besides, somewhere along the line I took a look around and thought to myself, "Hey, I'm discussing Sartre in a Berkeley coffee house. How cool is that?"
Steve also got me my first job on the west coast, a part-time low paying gig for a company called Contractor's License Information Service, which, as the name implies taught would be Contractors to pass the State exams. CLIS had a "no fail" policy, which meant that, once having paid the fee, the contractor wannabe could attend as long as it took to pass the Licensing exam. CLIS was very much a "teach to the test" operation, to the point of sending its employees to take the tests and having them copy as many of the questions as they could get away with. The CLIS courses then taught the answers by rote. This is, at best, marginally legal, so it was not that much of a surprise the day that the tax guys shut the place down. Cut corners in one area and you're likely to cut a few elsewhere.
In any case, I was hired to gin up a new course, for the 1st Class Radiotelephone License. The fact is that rote learning is not so much a help for the FCC exams, as they routinely change the answers just enough to mean that you need to know what you're answering. But it was fun to write questions about push-pull circuits and the like. The CLIS gig paid enough that I didn't need cash infusions from home very often, and it lasted until I got a job at Mare Island on the nuclear submarine refueling crew, and then, six weeks later and before my security clearance had come through (hence, before I'd done any real work) my air pollution gig at SAI. I'd moved out of Douglas' with my first paycheck from Mare Island, so I was now living alone in my very own one bedroom apartment, two blocks from UCB. It turns out that the dropout factor makes it easy to find such a place in January.
So, living alone, which, as you might guess, can get lonely. But the Special Interest Group is a powerful networking tool, so I began to slowly expand my circle of acquaintances. Later, other friends followed me to California, taking advantage of the beachhead I'd established.
In the summer of '75, as I mentioned earlier, I moved in with Steve E. who had been accepted to a Cornell graduate program, but for the following year, so he had a year to burn. He spent his time working for CALPIRG, one of the Ralph Nader spin-off organizations. In some other essay, I'll describe the weekly Dungeons and Dragons game we were part of (yes, I really am that geeky). But to close out this essay, I'm going to tell the tale of a party.
I'd kept in touch with Steve L. and he had a band. It was something of a pick-up group, with a decidedly fluid personnel roster. Its name was Cargo Cult. That day, they were playing at a barbecue in somebody's back yard, probably an unpaid gig for a friend, just for the practice.
Steve played bass. They also had a drummer and the usual guitarist, but they also had a pedal steel player and for a few songs were joined by a girl who did an absolutely killer cover of Linda Ronstadt's "When Will I Be Loved?" As you can probably tell from the descriptions, there was a decidedly country rock flavor to the band that afternoon, and it took me a while to realize that the country licks were mostly coming from the guitarist, whereas the pedal steel man was playing jazz.
It was not that big an insight actually, because it came in the middle of an extended instrumental break where the jazz took over. A fair amount of the ceremonial herb had been passed around, the smoke joining with the barbecue scents and afternoon haze. Behind the band was the faintest glimmer of blue from SF Bay, with the fog gathering just beyond the Golden Gate, ready to overwhelm the sky as it usually does on summer evenings in the Bay Area. The wind had picked up and the day had turned cool despite the warm sun.
It was one of those Moments. The thought crossed my mind that Cargo Cult was Really Good, and that they knew it, but they probably also knew that they'd never make it as professionals. This was as good as it would get, and they were fine with that.
As for myself, well, you take your mystical insights were and when they happen. On that day, my thoughts were that we were creatures of wind and smoke, as ephemeral as the fog, as diffusely powerful as the sunlight. We coalesce and disperse; we merge with our surroundings. We sometimes accomplish great things. At other times we merely exist, as if there is anything "mere" about it. I was happy with all of it, a happiness that had taken just under a year in California to achieve. Right then, at that that particular point in time, I was exactly where I was supposed to be.
Monday, June 9, 2008
I remember a documentary on I. F. Stone, in which he disclosed that the real secret of his journalism was in listening to the exact words of politicians and government officials in order to spot the slight verbal tics that indicated the legalistic lie, the carefully worded truth meant to convey the wrong impression. I have a friend who was positively incensed when he learned of Clinton’s mislead, saying “I did not have an 8 year long affair with Jennifer Flowers,” when actually it was a 12 year affair. I’ve lost touch with my friend, so I don’t know how he felt about Cheney/Tennet’s description of the link between Saddam and bin Laden: “We have solid reporting of senior level contacts between Iraq and al Qaeda going back a decade” which translated to “there have been no real contacts for the past 10 years.”
But I am reminded of a quote I recall from a high official in the last days of Polish communism, “The purpose of propaganda is not to get people to believe lies. The purpose of propaganda is to kill the idea of truth.” Twisting truth is more dangerous than merely telling lies; when the truth twists, the very ground beneath your feet becomes treacherous.
I happen to be a close to incompetent liar. I’m just not very good at it. So the Heinlein prescription held some attraction when I was younger and more naïve. But twisted truth still has threads of truth in it, and is easier to pull apart than a well-constructed fabrication. So let me start with the advice, if you’re going to lie, then tell a lie. Be a mensch. At least admit to yourself that you’re making it up. That, at least, saves you from the conceit that you’re better than those you’re lying to. The lie-by-telling-the-truth game lets you tell yourself that it’s your audience that’s too dim-witted to figure out what you’re really saying.
The next important point is that narrative is important. The best lies tell a good story, one with all the proper narrative tricks, like foreshadowing and thematic resonance. All well and good.
But the most important thing about a good lie is to tell your audience what they want to hear. And what they most want to hear is that they are important, they are worthwhile, and they are better than someone else.
That's also my advice on how to write popular fiction, too. And I have trouble with the "popular" part, another indication as to just how poor a liar I am.
Saturday, June 7, 2008
The small delay was often used to produce an "echo effect" on recordings and in the studio. For the echo effect, the tape output was mixed with the line in and patched back into the tape input. Depending on the tape speed, the echo delay could be controlled, and the gain between output and input controlled the echo strength. A gain of greater than 1 produced the "infinite echo" that rapidly became a sound pulsation with its frequency centered at the maximum frequency response of the system.
One practical joke that was often played at radio stations was to hook up a tape deck to generate a delay, then feed the announcer's voice back to him with a fraction of a second delay. I was once trying to get an echo effect on my voice and I found that I'd practical joked myself; I had to remove my headphones in order to continue. The delay makes it almost impossible to speak. It's hard to explain why, but the experience is compelling.
In a course, Voice and Image Processing, that I took at RPI there was a similar demonstration with video. A ball was placed behind a small barrier, and a video camera showed the ball on a TV screen. Normally, you could just watch the monitor and reach behind the wall to pick up the ball. But with a half-second time delay, such a seemingly ordinary task became almost impossible. You soon found yourself reaching for the ball, overshooting, then overcorrecting, then overshooting, etc.
Such a thing is called a 'limit cycle' in systems control theory, but it's pretty eerie to be a part of a limit cycle and unable to break out of it. Eventually, you just stop moving entirely, then veeeeerrrrrrrryyyyyy slowly move your hand to get the ball. It could literally take 30 seconds or more to do that simple task.
There's a bunch of mathematics in systems theory that deals with time delay and "controllability." The upshot is that if you add enough time delay into a control system, it becomes uncontrollable. Your ability to affect events is slower than those events. Imagine trying to pick up the ball behind the wall if it is moving erratically.
One of my favorite jokes is about the economics professor walking through the Quad with his students. One of his students says, 'Look, there's a ten dollar bill on the ground.' The professor replies, 'Can't be. If it were, someone would have picked it up already.'
For a long time, economics was dominated by what are called "equilibrium calculations," models of an economy under steady state conditions, no shortages, prices in equilibrium, all the usual assumptions. Those are the simplest conditions to model and to easy calculate, so they were the first results. Evolutionary biology tended toward the same simplifications, for the same reasons. The advent of the computer, and the growing access to massive amounts of computing power changed the landscape, but it took a while for theoretical models to catch up to the improved tools. In fact, the catch-up is still going on.
I had lunch with a colleague a while ago, and he asked my opinion about global warming/climate change/greenhouse gases. I told him that it was pretty obvious that the signal was out of the noise, the whole process was clearly underway, and was he surprised at this answer? He noted my well-known contrarian streak. I observed that James Hansen hadn't made a wrong prediction since 1988, and I wasn't going to challenge that sort of success.
In truth, I was a little late to the global warming party, partly because of that contrarian streak, but also because I was focusing on the science and not the policy. I was also perhaps yielding too much to my own libertarian leanings. So let's review why I should have been convinced sooner than I was, at least on the policy issues.
From the standpoint of political philosophy, one fact should be paramount: if we do not have a right to the air we breathe, then human rights, including property rights, are meaningless. And that should include the right to have that air remain unaltered. You shouldn't have to prove that harm is being done to you, any more than you should have to prove that people are harming you in order to not want a stream of trespassers walking across your lawn.
Now any given individual has no real impact on the contents of the entire atmosphere, although it's certainly possible for an individual to affect your current breathable air, and you generally have recourse. If someone smokes in your house and you don't like it, you can throw them out. If the neighbor's barbecue is noxious, you can usually complain to some agency, and I, for one, do not consider that to be an infringement on your neighbor's rights, though your neighbor may disagree.
But group behavior can, and does, affect urban, regional, and global resources. The industrial world's propensity for fossil fuels has had an undeniable effect on the concentration of some important trace gases in the atmosphere. Regulating group behavior is not the same as regulating individual behavior. Regulating corporations or national economies is not the same as regulating individuals, and giving free license to groups and organizations reduces individual freedom.
In the case of global climate change, regulating group behavior is essential. Actually, of course, group behavior is regulated. It just happens that it is regulated by those who rule, manage, control, and lead those organizations, the corporate boards, the CEOs, the congresses, presidents, agency heads, judges, and lawyers whose fingers are entwined with the strings of authority.
But authority and control are meaningless if the system is uncontrollable. The global climate system takes decades, if not centuries to equilibrate to any given greenhouse gas level. Glaciers take even longer to melt or rebuild. And the human political process likewise has major delays built into it.
There is a thin straw to clutch at, called feedforward in control theory. Using feedforward, you attempt to compensate for feedback delays by anticipating the system response. But feedforward control is seriously limited by your understanding of the underlying system. Without that understanding, feedforward is useless.
In regulatory policy, science is the feedforward control signal. Science, however, is currently under political attack from numerous quarters. And big money is being spent to target climate research in one part of that attack.
We're going to lose south Florida, and, my colleague suggests, most of Louisiana and Mississippi. California will acquire a new inland sea. Much of Bangladesh will vanish, as will plenty of islands in the Pacific and Indian Oceans. The fact that these things are going to happen long after you and I are dead does not make the future more palatable. It makes it more inevitable.