Thursday, April 26, 2007

Information Technology, Systems, and Simulation

Some years ago, a time sufficiently distant that I was still a member of the IEEE (Institute of Electrical and Electronics Engineers) I attended one of that organization’s Simulation Conferences. I there heard an anecdotal tale from two engineers, who had prepared a queuing model for a particular warehouse operation. The model had the unfortunate tendency to fail with some regularity; the simulation would progress to a particular point and then cease to function, responding with a series of error messages.

The two engineers decided to visit the site of the warehouse that they were simulating, in order to observe a day at the loading dock, which was the input to the queuing model. They observed the following sequence:

  • A truck arrived at the loading dock.

  • The driver and dock attendant began unloading the truck, whose cargo was a number of cardboard boxes, all of identical size.

  • As each box was unloaded, it was placed on a moving conveyor belt, which transported the boxes into an automated stacking system within the warehouse.

  • When the final cardboard box was placed on the conveyor belt, the dock attendant also mounted the conveyor belt, and when it reached the entrance to the warehouse, he proceeded to kick and pummel the box until it entered the warehouse.

  • The engineers were nonplussed. “Why did you just do that?” they asked the dock attendant. The dock attendant then explained that there had been a blueprint error in the construction of the warehouse and the storage area that fed the automated stacking system was about nine inches too short, so it was necessary to force the last box into the warehouse. The boxes in the storage area had just enough give for that task to be accomplished, provided some substantial force was applied to the last box.

    In a sense, then, the simulation had been a success. The automated warehouse system would fail if used as constructed. However, human intervention has compensated for what was basically a defective system design.

    In recent years, a substantial portion of the productivity gains in the U.S. economy have been ascribed to Information Technology (IT) as the causal agent. There is probably substantial truth to this conjecture, especially if IT is expanded to include telecommunications of all sorts, email, cellular telephone technology, broadband internet connections, etc. However, there are a number of implications to heavy IT involvement on business processes that should be borne in mind. I will speak primarily to two such implications. The first is that a high degree of IT involvement in a business tends to make that business “system centric” as opposed to “process centric.” There are several further implications of that observation, and I’ll get to those in a bit.

    The second thing to bear in mind is that IT systems are generally simulations of something, rather than res ipsa “the thing itself.” This is easy to see when we speak of a computer simulation model, but this hardly scratches the surface. A telephone, for example, does not deliver the voice of the person at the other end; rather, electrical information is sent from one place to another that then allows a simulated human voice to appear at the receiving end. Similarly, financial and accounting systems are simulations, and economists speak of “the financial economy” as different from “the real economy.” When decisions made for financial accounting reasons are derided as “coming from the bean counters,” that derision carries with it the idea that the financial simulation is missing some important part of the real situation.

    Nevertheless, accountants and financial officers are charged with protecting the financial systems of an organization, and with good reason; if the financial system fails, it likely mirrors a failure in the “real” organization, and even if it does not, financial system failure makes some part of the real organization either unobservable or uncontrollable, or both. As is often the case, financial systems are both simulation systems and control systems.

    The same may be said for IT systems. IT systems are generally either communications or control systems (or both), and failure of an IT system can have adverse or even catastrophic consequences, whether it be losing an inventory database, sensitive customer information, or a set of emails. Without adequate information, management decisions become mere guesses, and without adequate communications, those decisions cannot be implemented.

    Moreover, as the tale of the automated warehouse suggests, IT systems are not as robust as real processes, because human beings are inherently more adaptable than algorithms and hardware. If the box doesn’t fit, the dock foreman may try kicking it; if that doesn’t work, perhaps a box cutter will do. If that fails, then perhaps he stores the box temporarily, until the holding area clears. As long as a human being has a goal in mind, he or she will often find away around obstacles. There is even a common name for this: “work-arounds.”

    So it is understandable, perhaps even inevitable, that IT organizations will tend to become “system-centric,” but this carries numerous drawbacks. The obvious drawback is the tendency to concentrate of system development at the expense of business process development. This has several follow-on failure modes.

    First, concentrating on systems rather than process tends to starve the processes themselves of needed resources. This is particularly problematic in a cost-cutting environment, where IT systems are, in themselves, supposed to reduce the need for personnel. Too often, however, the personnel savings are assumed rather than demonstrated, with an outcome that is deleterious to the underlying process. Moreover, IT systems usually carry their own overhead burdens, so it is not uncommon to see the effect of merely trading one set of employees for another set, the latter often being more expensive than the former. This, incidentally, is a problem that is not at all confined to IT. It is often true that attempts to implement cost-cutting through personnel reduction results in an increase in managerial overhead (“off-shoring” projects are notorious in this regard) to no great benefit to the organization as a whole. Hospital management is another area where this phenomenon has been observed, with reductions in nurse-to-patient ratios often occurring simultaneous to substantial increases in administrative (and sometimes legal) overhead.

    Another implication of “system-centricity” is that it allows, even encourages, “gaming the system.” A substantial portion of the enhanced productivity created by IT systems in large organizations is the deletion of several layers of middle management. The managerial review process is then replaced with a variety of system-derived “metrics.” However, employees then often find it easier to “work the metric,” than to perform the work that the metric is supposed to reflect. This is the equivalent of the use of standardized testing in the public school system. Given the need for a certain percentage of students to achieve particular scores on the tests, teachers find themselves under pressure to “teach to the test,” emphasizing material that is known to be on the tests and de-emphasizing material which either does not appear on the tests, or material for which it is inherently difficult to test. In some cases, school administrators and teachers have been found restricting the student population that takes the tests, sending those who will do poorly home on test days. In other cases, there has been instructor assisted cheating.

    There are no easy answers or quick fixes to these problems. Certainly the older “thick middle management” model contained its own pitfalls and cannot be taken as a model of information clarity and inevitably sound judgment. If higher level managers wish to be kept ignorant of problems within the organization, they will accomplish this task, whether by killing the messengers bearing bad news, promoting those who tell flattering lies, or receiving “dashboards” that do not adequately convey a picture of the organization. However, those wishing to be dutiful officers and representatives of the organization as a whole, should listen to the whispered memento mori (“Remember Thou art Mortal”) and take heed.

    Wednesday, April 25, 2007

    Max Headroom

    Karl Hess once told me that he thought Ayn Rand stole all her best ideas from Max Stirner. I disagreed. I didn't think she was that well-read.

    Max Stirner was the pseudonym of Johann Kasper Schmidt (1806-1856), a German philosopher who is considered to be one of the earliest proponents of egoism, nihilism, existentialism and anarchism. Only Descartes (cogito ergo sum, and all that) and Goethe (who planned a novel entitled The Egoist, but I don’t know nearly enough about Goethe) have real claim to priority.

    The "Max Stirner" name itself was his nickname from childhood, a pun on "Stirn," -- German for "forehead." So his nom de plume translates to something like "high brow," or perhaps "Max Headroom." So call him Max Headroom, if you like. There is a pretty good article on Stirner in the Wikipedia, and I recommend it.

    So, bit of background, Stirner was writing in 1844, after the ice of religion and aristocracy had cracked, but before the floes were freely moving in the water, at least in Germany. Then, as now, there were a myriad of philosophies fighting for control of men’s minds, Church and State were now not the only games in town, so a lot was at stake.

    Some of the players went under the banner of Liberalism (which, I’m sure you know, was considerably different from the doctrines now taking that name), some Socialism, and some Communism (though again, much different from the current notions grouped in those categories). One of the strands of liberalism had recently been espoused by a fellow by the name of Feuerbach, who seems to have been one of the first to achieve what we know call Secular Humanism, the substitution of Man in the place of God in all the moral equations. Stirner is particularly harsh with Feuerbach, and that seems to have been responsible for most of what splash Stirner’s book made at the time. However, Marx devotes a substantial portion of one book to attacking Stirner (in typical Marxist fashion: a personal attack rather than any attempt to meet the ideas themselves), so there was obviously something else going on.

    Stirner's Magnum Opus was Der Einzige und sein Eigentum, (DEusE). Its English translation was titled The Ego and Its Own. My German sucks as badly as a single semester of college German can suck, but I can recognize a clunky translation when it bangs my head, and "Ego" qualifies. The original English translator stipulated that his title was a bad translation, but felt that it was a good enough title for something largely untranslatable.

    "Einzig" means "only, sole, single" and "Eigentum" means "property," but there is also an internal bit of wordplay, because of the "Ein/Eigen" thing. Friend Ben Sano suggests that a better translation would be “Myself and Mine,” whereas I thought “The Only and the Owned” was pretty clever. Tthe Wikipedia article says "The Individual and his Property" and "The Sole One and his Property" have also been used, but I don't like those for other reasons. In any case, The Ego and Its Own is the title you’ll find in the second-hand bookstore.

    Still, the reference to Ego (and Egoism) does allow the bringing of Freud into the matter later, which I may do, and there are various reasons for thinking that Freud was familiar with Stirner’s work. Stirner’s book was translated into English in 1907 and found some audience, though never a large one. I think there was a revival in Germany a little prior to that; the surrealist painter Max Ernst is known to have been influenced by Stirner.

    The "Ego" part also hints at another problem that comes with reading Stirner: it's not just the German to English translation that causes problems. Stirner wrote DEusE heading on two centuries back and the meanings of the words have slipped some. "Ego" doesn't mean the same now as it did before Freud, or when Steven Byington translated it DEusE into English in 1907 (when Freud was also just penetrating English consciousness). All in all, you get a problem similar to the one Walter Kaufmann dealt with in translating Nietzsche, leaving "ubermensch" untranslated, rather than using "superman" because of the pop culture that had accumulated around the latter term.

    The translation problems hardly stop there. The year of DEusE's publication, 1844, is inconceivably distant now. Stirner uses European Jewry as metaphors and exemplars at times, but there is no way for a post-Nazi world to understand how it read at the time. He wrote for his contemporaries, so jokes, allusions, all the rest, just slide right by all but the most devoted scholar of the period (which I am not), and such scholarship carries its own karma. And so forth. So what remains are the grand sweep of ideas, which are themselves open to misinterpretation. In fact, owing to the nature of Stirner's message, misinterpretation is almost demanded.

    There is a web site devoted to Stirner:

    Reading the essays there is enlightening. Are they reading the same book as I have? Probably not, when you get to it. It's a phenomenon that in some ways validates Stirner: each reading is unique, each an Enzig’s Eigentum, as it were.

    So enough introductory, what is it the Stirner said that was so damn fascinating? Well, he was specifically attacking certain sorts of abstract ideas, which he called “spooks” and their elevation to a position above men, or specifically, Stirner himself, since he stipulated that he was speaking for himself alone, and damn proud of it, thank you very much.

    What were these ideas, (he also referred to them as “fixed ideas” in the translation, though in current times we would probably call them “fixations”)? Things like God, and the State, and Mankind, and the Common Good, the Law, the Public Interest, the Proletariat, well, you get the idea. Liberte, Equalite, Fraternite. Grand ideas that have been at the center of quite a lot of bloodshed throughout time, but, accordingly, ideas which are quite dangerous to attack. Ideas that are supposed to be larger than we are. “Bats in your Belfry,” Stirner said of them.

    What was the nature of Stirner’s attack (other than sheer peevishness, which is often quite enough, isn’t it)? Let’s use a single one of the spooks to serve for all, but remember that the argument is fairly general. Let’s consider how Stirner would address a theist, someone who is religious in the old style, who demands that all men must live to serve God (which, in the practice of Stirner’s time meant that all men must live to serve the Church).

    Anyway, Stirner would say something like this:

    First, you cannot actually serve God, you can only serve your own idea of God, or perhaps someone else’s idea of God. But since God does not speak to you directly, (unless you are Jean d’Arc, presumably), you are stuck with following a disembodied spook.

    And yet, let’s consider this idea of God. Who does God serve? Well, no one. God exists in himself and for himself, and asking who God serves may even be blasphemous. Well, why should I not demand the same? Why should I not serve myself? In fact, (and here the argument gets specific to God, and I’m putting words in Stirner’s mouth, but it’s a pretty obvious extension of Max’s ideas), if God made Man in his own image, why did he leave out such an important bit, the right of existing for his own sake?

    Then, Stirner inquires slyly, aren’t theists really serving themselves when they claim to be serving God? Your worship holds out the promise of life after death and eternal life at that. What is your eternal life to God? There are plenty where you came from (wherever that may be). On the contrary, your eternal life is of importance to you and primarily you. You are actually an egoist pretending otherwise. Your selflessness, your disdain for selfishness is a sham.
    Of course, that final argument is self-canceling to a substantial degree. If you are really acting in your own self interest in being God fearing (and if the Old Testament God really exists, you’d be daft not to fear Him), then how is owning up to it going to change you or your behavior?

    Well, it’s good to be rid of these fixed ideas, anyway, Stirner says, and there is some truth to it. If you’re behaving in your own interest in a certain way for a long time, if you have developed a fixed idea, in other words, you may get stuck and continue to behave that way long after it has ceased to serve your own interests. Then too, honesty can be refreshing. To believe in a God that is both a bully and all loving suggests a centralized hypocrisy.

    There is a similar argument made when Stirner gets to the Eigentum, property in other words. He notes that liberalism (old school) makes constant demands for freedom and liberty. But what are these good for? Nothing really, you’re just hoping that freedom and liberty give you an unimpeded path to what you really want, whatever that may be. Food, a clean bed, a palace or a nice house, whichever your taste. A respected position, perhaps, or your own private railroad car (Stirner noted the insatiability of desire with a demand for the possession of flight - having just been on a five hour flight from Detroit to SF, I can say that the process is overrated). Sex, drugs, rock and roll. A happy life and a loving spouse.

    In other words, Stirner says, don’t confuse means with ends. Realize what it is that you really want, rather than what you think is the means to get what you want. Again, pretty decent advice, within its limitations.

    But there are those limitations. Indeed, when you follow Stirner as far as he goes, you find yourself bereft of most of the things that people want from a philosophy. Design an ideal society? Why would anyone want to do that (or think that it could be done)? Stirner himself spoke vaguely of a “Union of Egoists” but never explained how it would be different from what we, in fact have. Abolish the State? What’s in it for me?

    There are no absolute prescriptions or proscriptions to be found in Stirner. If Mother Teresa were to tell him, “Very well and good, but I choose to serve my own idea of God,” then what?

    The philosophy of egoism has no real answer, other than maybe, “Good for you, as long as you’re clear on whose responsibility it is.” And figuring out what one wants, and the getting it, those are the hard parts. What happens when you want contradictory or unrealistic things? Resolve the contradictions? Isn’t “contradiction” a fixed idea? And how often have you gotten what you thought you wanted, only to discover that maybe that wasn’t it after all?

    So we come to my suspicion that Freud read Stirner and tried to answer some of those questions, and his answers wound up being so unpalatable and disturbing that we’ve spent a century trying to sweep Freud under the carpet one way or another (and by “we” I mean both Freuds’ supporters and his detractors).

    However, I don’t think that Stirner--or Freud--are useless or even wrong. Here’s an example of why. A while back a friend of mine was trying to sort out a set of romantic entanglements that had developed around him. He was trying mightily to “do right” by everyone involved. After listening to him for a while, I said, “You know, you are allowed to consider your own interests in this.” He blinked, smiled, and said, “Thanks, I need to be reminded of that from time to time.”

    Tuesday, April 24, 2007


    The plural of anecdote is conjecture. --B. Sano

    Whenever sex enters the narrative, the narrative becomes all about sex. That’s true even when things seem to go in another direction, like in slasher films where the teens have sex, and are then gory murdered. The deaths are all about how they shouldn’t have had sex.

    Back in the day, a friend of mine told me about a discussion he’d heard. I’m unsure about the context, but it was some sort of group discussion, and it was about communes. There was one woman in attendance who was living in one, and the first thing she said was, “Don’t ask about sex. It’s all very confused and I don’t want to talk about it.”

    A lot of people lost interest after that, apparently. The hows and whys of dividing up household chores, composting, and geodesic domes just didn’t have the, uh, sex appeal.

    Abstinence education was always a doomed idea. If you don’t want kids to have sex, you don’t spend a lot of time talking about them not having sex. Talking about not having sex is still talking about sex; it betrays the fundamental obsession. Try sublimation. Teach them chess, only you’d better redesign the pieces and make sure they never see “The Thomas Crown Affair.”

    I’ve occasionally wondered how much of the damage done by pedophilia is from the acts themselves, and how much of it by all the fuss about it. Realize, I’m not saying that raping a ten year old is innocuous, but I noticed on the “Inner Child” episode of Raines that everyone was behaving as if the rape of the child made the murder much worse. I confess to thinking that “dead child” trumps “rape,” but I know there are those who seem to disagree. Certainly the dramatic “my life is over/ruined” etc.” often appears in the public discourse.

    After I went away to college there was a scandal involving a Nashville disk jockey and a number of teenage girls, one of whom was a neighbor of ours. There were supposedly things like nude photographs, etc., and the DJ in question was fired and fled town. I’m not sure how illegal it was, nor even the age of consent in Tennessee at the time. For that matter, I have no information as to whether or not actual intercourse ever took place, though one wouldn’t be surprised, would one?

    Our neighbor committed suicide, ostensibly because no one asked her to the Senior Prom.

    It’s a fair cop, but society’s to blame. – Monty Python’s Flying Circus

    A friend of mine has a son who once did some adolescent “horsing around” with a buddy of his on a camping trip. Some would classify it as an adolescent homosexual experience. Still, my friend’s son, as an adult, is by pretty much any measure, both stable and heterosexual, at least judging by his lovers. He never had any particular problem with the episode.

    His friend, however, was somewhat bent out of shape by it, or so I heard. He was very upset, broke off the friendship etc. I don’t know the final disposition. I do wonder what the result would have been if conventional wisdom had opined, “No big deal.”

    Which brings me around to Michael Jackson, a tragic case, no doubt. An abused child, worked harder than most histories describe sweatshops, (but all too common in show business), robbed of a childhood, so most theories of arrested development apply. Yet a bona fide pop genius.

    That's how much we love Michael. We love Michael so much we let the first kid slide... –Chris Rock

    No moral here. I know how much some people hate Jackson. Some people seem to love him just as much. But I don’t want to go out on such a bummer, so I’m going to change the subject, while still talking about Michael Jackson.

    Jackson was married for a while to Lisa Marie Presley. Elvis’ daughter, so what does that make her? The Kingette? No, that’s disrespectful, and I do respect Lisa Marie; far too much to think she deserves tabloid treatment, though obviously I’m not above the occasional joke at the expense of her fame.

    But here’s the thing: she has her father’s features, eyes, lips, cheekbones, you can see Elvis in his daughter. But it looks good; she’s pretty, and exotic too. I like exotic.

    Besides, Elvis was not “ruggedly handsome.” He was good looking, but he looked almost pretty, a little feminine perhaps. Full lips, soft eyes. There’s nothing wrong with that.

    Then there’s Michael Jackson, who seems to be suffering from surgical addiction, at the very least. The story goes that he’s been trying to make himself look like Diana Ross.

    Nothing wrong with Diana Ross, either. But she’s not a girly-girl. I’d even let it slide if someone were to suggest she looked a little bit mannish.

    So you’re probably ‘way ahead of me here, to the conclusion of a shaggy dog essay: The Michael Jackson/Lisa Marie Presley marriage can be said to have involved a man who has been trying to look like a woman who looks a bit like a man to a woman who looks a lot like a man who looks a bit like a woman.

    Not that there’s anything wrong with that.

    Friday, April 20, 2007

    Relativism vs Absolutism

    [originally written June 7, 2006]

    The original version of this brief followup to "What Moral Relativism Means to Me" came about because I ran across a set of blog postings that were valiantly trying to keep a discussion going, with multiple posts cross-linked across multiple blogs. The discussion was about moral relativism. Naturally the whole thing became very confused, but that’s not entirely due to the blog medium. Part of it is that academic philosophy itself has confused the issue of with a plethora of nomenclature about all possible permutations of moral philosophy. Yet there seems very little comment concerning what seems to me to be the central issue (noted in my original essay): that people have a certain agenda when they decry “moral relativism” and that agenda certainly looks like it’s designed to excuse a refusal to consider differing points of view.

    While it’s usually dangerous to use scientific analogies in moral and ethical inquiries, I’m going to take the risk and use Special Relativity as an example. What’s interesting about Relativity in the scientific sense is that it is not at all in opposition to objectivity; Special Relativity is objectively testable and has passed every such test. Similarly, there is nothing in the idea of Moral Relativism that requires it to conflict with objective reality. It is true that someone’s subjective viewpoint of what is good or bad (for them) is central to moral relativism (or at least my version of it), but it is often pretty easy to objectively determine whether or not a particular event is good or bad for someone who isn’t you. Give a hungry man a meal: probably good. Hit him over the head with a hammer: probably bad. The fact that there are exceptions to both of these general rules only underscores my point.

    What Special Relativity does is overthrow the ideas of Absolute Space and Absolute Time, or more technically, the idea that there is an Absolute Reference Frame. Other sorts of Absolutes are not absent from science, though there are often some interesting caveats. Absolute Zero, for example, is a perfectly respectable scientific concept, it just happens to be unattainable as a practical matter. Nothing wrong with that.

    So I’m holding that “Moral Relativism” does not stand in opposition to whatever one would mean by “Objective Morality” (though I think that the only meaning the latter can have is that one would judge an action by its objective consequences, and not, for example, what one intended those consequences to be). Rather, “Relativism” would stand opposed to “Absolutism.”
    That evens the odds, I think. I mean, how many followers would Ayn Rand have gotten if she’d called her philosophy, Absolutism?

    Wednesday, April 18, 2007

    The Crime of Thomas Jefferson

    One way of looking at high energy particle physics is that it consists of hitting some stuff with the biggest hammer you can get, then making up stories about the resulting fragments. In human affairs, we do something similar, but the penetrating radiation consists of ideas and facts, and people’s reaction to them. In the United States, one of the big hammers is the Founding Fathers, their ideas and the facts about their lives. Modern Americans project all sorts of things on those guys, which tends to reveal more about modern Americans than the Founding Fathers.

    The most recent furor about Thomas Jefferson is a pretty good experimental smashup in that regard, though the first result is one I’ve mentioned a while ago: when sex enters a narrative, the narrative becomes all about the sex. In Jefferson’s case, the burning issue of the day was whether or not he had a child by his slave Sally Hemings. DNA testing of Hemings’ descendents put the debate into the realm of the truly bizarre. The testing was “inconclusive” in that it could only say that someone in Jefferson’s immediate family was the progenitor, so it could have been either Thomas or his brother Randolf. Naturally, a lot of Jefferson scholars immediately set out to prove that it was Randolf, because otherwise, Thomas Jefferson would have had to have had sex with his slave Sally, who, it should be noted was his deceased wife’s half-sister. That would have made Jefferson no better than…his father-in-law.

    Sally was also Jefferson’s property, and it’s hard for us to really grasp what that means. A southern slave owner could certainly legally have sex with his slaves. He could also beat them, mutilate them, force them to mate with anyone he chose, or kill them for any or no reason, all without any repercussion other than financial. Do you own a dog or a cat? Your pets have more legal rights now than did a slave in the Old South.

    On the other hand, for Jefferson’s brother to have had sex with Sally Hemings without Thomas’ permission, would have been a serious breach of manners and ethics. If it were done with Jefferson’s permission, then, well, which is worse, sleeping with your wife’s half sister or pimping her out to your brother?

    As I say, sex in the narrative tends to muddy the waters and muddle the thinking. In any case, Annette Gordon-Reed, in Thomas Jefferson and Sally Hemings: An American Controversy pretty clearly demonstrates that Thomas Jefferson was the only possible father of Hemings' seven children. More importantly Gordon-Reed addresses the issue of how it is that the fairly clear and compelling evidence of the relationship was ignored or explained away by scholars who were basically devaluing the evidence provided by historical sources who were black. Not to put too fine a point on it, implicit racist assumptions led to false conclusions.

    Jefferson, of course, was a paragon in the founding of America. His was the language of the Declaration of Independence. As President, he arranged the Louisiana Purchase. He was the prime mover behind the founding of the University of Virginia, the first secular university in America. His is the spirit behind the First Amendment, and his aphorisms in favor of freedom of speech, press, and religion are part of the discourse to this day. He was also a major scholar and scientist. As John Kennedy once quipped at a White House dinner for Nobel Prize winners "I think this is the most extraordinary collection of talent, of human knowledge, that has ever been gathered at the White House, with the possible exception of when Thomas Jefferson dined alone." Jefferson’s library formed the basis of the Library of Congress. He was probably, after Benjamin Franklin and Benjamin Thompson, (later Count Rumford) the most internationally famous scientist and intellectual in America at the time.

    So then, how to react when confronted by something like this:

    I advance it therefore as a suspicion only, that the blacks, whether originally a distinct race, or made distinct by time and circumstances, are inferior to the whites in the endowments of both body and mind. It is not against experience to suppose, that different species of the same genus, or varieties of the same species, may possess different qualifications. Will not a lover of natural history then, one who views the gradations in all the races of animals with the eye of philosophy, excuse an effort to keep those in the department of man as distinct as nature has formed them? This unfortunate difference of color, and perhaps of faculty, is a powerful obstacle to the emancipation of these people. --Thomas Jefferson, “Notes on the State of Virginia”

    Jefferson’s ownership of slaves made him a part of his culture, and his racist views were also part of that culture. This is that “cultural relativism” we hear so much about. Those who decry cultural relativism must then decide whether Jefferson was an evil man, or whether slavery wasn’t so bad as all that. Since I have no quarrel with cultural relativism per se, I’m willing to give Jefferson a pass on the slave owning, though not a full pardon. Washington gets a full pardon; he freed his slaves at his death. Jefferson supposedly wanted to do the same, but he’d ran up so many debts that his estate couldn’t afford the gesture, so his slaves got sold off, families split, the whole horror show, all because Jefferson just had to add that extra staircase onto Montecello and import a few more varieties of plants for his experiments.

    But those sorts of moral transgressions are transient, personal, and local. The same cannot be said for the scientific racialism that he expounded as a whitewash to his own personal good fortune of having been born white and rich in a society whose wealth depended upon slave labor. It may be asking a lot for someone to give up all those benefits of position and privilege. But to use one of the finest minds of his era to rationalize that situation, that is crime that continues to this day. Certainly scientific racism would have existed without Jefferson, but he was one of its originators in this country. And that was a crime against both free society and against science.

    Tuesday, April 17, 2007

    The Big Three

    [from archives; originally from April 2006, so Google results may vary]

    Three authors dominated post-WWII science fiction: Robert Heinlein, Isaac Asimov, and Arthur C. Clarke. They weren’t necessarily everyone’s favorite authors, but they were the ones that had to be acknowledged, and often they were the intro to the field for the new reader / neo-fan. And the general consensus has stood the test of time; these three are still among the most famous science fiction writers. A quick Google search shows 6.5, 11, and 14 million hits, respectively for the three, with only Philip K. Dick (14 million) and Ray Bradbury (8.4 million) as equivalents in fame.

    (For reference: Tolkien gets 26 million Google hits, Rowling gets 25 million, Hemingway gets 22 million, and Hunter S. Thompson gets 24 million. Shakespeare manages 122 million, and I’m guessing he’s the writer standard. To be more famous than Shakespeare, you have to either have a major religion – Jesus gets 280 million hits – or be the father of your country – Washington gets 2.6 billion hits. Founding a minor religion does help, however; L. Ron Hubbard gets 5 million hits, and it isn’t because of Typewriter in the Sky).

    So why these guys and not, for example, Fritz Leiber, Henry Kuttner, or Ray Bradbury? Kuttner, of course, died young, which was career limiting in his case. Lieber was as much a fantasist as science fiction writer, though he did publish in Astounding on occasion, just not frequently enough. The Astounding factor was even more apparent with Bradbury. Not to put too fine a point on it, but Bradbury may have been the favorite SF writer of school teachers and Bugs Bunny, but he didn’t publish in Astounding, and that was that.

    Asimov, Heinlein, and Clarke, on the other hand were science fiction writers. Even on those rare occasions when one of them wrote fantasy, it read like science fiction (e.g. "Magic Incorporated"). Nevertheless, they filled different ecological niches.

    Heinlein is the easiest of the three to categorize, at least in terms of what he offered to the core audience of upwardly mobile bright guys. Heinlein offered the personal and political philosophy of the competent man. His authorial voice was pedantic in the best sense; readers got the feeling that they were learning something from his expository asides, and the antagonists in his stories were the same small-minded ignorant people that his readers dealt with (or at least thought they dealt with) every day.

    Clarke was the mystic and transcendentalist, as befits one strongly influenced by Olaf Stapleton. In reviewing a list of Clarke’s short stories, I was stuck by just how often Clarke wrote about the end of the world, from his first story, “Rescue Party”, to the tongue-in-cheek “Nine Billion Names of God,” to the novel Childhood’s End, Clarke seemed to delight in just doing away with it all.

    Yet, at the same time, Clarke’s science fiction offered immortality and transcendence. In Childhood’s End, humanity merged into a group mind and left the planet (destroying it in the process). In The City and the Stars, the citizens live for a thousand years, “die” and are then reborn with a set of edited memories, a technological reincarnation, with hints of karmic justice tossed in. In 2001, the Childhood’s End scenario runs again, only this time for a single individual, and the Earth is still around at the end of the book (though its fate is in apotheotic hands.

    What Clarke offered to the intent reader, therefore, was science fiction as an alternative to the faith-of-our-fathers. You get all the elements of fundamentalist Christianity (eternal life, apocalyptic visions, God-like vistas), but with faith in technology as the underpinning, rather than just faith alone.

    So what about Asimov? He was obviously the Secular Humanist of the trio, with the additional attraction of the “I am one of you” persona. He made no secret of his fanhood; indeed, he gloried in it. Then, when he decided to embark upon another career as a popularizer of science, he did so in essays that were accessible, personal, and fun. His readers came away with not just knowledge, but the belief that they’d like to have dinner with Issac, and, moreover, that they could have dinner with Isaac, given a bit of luck. And they were right, because Isaac in person was pretty much the same as Isaac on the page, educated, garrulous, witty, erudite. He was many a fan’s image of his own ideal self, the nerd as hail-fellow-well-met.

    Following the establishment of the science fiction ecology, the niches were bound to be further carved up and refined. The entire genre of “military science fiction” owes most of its existence to Heinlein’s Starship Troopers, as does “libertarian science fiction” (and libertarianism generally) owe to The Moon is a Harsh Mistress. In the 1960s, Harlan Ellison became Lenny Bruce to Asimov’s Henny Youngman, using the personal essay as an adjunct to a fiction career, while others noticed that there might be a market for science writing, if done appealingly. And if Heinlein’s Destination Moon gave a boost to any kid who wanted to work for the space program, Clarke/Kubrick’s 2001: A Space Odyssey gave notice to a whole generation of dopers that space might be really trippy, even if you didn’t actually go there.

    The Historian

    Nor is there any such creature as a self-made man or woman. We love that expression, we Americans. But every one who’s ever lived has been affected, changed, shaped, helped, hindered by other people. We all know, in our own lives, who those people are who’ve opened a window, given us an idea, given us encouragement, given us a sense of direction, self-approval, self-worth, or who have straightened us out when we were on the wrong path. Most often they have been parents. Almost as often they have been teachers. Stop and think about those teachers who changed your life, maybe with one sentence, maybe with one lecture…

    I met David McCullough in 1972, I believe, sometime near the publication of The Great Bridge. He typically speaks to large audiences these days, but on that day there were maybe eight of us, members of the student/faculty Library Advisory Committee at RPI. He spoke to us about Washington Roebling and the Brooklyn Bridge.

    Washington Roebling was the son of John Roebling, a designer of suspension bridges and the founder of a company that made “wire rope,” the cables, in other words, that were necessary to the construction of such bridges. John Roebling had began the design and construction of the Brooklyn Bridge when he died, in 1869, of tetanus contracted after an injury during the construction; his foot had been crushed between a ferry and the dock.

    The task now fell to his son Washington, who produced the final bridge design and who oversaw the construction until “caisson disease,” which we now call “decompression sickness” or “the bends” crippled him and more or less confined him to his bedroom. His wife Emily then took over day-to-day oversight of the construction, conveying her husband’s orders to the construction site. And just imagine the problems that this arrangement created.

    Washington Roebling was a graduate of RPI, as were a number of his assistants. This group of civil engineers then fanned out across the country and were in large measure responsible for “the golden age of bridge building” in the U.S. One of the perennial favorites at RPI was the Tacoma Narrow Bridge, old “Galloping Gertie” which ran afoul of aerodynamics and mechanical resonance to fall down when a constant wind of the right speed hit it. It wasn’t designed by an RPI alum, but it was a ringer for the Bronx-Whitestone Bridge, which was. The Tacoma Incident is the source of one of my favorite catch phrases: “Worrying about things I haven’t thought of yet.” No one expected aerodynamics to be important to bridge design, at least not until a bridge fell down.

    Some of Washington Roebling’s papers went to Rutgers, but a sizable number of them became the Roebling Collection at RPI. McCullough came to RPI to research on his book. He found the Roebling Collection in cardboard boxes, stacked in a closet in the library. Judging by what SF author John Barnes says about such matters, RPI was doing fairly well by just not throwing it all away.

    Now, of course, the Roebling Collection is a big deal, housed in the special documents section of a nice new library (okay, it was new when I left, so it’s actually now over 30 years old). And David McCullough is a large reason for that.

    We did a big review of The Great Bridge in the Rensselaer Engineer, pulling in a bunch of material from the Roebling Collection that didn’t even make McCullough’s book. I don’t recall if the alternate “minaret style” bridge tower drawing is one of those, but we ran a drawing of the full bridge across the top of the multi-page article, emphasizing just how long that sucker is.

    Needless to say, McCullough’s talk was absolutely riveting. Everyone is now familiar with his voice and style from all these years of PBS narration, but imagine what it was like to get it fresh, live, without warning. And McCullough is an historian’s historian. He loves his subjects, and he loves the minutia of scholarly research. That LAC talk was probably a big reason why I once spent a week reading an entire year’s worth of newspapers from 1911 New York. It’s impossible to convey the feeling unless you’ve done it, but McCullough manages to communicate the fascination of it.

    I seriously doubt that David McCullough would remember me at all, though he might remember having given a talk at RPI about The Bridge and the source material that helped him tell its tale. That’s the way it should be, after all. For that matter, I expect that he would be pleased to know that, however much I remember him and his talk, I remember more about what he said, and I then took the time to look at the Roebling Collection on my own, and to direct others’ attention to it, because the history is more important than the historian, even a very good and successful one.

    There’s a line in one of the letters written by John Adams where he’s telling his wife Abigail at home, ‘We can’t guarantee success in this war, but we can do something better. We can deserve it.’ Think how different that is from the attitude today when all that matters is success, being number one, getting ahead, getting to the top. However you betray or gouge or claw or do whatever awful thing is immaterial if you get to the top…

    We walk around everyday, everyone of us, quoting Shakespeare, Cervantes, Pope. We don’t know it, but we are, all the time. We think this is our way of speaking. It isn’t our way of speaking – it’s what we have been given. The laws we live by, the freedoms we enjoy, the institutions that we take for granted – as we should never take for granted – are all the work of other people who went before us. And to be indifferent to that isn’t just to be ignorant, it’s to be rude. And ingratitude is a shabby failing.

    All quotes from David McCullough.

    Monday, April 16, 2007

    Wisdom from Ed

    I cried because I had no shoes, 'till I met a man who had no feet. So I said, 'You got any shoes you're not using'?
    -- Steven Wright.

    My last summer at RPI, when I was putting some finishing touches on my Master’s Project, I had a roommate named Ed Kulis. This isn’t going to be a “funny name” essay, though I acknowledge the potential..

    I had a pretty good apartment down on 10th St., close enough to campus to walk if I wanted, far enough that it wasn’t embarrassing to drive. I’d had a couple of other roommates while there, but there were also several stretches where I lived alone. One of the nice things about the place was that it was cheap enough that I could do that.

    Anyway, Ed always drove. The first thing I ever noticed about him was that he walked funny. A short while into our acquaintance, when we were walking across a parking lot, having just climbed some stairs, I asked “What’s wrong with your legs?”

    “What legs?” he replied, and grinned a little.

    Several years earlier, Ed’s car had broken down on the Sawmill Parkway, just north of New York City. It was a VW, engine in back, and while he was checking the engine, a truck hit him. He woke up several days later in the hospital, minus much of his legs. One ended mid thigh, the other had a bit left below the knee. His “funny walk” was nigh onto a miracle of rehabilitation therapy and athletic level control of his prostheses.

    The whorehouse Madame answers the door and at first doesn’t see anybody. Then she looks down a bit and sees a guy in a motorized wheelchair. He’s a quadruple amputee, no arms, no legs. ”Don’t just stand there,” he says. “Let me in.”

    ”But you’re a quadruple amputee!” she blurts out.

    ”I rang the damn bell, didn’t I?” he replies.

    Ed suffered through multiple surgeries, then months of recovery, followed by grueling rehab. He once told me that the day he walked out of the rehab center, on his own power, was the happiest day of his life. And he’d vowed never to take anything in life for granted, ever again. Every day after was a gift, he’d told himself.

    “But you know,” he confessed to me. “Eventually it wears off. I still remember the feeling, and the promises I made to myself, but then homework has to get done, I have a little fight with my girlfriend, I get pissed at some asshole in the supermarket, and then I find myself at the end of the day just exhausted, watching some lame TV show that’s as big a waste of time as anything I’ve ever done. And I think to myself, yeah, today was a real gift all right.”

    Then he grinned and shrugged, “But it’s still better than being dead, and there’s always tomorrow, right?”

    So I knew what was coming, certainly. Ed had told me about it. I had a much smaller chunk of me chopped off than Ed did, and Ed never saw it coming, whereas I had weeks to contemplate my own mortality, including various unpleasant endgames that look much less likely now. The upshot of it was that, right after they amputated my finger, last December, I got that same euphoria and uplift, the sense of being really, really connected to it all. There were people I spoke with in the days following my surgery who thought that I was looped on some major pain medication, when the most I ever took was over-the-counter naproxen and half a Vicodin, that being close to my nausea limit in the narcotics department.

    High on life, yessiree Bob. That’s me.

    And I knew full well that it was transitory, and that I’d soon enough be back to the day to day of work, and worrying about the state of the world, and trying to figure out how long it will take to pay off the medical bills, and how that’s going to impact other plans for the future. I even knew that the loss of the use of a digit was going to be irritating, in many minor ways. But it still surprises me sometimes with the inconvenience of it.

    Then every now and then I think of Ed, and what he had to go through. He got a nice insurance settlement, so he had a tricked out car that let him control everything from the steering wheel, and if he retained any fear of automotive collisions, I think he overcompensated a bit by driving a little wild. I, on the other hand, now scrutinize every mole and blemish, then roll my eyes and tell myself to get a grip. So then I squeeze the exercise putty and do the breathing exercises as I return to full Aikido training. I go to the free practice every Tuesday and regular practice every Saturday morning. My shoulder rolls and back falls feel good again, but I haven’t yet fully committed to both classes on Thursday night, and it’s hard to tell the difference between pacing myself and fear.

    I hope Ed’s doing well, because that’s actually important to me, albeit in a pretty distant way, because I haven’t heard from him since we roomed together. I bet he’s still got the grin, though.

    It’s all a lot better than being dead, and there’s always tomorrow, at least until there isn’t, and that’s also part of life. One thing Ed didn’t warn me about though, was how everything I said about the matter in the first few weeks seemed really profound, but now it’s an effort to get it above trite.

    I cut my finger. That's tragedy. A man walks into an open sewer and dies. That's comedy.
    -- Mel Brooks

    Crossposted at We Are All Giant Nuclear Fireball Now Party Blog.

    Sunday, April 15, 2007

    Follow the Money

    For some reason, I know several people who might be considered academics without portfolio, or, more precisely, without academic positions. These are people who regularly behave as scholars, scientists, or mathematicians, even to the point of publishing papers in academic journals, but who nevertheless do not have positions at any university or similar scholarly institution. I grant you, this is a little weird, to publish without the prod of “publish or perish,” or really, any of the other usual incentives. But there you go.

    In some cases, the lack of apparent incentive is just that, appearance, which is to say that they are still keeping their C.V.s polished in the hope of some future connection. Others, by contrast, do it just because they can, and they got into the habit of doing so at some time or another. I think I’m probably in this latter group, in that I’ve got a few unpublished papers that I update from time to time, and every now and then I begin another. Maybe I’ll get around to publishing them someday, maybe not. It all depends on how I feel about it, I suppose.

    Anyway, long introduction out of the way, one such friend of mine once had a paper that he wanted to publish in the Journal of Indo-European Studies. For those in the know, this publication was long associated with Marija Gimbutas, the originator of the “Kurgan hypothesis” connecting the archeology of Neolithic burial mounds in southern Europe with the linguistics of the “proto-Indo-European” languages, to some acclaim (from Joseph Campbell among others) and considerable skepticism. Both the acclaim and the skepticism were linked to some pretty spectacular leaps she made in the formulation of “The Goddess hypothesis,” suggesting that Neolithic European cultures were matriarchal, later to be overwhelmed by invasions of patriarchal Indo-Europeans.

    My scholar friend is a “Red diaper baby” and sometimes referred to Gimbutas’s “impeccable leftist credentials,” for reasons that will soon become evident.

    The Journal of Indo-European Studies has a “sister journal,” Mankind Quarterly. Both were founded by a fellow named Roger Pearson. I forget whether it was Pearson or his co-editor who suggested to my friend that his paper might be published in Mankind Quarterly. It may have been the co-editor, J. P. Mallory.

    My friend had never heard of Mankind Quarterly, but I had. Why? Because a large number of the citations given in The Bell Curve are from Mankind Quarterly. And the reason for that is that Mankind Quarterly is the go-to journal for papers dealing with the scientific basis of race, or, put another way, the primary purveyor of scientific racism in U.S. academia.

    I’m going to get lazy here and just quote the Wikipedia entry:

    Pearson was brought to the United States in 1965 by Willis Carto of the Liberty Lobby, and contributed to some of Carto's publications, such as Western Destiny and at Noontide Press. At the end of the 1960s, he parted with Carto, and successively taught at Queens University, the University of Southern Mississippi and Montana Tech. During his tenure as dean at Montana Tech, Pearson received $60,000 from the Pioneer Fund.

    In 1975, he left academics and moved to Washington, D.C., where he founded the Council on American Affairs. He also joined the editorial board of Policy Review, the monthly Heritage Foundation publication in 1977, but was forced to resign in 1978, after the Washington Post exposed Pearson's background following the 11th Conference of the World Anti-Communist League — which he chaired.

    Pearson also held the directorship of the Institute for the Study of Man, a group which was alleged by Searchlight magazine to have received $869,500 between 1981 and 1996 from the Pioneer Fund (Mehler 1998) and which under Pearson acquired the peer-reviewed journal Mankind Quarterly in 1978. Pearson simultaneously took over as editor and has remained editor through to the present day, though his name has never appeared on the masthead...Auschwitz doctor Josef Mengele's advisor, Otmar von Verschuer, was on the editorial advisory board of this journal before his death in 1970.

    Pearson is a former officer in the British Army, serving in India when it was still a colony. He’s also a member of the Eugenics Society. I knew of maybe half the stuff mentioned in the Wikipedia article, having seen a piece on The Bell Curve in The Skeptic magazine. I conveyed all this to my friend.

    So my friend had run across the fact that Marija Gimbutas, one of the patron saints of the neo-Pagan movement, was academically in bed with a guy promoting eugenics and scientific racism. More to the point, my Jewish, Red-diaper baby friend, was being asked to publish in a journal that had once had a Nazi on its board.

    He was in a quandary. My opinion was basically, a publication is a publication. Both journals were academically respectable, despite everything, and neither would be a blot on his C.V. He did ask other opinions, and, interestingly his African-American colleague in the anthropology department of a nearby university was perhaps the most telling. Said colleague opined that anthropology journals were all so racist anyway that it barely made any difference. Not exactly confirmation of my opinion, but pithy nonetheless.

    My friend finally made the decision to hold out for his first choice and his paper appeared in the Journal of Indo-European Studies, not Mankind Quarterly. Fair enough, but it got me thinking.

    Roger Pearson, whatever his vices or virtues, pulled in a lot of money from various sources, sources that I personally find more than a tad “icky.” Is there such a thing as “tainted money?” One might ask a neo-Pagan about that. I have no idea what sort of strings were attached to the money Pearson got; most likely there were none, at least not in the sense of quid pro quo. It’s just that Pearson is who he is, and the people with the money thought he was a good person to spend some of theirs. I doubt he disappointed them.

    There’s a saying about politics: “If you can't take their money, drink their booze, eat their food, screw their women, and still look them in the eye and vote against them in the morning, you don't belong here.” I’ve always thought more or less the same thing about science and academia, but the fact is that if you play it that way, you don’t necessarily get to have a long career, not unless you get tenure, and the early plays in that game separate a lot of sheep from goats. There’s a lot of money out there for anyone who genuinely believes that tobacco is harmless, that global warming isn’t caused by fossil fuels, or that rich people are rich because of virtue and intrinsic worth.

    But there’s just as much money for anyone who doesn’t believe those things, provided they’re just as able at making the case as those who believe them. And cynics don’t have any problem jumping from one funding source to another. The ones who are really committed to the King are the courtiers, and many of them don’t care who’s the King, so long as there is one.

    Friday, April 13, 2007

    Another Lifeguard Story

    Here is another “saving from drowning” story that I like to tell; it doesn't have anything to do with me, except that I read it in the Red Cross Lifesaving Manual when I was taking the YMCA lifesaving classes. It's attributed to an 1894 book on Swimming from something called the "Badminton Library." The Red Cross manual gives the bare bones of it:

    A boat load of mill workers were being ferried across the Clyde one evening. The boat was badly overloaded and had not proceeded twenty yard from the dock, when it listed suddenly and overturned. One James Lambert, a powerful swimmer by the record and a good waterman, found himself in the water gripped about by as many men and women as could lay hands on him while others held to them. With marvelous self-possession and cold courage he allowed himself to sink to the bottom with his burden and found the water to be about ten feet deep. Being quite unable to swim because of the manner in which he was held, he, nevertheless, contrived to get his feet down and shove diagonally to the surface and some few feet toward the dock before he sank again. Thus alternately driving off the bottom, getting a breath of air and sinking again he managed to near the dock where ropes and boathooks were used to relieve him of his burden. Upon checking, it was found that he had brought in sixteen or seventeen of the unfortunates, but he did not rest there. Plunging in again and yet once more, he brought to shore first, two girls and then another girl and her young man, who were drowning together. Then, and this is the irony of it, he found himself clinging to the quay so spent that he would inevitably have sunk and drowned if an old and decrepit man had not seen him and, extending his cane to him, towed him along the quay into shallow water and helped him out. Thus the most spectacular rescue of all time would have ended tragically for the hero, if the old man had not used effectively, if not as spectacularly, the only means at hand commensurate with his strength and ability. The story points its own moral.

    I've ruminated on this tale quite a lot, because I think that there is much more here than a good fable about the importance of unspectacular rescues. I think that there is something here for writers, for instance.

    This story is a good example of pure plot, in that there is really no "characterization" to it at all. We learn nothing about James Lambert other than his abilities and his heroism. Even the heroism is a little ambiguous, in fact, because the first part of the rescue was "thrust upon him." He didn't have that much choice in that part, though he obviously did then decide to go for others, he was wired on adrenaline and hypoxia by then, of course. In short, the heroism could have been utterly characteristic of him, or the only heroic thing he ever did and a complete surprise to all who knew him.

    Of the old man, we know absolutely nothing. Interestingly, for many years, I remember the second rescuer as being a young boy. Does that change the story? Not really. I like it better with an old man now, but then, I'm closer to being an old man now, so maybe that explains it.

    But imagine the way that this would get treated today. The tabloid reporters would try to learn all about James Lambert. Does he have a girlfriend? Go to church regularly? Is he in school? And so forth. In fiction, it would be even worse. We'd get stories from his past, formative glimpses of his childhood that went into making a hero. All sorts of prying little bits.

    All useless really. I read the story and feel that little catch in my throat that says the story is working. It's all there on the page, and all the background information in the world is not going to answer any important questions. Or the important question, which is "Why did he do it?"

    And my answer to the question is "Because he was there, and because he could."

    I think that "because he could," is often overlooked as a motive and explanation for human behavior, and that's too bad. There is a related motive: "Because he could get away with it." That sounds naughtier, but it doesn't have to be. You could use either explanation for Schindler, for example, with equal effect. Or you can even apply it to James Lambert. And he almost didn't get away with it.

    Then there is the connection between heroism and creativity. Almost any creative act occurs because someone could do it, and most other reasons are secondary. People play sports because they can, and when they no longer can do it, they weep, and for good reason, because they've lost something important. We love because we can and because we can get away with it; we live because we can and for as long as we can get away with it.

    Thursday, April 12, 2007

    Why Science Fiction is Important

    Some while back, I noted touched briefly on a profound subject, the unprecedented demographic changes that occurred in the 19th and 20th centuries, in America and elsewhere, though I’m primarily focused on the American change for now.

    The change is part of what we call “The Industrial Revolution,” where vast numbers of farmers first left farming, then left rural America for towns and cities. This demographic change first swelled what was called the working class, then the middle class. The vast majority of people in the U.S. at least say they consider themselves to be middle class, whereas even as little as a century ago, the major category would have been farmer.

    So the path went farmer->working class->middle class, with that last transition being really important in the last half of the 20th century, post-Depression, post-War, post-GI bill. And “the bright child of working class parents” describes so many of us, especially science fiction fans, (and just incidentally, me).

    This demographic transition coincided with the Baby Boom, and the transformation of science fiction from a genre among many other genres, to the genre, a dominating force in popular culture and indeed, many other parts of culture. Science fiction (and I'll toss in Fantasy, though in truth SF is a subset of F, not the tother way around) are now so dominant in popular culture that "realistic" dramas often slip into SF&F tropes without even noticing.

    It seems to me, therefore, that a critical review of science fiction is necessary for an understanding of American culture in the last half of the 20th century, and the intellectual atmosphere that we all breathe. And I will try to make the case that you can’t really understand anything about our country without trying to understand the connections between science fiction and its readers and fans. The “bright children of working class parents” constitute the core population of technicians, engineers, and scientists who have created the modern economy, and this cohort also provides a great deal of the intellectual muscle that powers the present political conversation, for good or ill. The cohort is so massive its gravitational attraction affects everything.

    The sort of criticism (in the literary sense) that this outlook provides makes many people nervous. It smacks of Marxism, deconstructionism, Freudianism, and a host of other “isms” that some find distasteful. Some of this distaste is legitimate, since deep analysis generally can be used as a club, a put-down, a sneering rejoinder: “You’re not really interested in space travel; you just want to escape from your boring, mundane existence.” “You don’t really care about the environment; you just hate business.” And so forth.

    Nevertheless, my intentions are honorable. I think that popular culture, including pop philosophy and pop psychology can offer interesting insights into collective and individual belief and behavior. The ideas themselves can have intrinsic interest, but so can the nature of belief in those ideas, and what believers get out of that belief. Science fiction is a literature of ideas and a literature of belief. An analysis of the ideas that appeal to science fiction readers might very well offer insight into the nature or our current world.

    Wednesday, April 11, 2007

    Courtesy and Crime

    On March 22, I posted on my newsgroup an essay on pseudonyms and anonymity, “Pseudonyms Anonymous,” where I noted that, having experimented with pseudonyms and anonymity in some writings in the distant past, I now always use my own name, with full disclosure as to who I am. Anyone who wants to know who I am can find me, and even show up on my doorstep if they are of such a mind.

    Part of this apparent fearlessness is that I’m not that important; if I were seriously famous I might have to take steps against nutcases (see “We Get Letters”). But I’m not that famous and I don’t get many threats.

    Anyway, on March 26 (am I prescient, or what?), a tech blogger named Kathy Sierra posted about how she’d been threatened online, to the extent that she cancelled a public appearance.

    That resulted in a story in my local newspaper.

    This was followed by a Call for a Bloggers’ Code of Conduct by Tim O’Reilly, one of the Web 2.0 mavens.

    O'Reilly's machinations also made the local paper (one of the things about the SF Bay Area is that the Internet is a local story for us).

    The story has a link to O’Reilly’s proposed “Code.”

    Okay, reality check here. Threatening or implying a threat to another person is a crime. It's assault. If anyone made some of the threats that were made to Kathy Sierra in person, rather than anonymously over the internet, said person could be arrested and jailed, especially if there were any reason to suspect that the threat was serious. If it was said in any venue that had security, e.g. a bar, said person would be promptly ejected probably with some prejudice. Alternately, if some makes such a threat against you and you hit them over the head with a baseball bat, it would probably be called self defense.

    One would think that this is an important point. It is nowhere to be found in the “Code of Ethics.” In this “Code,” “stalk or threaten others” is given the same weight as libel and copyright violation, and criminality is never mentioned.

    The first article in the Chronicle also deals a bit with the fact that this sort of harassment is most often directed at women. In each successive iteration, that aspect of the story becomes less prominent. The “Code” is completely gender neutral, as if misogyny weren’t the most prominent aspect of the initial story.

    I hear there’s no racism in America these days as well.

    To divert attention from what are basically anonymous hate crimes towards so ill-founded a concept as “courtesy” is beyond ludicrous. There are some people who consider “courtesy” to be violated if anyone uses “foul language” i.e. the usual sort of language used every day in the military or college dormitories. Others consider it to be discourteous to refer to a black Presidential candidate as “articulate” or “clean.” Context can mean a lot when it comes to “courtesy.”

    Hate speech is hate speech, racism is racism, and misogyny is misogyny. These exist on both the right and the left, although my own experience is that the right is somewhat more open about its racism and misogyny, while the left tends to get much more upset when theirs gets pointed out to them (that’s probably just me being discourteous).

    Just because women like Ann Coulter and Michelle Malkin get hateful emails doesn’t mean that the emails came from “lefties.” A lot of sick people are simply beyond politics. But such events do feed into the sense of grievance that both of them bring to the table, and I’m sure that both believe the hate mail comes from their political enemies. Similarly, abusive threats to feminists and women generally are political only in the sense that everything is political, although the orchestrated attacks on the two female bloggers who worked for John Edwards was specifically political. However, despite appearances, deranged nitwits aren’t that big a political block.

    The enablers of those deranged nitwits, however, are.

    A Little Bit Like Affirmative Action

    [From my archives]

    It would have been sometime in 1967, as nearly as I can recall. I was in high school in Tennessee, and I got a call to go down to the Guidance Office to see somebody. I may have gotten out of a class for it, or maybe it was during a study period. That doesn't matter at this distance, though it probably did at the time. Often you don't remember all that much about the events that change your life.

    The man who wanted to see me was from a college I'd never heard of: Rensselaer Polytechnic Institute. He was head of the RPI Admissions Office. Years later I learned that he had a habit of doing what he did that day. Whenever he was away from RPI on business, he made a point of making a recruiting call or two at several high schools in the area he visited. He'd call up the local guidance councilors and ask if they had any bright, scientifically- and mathematically-minded, students that might consider attending RPI. I fit the bill, so I was it for that afternoon. We talked a while about RPI, what it was, why it would be a good school to attend, and he left a catalogue that I read carefully over the next few months. Ultimately, RPI was one of the schools I applied to, and it was the one that I attended, as both an undergraduate and a graduate student.

    The man's name was David Heacock. I met him several years later, at a student/faculty/administration party, and during a conversation, we realized that he was probably responsible for my attending Rensselaer. Neither of us had recognized the other, of course. We spoke for a while about his intentions on his recruiting trips. What did he have in mind?

    Rensselaer is an old school, the oldest engineering college in the nation, in fact, or it has a claim to the title. Located in Troy, New York (a goodly distance from Tennessee, which was one of my criteria for a college), it is quite well-known in some places, places like New York State, especially New York City and Long Island. It is also well-known in some circles, like among engineers generally, or within General Electric. GE has research laboratories just across the Hudson from RPI in Schenectady.

    But it wasn't unusual that I hadn't heard of it, because RPI is generally unknown outside of its own cultural niche, and its own geographical area. Most often, people in most parts of the nation, if they've heard of it at all, remember it as having a good hockey team. That isn't really quite what the RPI administration would like to be the school's main claim to fame, and some of the admissions people took steps to try to expand its reach. Steps like making a special effort to recruit students from Tennessee, when the opportunity arose.

    During our conversation, Heacock may have used the word "diversity." Maybe not. It was long ago, and the word did not have had as much baggage loaded onto it as it does these days. Besides, I'm white, and the word isn't often applied to different groups of whites.

    Still and all, I understood early on that I had a bit of an edge. RPI wanted me to attend. They recruited me. They accepted my admission. They gave me scholarship money. Was I at the top of my high school class? No, I was 11th. SAT scores? They were pretty good, but there were guys from Long Island with higher scores (especially in math) who didn't get in. But I was on the forensics team, a lifeguard at the downtown YMCA; I'd won a local short story contest, placed second in a swim meet once -- all things that added to the notion that I'd be more than an average RPI student. Which, of course, I turned out to be. Average students didn't wind up at those student/faculty/administration parties.

    But all the things that gave me that edge over the generic New York student, all of them still come under the heading of difficult-to-quantify—judgment calls in other words. I say, “difficult-to-quantify,” but they did attempt to quantify it, with formulae for admission and financial aid, formulae that took all the extras into account. That moves the judgment call to the realm of deciding whether lifeguarding quantifies to the same thing as playing in the band, or being in the forensics club is as good as those last 10 points on the SATs. When all is said and done, what happened was that the admissions people looked at me and decided they wanted me to help change the mix of students at RPI, to make the place something more than a regional engineering school.

    And if that Long Island student had wanted to make the argument that he was being discriminated against, he'd get no counter-argument from me. It was discrimination, and it was to my benefit.

    Over the years, I've seen a lot of discrimination both against and in favor of various people, and various types of people. When I got to RPI, I spent the first couple of months getting rid of the last vestiges of my southern accent (I still remember my date who teased me because my speech rhymed "pen" with "tin."). People with southern accents, white or black, are assumed to be stupid, you see. On the other hand, being tall, blonde, and Anglo has often worked in my favor, and I'd be an idiot not to know it.

    When a kid gets into college because he's the son of an alumnus, that certainly discriminates in his favor; and how many black high school students can have that sort of edge? Not as many as whites, of course. Likewise, whites have an edge when it comes to just plain buying their way into schools. Several hundred years of economic discrimination leave effects that wouldn't disappear overnight, even if racial discrimination were to vanish, which, of course, it hasn't. People of African descent are still at a disadvantage, still discriminated against. There are people who think otherwise, or say they do. I hope that they are merely wrong, and not lying, but I know that many of them are, at best, deluding themselves.

    I've been writing mostly about discrimination in white and black, because that's what I grew up with. But the same arguments hold true for broad numbers of college-yearning boys and girls, children of immigrants, the racially or culturally disadvantaged, even (as was my case) the geographically disadvantaged. Most of these discriminatory disadvantages can be addressed without much comment. I was geographically disadvantaged when it came to the college I eventually attended: I'd never heard of it. A guy came by my school to try to change that.

    All perfectly legal, and not worthy of much comment, apparently. But if someone wishes to try to address the ongoing advantages that white people enjoy, and the disadvantages that African-Americans suffer, there comes a gnashing of teeth and a wailing that this is horrible, that racial discrimination is so odious that it must never, ever, occur. So the lackluster sort of racial discrimination that is called Affirmative Action comes under harsh attack. This attack, it seems to me, primarily comes from those who have all their lives been beneficiaries of exactly the sort of discriminatory edge that got me into college. A discriminatory edge that would apparently have been illegal if it had been because of my skin color, but was just fine since it came from where I lived. Indeed, such is the nature of racism that people often do not even notice the sorts of discrimination that don’t involve race, except possibly in the extreme cases, like opposing partisans noting that G. W. Bush did not get admitted to the Ivy League on the basis of his high SAT scores.

    Then there is this aspect: None of the advantage would have amounted to anything if Mr. Heacock hadn’t specifically recruited me. Universities like to recruit wholesale, because it’s cheaper, but I was recruited at retail. That is where so many Affirmative Action programs turn out to be just lip-service: the gatekeepers give the little extra edge in an admission formula, but there isn’t really any active recruiting to back it up; everyone prefers wholesale to retail. The minority individuals who do manage to work the system (which has been every American’s God-given right since the nation was founded), are then told that their achievements are suspect because of all the unfair advantage that they’ve been given. Odd that no one ever told Babe Ruth that his honors were suspect because he never competed against anyone who wasn’t white, but there you go.

    All and all, I have to say that I think diversity is something to applaud. It's done right well by me. And I'm under no illusions that it had to be that way. I know how lucky I was, and how lucky I continue to be.

    Sunday, April 8, 2007


    In November 1973, Leon Harmon, a researcher at Bell Labs wrote an article for Scientific American titled, "The Recognition of Faces." It included pixilated images, most notably this one of Abraham Lincoln:

    The Lincoln image in the Scientific American article had already seen wide distribution through the college engineering magazine circuit; it seemed like half of us put it on the cover. It had been part of a press release from Bell Labs, and well, the first time you see it, it’s a revelation. Or at least it was. Everyone is used to pixelation now, and few people think about it as such; they’re usually more interested in what it’s obscuring, i.e. the “naughty bits” on basic cable channels like E! when the show is about porn stars or Howard Stern re-runs are letting the strippers get professional.

    A few years after the Scientific American article, Salvador Dali entered the arena with one of his last truly striking works "Gala contemplating the Mediterranean Sea, which at 30 meters becomes the portrait of Abraham Lincoln (Homage to Rothko).":

    Several years ago, I spent several days looking at the face of Mika Tan. You may now think whatever thoughts you wish to think about the psychological meaning of my focusing on the face of a porn star:

    What I was doing was experimenting with various image processing filters. I’ve been fascinated by voice and image processing since college, and periodically I reacquaint myself with the tools available. Things that once took days of mainframe time now take seconds on a mid-level PC, and there are also a lot more tricks and methods available to the professional or the hobbyist, because people are clever, and there are a lot of people interested in images. Do tell.

    I’ve done a lot of experimenting with computer generated abstract art, often starting with more conventional photographic images. One of the things I’ve confirmed along the way is that faces are special; we see them easily, even when they are not there, and modified images of faces carry emotional weight far more easily than other images. We’re built to recognize faces (that’s part of the point of the Scientific American article), and the images go deep.

    So anyway, if you’re going to be nerding about in Photoshop, Irfanview, Corel Photopaint, and Pro Motion for days on end, it’s not a bad idea to begin with an image or images that are both evocative and easy to look at. So that pretty much answers the question of “Why Mika Tan?” At least to me it does.

    Finally, in an earlier essay, I noted the ability to creating hidden codes and subliminal sexual images via image processing. So here you go:

    Good luck and never let it be said that I’m a prude, just way too subtle.

    Voyeurism goes deep. A Chicago friend once told me, “You’re not really into voyeurism until you’ve done things like put a photograph of a naked girl in a glass cabinet, lit by candlelight, and then spied on it from behind a chair across the room with a telescope.” – Fritz Lieber

    Saturday, April 7, 2007


    Puritanism. The haunting fear that someone, somewhere, may be happy.
    -- H. L. Mencken.

    Mika Tan bills herself as “The Nice Girl with the Naughty Job.” I ran across her MySpace page a few weeks ago, and I was struck by her self –description:

    I am a Taiwanese/Okinawan (mom) and Japanese/Samoan (dad) mutt who grew up in Oahu, Daly City, San Diego, and Guam, in that order. My dad was in the military, so went to 14 schools by the time I graduated high school. I have been described as a free-spirit, goofy and down-to-earth. I have only a few good friends and I treat them like gold; I am one of those kind of friends who will actually show up on moving day with a U-Haul.

    The “only a few good friends and I treat them like gold” might have been copied out of Jung’s Psychological Types. It’s one of the descriptors of an introverted thinking type.

    It might seem strange to consider that a performer, any performer, much less a performer in erotic films, aka “porn star,” might be introverted. The “thinking type” goes just as strongly against stereotype, doesn’t it? In fact, it easily, almost inevitably, fits into Jungian theory. The performing persona becomes an expression of “the shadow self,” and why do you think actors love playing villains, madmen, and the outlandish?

    My wife Amy used to be a librarian. She’s always been bookish, shy, and ill-at-ease in social situation—as herself, at any rate. But she has a performing persona, “Madame Ovary, The Lady with the Flaubert-ghasting Name,” and when she in character, she is extroversion personified. She is outgoing, magnetic, flamboyant, and, above all, she takes energy from the audience. After a show she practically vibrates and it sometimes takes days for her to come down. A bit bi-polar? Only in the theatrical sense, not the clinical.

    I am jolly and effervescent most of the time (whoa…but when I am tired, I totally crash). – Mika Tan

    The stereotype of the porn star is a sexually abused, drug addicted, exploited prostitute. I’m sure there is some exploitation involved; this is show business we’re talking about here. But a friend of mine once observed that the adult film industry is the only area of show business where the actresses routinely make more money than the actors. They also form their own production companies and move on to themselves direct films at a rate that I daresay exceeds that in “regular Hollywood.”

    I once read an interview of Georgina Spelvin, the star of The Devil in Miss Jones where she said that she’d encountered “the casting couch” (the exchange of sex for a stage or motion picture role) much less often in adult films than in the theater or regular motion picures. Again, exploitation is a relative term.

    One of the chasms in “The Great Divide,” the gap between both sides of The Culture War, is the attitude toward pornography and those who make it, especially the performers. Realize that it’s not the actual consumption of pornography where the differences lie. One of the exquisite ironies of the “community standards” court interpretations came when those accused of possessing or manufacturing pornography could point to the internet traffic to porn sites from the communities where they lived. It turned out that porn consumption in those small towns in Utah, Iowa, and Georgia was as high as anywhere.

    But the puritanical smut hounds hate themselves for it, and they hate those who provide it to them. In practice, that comes down to hating women. It finds expression in all sorts of ways, from the demeaning stereotypes to outright stalking.


    On the other side of the Culture Divide there is respect, even admiration for the performers. I remember an article describing how the Saturday Night Live cast treated Seka, another adult film star, when she appeared briefly on the show hosted by Sam Kinison. The word “reverence” was used.

    In many respects, the profession resembles that of the professional athlete. The performers in porn do things that the average person simply cannot do, both physically and mentally. Star baseball, basketball, and football players are sometimes spoken of in the language of the superhuman. Mainstream actors and actresses are praised for “digging deep,” or “exposing raw emotion.” Action stars are lauded for performing their own stunts.

    In mainstream discourse “hard core” is a superlative and a compliment, except when it’s hard core pornography, which, of course, was the origin of the term. But to those who have actually colonized the other side of the cultural divide, the hard core performers are heroic. They’ve broken chains that the rest of us lack the strength to break.

    Thursday, April 5, 2007

    Tihs is Not a Typo

    Many years ago, when I was hanging out on the Compuserve Graphics Forum, I got into a little set-to with the sysops. I was playing with my then new hand scanner, and I uploaded a scan of a sketch/cartoon Dale Enzenbacher had made of me:

    The sysop objected to the not-exactly-a-word “Sh*t.” Actually, the asterisk I’m using here was one of those stylized atom symbols, but you get the idea.

    I thought the guy was being prissy at best, and censorious generally. Well, anyway, we had an exchange, in public, with me, among other things, observing that he was making it out to be worse than it was, and not letting the other members of the Forum make their own judgments about it. Tough titty, was more or less his response, minus the “titty” part, of course.

    So I then proceeded to prepare a series of graphics images, each of which had, one way or another, the word “shit” in them somewhere or somehow. These little rebellions were not, however, easy to spot, and not a single one of them was ever rejected. In fact, for most of them, it was impossible to spot, because I’d camouflaged or otherwise encoded the offensive word.

    The easiest way was to put the word in, then change the color palette value for the color of the letters to match the background. In order to see the word, you had to either change the one color, or color cycle the palette.

    You can also see the message if you download it, then use something like Irfanview to shift the color palatte.

    A variant of that was to take the color difference down to the subliminal threshold, then put a lot of other noise around it. Again, impossible to see unless you were looking for it.

    Then there was the image of the brick wall that I made with the words “This is Art” spelled out in differing sizes, colors, etc., except that one of the hardest to see ones had the letters in “This” rearranged.

    A more obvious one, but one that still got through:

    I did many more of these than I uploaded, because, knurd that I am, I was getting more interested in the process than the actual dispute that had begun it. I’d proved my point, a proof confined more-or-less to myself alone, unless someone else had noticed, and if anyone did, they were obviously on my side, since they never told the sysop. But I was branching out into more intricate and arcane methods of the “secret message” trick.

    There are, in fact, a plethora of methods of putting secret messages into graphics files generally. One trick is to modify the least significant digit on a pixel color code, then use that modification to encode your message. You need a “reference” image to get the difference off, and it helps to first do a compression (like the LZW encoding that used on Zip files), which makes your message look like noise. If you want, you can just have the reference be a solid color, so your message then looks like noise or static, or something like it, and if you have an image where that sort of texture is appropriate, then Bob’s your Uncle.

    There was a crank book called “Subliminal Seduction” by Wilson Bryan Key (1974) that claimed all sorts of “subliminal messages” were contained in various advertisements, especially subliminal sexual messages. The fact is that subliminal images have less effect than overt, visible images, but there are all sorts of paranoias, and this one fed the notion that people were being all sexed up by invisible images bombarding them from all sides. Think of the children!

    After playing around with my secret message images for a while, I had the notion of using some of those tricks on pornographic images, writings, etc. I’ll bet you could get a lot of free publicity for a “Subliminal Sex” art exhibition, where maybe only two thirds of the works actually had any sexual content, even of the subliminal kind, so the prissy folks would be getting headaches staring at a black canvas that contained a subliminal image of a puppy, and thinking it was erotic.

    I think it would be more trouble that it would be worth to me, but if there are any artists out there who are hungry for notoriety, you know how to get in touch.