Saturday, July 30, 2005

Defense: The Modern Cathedral

Americans take for granted that the military-industrial complex here is huge. What they simply aren't able to process is how huge. The numbers just get so large, they become meaningless. The US spend $455 billion dollars on the military (and related matters) in 2004 alone. That's over $1,500 for every man, woman, and child in the country (which currently number just shy of 300 million). But consider: publically, the US armed forces have somewhere in the ballpark of 1,430,000 active troops. That means that, in a year, for each enlisted soldier in our armed forces, we're spending $320,000 - this number drops, but only a little, when all army reserves and national guard are taken into consideration. By contrast, the biggest army in the world (China's) is spending about $13,288 per soldier. India (the 3rd-largest armed forces) is spending $12,907 per soldier. the obvious retort is that these countries have utterly different standards of living, so these figures are misleading. Keeping in mind comparable standards of living, the first European military force on the list (Germany, ranked 19th in the world) spends $105,729 per soldier. Much larger than China or Germany, but clearly still a huge drop from the States.

So the question is: why? Why does America spend 43.8% of the combined military spending of the entire world? It certainly isn't to give soldiers an opulent lifestyle. The classic joke, of course, is that the military spends $3,000 for a hammer. It's the old myth that state-run bureaucracies are infinitely more inefficient than private business, and that this inefficiency, combined with "special projects" that don't make it on the spreadsheet, inflate the prices. But look at America's recent plunge into deficit spending: we're creating the money from thin air by adding to our debt.

I posit that american military spending constitutes a very modern form of monument construction. Take, for example, the F/A-22 Raptor, said by some to be the most advanced fighter plane in the world. By the time the military's 280 or so planes are finished, they will have cost as much as $71.7 billion to develop and produce, to say nothing of the cost of their upkeep, making them by far the most expensive fighters ever created. Which begs the question: to what end? Did we really need to upgrade from F-16s? Probably not - but for the American military, being "the best of the best" is a matter of pride. So they've spent over $70 billion on a plane to one-up their apparently non-existent competition. After all, the Russian MiG-29 was the equivalent of the F-16, so apparently the US military just had to prove they're in a different ballpark. At the expense of its citizenry.

Amusingly, America is progressing toward some genuinely science-fiction-like behavior. With the advent of unmanned fighter craft, we'll soon be able to fight air-superiority wars (the only kind we seem to be really, really good at) simply by spending money on manufacturing robots. My economy vs. your military - fight! I find this amusing because life imitating science fiction always amuses me. It allows me to stick my tongue out at all those English teachers who asserted that science fiction was a waste of my time. But it raises concerns as well. When manpower stops limiting American military expansion, jus thow much will they be willing to spend?

It's like the Cathedrals in Europe. Sure, they're impressive now. They're treasured as wonders of their eras, and we weep at their destruction. But how many hundreds of years did they take to build? How many lives did they cost? And, honestly, what good did they do? I've always bee in the camp that says that glorifying the deity of your choice is best done through means other than centuries of manual labor by serfs. Is military one-upsmanship against enemies who use methods conventional military techniques fail to thwart (like terrorism) really worth glorifying by shaving off the top of every dollar than changes hands here?

Because let's face it - this is a cathedral that history won't look back as kindly at.

Tuesday, July 26, 2005

Personartlity

I'm a fan of improvisational stealth performance art. Mostly, I'm a fan because it's funny, but part of it is also that the people who do it are clearly doing it as a creative act. They're pushing at the boundaries of mundane reality, not by writing a song or painting a picture, but by their actions. The art is in the moment of performance, and everything beyond that is simply secondhand.

This has led me to think about those rare individuals who, while clearly artists, are arguably masters in the medium of personality or social interaction. No one can know if a conversation with such a person is really a performance piece - indeed, these guys never seemed to let down the facade. More, perhaps their true nature was one of constant performance.

Dali is probably the classical master of the form. No one who achieved celebrity has been more continuously and calculatedly loony. Though shunned by the Paris Surrealists, Dali succeeded in making everyone around him a prisoner (or guest) in his own surreal (or even absurdist) world. Ironically, he seems to have been motivated by the most profound sense of iconoclasm. He seemed to adore money, but apparently only because it was taboo for artists to work for money (constituting "low art"). He painted Hitler, apparently to scandalize his contemporaries. He agreed to the making of a documentary about him, but flatly refused to answer any of the filmmaker's questions.

Andy Kaufman is probably the other modern master of this medium. His portrayal of "Tony Clifton" (sometimes played by his brother or his friend Bob Zmuda) maintained a ruse for an extended period that Clifton was, in fact, a different person. Amusingly, thanks to Zmuda, Clifton lives on (albeit for brief periods) well after Kaufman's death.

So, it is art? Well, I've seen a man wrapped in saran wrap writh on a stage and be called "dance," so I'm willing to call it art. But it's especially difficult to see where the artistry ends and the art begins. If such a line matters.

Here's a list of the people who I think need to up the ante and really startt to blur the line. These few have the potential to really "go nuts" and pull off some hilarious stuff: Robin Williams (It's not too late! Just go nuts!), Tom Cruise (All he has to do is turn around and say, "You thought I was a Scientologist? Fooled you!" and he'll go down in history as someone who wasn't just brainwashed and crazy), and (collectively) the Gorillaz (They don't exist. Anything they do is a little miracle).

Some people, though, are just nuts... or bad with grammar... I haven't figured it out...

Sunday, July 24, 2005

Not His Best Game Face


Now, I'm not a huge fan of the cheap shot. Lord knows I thought ill of Rush Limbaugh for his style of smear that used clearly photoshopped pictures involving the worst shots released by journalists, and I wasn't even paying attention to politics then. But this is really the worst picture of Bush I've seen. Really. Get the man some sunscreen, for God's sake. Like it or not, he's representing us at diplomatic meetings and such.

Provided, I suppose, that he is engaging in diplomacy at all...

Wednesday, July 20, 2005

Fashionable Sitcom Alchemy

The sort of sitcoms that got made in any particular period always seem similar to one another. Humor, to some extent, is driven by novelty, and so occasionally producers have "reinvented the wheel" sitcom-wise from time to time, trying to keep the best (read: popular) traits from previous incarnations. Olde-timey Nick At Night sitcoms always seem to be "gimmick" shows (They're an ordinary couple, only she's a witch!) or odd-couple shows (He's a yokel, she's a snob! Hilarious!). Some have both (Sailors, an intellectual, a farm girl, a movie star, and rich people, on an island!). The 90s seems to see major innovation in the form of the "apparently normal weird friends" shows. Their contribution: strong support by a younger age bracket, and a departure from "family" sitcoms, where most everyone was related (entirely removing many sexual-tension gags from possibility). Often, characters were downright bizarre in their weirdness. Dave Barry once described it as a machine that spits out sitcom ideas to the tune of "X young, quirky, attractive young adults share an apartment."

And now, there's a new breed: the Awful People sitcom. The protagonists are invariably flawed, often totally unbalanced. Humor is mainly achieved by their willingness to say, do, or publicly endorse things that are utterly unacceptable in the real public world. Usually, these comedy worlds are held together by the thinnest threads of believability, sustaining themselves by virtue of the fact that every character is an awful person (so there's a sort of balance).

There are degrees to this, of course. In some cases, the character flaws, though pronounced, are forgivable. Sometimes, certain characters are a lot more awful than others. In other cases, the show manages to generate sympathy for the protagonists against extraordinary odds. Sometimes, everyone just needs to get slapped around.

There's been some of this in film, recently. Some of it was really good, and a lot of it was really, really bad.

What I want to know is this: if you staff a narrative with awful people, but write them believably, will people still watch? Can people really become fans of the worst kinds of people, just because they want to hear a story?

If so, it would go great lengths to explaining soap operas, apart from the "believable writing" part.

Monday, July 18, 2005

Blame Rockstar!

I saw South Park: Bigger, Longer, & Uncut with my grandmother when it came out years ago. A kindly old lady from the Midwest, I was visitting her when the movie openned, and in search for common ground, she announced that she wanted to go see this movie with me. Her reasoning was simple: it was "hip," and the critics had reviewed it glowingly (indeed, the New York Times and the San Fransisco Chronicle, the newspapers whose movie reviews I most trust, gave it a hearty reccomendation). I warned her that this might not be wise - that she might not be ready for the levels of profanity that my generation has becomed accustomed to. But she insisted. And how could I say not to my sweet old grandmother?

About thirty seconds into the movie's famously crude "Uncle Fucker" song, I glanced at her, to see that her already imperfect posture was being slammed into her seat by shock. By the time the song was over, her face was blank, her mind no doubt blown by the sheer level of crudeness, her body shrunk to a size so small that she was almost invisible. And, as is so often the case in this sort of transformative experience, she passed through the gauntlet and emerged on the other side a new person.

Well, not really. She was still the same person. But being an intelligent person, she "got" what the movie was about: namely, that the tolerance we have for violence in this culture is totally disproportionate for our appetite for violence. The movie's message about America's views on censorship can be summed by with one quote, said by the movie's chief zealous mother as part of the "Blame Canada" platform:

"Remember what the MPAA says: Horrific and deplorable violence is ok as long as you don't say any naughty words."

Which brings us to the debacle over Grand Theft Auto: San Andreas. The game has been universally acclaimed by the game industry and players alike, but universally deplored by those who feel a need to be the country's conscience. Let us review the facts:

1. Grand Theft Auto: San Andreas is a third-person perspective mission-oriented game by Rockstar Games, in which the main character is a gangbanger in the early 90s, cruising through an enormous Los Angeles-like game world and advancing a plot of criminal accomplishment which includes (but is not limited to) murder, carjacking, home invasion, drug dealing, pimping, vandalism, and basically every other imaginable crime described in the Gangsta-Rap of the period. Gameplay is open-ended (in that the player is free to explore and do things in any way he/she desires), but requires criminal activity to achieve any level of success.

2. Fans who modify games for their own entertainment (known collectively as "modders") often scrounge around inside the source code for games in order to find stuff that wasn't included in the game, allowing them to re-insert it into gameplay. Usually, this allows them to add their own content (such as new clothes to wear). Modders are usually motivated by two desires: the extend the game's value by adding their own touches, and to earn the respect of other modders for their contributions.

3. A group of modders discovered a mini-game in the sourcecode that was not implemented in the published version of the game. Known as "hot coffee," the game allowed the player to woo several women throughout the game world, and "have coffee" with them. Having coffee, in this case, means having sex. The game can now be played after installing a modification, and involves interactively making love to women, controlling position, thrust speed, and camera angle.

4. Watchdogs have expressed total outrage at the game's graphic and uncensored depiction of sex, and have called for the game to receive the highest ESRB rating available: Adults Only. These watchdogs include possible presidential candidate Hillary Clinton and the Democrat's own closet Republican Joseph Lieberman, as well as the more traditional vanguard of conservative talking heads.

5. Rockstar Games, makers of the Grand Theft Auto games, have that (a) the minigame was not only never intended to be played, but never even really developed, with most of the elbow grease of development done by modders who, in changing the source code, have violated their end-user agreements. Thus, Rockstar asserts, they are no more to blame for the presence of the game than any other game publisher, because nothing (in principle) is preventing people from modding sex mini-games into a whole slew of other games.

Realistically, it's likely that Rockstar did, in fact, develop the mini-game, but pulled it out when they realized the reaction it would get, making their claims a little suspect. But frankly, what surprises me is the extent to which people have genuinely missed the point. By making GTA: SA an "adults only" game, the lesson will be the following: it's less impotant to shield our kids from playing a game that glorifies senseless murder than a game that allows consensual sex.

Sure, games are violent. So are a lot of movies. But the GTA games have really, REALLY raised the bar over the years. San andreas' predecessor, Vice City, was banned from sale in Switzerland for its violence, and Rockstar games was sued by the Hatian government over a mission entitled "Kill the Hatians." In general, the games encourage a near-total disregard for human life, and reward "impulse crimes." The only crimes the GTA games haven't glorified are (thankfully) rape crimes. On the other hand, the games have been hailed for their design and depth. Various family-acceptable games (such as the Tony Hawk games and the PS2 Spiderman 2 game) have been fundamentally influenced by the design sensibilities Rockstar innovated. Vice City was one of the most successful games of all time, topping many movies in raw profits. San Andreas was billed from day one as "Everything Vice City, Only More So," but despite the occasional angry outburst, San Andreas has been without controversy. Until now.

Are we really such puritans in this country that we are only horrified by sex? Not rape (which is one of the few crimes that has the potential heinousness of murder), but mutually consensual sex. Vice City certainly had sex by the barrel (including a guest appearance by porn star Jenna Jameson), but always off screen, as did San Andreas (when using a prostitute to refill your health bar, for example, all the player sees is the car rocking while some ambiguous sound effects are played). Parents have been buying San Andreas for their kids happily enough, if the huge windfall Rockstar saw from the holiday season was any indication. I'm not saying that San Andreas is a good game for kids. It's not. I'm saying that a sex minigame is NOT the reason it should be an adults-only game. In fact, in my opinion, a game with a sex minigame need not be an adults-only game. Our attitudes toward sex are unhealthy enough as it is. But wanton violence and crime on the level Rockstar has implemented should offend America more than a sex game Rockstar didn't even enable for gameplay.

So why hasn't it?

ADDENDUM: So Rockstar did have the game intact in the source code after all. Well, they'll suffer for it.

Friday, July 15, 2005

Witch Trial

I recently played a great game called Witch Trial, made by Cheapass Games. If you're not familiar with the idea, their manifesto is that you probably have all the props you need to play board games (pawns, dice, fake money, etc), and just need a few new bits to play a new game, which they sell for minimal cost. Anyway, Witch Trial involves taking the role of an 18th-century lawyer and defending/prosecuting "witches," i.e. the homeless, free thinkers, vegetarians, unmarried women, and even the larcenous elderly.

A trial typically goes like this: a suspect is paired with a crime (crimes range from the mild "atrocious manners" to "showing of ankles" to the demonic "tampering with the post"), and "evidence" is brought into play. Evidence, amusingly, if often usable by either the prosecution or the defense. For example, having a loving family is proof of either guilt or innocence, depending on who is making the arguement. In fact, of all the evidence in the game, only two can't be used by both sides (Confession being one of them). Witnesses can also be brought in, happily testifying either way, and the charges can get changed in the middle of the trial. In the end, it's still a matter of chance whether the jury will acquit or convict, so the lawyers (i.e. the players) usually plea-bargain at the end so they both profit from the experience.

Which put in my mind modern politics. The extent to which the Public is incapable or (unwilling) of making broad critiques of either party never ceases to amaze me. I long for the days of "It's The Economy, Stupid," but now it's always specific scandals, really zeroed in on a particular topic that widely ignores the general situation. During Clinton's presidency, there was Whitewater and Monica Lewinsky. During Bush's first term, the ambiguous threat of global terroism somehow became exclusively about WMDs in Iraq. During the election, it was the Swift Boat Veterans and the CBS debacle. Earlier this year, it was the Schiavo case. Now, it's Karl Rove's leak of Valerie Plame's name to the press.

Karl Rove is a classic example of someone a lot of people hate, but no one seemed to be up in arms about until a specific charge or issue can be raised, at which point the torches and pitchforks come out of the closet. Ask liberals why they hate Rove, and all you'll likely hear are vague platitudes about how "he's Bush's brain" and "he's the reason they play dirty and win," but rarely have concrete examples. It was exactly the same with conservatives and Clinton: prior to the single damning case, there was wide-ranging speculation that he a terrible person, without a strong foundation. People judge first, then hunt for evidence, demanding conviction at the first shred of evidence.

In part, I think this is the result of the intellectual laziness Americans suffer from. In this country, we place great value in our beliefs, but almost none on the supporting evidence. "My opinion" is a phrase used, not to indicate that a topic is subjective, but rather a solemn vow to defend the topic regardless of rational evidence. We've taken "freedom of speech" and used it as a justification for mindless zeal. Of course, I'm speaking really broadly - a lot of people do marshal the facts (albeit in a highly biased way). But those key moderates who swing the vote in critical states, those Median Voters, I get the impression they revel in their cluelessness.

That's the irony of it all: those most willing to let someone (anyone, really) take them by the hand and walk them through politics are the ones who win elections for candidates. So the candidates fight might over that segment of the population most, and the political lesson of the 90s was "the sound byte beats the scholar." Politicians build their case to the median voter one wild attack at a time, and whoever builds the more prevasive bias in their favor wins, because those with strong beliefs know who they're voting for well in advance, leaving the undecidedes to give the final, critical push.

I just wish people would use the vast information spout that is the internet to get the facts about today's "Scandal of the Week." At least then, they could sound like they had some idea why they were so upset.

Tuesday, July 12, 2005

The Draft

I've never really revised the things I write. Sure, I spell-check, and I check for those common instances where my grammatical intention shifts gear mid-sentence, leading to nonsense, but I don't usually write something with a structure in mind from the get-go, or with planning, or with a series of drafts that lead to a refined final product. I write like a talk, basically.

Doing debate in my youth is probably to blame. Among the tasks was "Impromptu," where you are given a topic and a timer starts counting down, requiring you to build and give a speech on the stop, from scratch. It's not as hard as you'd think, but it takes a little getting used to. Once you can do it, writing like you talk is easy, (my typing isn't nearly as fast as my speaking). But it tends to make the process of revision all the more unpleasant, because you've already got a coherent document in hand.

What's really disturbing about it, though, is that if I return to something I wrote a year ago, I don't recognize myself in it. Oh, I'll likely remember what my basic argument was, but not my specific turns of phrase or my examples. The act itself happened quickly and without extensive consideration, so it didn't stick in my memory. The result is a text that I usually agree with, and that seems (to me) well-argued, but that somehow no longer has my "voice."

The problem, I suppose is partly that I don't think very hard about what I'm saying/writing as I'm saying/writing it. If I met someone who talks in the same style as myself, I likely wouldn't notice the similarity. So seeing my own stream of consciousness fixed in place is an unusual experience for me.

A few times, I've managed to paint myself into a corner. In trying to write a persuasive memo, I started with the basic facts and began to build them toward a specific goal (in the media, it might be called "propaganda"), but about mid-way through, I realized that I'd managed to present those facts in precisely the wrong light, making it impossible for me to come to the conclusion I wanted. Naturally, I began fresh and had no problem the second time. But I could have avoided the problem in the first place by laying some groundwork.

Still, the next time you're having a conversation, try to detach a little part of yourself and have it pay attention to what you're saying and how you're saying it. Try to step back and see whether the logical flow you understand your phrases to use as linkage actually makes sense. You might be surprised at how strange your speech is. People in movies don't talk like real people - they stutter, shift gears mid-sentence, make big logical jumps. In my case, my writing suffers from the same symptoms, though I don't really notice until later on.

So it goes. I'll probably have the same experience re-reading my earlier posts when I eventually decide to do so.

Monday, July 11, 2005

What Grades Mean

The mid-Nineties saw the rise of the "ritalin kid," a tragic soul who, through no fault of his own, suffered a chemical imbalance that made him hyperactive, inattentive. Or, if you believe the other side of the coin, the mid-Nineties saw the rise of the "ritalin kid," a spoiled brat overstimulated during development and driven to chemical imbalance by over-reliance on a new drug that allowed frazzled parents to spackle over their kids' defects rather than try to correct them.

It's not a trivial issue. Near-literal wars have been fought over whether Attention-Deficit Disorder is a real condition (it is), whether it is over-diagnosed in the U.S. (it is), and whether many children are receiving a higher dose of ritalin than is justified by their condition (they are). ADD and its close cousin ADHD cut deep into long-standing intuitions about the distinction between a "condition" and a "personality." Classically, if a child was unable to attend to complex instructions or to retain a level of self-control, they were labeled "problem kids" and usually also declared stupid. Now, however, because ADD is a "condition" independent of intelligence or intentionality, it is suddenly being treated in a very different way, which initially made many people very uncomfortable.

Then came the parents. Across the country, parents came to bat for their hyperactive or otherwise attention-deficient kids. Like grizzlies defending their children (which, frankly, they were), parents attacked school administrators for being biased against their children for problems beyond the child's ability to control. Ironically, the thrust of these attacks were not that teachers should be more patient, as might be logical. It was that their kids should not be punished because of their condition, and therefore that they should be given special privileges so that their grades no longer reflected their condition. Dyslexic kids got more time with tests, they argued, so their kids, also stricken with a condition, should receive similar treatment. ADD and ADHD became "learning disabilities," qualities wherein a person was suddenly exempt from certain universal standards.

And it's precisely at this point that my line in the sand was crossed. Understand: I have sympathy for people who suffer from serious cases of ADD and ADHD. I've known a fair share, and many are terribly insecure about their propensity for distraction. In my experience, this group often has been forced to work harder than their peers to rise to the occasion, and their pride in their accomplishments is constantly undermined by the ease their peers have doing the same things. But just as often, I've met people for whom the condition is a status symbol, and who demand the pity and consideration of those around them. Instead of buckling down and trying to force their minds to order, this group has instead stopped trying to think, and act with the expectation that the world will grant them favors because of their status.

This latter group is a problem because they make up a part of our society that undermines an overall ability to assess skill. Consider the following just-so story: I am a child with ADHD, and as a consequence, I receive double time to take the SATs. The magnitude of my condition isn't taken into account, as long as it is "bad enough." So I, taking my proper dose of medication, am able to concentrate enough that the doubled time actually gives me an advantage score-wise over the poor suckers who didn't get extra time. I now apply to colleges. Presumably, whether or not I am accepted to college X, Y, or Z will be influenced by my SAT score. The same goes for my grades - throughout high school, my parents consistently won me favors in test-taking circumstances, keeping my grades high. The result is a superior college-bound resume than my peers of equal "intelligence," and I get into a better school.

Would this be fair? A person's grades, arguably, are going to be used as an assessment of their scholastic capability. If given no advantages once in college, I predict that the child embodied by the above paragraph would be thrust into an environment they would be unable to excel (or possibly even succeed) in. If, on the other hand, they do get favors in college, what about when they enter the work force? The intuition I'm getting at is that grades should measure performance, and if a person is unable to perform for some reason, that should be reflected in their grades.

"But wait," says he public conscience, "what about the poor dyslexic kid you mentioned earlier? Should be get shafted as well?" After some thought, I've decided that it's a much tougher call for dyslexia than for ADD or ADHD. Here's why: in the case of a perception problem (e.g. dyslexia), the problem is confounding sensory inputs. A severely dyslexic kid would do as well as other kids if the test was read aloud to everyone. Learning disorders that stem from an inability to think clearly, however, are problems of confounding cognitive processing. It's not a matter of mud on the windshield, it's a matter of a wrench in the engine. And, I assert, that is as worthy an influence on grades as run-of-the-mill stupidity.

I think our society is totally ignorant of what it means to have ADD or ADHD. People take concentration for granted, and you can't just tell someone unable to filter out distractions to "focus harder." And they do deserve special consideration with respect to education: teachers need patience and genuine empathy to get through to them. I'm not opposed to having them take tests in environments that are less distracting. But giving them extra time on tests or grading their papers more leniently makes the problem worse. I've met people who gloat about their "special status," and how it's going to give them the edge they need to get the best grades in the class, and it's he kind of pride I can't stand. The mentality of "being disabled" becomes a crutch (in the ultimate irony) that the person is unable to operate without. It gives them an excuse not to try. Grades should indicate capability, and if a person's problem are bad enough that they cannot be overcome, their grades should reflect that.

I'm Loving It (Where "It" Means "Crazy")

Nearly a year ago, when McDonald's brought to the fore it's current line of "I'm Loving It" commercials, and friend of mine was watching TV. And caught one of the very first. A man working in a car wash is seen eating McDonald's, and we hear his thoughts as he runs his eyes over an oblivious and attractive woman. The man is clearly under the delusion that, by virtue of eating McDonald's, he is somehow transformed into a sex magnet, and that the woman wants him badly. She then walks away, cementing the viewer's suspicions that the man is, in fact, delusional. My friend sort of blinked and thought to himself, "Wait. Am I to understand that McDonald's makes me crazy?"

So he paid increasing attention to McDonald's adds, and informed the rest of us of his hypothesis: the new ad campaign always carried the message "McDonald's causes insanity." And nearly a year later, he's batting 1000. That's right: every McDonald's commercial for the last year has actually been telling you that eating there causes madness. Sometimes, it's quite subtle, only recognizable to someone overly familiar with the symptoms of various (often obscure) conditions.

A number of notable examples:
  • A man sitting in a McDonald's begins to have vivid psychotic hallucinations that he is playing the Aztec ball game. The ball happens to be on fire.
  • A man describes to a woman friend that he just broke up with a girlfriend. The act of eating McDonald's breakfast appear to induce in this woman friend a Fatal Attraction-style fixation on the man.
  • A man seems to have instantaneous transitions between clearly discrete events (e.g. going from sitting on a couch to being in the middle of a basketball game), suggestive of the "lost time" phenomenon commonly described in alien abduction stories, and associated with entering a fugue state.
  • Exposure to a McDonald's poster suddenly causes a man to become fixated on an elderly woman standing in front of it. The commercial seems to have a sexual subtext. Which might appear to suggest that McDonald's creates an unnatural desire for the elderly. However, the truth is made clearer by the following example:
  • McDonald's famously received a lot of flak for their internet banner ads during this period. Specifically, an ad saying, "The New Big Mac: I'd Hit It." Which means that McDonald's actually gives you an unnatural attraction to foodstuffs.
  • A girl walking down the street begins to be pursued by a man composed entirely of graffiti, who always ceases to move when she turns to look. Given that she never looks directly at the man while he is moving, the commercial suggests a constant paranoia on her part.
  • A man become irrationally and aggressively protective of his chicken strips, shouting at others to stay away despite no apparent interest in their acquisition.
  • An old woman eating a McGrittle (or some other McFood) is attacked by ninjas and fends them off. I'm led to believe this is another example immersive hallucination.
  • A woman eating chicken strips becomes totally lost in the experience, then dumps her boyfriend for a complete stranger for no reason: a clear case of Dissociative Personality Disorder.
As you can imagine, I could go on. There have been dozens of these commercials. My point is, that with sufficient scrutiny, all McDonald's commercials seem to suggest that eating there blurs reality and might leave you mentally crippled.

It's not just McDonald's. A lot of commercials seem to suggesting "our product causes you to become irrational." Beer commercials have always been this way, of course (in that they seems to cause men to believe they are surrounded by attracted women), but it seems to be spreading to other products. For example, a Wendy's commercial suggests that being given a choice in side dishes causes multiple personality disorder.

So the question is: why would people want to be crazy? They're still making these ads, so someone must think they're working. Maybe, working for an insanity-causing company, the marketing people have gone crackers? It's difficult to say. For my part, I'm waiting for a few disorders McDonald's hasn't tapped yet:
  • The Post-Traumatic Stress Disorder Commercial: A man hears the sound of someone biting into a hamburger and suddenly vividly re-experiences eating at McDonald's. He hears the slurping of a soda and relives drinking at McDonald's. Which causes him to mime leaning onto a counter an reflexively order McDonald's food.
  • The Schizophrenia Commercial: As a person eats some McFood, whispering voices get louder and louder, telling him how delicious the food is, until he suddenly shouts at them, causing everyone else in the room to look at him in surprise.
  • The Bipolar Commercial: With each bite, a man's background goes from dark, quiet, and still to loud, colorful, and frenetic. His last bite leaves things frenetic, and he leaps up to join the dancing crowds.
  • The Sociopath Commercial: A man emerges from his house, goes to a drivethrough, orders a lot of McDonald's food, and drives home, all the while twitching like mad and loathing society. The commercial closes with him eating his McFood and peering crazily through the blinds of his well-shuttered home.
Think about it: do any of these really seem weirder that what McDonald's is currently using?

Thursday, July 07, 2005

I (heart) Statistics

I just love 'em. People have a basic distrust of "statistics" because they think the term means graphs and charts. The truth is that "real" statistics are matrices of data, and it's up to an enterprising individual to figure out their relationship.Graphs can easily be made deceptive, but raw data less so. And I love data.

Electoral-Vote was a godsend during the election, because there was so much data! So I crunched the numbers on my own. As early as August, it was clear that Bush was going to win the popular vote (though possibly not the electoral vote). Why? Well, further analysis revealed that red states gained population more than blue states, so there was a red-state advantage in raw voter population.

There was also my short-lived jaunt into selling for money on eBay. I sold cards from a new Collectible Card Game called "Call of Cthulhu,"because I was lucky enough to find a wholeseller willing to sell to individuals (most sell only to retail shops), allowing me low enough prices to make a profit. But I couldn't be sure until I ran the numbers on eBay. So I grabbed the data on what cards were selling for, and used a propperly weighted random number generator to determine how many cards of various types I would get buying X boxes, and put in my various expenses, and got a positive number! Plus, by keeping up this bookkeeping, I was able to anticipate when I wasn't going to be able to make a profit anymore (because statistical trends let me see the writing on the wall before I actually started to loose money), and got out while the getting was good. All thanks to math!

What I'd like to do is figure out a way to grab data from the Internet Movie Database, and use it to anticipate career trends. Who, for example, automatically guarantees a movie will suck? We all know certain major indicators of suckage (Rob Schneider, who is the epitome of tastelessness, and Charlie Sheen, who seems cursed to be in movies that suck despite having no glaringly negative qualities himself), but it's surely more interesting than that. So the idea is to grab the list of movies a person's been in, grab the IMDB user rating (accumulated from oodles of votes, giving a populist sense of its quality), and graph the results weighted to the person's casting prominence (i.e., if the movie sucked, but the actor had a cameo, it shouldn't count against them as much as if they had top billing). My guess is that, for example, Robert De Niro would show a marked dropoff in recent year, as compared to the glory days of Scorsese films. But I might be wrong. You could also, to a certain extent, rate the movies based on their score at Metacritic or Rotten Tomatoes, and get the "critic's choice" sense of it as well.

Why go to all this effort? Because I love statistics, and this is unanalyzed data! It's like a mountain waiting to be climbed! Which brings me to the annoying part about statistics: getting the data. I can't strees how important getting the data is, but it's usually boring gruntwork. I guess I'd say that collecting the data is like climbing the mountain, while analyzing it is like sking down it. Only on skis made of math!

So lacking a webcrawling dohickie capable of fetching data for me, I'm left analysing statistics at my job. For example, I once worked in a call center, and like most soulless environments, we were compelled to compete for no reason, and were given a list of everyone's performance on a variety of factors. Out of sheer boredom (did I mention the soullessness?), I figured out, with the rudimentary tools on hand (i.e. the Windows calculator), that there were certain subtle but significant correlations between a person's average call duration, overall break time taken, overtime, and the likelihood that that person would be evaluated by management. This seems like a trivial bunch of numbers, but people lived and died by whether they were evaluated. Some people wanted to be evaluated badly (so they could jockey for the tiny number of slots you could be promoted into), while others wanted to be evaluated as rarely as possible (because they had total disregard ofr management and wanted to slack off as much as possible). When I told my coworkers how to increase/decrease their chances of being evaluated, they looked at me like I was clairvoyant. The truth is that I knew (a) how to get an average from a list of numbers, (b) how to draw plot dots on piece of grid paper, and (c) how to connect dots. Not rocket science.

The moral of the story is: take a statistics class while you're in school. If your instructor isn't a moron, and you can learn to use the free, powerful, and somewhat intimidating stat app called R, you, too, can amaze your friends, outwit your enemies, and waste huge amounts of time learning the subtle relationships between things!

Not all math is fun. But statistics is, because it gives you knowledge, and knowledge is power.

Wednesday, July 06, 2005

Mutiple Personalities

How many email addresses have you had? How many are still active?

If you're like most people I know, the answer is "several." It's not just email either - in the electronic world, people are willing to create new identities at the drop of a hat. Personally, I can count a minimum of seven distinct nicknames that I've used registering for forums, getting email addresses, or otherwise proclaiming my identity digitally. And my only real reason for having accumulated this many is that I've been using the internet for about 14 years, as opposed to (say) trying to appear to be multiple people.

It's a question of effort. Most forums are "soft" when it comes to verifying identity - they only require an email address distinct from all others previously registered. But with free email accounts being easy to open, anyone who can be bothered to spend a few minutes filling out online forms can get themselves a new address.

This didn't use to matter. Time was, the worst a person could do with an alternate identity was troll for flames (a persistant and juvenile problem we will forever be plagued with). Now, however, is the Time of Blogs. Sure, they've existed for a long time, but it's only in the last two or three years that a name has existed for them, and that they are being taken seriously as a medium. So I could, in principle, create some drone accounts for this here blog, and start posting under multiple names. If I were interested in spreading opinions and wanted to appear to be a plurality, it would be effortless to do so - and I could reserve my really nasty posts for "disposable" personalities.

Fortunately, I have no particular desire to sway the masses.

My point remains - it's no longer a method for 14-year olds to converse with themselves in a chat room. Take today's freshly jailed New York Times reporter, so-jailed for refusing to reveal the source that leaked the name of a CIA operative. If journalism feels threatened by legal action, what's to keep them from publishing under pseudonyms, sending prosecutors off on wild goose chases. It wouldn't be difficult to create a skeletal identity who publishes scandalous (or illegal) reporting, then defer the blame to someone who doesn't exist.

With information being so easily spread, and current events blogs being so ideologically diverse, it's only a matter of time before this sort of face-changing results in immoral activity, such a slander or libel. People should demand forthright honesty from those who scrawl words into the internet's yielding carapase.

But then again, I never did say what my other aliases were, did I?

This is a Beginning

I am neither an exhibitionist nor a voyeur by nature. I've never understood the need some people have developed to publish their train of thought, nor the compulsion others have to vicariously ride those trains. And yet, here I am.

I think, for me, it's like a headstone. Ever since I was young, I've wanted a big, sturdy marker left after I die. Something made of stone, ideally someplace where the erosion and redevelopment possibilities are minimal. It's about leaving a mark. The irony is that my desire to make my mark is driven by my atheism. Immortality, I proudly proclaimed years ago, exists only in the form of the influence you have on the people around you, and on the eternally-diminishing ripples that extend from that influence. It's not about being remembered, it's about having existed. It's proof. It's like scratching your name into the rafters, where none will look. A speck among specks that carries your signature.

So now I've got this virtual headstone. It's not made of stone (in fact, its chances of outliving me are virtually nil, in my estimation), but it might cause some ripples. And I can apparently scratch away, chiseling to my heart's content.

It's a rare opportunity, to dance upon one's own tombstone.