Another story emerged this week about a presidential hopeful getting stung in the campaign's misinformed effort to exploit the social networking phenomenon.
This one involves John McCain, whose staffers set up a MySpace page that not only borrowed the designed of TechCrunch CEO Mike Davidson, without attribution, but also embedded a menu image directly from his server, thereby eating away at his brandwidth with every visitor to McCain's site.
Davidson responded with an "immaculate hack," by replacing the image they were using with another, with text in which McCain seems to be reversing his stance on gay marriage. "No server but my own was touched and no laws were broken," he points out.
To me, the most interesting point of this episode is contained in Davidson's comment that "I think the idea of politicians setting up MySpace pages and pretending to actually use them is a bit disingenuous." That's a polite way of putting it. I'd use the words "cynical" and "embarrassing" instead.
It's embarrassing to witness clueless politicians and their aides trying to "connect" with the MySpace generation this way, betraying themselves for the dinosaurs they truly are by trying to act cool in a medium they don't understand.
It's cynical, in some cases other than this McCain episode, for politicians to decry these sites one day and then campaign on them the next. The Shifted Librarian recently commented on the ironic positions of two other presidential candidates, "Duncan Hunter of California and Ron Paul of Texas, both Republican members of the U.S. House of Representatives," who offered a bill that would have blocked student access to MySpace and similar sites on school and library computers. Yet both candidates -- or rather, their aides -- set up their own campaign pages there.
Politicians who don't grasp the culture of MySpace should stay off it.
Friday, March 30, 2007
Another story emerged this week about a presidential hopeful getting stung in the campaign's misinformed effort to exploit the social networking phenomenon.
Wednesday, March 28, 2007
I've returned from a short spring break between quarters at Columbus State, a week I devoted entirely to editing a final working draft of my novel and putting it into the mail to several potential agents.
Writing this one has been an interesting experience. I consciously decided to create a traditional novel, one with a linear plotline. It was the structure needed for this particular story. I haven't taught any literature courses for almost a decade, having turned my efforts instead to technical communication and hypertext. Returning now to the craft of fiction, I realize how far I've wandered from the traditional plotline's Aristotelian assumptions about the nature of reality and perception.
In the world of the standard novel, experiences occur through chronology and causation. Characters hasten through a corridor of moments, constantly progressing forward. With each step, however, the corridor grows narrower, as one event or one decision limits the range of possibilities for all subsequent actions or choices. It's essentially a tragic perspective, although the novel often provides a comic alternative by revealing at the end that the characters' own perceptions were flawed or that a secret force has been at work to counter and alter what seemed to be an inevitable outcome.
So I was intrigued, one day after completing the manuscript, to find an article in the New York Times about anthropologist Mary Douglas' new analysis of the "ring composition" in narrative. Douglas concentrates on literary works and sacred texts that are characterized by their "lack of structure, repetition and episodic incoherence." The article's author Edward Rothstein mentions the Book of Numbers, Persian poetry, epics and unconventional novels like Tristram Shandy as classic examples. Douglas believes that these are organized according to an organizing principle of experience very different from the Western sense of narrative, but closer to living experiences when we make sense of events only through dawning realizations:
At first one event follows another. We may not be entirely sure where it is going. Is there a point at all? Then, with declarative emphasis comes the turning, where, with a shock, we hear a first echo. We connect these different moments; a pattern begins to take shape. Then, step by step, other similarities are heard — they too take on meaning — moving backward from the most recent to the earliest in time, until we return to where we began. This kind of narrative needs to be heard again, for it is only in the retelling that the full nature of its order is revealed.
One point I derive from the article is that we have legitimate alternatives available to the Aristotelian aesthetic of chronology and causation of linear narrative. The ring composition is one of those. Hypertext is another, since it operates through simultaneity and association, through an open space rather than a corridor, with options and alternatives opening with each act or decision, rather than closing. Hypertext fiction is still in its infancy, hampered I believe by the absence of tools to allow the full realization of its potential. But its comic aesthetic is promising, nonetheless.
Tuesday, March 20, 2007
The Internet is creating an entire generation of narcissists. That, at least, is the conclusion of a professor at San Diego State University. But is this science, or just another round of generational stereotyping?
Eric Gwinn of the Chicago Tribune reports on the findings of Jean Twenge, who maintains that "Young people born after 1982 are the most narcissistic generation in recent history." We know this because, among other transgressions, they're flaunting themselves shamelessly on MySpace and You Tube.
The bulk of Gwinn's piece concerns the dangers of young people divulging personal information without safeguarding themselves, which is a legitimate concern. But the scholarly commentary that frames the issue consists of a very old puritanical theme, that every new medium or new art form releases the latent vices of whatever generation embraces it. It's the same argument, dating back to the 1950s, that claimed rock music had made my generation shockingly libidinous, and that watching Howdy Doody on television turned us into fuzzy-thinking Marxists. It's the same argument that blamed MTV for spawning a generation of iconoclastic hedonists, the likes of which the world had never seen. It's the same argument, still current today, that claims video games are responsible for youth violence, in the schools and on the streets, which had never been a problem before. So why shouldn't MySpace turn the young people of today into narcissists?
Besides being puritanical, the argument simply misunderstands the medium it's criticizing. YouTube and MySpace, two of the social networking sites that Twenge singles out, are part of the Web 2.0 development of the Internet, which encourages user participation and user creation of original content. In each case, the "content" consists of text and images about the users themselves -- but in the construction of a social network, not as an isolated platform. An individual "all about me" website, unlinked to anything outside itself, would be narcissism, the narcissist alone in a room with a mirror up to the self. MySpace is an open party, where each guest arrives in an interesting outfit and strikes a pose to draw attention.
Social networking online is not much different from networking in person. It involves the conscious creation of a public image, a persona, selecting aspects of the personality, some heightened and others downplayed. The persona isn't the true, full personality, but rather a somewhat artificial projection. The healthy, integrated individual recognizes it as such, and doesn't confuse the projection with the core personality, the construct "out there" with reality.
In addition, educator Andy Carvin has pointed out how Twenge misconstrues the "ethos" involved:
She also makes too much of the fact that some of these tools have brand names that embrace the first-person, such as MySpace and YouTube. Twenge equates these tools with being “all about me.” They are about me, but not in the way she thinks they are. The vast majority of people who use social networking sites aren’t in on it to become famous and have hordes of adoring fans. Sure, some people are there for vanity or proto-celebrity purposes, but most people are there for us, not me. They’re communities where people come together to find each other and bond over likeminded interests. They’re communities where people reinforce interpersonal relationships through sharing and creating content. The names MySpace and YouTube are merely references to the fact that they’re an experience built around your interests and creative abilities - and the others who share those interests and abilities. Just as Time Magazine botched it when they declared “you” as person of the year, Twenge misunderstands the ethos of social media, not recognizing that users of social media do it because they care about the notion of “us” and want to be a part of something bigger than themselves.
Posted by Douglas Gray at 3:20 PM
Saturday, March 17, 2007
French postmodernist Jean Baudrillard, who among other achievements extended McLuhan's critique of media as a controller of perception, died March 7, 2007. Buried deep in evaluating final projects and giving exams at Columbus State, I wasn't able to pay tribute to him at the time of his passing.
I was and continue to be an admirer of his thought, although I found his actual writing to be almost impenetrable. In the mid-80s, a graduate student friend at Ohio State, very up-to-date in postmodern theory, lent me his translation of Baudrillard’s 1972 work For a Critique of the Political Economy of the Sign. I did my best to get through it, but found myself frustrated enough at one point to throw the book against a wall, so it could share my pain.
His 1988 work The Ecstasy of Communication brought me similar grief. But I understand enough in the abstract to recognize the importance of his thoughts on the seductive power of media to substitute a modeled hyperreality for physical reality.
Baudrillard used the term "simulacra," from Plato, to describe an historical progression of how art and media have created different types of "copies" or nature or reality: from representations of things, to idealizations; then on to mass-produced mechanical reproductions, ending in our time of electronic, digital simulations of things. This is the age of hyperreality, virtual reality creating what he called a "desert of the real."
A challenging, controversial figure. I have a dream of someday returning to his work, when I'm older and much more patient, when I have time to savor his depths and the intricacy of his style. That won't be any time soon, though.
For an excellent overview of Baudrillard's work, check this site from the University of Western Ontario.
Posted by Douglas Gray at 11:12 AM
Friday, March 16, 2007
Back in 2003, when the Chronicle of Higher Education wondered to itself whether blogging would go the way of the CB radio, the question was already a cliché. It's turned out to be a cliché with impressive staying power, since people are still posing that question online today, as if it were some sort of novel idea.
What irritates me about the question isn't its relevance, but its technological and cultural snob appeal. It assumes that CB was a perfectly serviceable tool, appropriately used by taxi drivers and truckers, until uneducated masses of enthusiasts swarmed upon it, like flies on the carcass of a wildebeest. Americans filled the airwaves with a babble of pointless, inane chatter, before suddenly abandoning all their radios in the dumpsters of interstate rest stops.
Blogging, the suggestion goes, might prove to be just another silly fad, like CB. Everyone's blogging now, but soon we'll all repent our foolishness and feel embarrassed about those blogs we stuffed with our inanities.
I'm disinclined to be dismissive. I suspect that the CB radio craze was a signature event in American culture of the 1970s. The fact that it was short-lived doesn't diminish its significance. Perhaps by attempting seriously to understand the appeal of the CB, we can better understand the current blogging phenomenon and make clearer predictions about its future.
One problem with CB radio phenomenon is that it dates to the 1970s, a decade that was hard to take seriously even while we were living it. Citizens band frequencies/channels had been available to the populace since the 1940s, but as late as the 1960s they were used only commercially by radio dispatchers and cab companies. Transistor technology eventually made the hardware affordable, and the FCC opened additional channels to the public. But improvements in the technology didn't themselves create the sudden, unprecedented demand for these radios.
The root cause of the CB craze was, in fact, a political event: the passage of the universally despised national speed limit of 55 mph. In 1973, an act of Congress turned every American motorist into a potential outlaw. The CB became the most essential weapon Americans had for combating the federal government's attempted curtailment our right to drive as fast as we chose and squander as much gasoline as we could afford to.
Used at first by interstate truckers, for the practical purpose of outwitting the highway patrols, the CB became a romantic symbol of the era. Truckers themselves, usually liminal figures in the national mythology, enjoyed a short period of glory, replacing the cowboy as the symbol of independence and rebelliousness. Two major stars of the time, Clint Eastwood and Burt Reynolds, played truckers in some of the biggest movies of their careers. A 1975 song titled "Convoy" by C.W. McCall made its way to the top of the charts around the world.
The 1970s was the decade of the road. For those who weren't around at the time, it's hard to grasp the sense of discontent, restlessness and dislocation that characterized the era.
Hitting the road had become a central theme of popular culture even before the national speed limit. As early as 1971, Carol King lamented about singing "One more song about moving along the highway / Can't say much of anything that's new."
Maybe it was the national hangover we were suffering after the turbulent, assassination-filled 1960s. Maybe it was weariness over our mounting losses in Vietnam and disillusionment with the political process that eventually culminated in the Watergate scandal. Maybe it was a series of recessions and a sense that the country was losing its competitive edge internationally. Maybe it was the fact that, as McLuhan once noted, societies where social mobility is diminished just naturally become nomadic.
Whatever the reason (or, more likely, combination of reasons), we were suddenly a country of Jack Kerouacs, all of us on the road searching for something we couldn't define. The CB radio filled a growing void. We'd lost our old communal ties, and required a tool to build new social networks on the highways. We were lonely. But more than that, we were yearning to reconnect with a unique American character we felt we'd lost. The CB wasn't just a radio: it was a stage where people developed complex, exaggerated, often comic personas for themselves. The airwaves were all at once filled with distinctive characters, sometimes almost mythic figures, joining in a great Whitmanesque chorus of American voices.
And then the craze ended, as suddenly and (at least on the surface) inexplicably as it had begun. But the end, I think, can also be linked to a political event: the 1980 presidential election, when Ronald Regan succeeded Jimmy Carter to the office.
Everone remembers Jimmy Carter as the president of "malaise," even though he did not actually use the word in the infamous address when he attempted, more frankly than the American public was comfortable hearing, to diagnose the nation's ills, its failure of nerve and loss of self-confidence. The word has nevertheless attached itself to the Carter administration -- unfairly so, because the whole decade was a period of malaise, not simply the four years he served as President. Ronald Reagan was not a great President, whatever the conservatives' nostalgia for him would claim, but he was the cure for the malaise that ailed us at the time. He restored Americans' belief in themselves and in the future.
His voice unified the country, reunited us with our communities and with our sense of self. By the time the national speed limit was repealed in 1982, the CB craze was already well over.
The CB served a social, political and emotional need of its time. Once those needs were satisifed or could be met in other ways, the craze ended. Blogging may also be a patch on some contemporary wound to our collective psyche, some ache that can only be soothed by spinning new social networks where we can proclaim and celebrate our individualities, while also solacing ourselves in the comfort of the crowd. History has shown that when we suffer the curse of living in "interesting" times, we develop interesting media to help us cope.
Tuesday, March 6, 2007
Here's a follow-up on comments earlier this week about the value of a paper trail: Diebold -- infamous for its paperless voting machines -- is lamenting the damage done to its image as a manufacturer of fine safes and teller machines.
As a citizen of a state (Ohio) that landed in the Bush column through voting fraud during the 2004 elections, my heart bleeds for a company whose chairman Wally O'Dell promised to deliver our Electoral College votes to the White House resident.
For more specifics, check this story at Black Box Voting.
Sunday, March 4, 2007
Consumer groups and state attorneys general are looking into allegations that Best Buy sales associates have sometimes used a corporate intranet site, a look-alike of their public website, to decieve customers looking for sales that were promoted online.
The news has generated a great deal of interesting discussion, much of it in support of Best Buy's ethics, at Slashdot and at TechDirt. My goal here is neither to criticize or defend Best Buy, but to look at a deeper issue: the unreliability of web documents.
If I were ever to shop at Best Buy (something I studiously avoid doing anyway) and bring a printed flier from that Sunday paper's supplement with me, it's unlikely the adolescent sales associate would whip out a rival flier, listing the price of the plasma display at $995 instead of $855.
A printed document is an objective reality, a "thing" in the original sense of that word -- something two separate minds can meet over and agree to its existence. The text on the page cannot be edited, censored or deleted without leaving a mark of the alteration.
In the incidents under investigation, the associates produced an alternative web page with different prices, a page with greater claim to accuracy because it was, supposedly, more recent than the one the shopper had seen at home. There was no physical record of a change, or even evidence that this wasn't the same page the shopper had read earlier.
Hypertext is mutable, quicksilver, evanescent. The short history of the Web is already full of cases where a corporation or government agency posted information online that later proved embarassing to it and that was hastily deleted, with denials that the material had ever existed at all.
Print is tangible, permanent, agreed upon. It points an accusing finger at whoever dares tamper with its meaning or its existence. That's why, whenever important matters are at stake, we need the assurance of ink and paper, rather than hypertext, to defend the truth.
Back in 1973, a rumor spread through the little southern college town where I lived that the United States had only one week's supply of toilet paper remaining. The townspeople descended on the bewildered manager and staff of the local Jitney Jungle, exhausting the stock of four-ply in about half an hour.
It was a small, tight-knit community where reports -- especially false ones -- spread with amazing speed, and seemed to gain greater credibility as they grew more outlandish. The Internet has many of the social dynamics of a village, which makes it a vital breeding ground for rumors.
Red Orbit is reporting on the efforts by several corporations (including McDonalds, Procter & Gamble, Coca-Cola, and Starbucks) to address false reports about their products and policies, mostly by way of their own websites.
In most cases, it's impossible to trace a rumor back to its original source. Recently, though, Starbucks was able to identify an email from a Marine sergeant in Iraq as the cause of a widely-spread tale that Starbucks refuses to support American troops overseas.
Saturday, March 3, 2007
Tuesday, after the assassination attempt on Vice President Cheney in Afghanistan, a few anonymous writers posted comments on the Huffington Post lamenting that the attempt failed.
Site administrators spotted the abusive remarks and immediately deleted them.
Rude, inappropriate commentary is part of the price for the internet's freedom of speech, and Huffington Post acted correctly in removing it as soon as it appeared. However, the right-wing spin machine seized on the incident and characterized the episode as proof that all liberals are "people who would celebrate a successful attack on the life of the vice president."
As Arianna Huffington points out in her trenchant response to charges from Rush Limbaugh, Sean Hannity,Dean Barnett and others, hate speech occurs on both extremes of the political spectrum, and finds platforms where it can express itself all across the Internet.
For the right to characterize it as the exclusive expression of the left is especially hypocritical, in light of odious pundits like Ann Coulter, who has "humorously" called for the murder of Bill Clinton and Justice Stevens.
Talk on the Internet can be extraordinarily vulgar, violent and inexcusable. Talk on the mainstream media can be equally so. The difference is that the anonymous posters lack the infrastructure of major publishes and cable networks to legitimize their hatred.
Friday, March 2, 2007
Tim Berners-Lee testified yesterday before a U.S. House panel on "The Future of the World Wide Web," presenting a persuasive case for maintaining a "nondiscriminatory Internet." (See full account at C/Net News.)
While refusing to back any specific bill on Net neutrality, he clearly opposed the concept of Digital Rights Management prioritization, and sparred with Representative Mary Bono of California over the issue.
What would Sonny think?
Two bright lights of my world have been extinguished this winter. The first was Molly Ivins, who passed away in January. The second is Arthur Schlesinger, Jr., who died February 28.
Arianna Huffington has posted a tribute to him on the Huffington Post, recalling how Schlesinger was one of the first notables whom she approached to serve as a columnist blogger on her new site:
"What is a blog?" he asked. "And what is blogging?"
So in this bastion of the Old Guard, I found myself explaining to a man who didn't do e-mail, and who considered his fax machine a revolutionary way to communicate, what blogging is. Of course, he got it instantly -- and almost as quickly agreed. With one proviso: "Can I fax you my blogs?" he said.
One of the qualities of the intellectual, and of the liberal mind, is this ability to grasp new forms of communication and this willingness to experiment with them.
Posted by Douglas Gray at 8:08 AM