I still can't bring myself to say "tweet" with a straight face.
Of all the emasculating words most grown men would prefer not to utter, "tweet" falls somewhere between "frolic" and "frisky." Or "twubs" and "twitturly," to name two cousins of tweet.
For anyone still confused, to tweet is to post a 140-character message via the Web site Twitter. It's best described as a brief text message that goes out to 10, 100 or 1,000 people at once. The range of content runs from, well, get a Twitter account and you'll see for yourself.
Some log their daily doings, from bathing Chihuahuas to fixing Ramen noodles. Others post links to stories and videos. And a few put up nothing themselves but follow closely those who do.
For certain demographics — students, media, anyone under the age of 30 — tweeting is inescapable. My friends tweet and seem peeved if I don't keep up. Newspapers and TV stations, including the Columbia Missourian, now tweet their stories. My group project for a research class I'm taking is — you guessed it — analyzing the effects of tweets. A course in online video that I took earlier this summer required tweeting from day one.
A professor of mine last semester joked that he liked to tweet with himself. I used to think text messages were solely for 12-year-olds and that MySpace accounts were for indie bands and child molesters. Since becoming a student again, however, I've joined the tech-savvy masses who latch on to every computer application that can be morphed into a verb. As in, "Do you Facebook ... ?" "Do you bing ... ?" "Do you tweet ... ?"
Many celebrities tweet, from Ashton Kutcher to Margaret Cho, and I confess to following at least a few of them. President Obama tweets, as do other politicians, either directly or through their staffers. Tweeting is already included in many a job description.
Before hyping tweets as the next big breakthrough, however, it's worth noting that less than half of new users keep their accounts after the first month. The first few Twitter reports I heard leaned toward the negative. Hip-hop artist Kanye West threw a tantrum because someone was impersonating him with tweets; Holly Robinson Peete was recently criticized for tweeting flippant remarks in regard to the death of former NFL player Steve McNair.
So far the most worthy tweets have come from Iran of all places, where a questionable election spurred protesters to tweet as a means of circumventing state-controlled media; and from Michael Jackson, when the rush to get out information led followers to first learn of his death via tweets.
Who could have seen those coming? At MU I hear a pep talk every so often lauding Twitter and other high-tech tools as the wave of the future, while making excuses for the fact that few have managed to turn a dime from tweeting, blogging or anything else in the same food group. Or to put it in grad-school speak, tweeting is not yet a "profit-driven model."
Twitter might eventually prove its staying power, or it might fade away in favor of a newer, spiffier fad. It doesn't take a schoolmarm to point out that you shouldn't tweet anything you don't want splattered across the front pages of a newspaper, since that's pretty much what tweeting is. Last week I tweeted a story I wrote. To my surprise, the one person who tweeted back to say she enjoyed it doesn't read newspapers, or hardly any other media. But she tweets.
Brian Jarvis is a journalism graduate student at MU.