Where’s the Beef?

January 19, 2009

Mid 90’s hip-hop has seen more media play in the last two weeks than it has in the last ten years thanks to the release of Notorious, a biopic about the life of Notorious B.I.G.  Biggie’s short life spells an incomparable script: a stratospheric rise from crack dealer to rap hero, and an instant collapse from glory when he was slain on the streets of Los Angeles.

Notorious was killed in March 1997, six months after Tupac Shakur was killed in Las Vegas.  The two waged an acrimonious war of lyrics and, it stands to reason, bullets as their East and West Coast stables fought for rap supremacy, each killing the other to achieve the most Pyrrhic victory.

It could be the day that changed hip-hop forever.

Three months after Biggie was shot, Puff Daddy released “I’ll Be Missing You” and the rap world was forever viewed through a more saccharine lens.  Gone were Pac’s lines like “Lil’ Ceaser, go ask ya homie how I leave ya / cut your young ass up, leave you in pieces, now be deceased.”

Instead, Sting performed live on a #1 rap single.  Sting.

The late 90’s were filled with broke-ass rappers like Ja Rule and DMX, and in the early 2000’s, Ludacris and Nelly entertained across race, age and gender, but didn’t really gall anyone.  Nobody’s going to shoot someone over “I’ve got hoes / in different area codes” and Nelly’s hit about how hot it is in the club won’t offend those blokes across the coast.

Then a few years later, everything changed.  Rap evolved, almost as the internet did.  It became open-source and collaborative.

In contrast to the mid-90’s, when rap’s two biggest stars, Biggie and Tupac, wouldn’t whisper a positive word about the other, during the past two years, the five titans of hip-hop have done nothing but work together, brag about one another, boost each other’s bankroll and hustle to push hip-hop back to the mainstream after a decade off.

Kanye West, Lil’ Wayne, Jay-Z, T.I. and Young Jeezy.  Each has put out a chart-topping record in the last two or three years and worked with another on a hit or three.  Four of them even teamed up to release one (admittedly fucking terrible) song recently.

Moreover, each is developing an independent character.  Lil’ Wayne is the playboy, Kanye the emotive poet, Jay-Z the mogul, Jeezy the social philosopher and T.I. the felon.  The music is renowned.  Lil Wayne’s The Carter III is nominated for Best Album at the Grammy’s, next to … Radiohead and Robert Plant.

There’s still a little hardass here though; T.I. is going to jail for a year come March, for possession of unlicensed machine guns.  That’s pretty bad.  But it’s just screaming for the album of the decade to drop when he gets out.

But there are no more Congressional hearings about hip-hop.  In 1993, hip-hop’s most recognizable businessman was Suge Knight, the Jeff Skilling of hip-hop.  Now, it is Jay-Z, who owns the New Jersey Nets and until the market collapsed, had plans to redesign an entire neighborhood in Brooklyn.

Kanye West designs his own Louis Vuitton sneakers, and is generally seen as the best-dressed man in entertainment.

In 1993, rap’s two stars decided to kill each other.

P.S. Juicy J. and DJ Paul won an Oscar.

Gaming and Google

January 6, 2009

The ad industry is a fun one.  We think creatively, dress casually, have inventive office spaces and our companies throw pretty good parties.  Like a lot of professions, we have our own gossip blog (AgencySpy).  I find that its writers are good at more than just industry smut (and lately, there has been some skin) but that they nicely summarize interesting articles that I often click through to.  Most of the time, I just skip links on a blog.  A good one today.

Nielsen Report Shows Gaming Trends of 2008

Despite having over 20% higher household penetration, the Wii has lower usage numbers as a percentage of time spent gaming (13.4%) than the Xbox 360 (17.2%).  This should come as no surprise: the Wii is built with groups in mind, and its games are collaborative and interactive.  They are also played in bursts — Wii Sports is meant to be played in ten to twenty minute sessions.  The Xbox is arguably the superior single-player game console and its sales are driven from games like Halo and Call of Duty.  All meant to be played alone or if with other people, online.  These are immersive, creative experiences.  I don’t play video games save for Dr. Mario on my Game Boy, but I understand why enthusiasts are so loyal to their Xbox.

And it was straying from these enthusiasts and taking a risky business leap that brought Nintendo back to the head of the gaming class after arguably losing the last two console wars.  I’ve recently begun to think of companies like Nintendo and Apple and Google not as software companies, but as engineering firms.  This is where the world’s smartest minds head — our brightest programmers and innovators, and the latter two will be known in decades as we grew to know GE and IBM.

In particular, I think Google has potential to redefine a number of categories that have seen little innovation of late.  Perhaps a gloater’s disclosure, but I have been considering investing in Google.  Like many, I’ve been considering investing while the market is as down as it is now and making some long-term commitments.  Google has a monopoly on information.  It has replaced dictionaries, encyclopedias, maps, satellites and libraries.

Would it be any surprise if Google, in five years, created five private schools?  Or if, in ten, it developed the nation’s first new, remarkable university?  With its reign over the information marketplace, Google could change the world, and do it for the better.

Twitter: Ruining Communication

December 18, 2008

I have no data to support this, but four to six months ago, Twitter achieved critical mass.

For those unaware, Twitter is a service through which individuals can dispatch messages consisting of no more than 140 characters to users who follow their updates.  You can consider and interpret its purpose in many ways, but that idea is Twitter’s beating heart.

Twitter launched sometime in 2006 and I was first introduced to it in early 2007 by a friend of mine who has always been quick to adopt technology and recognize valuable innovations.  My first interaction with Twitter was solely through SMS; my phone would sporadically chime when Ed sent a “tweet,” or an update about his whereabouts, disposition, whatever.  “At the BU pub,” it might read.

I assume Twitter had a web interface at that time, as well, but I didn’t use it.  I briefly followed a few people but quickly tired of my phone buzzing with information that I didn’t care about.  So, for the most part, I ignored the service until two months ago, when its call became too loud to ignore.  Turns out I wasn’t alone.  According to its Wikipedia entry, Twitter had “well over 5 million visitors in September 2008 which was a fivefold increase in a month.”

I signed up for Twitter as @YouIntern and began “tweeting” updates about our website (www.youintern.com).  Given that I work on a startup on the side and am employed full-time at a marketing communications agency, I naturally began following various technologists, marketing strategists, brand thinkers, and other people who classify themselves with ambiguous descriptions such as “thinking and breathing in the social brand sphere” and “uniting consumers and connecting brands to empower social media benevolence” or whatever the phrase of the day is.

And after a two or three week honeymoon period, I’ve realized that Twitter might be the most deleterious communication “innovation” in years.  Here’s why:

1.  Self-aggrandizement

Twitter is really all about one person: me.  It’s about what I have to say, what I want to say to you, and how much I can cram into 160 characters that will grab someone’s attention.  It is the most superficial of the social networks, and that’s saying something given that we mainly use Facebook to prove how artsy or muscular or educated we are.

Almost all of my tweets are about my website.  When someone follows us, I hope they’re following YouIntern for real-time updates about internships for which they can apply.  But if they’re not, I want them to see how we’re progressing and hope they see success.

2.  There’s not much to say in 140 characters

One of my favorite Twitterers is Shaquille O’Neal.  Shaq mostly tweets funny quotes, and, horribly misspelled as they may be, they’re usually entertaining.  Recently, he said ”

But most of us can’t say much with so little, and truncating our opinions into tiny soundbytes is another example of our devolution online.  I’m guilty of this.  I find myself glossing pages rather than absorbing them because I feel the web and it’s tsunami of information has trained me to pick up little pieces during my travels.  That was our reading; this is our writing.

3.  I don’t follow the right people, and the right people don’t follow me

Our objective on Twitter is to shout-out updates and new jobs we’ve posted.  Our target audience should be college students looking for internships who want real-time updates when they are available.  Instead, we’re mostly followed by friends and techies.  I would almost hope they don’t read them, since they are of no relevance; I don’t read much of what they write and I follow most of them out of e-courtesy.

Certain Twitter users follow hundreds, upwards of thousands of other users.  I see messages on my feed from users who get back from meetings and are astonished at the 300+ tweets they need to read.  At a max of 140 characters, that’s 42,000 characters.  By comparison, Thomas Friedman’s editorial today in the New York Times is just north of 5,000 characters.  Now, in the history of the internet, that might be the most pretentious sentence ever written, and no, I didn’t read the editorial, I read my Twitter page instead, which is something I should stop doing.  Dangerous, no?

4.  Links

Of the 20 most recent tweets that my Twitter page displays, 11 of those have links that they want me to visit.  That’s right — over 50% of the updates on my page want to direct my attention elsewhere after reading their note.

One user I follow, an industry connection, sent seven consecutive tweets with her company’s URL within 15 minutes.

I don’t have anything prescient to say about this.  I just don’t get it.  Blogging and micro-blogging (as Twitter is known) has an unhealthy obsession with linking readers to other sources for unknown reasons.  A blog full of links looks more credible than one without, but that note probably features less opinion and more summary.

A service like Twitter is bound to evolve through its users, and it has become a massive link exchange.  I don’t know who has time to read all the links sent along by friends or followers.  I don’t click any of them and I sincerely doubt that most people do.  In fact, I think the people who include links in their tweets don’t click other users’ links either.

I think Twitter has endless possibility.  I think Yammer is a fantastic tool that can replace quick inter-office email and keep groups coordinated.  I like how brands such as JetBlue and WholeFoods seem human on Twitter.  And it’s worthwhile to connect with smart people, whether they be my friend Eugene or the CEO of Zappos.

But through social media, we’ve clipped our words and opinions.  These are things we ought to value and sometimes they are our only creative or provocative outlet.  As the social media Concorde flies ever-faster, I think we should pause and consider.  Speed does not always equal progress.