bpwcc2bcmaervvx.jpg-large-1

Like many journalists, I hang out on Twitter. At its essence, it’s a broadcast medium where everyone has his or her own TV show, and you can watch an almost endless stream of other people’s shows. Forget that Andy Warholian quote that “In the future everyone will be famous for 15 minutes.” He said that in the ’60s. Today is the future and on Twitter everyone can be famous for 15 seconds, over and over again.

Is there a word yet for tweeting while watching TV (Teletweeting? Tvitting?)? If not, there should be, because it’s become a popular pastime. Twitter was the place to be during last year’s Democratic and Republican conventions and the Oscar’s is more fun with the almost endless stream of snark your “tweeps” can provide. But since Twitter is all about who you follow and who follows you within your little viral cluster it doesn’t necessarily reflect the world at large, despite what a recent study claims.

Nielsen, which has partnered with Twitter, conducted this study by analyzing more than 200 episodes of prime time shows on major networks, and found that Twitter chatter significantly increased ratings nearly a third of the time. It worked the other way, too. A show’s popularity often resulted in more tweets about half the time.

This seems obvious in the way another first-of-its-kind study from more than 15 years ago was. Just before the opening bell of the 1996 Mike Tyson-Frank Bruno, Showtime announcers offered a bonus to its pay-per-view audience: Get on the web and score the fight along with the judges. Within 30 seconds, traffic to the site was so heavy, the server was knocked off line.

This was the era of dialup modems with super-slow connections, with the Web barely six years old. Showtime execs were astounded. They knew it was impossible to get on the web that fast if your computer was cold and had to be booted up, or if it was in another room.

This study found that nearly half of U.S. households that had both a TV and a personal computer kept them in the same room, and 40 percent of PC households with Internet access regularly watched TV while they were online. The numbers were higher for premium subscribers – those who paid extra for premium channels like Showtime or HBO: 60 percent of premium subscribers stashed their computers and TVs in the same room and nearly half said they used their TVs to keep them company when they surfed the web.

The Showtime study, which has been long forgotten, was a much better indicator of the connection between TV and online than this new Twitter study is, and that has to do with the social networking aspect of Twitter. Because our Twitter experience is shaped by who we follow and who follows us, this social amalgam tends to reflect us. In other words, your Twitter experience may not reflect what’s happening in the real world.

Being on Twitter is a bit like living in a community. If you live in, say, the East Village of Manhattan, you may not know many Republicans. In fact, if you never left your liberal bubble, you might think no one would vote Republican. The same holds true, of course, if you live in a Texas town with a Tea Party congressman and mayor.

As it relates to TV ratings, this was born out by the campy made-for-cable movie “Sharknado.” A few weeks out of nowhere my tweet stream was filled with references to sharks being flung through the air and gobbling up hapless victims. It was a torrential downpour of tweets and it seemed like everyone was watching. Yet the Syfy movie registered only a slight bump in its usual ratings for a Thursday night. As Brian Stelter reported in The New York Times, “Nielsen’s quarter-hour ratings suggested that a few hundred thousand viewers turned on ‘Sharknado’ after it started at 9 p.m., perhaps because they saw friends or celebrities joking about the movie online.”

So which is it, Nielsen? Does Twitter activity increase TV ratings or not? It makes sense that it would, but we also can see that it doesn’t. Unlike most movies and TV shows, however, there is no cut-and-dried ending.

Image via HLNtv