Every few years a new (sometimes old) phenomenon that adherents claim will save journalism achieves meme status.
I was reminded of this by the news that AOL’s Tim Armstrong was putting the hyperlocal network Patch on the back burner, which most commentators have taken as a death knell. Patch has lost a couple hundred million dollars over the years, maybe even more, according to The New York Times’ David Carr, who in a recent column cast Patch as Moby Dick and Armstrong as Captain Ahab. AOL has already laid off about 300 Patch staffers, although the company, in crisis PR mode, claims the hyperlocal enterprise is simply taking a break while it looks for potential partners.
Frankly, I never saw the appeal of hyperlocal as a sustainable business model. The idea, you see, was that readers would care more about the grand opening of a new bakery on their street or a mugging or burglary on the next block than they do Iran’s atomic testing or Edward Snowden leaks. Location would trump reader interest, and hyperlocal would take the best of what the Web offers with its powerful community building combined with our innate curiosity over what our neighbors are doing. Except, as AOL’s Tim Armstrong and others found, there isn’t enough money in this local ad play to sustain a business, and much of the content was pedestrian (although to be fair, in times of emergencies, like Hurricane Sandy, some of it was quite good).
Hyperlocal has been but one meme over the past decade or so that was going to rescue the news business. Before hyperlocal, there was blogging, which was viewed in some circles as the most important innovation to hit news since Gutenberg’s press. In 2004, the New York Times magazine put Ana Marie Cox of Wonkette fame on its cover, heralding this new age of journalism.
But blogs, as we know, haven’t taken over the so-called “mainstream” (or as Sarah Palin calls it, “lamestream”) media. Musty old media companies like The New York Times simply co-opted blogging. Meanwhile, blogs became more like the old media companies they loved to criticize, achieving a look and feel that became less by-the-seat-of-their-pants and more professional — better to attract advertisers.
As a corollary, remember about eight years back when reporters were going to tote video, still cameras and audio recorders to post 360-degree content on their own blogs? We’d all be one-man bands, writing our own breaking news, adding video and photos and graphics on the fly with our trusty portable gadgets. Eventually we wouldn’t need news organizations, since we’d become our own brands and operate our own ad servers and the like. It would be the ultimate disintermediation.
Alas, reality has a nasty habit of tossing cold water on such fantasies. Editing video, shooting quality photos, and creating powerful audio segments take special skills, not to mention have you already the inherent challenges of reporting and writing. It’s difficult enough to report, write and edit stories. Requiring reporters to become expert at video, still photography and graphics didn’t recognize the inherent challenges in becoming proficient at any of them.
Around the same time, citizen journalism came to the fore. This meant that everyone could be a reporter. Why not, right? It’s not like you need to pass a test to join the fraternity of journalists. This picked up steam at the time of the tsunami in Thailand and Madrid bombings in 2004, and the London terror attacks in 2005, where many of the most powerful images came from people who were there. Since everyone could be a news reporter there would be little need for professionally trained information gatherers, who couldn’t possibly be everywhere news breaks.
As social media platforms like Facebook and Twitter became engrained in our lives, news would be amplified without the need for irritating filters like editors, so the thinking went. It culminated with CNN anchors reading tweets over the air from unknown Egyptians and Iranians in the midst of Arab Spring. Unfortunately there was no way to vet their veracity. It also rejected the fact that many of us who practice journalism have attained hard-earned skills. Just because I’ve watched someone work as a brick layer doesn’t mean you’d want me building your house.
Recently two other ‘save journalism’ memes have gained popularity. One is that journalists (and everybody else) must learn to code. “It’s the hottest skill on the job market, the modern-day language of creativity,” and “it should taught to kids in school instead of cursive writing, which no one needs.” The other is the rise of data journalism, where reporters comb through huge troves of data in search of information that can lead to a story.
One problem with journalists learning how to code is that learning to program, as with video and photography, requires great dedication and years of practice. What’s more, which coding language should you take up? Ruby on rails? Django? HTML? CSS? All of them? By the time you become well versed in one you might find out a new language is now hot.
As for data-driven journalism, these kinds of stories take time, perhaps months, and only the big news organizations cam employ a staff of data journalists to publish an in-depth story every four months. Some data journalists will be able to write their own tickets, but I don’t see how most news companies could afford one on staff.
Each of these memes — hyperlocal news, blogging, citizen journalism, reporters toting video and still cameras and becoming all-in-one media conglomerates, coding and big data — were going to transform traditional journalism. In most cases, however, it’s the memes themselves that have been transformed, and co-opted into something that looks a lot like traditional journalism.
Image of Edward R. Murrow via Wikimedia.