Categories
Economics Media

“Aggregation” Is Not A Villan

Citing the classic public-official-gone-bad investigative journalism that many of us worry about losing in these days of belt-tightening at newspapers, the Cleveland Plain Dealer’s Connie Schultz takes aim at “aggregators” as the source of online newspaper financial woes.

She compares “aggregation” at sites like The Daily Beast and Newser with a 1918 case between the Associated Press and International News Service. She writes:

[P]arasitic aggregators reprint or rewrite newspaper stories, making the originator redundant and drawing ad revenue away from newspapers at rates the publishers can’t match.

[…]

INS was copying or rewriting AP stories and transmitting them by telegraph and telephone to papers in western U.S. time zones.

The Supreme Court ruled that INS engaged in unfair competition that ultimately would drive AP out of business. It enjoined INS from reproducing the AP stories, but only for a brief period while AP’s dispatches had commercial value.

Thus, the financial solution and the way to ensure that investigative journalism continues, Schultz believes, is to rewrite copyright law to force “parasitic aggregators” to reimburse content creators for using the “sweat of our brow to compete against us.” The words are from Daniel Marburger, professor in the College of Business, Arkansas State University.

Their proposal is to put a 24-hour hold on copy that appears on a news site.

A 24-hour hold on links in the era of Twitter? You must be kidding me. Because, at its heart, this proposal questions the value of other people — or automated web sites — pointing potential readers at a story via a link.

What, pray tell, makes someone an ‘aggregator’?

The two sites that Schulz mentions do not fall into my mental model of an aggregator. Google News (normally the news industry whipping boy) is an aggregator; an alogorithm “decides” what stories appear on the news page. The links reflect headlines and the first sentence (or two) from a news organization RSS feed.

Conversely, I don’t think of either The Daily Beast or Newser (the two examples Schulz provides) as an “aggregator.” At each site, a human writes a compelling recap of a “print” (aka “text”) story and links to an original story. In my mental model, this is not aggregation. The writers are “adding value” to their readers, who may have never visited the originating web site. Moreover, this is exactly the same thing that TV and radio news stations have done, at least in local markets, for decades. One key difference: those on-the-air news reports don’t have a mechanism for getting the reader to the original newspaper story.

At 9.30 pm Pacific, the top two stories at the “cheat sheet” on Tina Brown’s The Daily Beast were  a Washington Post story about Obama and climate change, datestamped Sunday  5.54 pm Pacific, and a story about Honduras that linked to the Wall Street Journal, datestamped Sunday 4.35 pm Pacific. Both newspaper sites indicate that these are stories in their Monday printed edition, but neither timestamps their articles or properly credits the day they were published online. (Why not? Because they are still thinking “fixed, once-a-day publishing”? Because they haven’t customized their print-to-web software to consider the unique chracteristics of web publishing and the web audience?)

But the Cheat Sheet, which is not copy-and-paste but instead writing with an edge, isn’t the primary attraction of The Daily Beast. There are photo galleries like this one. And then there’s the TV newsclips, another use of news content. But The Daily Beast produces even more original content, much of it commentary. We can argue about the “proper” role of commentary versus reporting in the “ideal” daily news diet, but commentary is not ripping off “the sweat of our brow.”

On the other hand, Newser features more non-self-generated content on its home page (“on the grid”) than The Daily Beast. One of those top stories at 9.30 pm Sunday was about Michael Jackson. The Newsweek story it links to is datelined Saturday 27 June (no time, again); the Newser recap, Sunday at 6.27 pm Pacific. For this one, Schulz’s 24-hour “hold” doesn’t come into play.

Interesting to me, Newser also appears to be an Associated Press subscriber.

So what’s the bottom line?

Yes, news organizations need to figure out a business model. I don’t have a lot of sympathy for them — they’ve had decades reaping monopoly rents that they did not use to explore new markets.

But demanding money for links isn’t going to save the day. Either we’ll simply stop linking to you — and you’ll see your traffic drop accordingly — or there will be so many link forms (Twitter, blogs, search engines, etc.) that there won’t be a way to assess everyone — or one intrepid national organization will realize that if they don’t charge for links, they’ll get all the traffic (free-rider effect).

See, like many other schemes that seek to make people pay for generic news, this one would work only if 100 percent of the news sites agreed. And that would require an anti-trust exemption, which ain’t gonna happen.

Related:

Tip: Jay Rosen

By Kathy E. Gill

Digital evangelist, speaker, writer, educator. Transplanted Southerner; teach newbies to ride motorcycles! @kegill

2 replies on ““Aggregation” Is Not A Villan”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.