Infographic: Twitter and the art of media punditry

Infographic: Twitter and media punditry

One great thing about our addiction to social media (and really, this may be the only one), is how easy it makes it to turn mushy real-world concepts such as influence into quantifiable, crunchable statistics. For those of us mathematically inclined, it’s almost like everyone got their own baseball card and we can simply flip it over to see who’s really slugging and who’s getting walked. Unlike baseball, however, the real world of social influence hasn’t exactly been figured out by the data crunchers; that is, the relationship between being a real-world macher and a social media starlet isn’t a straight line, though the two are correlated.

In this infographic media consultant Jose Rodriguez has mapped out the social reach of a very special subset of folks: journalism pundits (including yours truly and many of the individuals and sites we follow). What makes this more interesting than, say, a graphic about the reach of Hollywood actors is that this group is speaking to an audience that are themselves influencers of what others consume so in some sense any reach you see here is magnified several-fold (that is, if you believe journalists and marketing folks still have  influence on the content you ultimately consume).

What can this data teach us? Well, we’re flattered to be here but obviously have some catching up to do in terms of reach. I can see from Jose’s visualization that there’s a pretty obvious link between number of tweets and number of followers. But maybe those with more followers just tend to tweet more. Since it’s free to tweet, no harm in upping the level of communication although many of the top tweeters on here have been at it longer than Ebyline has been in existence, which probably explains some of it.

As always, outliers are the most interesting data points. While nearly all of those with big followings themselves follow a relatively small number of feeds, there’s Stuart Elliott of The New York Times’ Media Decoder blog out there on his own (in hot pink, no less) hitting the Follow button like a retired mayor at the video poker table. What gives, Stuart?

 

Infographic_ TwitterJournoBlogoSphere | infogr.am

Hyperlocal journalism: you can only automate so much

#SFSNYC

Jim KirkCutting costs: good. Streamlining operations: really good. Eliminating a human editor from the process of publishing news: not gonna happen.

That’s the message that Sun-Times Media’s editor-in-chief, Jim Kirk, sought to relay on the second day of the Street Fight Summit conference on hyperlocal publishing in New York City. Kirk was talking with Everyblock president Brian Addison and DataSphere veep Gary Cowan about automation and the use of raw data to complement—or replace—traditional, i.e. human, content creation and curation.

Kirk, who joined Sun-Times in April from Crain’s, referred to the Journatic scandal in which a freelance writer plagiarized an article for crosstown rival Tribune’s hyperlocal arm (Sun-Times had a relationship with Journatic at the time) and said that human editors add cost but remain a necessity. Nevertheless, he added, the days of waiting for a new business to arrive are over and Sun-Times and other publishers have to push for new ways to produce local content at a cost that’s sustainable. “It’s either move forward or die,” Kirk told the audience.

Everyblock’s Addison also had words of caution about relying too much on raw data over reporting that uses it. Everyblock, owned by NBC, combines aggregation of data and news with a social platform based on ZIP code, relying on a combination of content scattered elsewhere and users who contribute messages and events info. But Addison said raw data is only a starting point for providing hyperlocal content. Publishing unfiltered police reports, for example? “It’s dry and creepy,” said Addison. “It has niche appeal.”

Infographic: The wire service as journalism innovation timeline

Infographic: A history of the wire service

Infographic: A history of the wire service

 

To help celebrate the launch of Ebyline’s News Desk custom wire service we wanted some way to visualize the timeline of journalism innovation. The more we researched the histories of the various big news agencies—AP, Reuters, AFP, UPI—the more apparent it was that the wires have done more to consistently pioneer new models, new technologies and new products than anyone else in the journalism biz.

They seized on the invention of the telegraph to replace steamships and homing pigeons (no kidding), were the first to use radio for news, developed the distribution of stock market quotes that we rely on today and have generally been much more open to improvisation and invention than their clients. Of course, there have been some famous gaffes, rivalries, bankruptcies and indignities.

We’re happy to include the good, the bad and the bizarre in our graphic history of the wires.

Fresh from the morgue: 2 out of 3 news librarians fired

news library

news libraryNewspaper morgues used to be the repositories of  each publication’s institutional history as well as the librarians who painstakingly clipped and indexed stories for posterity. Today? The libraries of most large newspapers and many magazines have been slashed by half or more and, in many cases, shuttered entirely. Yet hundreds of librarians have managed to adapt—focusing on computer-assisted reporting, data retrieval and the like and a few libraries-within-newsrooms are even thriving. Some are even looking to their morgues to produce revenue by licensing or syndicating the newspaper’s archived content or doing custom research.

First, kill all the librarians

A database maintained (and available publicly) by Michelle Quigley, a news researcher at The Palm Beach Post, shows the extent to which this small but essential corner of the newsroom has been decimated by cuts. According to Quigley’s numbers, which cover 81 daily newspapers (and a few magazines),  two out of three news librarians have lost their jobs since 2006. That’s 303 layoffs leaving just 167 positions behind. While the (slightly outdated) list covers only a fraction of the newspaper industry’s 1,500 dailies, not to mention the thousands community weeklies, it includes the largest and most prominent newsrooms in the U.S. and points up how publications such as the Atlanta Journal Constitution, Businessweek and ABC News have cut their research headcount to zero. (Largest remaining library staff? Newsday, with 11, which is 10 more than the Wall Street Journal.)

It may seem obvious with hindsight that news libraries would suffer, but nowhere else in the newsroom was the advent of the internet cheered more loudly, says Mike Meiners, who recently left his post as director of news administration at the St. Louis Post-Dispatch after a 25-year career in news libraries. “Librarians led the pack” when the internet emerged as a research and communications tool in the nineties, notes Meiner. “The web exploded and public records became available en masse, and it really made our role [essential] in helping reporters find news tips and information.” He calls the late nineties and early aughts a “golden age” for news morgues.

“They really don’t need you anymore”

The internet produced two phenomena that rendered library staffs vulnerable, says James Matarazzo, a Special Libraries Association fellow who’s studied and written about corporate libraries, including those at media companies. First came the availability of software to do what news librarians used to spend a lot of their time on: archiving and indexing the publication’s own content, both for inclusion in public databases and for internal research whenever a reporter needed background for a story.  “If all you did was prepare the paper for the database aggregator and index the newspaper, they really don’t need you anymore,” says Matarazzo. Second came the general  collapse of advertising revenue and with it the funds to do the investigative reporting that relies heavily on research.

But just as newsroom training is evolving to cope with shrinking budgets and new priorities, so the advent of digital archives have altered, but not eliminated, the role of newspaper libraries and the people who continue to staff them. The Times-Picayune (which recently cut back to printing three days a week) reconsidered its decision to eliminate its entire library staff (they still fired two three-decade veterans). Tech and data-savvy librarians and research professionals at several publications have found new ways to contribute and stay relevant. Meiner said the St. Louis staff has shrunk since then but those with computer-assisted reporting skills remain a valuable part of the newsroom (Quigley’s list says the staff went from 10 to 1).

Computer-assisted reporting and data journalism is also part of the job for Leigh Montgomery, who started as a librarian at The Christian Science Monitor in 1998. The Monitor retains a collection of reference books but most reporters now use computer-based resources instead. Montgomery sits right in the newsroom so she can “actively hear and collaborate on what we’re working on.”

“We’re getting information from databases, coaching people on how to use those, how to find the best and most relevant information,” she says. “We might be updating a story several times throughout the day, putting together a story or data presentation. There’s definitely a role for an information professional in that.”

Shhh! I’m making you money

Teresa Leonard began as a research librarian in 1987 and now works as director of news research at The News & Observer in Raleigh-Durham, N.C. Her staff of four sits in the newsroom alongside other departments. Like her counterpart at the CS Monitor, Leonard and her staff help with research requests. “A lot of our research is public records-based so as public records have been opened up online, they’re more accessible,” she explains. Leonard is also involved in training initiatives such as writing workshops and public records workshops.

Despite digital technologies for archiving the paper, Leonard doesn’t see the archivist’s role disappearing entirely. “The electronic production of the paper doesn’t always produce as clean archives,” she says. “Where papers have someone who can pay attention to that and oversee that, then you’ve got a much more usable archive.”

In addition to their roles as archivists, trainers and data experts, librarians sometimes also help newspapers make money. Toby Pearlstein, also a Special Libraries Association fellow who collaborates with Matarazzo, sees librarians bringing in cash by selling images from photo archives or doing research for the public. “They’re striving to find new sources of revenue for the paper that the paper might not be familiar with,” she adds.

For her part, Montgomery says she’s helping the Monitor by “looking at new models and being fully aware of trends in the information industry and of content opportunities. Many librarians have been involved in doing research for scholars or the public, fulfilling photo requests and looking at new information products.”

Learning to query: Data journalism 101

_


So you want to be a data journalist. A number-crunching sleuth who’s surfing the crest of the very same digital wave poised to sweep aside the low-lying journalistic towns and villages in its fateful path. Our guest blogger, Liv Buli, is turning numbers into stories at Next Big Sound and lays out a few things you should know before claiming the moniker for yourself.

 

First off, data journalism isn’t new. Not even close. The Guardian, the same U.K. paper that has made a fetish for data one of the centerpieces of its widely-acclaimed online strategy, started turning numbers into stories in its very first issue way back in 1821. (It was a list of schools by student enrollment and spending, so not exactly 19th Century linkbait.) What has revolutionized the field in recent years, and the reason you hear the term every time journos get together to discuss “What’s next?” is the amount of data now available, and the speed with which this data is being generated and delivered. At Next Big Sound, the music analytics company at which I blog, we are gathering an average of 175 million data points each day. Facebook users have generated roughly 300 times as much data as is contained in the Library of Congress. That’s a lot of scatter charts.

It shows, too. Some of the best, and best-read, stories to hit the press this past year were  based on data findings, such as The New York Times’s damning series on corruption and cruelty in horse racing, not to mention the Wall Street Journal’s uncovering in recent years of two of the biggest scandals to hit the business world: the LIBOR rate-fixing scandal (currently rocking the finance world) and the backdating of tech company stock options (which unseated a slew of high-profile Silicon Valley execs.) What’s striking about this kind of technical journalism is that the data weren’t just a starting point for asking questions, they provided the answers, too. WSJ’s statistical analyses in both the LIBOR and stock options cases showed that the likelihood people were obeying the rules was remote, providing the circumstantial evidence long (really long) before regulators and law enforcement obtained proof.

Data journalism can also come in different formats. The News Application team at the Chicago Tribune consists of a group of programmers embedded in the newsroom, assisting journalists in uncovering data and creating cool apps to make use of it.

Interviewing the data

OK, you’re interested, you want to know “Where do I sign up?” and “Is it OK that I flunked every math class I ever took?” Start with this: A basic requirement for the role is the ability to query a database in order to extract information and that requires some basic coding skills. It’s usually the second step in my reporting, after picking a topic for which the data seems promising. It’s akin to interviewing sources at the scene of a crime or compiling a list of experts to call for a science feature. I work mostly with music industry numbers so if I’m looking at a database of artists and their followers on social networks I may want to query only the artists that have a certain range of followers on Twitter.

In the run-up to the MTV Video Music Awards, I wanted to see if our data would give any sort of indication as to who would be most likely to take home the Best New Artists award. Querying the Wikipedia, Facebook and Twitter data for the five nominees in the right time frame made it  easy to compare the size of their fan base and current popularity in order to accurately predict who would take home the title. Which brings us to essential skill number two: a level of number comprehension that allows the journalist to understand what story the data is telling.

Figure out what data to work with

So what equipment does the data journalist need? I use Excel at times, and there are some  fancy stats programs such as SPSS out there. I’m lucky, though: I have at my fingertips a proprietary platform that allows me to easily graph the information in NBS’s database so I can see correlations and pull overview reports of relevant data. But all data journalists have to figure out what data they want to work with and find the right tools for analyzing that specific information set. Telling great stories then simply becomes a matter of figuring out the right questions to ask of the data and combining this with relevant reporting. And that part looks a lot like regular journalism: I stay up-to-date on the industry I am covering.

As for where to do data journalism, think outside the newsroom. Because NBS is a data company, not a major news outlet, it’s more of a challenge to get our content widely read and I don’t have a bullpen of editors to bounce ideas off of but there are some major advantages. Because we have historical data on hundreds of thousands of artists, I can often add insights to breaking news stories. I’m able to work with our in-house engineers to do analysis on major events such as festivals and award shows, to gauge which performers are growing the fastest in terms of views, plays and new fans. We have our own data science team with whom I am able to delve even deeper on certain topics, uncovering how social media metrics and radio spins correlate to album sales, or whether artists are purchasing fake Twitter followers.

 
About the Author: Liv Buli is the resident data journalist for music analytics company Next Big Sound. Buli is a graduate of New York University’s Arthur L. Carter Journalism Institute and her work has appeared in Newsweek Daily Beast, The New York Times Local East Village, Westchester Magazine and more.

css.php