The History of Data journalism

More

A historical take on every critical breakthrough from the 1950s until today

This article was written for the European Journalism Centre and you can read the full version on their website DataJournalism.com


It all started with trying to predict the outcome of a US presidential election.

Many practitioners date the beginning of computer-assisted reporting and data journalism to 1952 when the CBS network in the United States tried to use experts with a mainframe computer to predict the outcome of the presidential election.

That’s a bit of a stretch, or perhaps it was a false beginning because they never used the data for the story. It really wasn’t until 1967 that data analysis started to catch on.

In that year, Philip Meyer at The Detroit Free Press used a mainframe computer (known as big iron) to analyse a survey of Detroit residents for the purpose of understanding and explaining the serious riots that erupted in the city that summer. Decades later, The Guardian in the United Kingdom used some of the same approaches to look at racial riots there and cited Meyer’s work.

Meyer went on to work in the 1970s with Philadelphia Inquirer reporters Donald Barlett and James Steele to analyse sentencing patterns in the local court system, and with Rich Morin at The Miami Herald to analyse property assessment records.

Meyer also wrote a book called Precision Journalism that explained and advocated using database analysis and social research methods in reporting. Revisions of the book, now called New Precision Journalism, have been published since then.

Still, only a few journalists used these techniques until the mid-1980s, when Elliot Jaspin in the U.S. received recognition at The Providence Journal Bulletin for analysing databases for stories, including those on dangerous school bus drivers and a political scandal involving home loans.

Jaspin, who had won a Pulitzer Prize for traditional reporting on labour union corruption, also had taken a fellowship at Columbia University to learn how to use data. This was the same university where a journalist and professor Steve Ross had been teaching data analysis techniques for years. By the late 1980s, about 50 other journalists across the U.S., often consulting with Meyer, Jaspin, or Steve Doig of the Miami Herald, had begun using data analysis for their stories.

The use of data by journalists has vastly expanded since 2015.

Aiding the efforts of the data journalists of the 1980s were improved personal computers and a much-needed software—Nine Track Express—that Jaspin and journalist-programmer Daniel Woods wrote to make it easier to transfer computer tapes (that contained nine “tracks” of data) to personal computers using a portable tape drive.

This was a remarkable breakthrough because it allowed journalists to circumvent the internal bureaucracies and delays involved in using mainframes at newspapers and universities and instead do their work at their desks.

Car book

In 1989, U.S. journalism recognized the value of computer-assisted reporting when it gave a Pulitzer to The Atlanta Journal-Constitution for stories on racial disparities in home loans. The project was one of the first collaborations on data stories that involved an investigative reporter, a data reporter and college professors.

During the same year, Jaspin established at the Missouri School of Journalism what is now known as the National Institute for Computer-Assisted Reporting (NICAR). Then, in 1990, Indiana University professor James Brown held the first computer-assisted reporting conference in Indianapolis, Indiana and continued them for several years.

By 1996 word of the U.S. successes had reached other countries, and foreign journalists began attending the “boot camps” (intense, week-long seminars) at NICAR.

Read the full version of this story on DataJournalism.com

Leave a Reply

Your email address will not be published. Required fields are marked *