An emerging sector, largely overlooked by media theorists, which has the potential to have a tremendous impact either through online networks or by providing content to news media. It is Data journalism and it combines research and investigative work with the intensive use of databases, digital maps and software to analyse, narrate and visualize a story.
According to IBM, about 2.5 quintillion bytes of data are created every day—enough to fill about 57.5 billion 32 GB iPads daily. The term “big data” is used to describe the growing proliferation of data and our increasing ability to make productive use of it. Often the difficulty for journalism becomes even greater when moving from print to digital content, thus giving way to “click whore” stories produced exclusively for attracting more visitors to a certain website. Probably the first indicator to measure the quality of information lies in its ability to meet the need to recognize the actual reasons behind events and news items. An information overload, where quality news reporting means to understand rather than to expose, to explain rather than to report about facts that most people are already aware of, given the large presence of social media.
According to Alan Rusbridger, editor-in-chief of «The Guardian», publishing is just an opening door for readers involvement, more than a process related to product quality or journalism work.
Have a look at the Data Journalism Handbook. It was born at a 48 hour workshop led by the European Journalism Centre. Interesting contents are included to describe how Journalists Can Use Data to Improve the News.