What is data journalism?
Data journalism is simply journalism.
The former is a new and trendy term but ultimately, it is just a way of describing journalism in the modern world.
As journalists, we don’t think of data as numbers on spreadsheet. We see it as digitised information. An account of communities.
Everyone produces data, whether voluntarily or not, as they use their mobile phones, search the internet, use social media or even walk down the street. Public bodies are the same. Every day they are producing data on their staff, their spending, their work and the product of their choices on our public institutions.
The stories of our communities, big and small, are held in this data and that is why it is crucial journalists are able to navigate and interrogate it - to inform the public on the state of their communities and to unveil systemic wrong when it exists.
Yet with the ever-growing digitisation of information, journalists are inundated with more data than we can possibly make sense of. Hence the need for new journalism techniques - techniques that embrace technology.
Technology can be used by journalists to unearth more pieces of information than a human alone. It allows us to dig deeper, get stories faster and unearth complexity in a way we have never been able to before.
Just as in traditional reporting, the top line or the lead from an insightful piece of data is just the start. These findings must be combined with local knowledge, investigating, research and reporting.
That is what data journalism is. Journalism that uses technology to access, analyse and find stories in data and then tell the stories of the people the data represents.
We believe this approach is the future of journalism, which is why the Bureau Local aims to increase the capability of data journalism at the community levels.
To give you an idea, here are some of the pieces of data journalism members of our team have worked on in the past.
Megan helped reveal the widespread use of blood doping in athletics after analysing 11 years worth of athlete blood tests. She also uncovered the billions of pounds in taxpayers' money paid to consultancy firms after analysing five years of UK foreign aid budget spending.
Maeve compiled data from Freedom of Information requests to more than 140 local authorities to reveal an increase in the number of unaccompanied asylum seeking children going missing from care, including those at risk of trafficking. Maeve has also trawled open data sets to reveal the monopolisation of fishing rights in UK waters, and has produced groundbreaking data on the use of the controversial “law of joint enterprise”.
Gareth used data journalism techniques to analyse every London food hygiene inspection by the official watchdog, to plot every fire in Croydon over a three year period and to pore over thousands of lines of information to provide a comprehensive picture of how much one local council was spending on legal fees and agency expenses. He used crowdsourcing research methods to chart public attitudes towards crime and living standards, and used statistics to show the ‘brain drain’ from state to public schools in south London.
Charles has used data for regional stories across the UK, such as the reporting on the most prevalent crime in each area, youth youth reoffending rates and the number of prisoners released in error. Charles also wrote code that pulls information from websites (known as scrapers), to create a more comprehensive database of teacher misconduct panel outcomes and built automated pieces of code (known as bots) to write roadworks stories. Some of his previous work included analysis of passenger data in Montreal to identify the most beneficial subway extension.