Guardian: Reading the Riots study to examine causes and effects of August unrest

"Reading the Riots is modelled on an acclaimed survey conducted in the aftermath of the Detroit riots in 1967. The findings of that study, the result of a groundbreaking collaboration between the Detroit Free Press newspaper and Michigan's Institute for Social Research, challenged prevailing assumptions about the cause of the unrest. Prof Phil Meyer, who co-ordinated the Detroit study more than four decades ago, will advise the research into the English riots." WSJ Jet Tracker

"The Wall Street Journal filed several Freedom of Information Act requests with the Federal Aviation Administration for the entire Enhanced Traffic Management System database, which contains flight records for aircraft that flew in the U.S. under instrument flight rules. The Journal analyzed the flight data for non-commercial jet aircraft traffic for a four-year period, 2007 through 2010. ... The Journal has included in the flights database an estimated cost to operate each flight. The estimates are based on per-hour cost figures for each model of jet, provided by Conklin & de Decker Aviation Information, an industry consulting firm used by some public companies to provide aircraft-cost estimates for regulatory filings."

Financial Times careers: Investigations and Special Project Editor

Deadlines is 11 May: "He or she will have a strong background in investigative work, and Pulitzer-sized ambition. A strong background in computer-assisted and database reporting, a proven track record at some of the world's biggest news organisations, international frontline reporting experience, in-depth knowledge of multimedia and an interest in mentoring and coaching would ensure a successful application."

Scraperwiki Data Blog: Read all about it read all about it: “ScraperWiki gets on the Guardian front page…”

"James Ball’s [Guardian] story [on lobbyist influence in the UK Parliament] is helped and supported by a ScraperWiki script that took data from registers across parliament that is located on different servers and aggregates them into one source table that can be viewed in a spreadsheet or document."

FleetStreetBlues: The truth about ‘data journalism’: it’s still about the story, stupid

"[A]midst all [the 'data journalism'] hype, earnestness and spreadsheet-geekery, here's the truth about so-called 'data journalism'. It's still about the story, stupid. ... [S]urely what's shocking is how few stories journalists actually managed to uncover [from recent major data dumps] ... No doubt we'll get better at this. Over time, journalists will learn how to pick out the stories that matter from these huge data releases - and it will help hugely whenever a single news outlet has control of the data, as the Telegraph did with MPs' expenses, so that they can drip-feed the top lines one at a time rather than see the whole lot drown in the 24-hour news cycle."

Guardian: Journalists of the future need data skills, says Berners-Lee

Sir Tim Berners-Lee: "the responsibility needs to be with the press. Journalists need to be data-savvy. These are the people whose jobs are to interpret what government is doing to the people. So it used to be that you would get stories by chatting to people in bars, and it still might be that you'll do it that way some times. But now it's also going to be about poring over data and equipping yourself with the tools to analyse it and picking out what's interesting. And keeping it in perspective, helping people out by really seeing where it all fits together, and what's going on in the country."

Guardian: Wikileaks’ Afghanistan war logs: how our datajournalism operation worked

"The data came to us as a huge excel file – over 92,201 rows of data, some with nothing in at all or were the result of poor formatting. Anything over 60,000 rows or so brings excel down in dramatic fashion – saving takes a painfully long period of time (tip number one – turn automatic saving off in preferences…). It doesn't help reporters trying to trawl through the data for stories and it's too big to run meaningful reports on. Fortunately, after COINS, huge datasets hold no fear for us. ..."

Talking Points Memo: Internal AP Memo Thanks Writers For Successful ‘Literary Treasure Hunt’ In Finding Sarah Palin’s Book

AP memo on how reporters who found an accidentally pre-released copy of Sarah Palin's book produced a story in 40 minutes: "They bought a copy, ripped it from its spine and scanned it into the system so it could be read and electronically searched. A NewsNow moved within 40 minutes, followed quickly by multiple leads as details were gleaned from the 413-page manuscript."