Imagine a news story written and published within three minutes of the event happening. That’s a real scenario described by Emily Bell in her T P Stead Lecture at the British Library last week. I was intrigued by her title “Robot reporters” and went to hear more about “Journalism in the Age of Automation and Big Data.” Bell, who formerly ran Guardian Unlimited and is now director of the Tow Center for Digital Journalism at the University of Columbia in New York, was arguing that journalists need to know about the technologies that help create and distribute their stories. These days that means they need to work alongside software programmers and engineers and understand the algorithms that underlie services that mine big databases and expose news stories—like Google News. In particular, they need to know the biases of the algorithms—because they will have some—and that it is much harder to find out about these than it is with human informants and writers, particularly if the code is commercially confidential—as it is with the Googles and Facebooks of the world.