Robot Reporters

We’ve focused on Robot News Anchors and Robot Educators. This week, we move to Robot Reporters, as many newspapers have turned to AI tools. Bloomberg uses Cyborg, to help reporters with articles on company earnings reports. Robot reporters have produced of articles on minor league baseball for The Associated Press, high school football for The Washington Post and earthquakes for The Los Angeles Times. Forbes is testing a tool called Bertie to provide reporters with rough drafts and story templates. The A.P. has been using Automated Insights since 2014 to produce billions of machine-generated stories a year. Thanks to Heliograf, The Post won in the category of Excellence in Use of Bots at the annual Global Biggies Awards, which recognize accomplishments in the use of big data and artificial intelligence.

But what does this mean for reporters job? Current automation solutions primarily emphasize the augmentation of their jobs and not the takeover of them. In the future, A.I. journalism will be seen as a necessary tool and reporters will be expected to learn how to use these tools as they emerge. It will relieve them of the busy work and some of the dull stories that many engage in today and free up that time for more creative, substantive work. Future journalists may focus more time on making sense of the world than on merely reporting what is happening.

Robot-generated stories still lack creativity, a skill that only humans can bring to the writing and editing process. Currently, robots cannot contextualize, piece together, or create “original, evidence-based conclusions” like humans can. Robots are unable to conduct face-to-face interviews, respond intuitively with follow-up questions or select the news angle from a follow-up conversation. This does not mean they won’t accomplish this in the future as A.I. and machine learning are fine-tuned and improved.

When Jeff Bezos bought the Post, it was with the intention of advancing this technology, and making it so proficient at writing articles that there would be no discernible difference between an article that was written by a human author and one that was automatically generated.

And yes, robot reporters do make mistakes. In 2017, the Los Angeles Times reported on a 6.8 earthquake that struck Santa Barbara at 4:51pm. This was surprising to the people of Santa Barbara who didn’t feel anything. The big problem with the story? The earthquake happened in 1925. The “reporter” who wrote the news article about the 6.8 quake was actually a robot. The L.A. Times deleted its automated tweet as well as the automatically published article and explained what happened in a subsequent tweet.

It is clear that A.I. will play a major role in the creation of news in the future and may even make journalism more human by strengthening the competitive intelligence of newspapers. We can be pretty sure it will make editors happier, since “A.I.-written articles have zero typos.”