Robo-journalists that write up ‘monotonous’ articles from election to sports results are becoming more popular in newsrooms – but will they ever replace traditional reporters?
- AI generated 40,000 stories on Switzerland’s 2018 elections in minutes
- The programme can produce customised news stories in multiple languages
- Similar programmes are increasingly being adopted in newsrooms globally
- Bots like Tobi are ‘quicker’ than journalists at generating and updating news
9 March 2019
‘Robojournalism’ is on the rise according to a media academic who has been studying the use of AI to produce news stories.
Software that is able to do this has been available for nearly a decade, but is becoming more prevalent in news rooms, they say.
These bots are designed to free reporters from writing ‘monotnous’ stories that can be easily altered by simply inserting new figures – including sports and election results.
But some fear they may one day overtake the role of traditional newsroom writers an become an everyday feature of generating news content.
Artificial intelligence is gaining ground in newsrooms with computer-generated articles and programs to help sift through large data sets
More and more news organisations are using AI to produce stories, personalise news delivery and in some cases sift through data to find important news.
A text-generating ‘bot’ called Tobi generated nearly 40,000 news stories for Switzerland’s November 2018 elections.
It did this in just five minutes for the biggest media organisation in the country.
Tobi is being also being used by a number of other national publications worldwide and similarly automated generators are helping news wires services to report data-driven news.
The programmes are not only faster’, they are also ‘multilingual’ – Tobi wrote in both French and German.
Tobi wrote on vote results for each of Switzerland’s 2,222 municipalities, in both French and German, for the country’s largest media group, according to a paper presented at the Computation + Journalism conference in Miami.
Heliograf has enabled The Washington Post to cover some 500 election races, along with local sports and business, since 2014.
Damian Radcliffe, a University of Oregon professor who follows consumer trends and business models for journalism, said: ‘We’ve seen a greater acceptance of the potential for artificial intelligence, or robo-journalism, in newsrooms around the world.
‘These systems can offer speed and accuracy and potentially support the realities of smaller newsrooms and the time pressures of journalists.’
News organizations say the bots are not intended to displace human reporters or editors but rather to help free them from the most monotonous tasks, such as sports results and earnings reports.
Jeremy Gilbert, director of strategic initiatives at The Washington Post, said Heliograf was developed as a tool to help the newspaper’s editorial team.
‘The Post has an incredible team of reporters and editors and we didn’t want to replace them,’ Mr Gilbert told AFP.
Mr Gilbert said the bot can deliver and update stories more quickly as they develop, allowing reporters to concentrate on other tasks, and that reaction has been generally positive.
‘The surprise was that a lot of people came up and said, ‘I do this story every week; is this something we can automate?’ Mr Gilbert said.
‘These weren’t stories that anyone wanted to do.’
Similar conversations are going on in newsrooms around the world. The Norwegian news agency NTB automated sports reports to get match results delivered within 30 seconds.
The Los Angeles Times developed a ‘quakebot’ that quickly distributes news articles on temblors in the region and also uses an automated system as part of its Homicide Report.
The Associated Press has been automating quarterly earnings reports for some 3,000 listed companies, allowing the news agency to expand from what had been just a few hundred, and this year announced plans with its partner Automated Insights to deliver computer-generated previews of college basketball games.
News agency Reuters announced the launch of Lynx Insight, which uses automated data analysis to identify trends and anomalies, and to suggest stories reporters should write
Rival news agency Reuters last year announced the launch of Lynx Insight, which uses automated data analysis to identify trends and anomalies and to suggest stories reporters should write.
Bloomberg’s computerised system called Cyborg ‘dissects a company’s earnings the moment they appear’ and produces within seconds a ‘mini-wrap with all the numbers and a lot of context,’ editor-in-chief John Micklethwait wrote last year, noting that one-fourth of the agency’s content ‘has some degree of automation.’
France’s Le Monde and its partner Syllabs deployed a computer program that generated 150,000 web pages covering 36,000 municipalities in the 2015 elections.
One advantage of using algorithmically generated stories is that they can also be ‘personalised,’ or delivered to the relevant localities, which can be useful for elections and sports coverage.
While news professionals acknowledge the limits of computer programs, they also note that automated systems can sometimes accomplish things humans can’t.
The Atlanta Journal-Constitution used a data journalism team to uncover 450 cases of doctors who were brought before medical regulators or courts for sexual misconduct, finding that nearly half remained licensed to practice medicine.
France’s Le Monde and its partner Syllabs deployed a computer program that generated 150,000 web pages covering 36,000 municipalities in the 2015 elections
The newspaper used machine learning, an artificial intelligence tool, to analyse each case and assign a ‘probability rating’ on sexual misconduct, which was then reviewed by a team of journalists.
Studies appear to indicate consumers accept computer-generated stories, which are mostly labelled as such.
A report prepared by researcher Andreas Graefe for Columbia University’s Tow Center said one study of Dutch readers found that the label of computer-generated ‘had no effect on people’s perceptions of quality.’
A second study of German readers, Professor Graefe said, found that ‘automated articles were rated as more credible,’ although human-written news scored higher for ‘readability.’
Even though journalists and robots appear to be helping each other, fears persist about artificial intelligence spinning out of control and costing journalists’ jobs.
In February, researchers at the nonprofit centre OpenAI announced they had developed an automatic text generator so good that it is keeping details private for now.
The researchers said the program could be used for nefarious purposes, including to generate fake news articles, impersonating others online, and automate fake content on social media.
But Meredith Broussard, a professor of data journalism at New York University, said she does not see any immediate threats of robots taking over newsrooms.
She said there are many positive applications of AI in the newsroom, but that for now, most programs handle ‘the most boring’ stories.
‘There are some jobs that are going to be automated, but overall, I’m not worried about the robot apocalypse in the newsroom,’ she said.
HALF OF CURRENT JOBS WILL BE LOST TO AI WITHIN 15 YEARS
Kai-Fu Lee, the author of AI Superpowers: China, Silicon Valley, and the New World Order, told Dailymail.com the world of employments was facing a crisis ‘akin to that faced by farmers during the industrial revolution.’
Half of current jobs will be taken over by AI within 15 years, one of China‘s leading AI experts has warned.
Kai-Fu Lee, the author of bestselling book AI Superpowers: China, Silicon Valley, and the New World Order, told Dailymail.com the world of employments was facing a crisis ‘akin to that faced by farmers during the industrial revolution.’
‘People aren’t really fully aware of the effect AI will have on their jobs,’ he said.
Lee, who is a VC in China and once headed up Google in the region, has over 30 years of experience in AI.
He is set to reiterate his views on a Scott Pelley report about AI on the next edition of 60 Minutes, Sunday, Jan. 13 at 7 p.m., ET/PT on CBS.
He believes it is imperative to ‘warn people there is displacement coming, and to tell them how they can start retraining.’
Luckily, he said all is not lost for humanity.
‘AI is powerful and adaptable, but it can’t do everything that humans do.’
Lee believe AI cannot create, conceptualize, or do complex strategic planning, or undertake complex work that requires precise hand-eye coordination.
He also says it is poor at dealing with unknown and unstructured spaces.
Crucially, he says AI cannot interact with humans ‘exactly like humans’, with empathy, human-human connection, and compassion.