Mobile navigation

COLUMN 

AI: knowing its limitations

AI looks set to play a big role in all our futures, but there are some things it can’t do…

By James Evelegh

AI: knowing its limitations
Speaking at the session on AI at the PPA Festival (L-R): Rebekah Billingsley, Olivia Solon, Dominic Young and Magda Woods.

AI is good for lots of things, but it lacks basic journalistic skills: it can’t take sources out to lunch, it can’t gauge their response to probing questions, it can’t do off-the-record briefings and it can’t pitch up to conferences. Humans still do all those things better.

So said Bloomberg’s Olivia Solon at a panel session at last week’s PPA Festival, entitled, ‘Time for a chat? The truth about AI and publishing.’

AI is, she said, a “helpful tool” that frees up real journalists to do the more creative things.

AI should never be used for reporting, insight and analysis, for a number of reasons. It’s not forward looking because its output is built from its assimilation of previously published material scraped from the internet. It also has no concept of, or adherence to, the truth. Its output can not be relied upon; what are the sources?

According to panellist Dominic Young, “you can’t trust it for news, because it makes stuff up”.

But there are some journalistic things that AI can do extremely well. For a number of years, Bloomberg has been using it to create content that is highly standardised, templated and repetitious, like quarterly earnings reports. Crucially, though, it’s always checked by humans.

Other publishers use AI to create weather and sports reports and other reporting that is data driven and formulaic. AI produces this type of content more quickly and more accurately than humans, though scrupulous publishers will always make sure it is checked by humans.

The problem, as always, is the unscrupulous ones.

In a report published this week by NewsGuard – Rise of the Newsbots: AI-Generated News Websites Proliferating Online – the authors found that, “artificial intelligence tools are now being used to populate so-called content farms, referring to low-quality websites around the world that churn out vast amounts of clickbait articles to optimise advertising revenue.”

The problem this causes is twofold: firstly, it takes valuable advertising revenue from more deserving recipients (like us) and, secondly, it further undermines people’s trust in all media, which only benefits bad actors.

Dominic Young sees now as an opportunity for publishers to try to push back, to differentiate ourselves and to try and frame the debate as to how AI will develop in the future: “This is a moment for us; we can learn from our expensive mistakes of twenty years ago. It’s a pivotal moment and we shouldn’t miss it.”


You can catch James Evelegh’s regular column in the InPubWeekly newsletter, which you can register to receive here.