AddThis SmartLayers

Watchdog warns editors to be ‘vigilant’ in use of robots

Charlotte DewarEditors have been warned to be “vigilant” in their use of artificial intelligence as the press watchdog considers how to deal with the issue.

The Independent Press Standards Organisation has issued the warning while the industry experiments with the use of AI, confirming that editors “are accountable for their reporting and editorial standards, regardless of the technology they are using”.

It comes after an external review into IPSO’s work recommended that the watchdog develops industry standards on the use of robots in journalism.

The review by former senior civil servant Sir Bill Jeffrey warned of “risks” being posed by the use of AI-driven systems in reporting.

In a statement published on Twitter,IPSO has now confirmed it is specifically looking at how to address the issue.

Chief executive Charlotte Dewar, pictured, said: “We are all grappling with the potentially enormous implications of artificial intelligence for journalism.

“At IPSO we are developing our thinking on this issue and how it relates to our regulation, but we know that editorial responsibility will remain a core principle: editors are accountable for their reporting and editorial standards, regardless of the technology they are using.

“They need to be vigilant to ensure their content adheres to the Editors’ Code of Practice.”

As reported on Friday, Reading Today editor Phil Creighton issued a warning about the use of AI after a robot produced inaccurate information and cited “fictitious” sources after an experiment in which it was asked to write NIBs for his newspaper.

Publishing giant Reach announced earlier this year it was experimenting with AI technology, while Newsquest has recently created a new role with a specific remit to expand its use – including using it to create local content.