Wikipedia launches edit-checking artificial intelligence

The new AI will check to see if an edit is "damaging"
The new AI will check to see if an edit is “damaging”

Wikipedia has launched a tool designed to automatically highlight low-quality edits to articles.

The Objective Revision Evaluation Service software has been trained by Wikipedia editors to recognise the quality of an edit based on the language and context of the change.

There are about half a million changes to Wikipedia articles every day.

Editors and ordinary users will now be able to quickly check how likely it is a proposed alteration is “damaging”.

“This allows editors to triage them from the torrent of new edits and review them with increased scrutiny,” the Wikimedia Foundation said in a blog.

Quality control

Other projects to engage artificial intelligence (AI) in the task of evaluating Wikipedia edits have not always been well received.

Some, for instance, have automatically downgraded the input of new editors, which has been seen as problematic for well-intentioned newcomers.

ORES aims to get around this by judging purely the content of an alteration.

“The thing to note is it doesn’t judge whether the facts that people are adding are actually true, because fact-checking is immensely difficult, it’s looking at the quality,” said Prof John Carroll, a computational linguist at the University of Sussex.

“It should help a great deal with Wikipedia,” he added.

Prof Carroll’s own start-up, iLexir, provides software to automatically check the quality of written English in essays by foreign language students.