When doing some Internet research, Wikipedia is often a go-to site. But while many may take every statement written as fact, it's important to remember that anyone can add to or edit the online encyclopedia. That means that errors are sometimes made.

To prevent incorrect edits or fake information from appearing in articles, Wikipedia recently started using machine learning software that is trained to know the difference between an honest mistake and Internet vandalism.

Built by Aaron Halfaker, senior research scientist at the Wikimedia Foundation, the artificial intelligence engine identifies certain words, variants of words, or particular keyboard patterns to detect cases of vandalism on the site.

Since the AI can then determine that the writer made a malicious edit, the site can then crack down on these specific vandals without punishing or scaring away other editors.

This is important since the number of active contributors on Wikipedia's English version has fallen by 40 percent over the past eight years. A study by Halfaker and other researchers found one of the main reasons writers were shying away from contributing was because the site has a strict set of rules for newcomers to prevent vandalism on articles, which made it hard for them to become regular editors.

While many people might worry that AI will replace the jobs of editors on the sitesince Wikipedia does crowdsource its articles—Halfaker's AI is intended to increase the number of contributing editors.

Dubbed Objective Revision Evaluation Service (ORES), the AI will help make Wikipedia become more friendly to new editors by acting like an editor's assistant. Previously, editors would receive an automated message that read they were not allowed to make a certain change. They might have also found that their contribution was erased without explanation because they broke one of the site's rules without realizing it.

Now editors can use the AI service to see recent edits and undo any edits made in ill-will with just one click. The software can also help editors identify and then address innocent or malignant mistakes. When making a change, the software also features a tool that informs the editor that if they make a certain change, they could send a message to the person who previously made an edit.

The goal of ORES is to get more editors contributing, while relaxing the rules on newbies—all while still preventing vandalism on the site.

Source: Wired

Photo: Pedro Verdugo | Flickr

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion