Wikipedia isn’t replacing their human editors with artificial intelligence yet — but they’re giving them a bit of an AI boost. On Wednesday, the Wikimedia Foundation, the nonprofit that runs Wikipedia, announced that it was integrating generative AI into its editing process as a means to help its volunteer and largely unpaid staff of moderators, editors, and patrollers reduce their workload and focus more on quality control.
In a statement, Chris Albon, the Director of Machine Learning at the foundation, emphasized that he did not want AI to replace their human editors or end up generating Wikipedia’s content. Rather, AI would be used to “remove technical barriers” and “tedious tasks” that impeded editors’ workflow, such as background research, translation, and onboarding new volunteers. The hope, he said, was to give editors the bandwidth to spend more time on deliberation and less on technical support. “We will take a human-centered approach and will prioritize human agency; we will prioritize using open-source or open-weight AI; we will prioritize transparency; and we will take a nuanced approach to multilinguality,” he wrote.
But the amount of information and content in the world is rapidly outpacing the number of active volunteers able to moderate it, and Wikipedia faces a future where AI would, quite literally, eat it alive. Earlier this month, the Wikimedia Foundation announced a new initiative to create an open access dataset of “structured Wikipedia content” — that is, a copy of Wikipedia content optimized specifically for machine learning — with the aim of keeping the bots off the site meant for human browsing. In recent years, the number of AI bots scraping the site has drastically scaled to the point that bot traffic has actually put a strain on their servers and increased bandwidth consumption by 50 percent.