You have full access to this article via your institution.
The Wikipedia Monument in Słubice, Poland.Credit: Joker/Karl-Heinz Hick/ullstein bild/Getty
Wikipedia is 25. The world’s largest encyclopedia celebrated its birthday on 15 January. That is a remarkable milestone: not just because of the website’s longevity and enduring relevance, but also because it has retained its founding values. These are worth reiterating: Wikipedia is free to use, is extremely participatory and aims for a high degree of transparency in its content. Its roughly 65 million (and counting) entries contain sources for the information described that users can access. “Everything about Wikipedia is a worship of expertise,” its co-founder Jimmy Wales said in an interview with Nature on 12 January (Nature 649, 549–551; 2026).
‘We’re humans — brilliant and a mess’: Wikipedia founder Jimmy Wales on trust and optimism
That commitment to evidence and transparency is worth celebrating and supporting in every way possible, especially considering concerns about current threats to the integrity of knowledge. As social-media algorithms fuel extreme opinions and the Internet is flooded with low-quality information, increasingly generated with the aid of artificial-intelligence technologies, the world needs Wikipedia just as much — if not more — as it did at the time of the platform’s creation.
The start of the twenty-first century was a time of huge optimism in the web’s power and potential to share knowledge and bring information to vast numbers of people. For the first time in recorded history, a technology could bring the sum total of human learning to anyone — and an Internet connection was the only requirement to access it. People in businesses, homes, public libraries, schools and universities could see what was previously accessible to only a relative few. Search engines such as AltaVista, Yahoo and Google were some of the main routes to discovering knowledge. So was the technology company Amazon, as an online retailer of books. And there was Wikipedia.
As viewed on a screen, the site’s homepage still mainly contains text. But that simplicity is deceptive. The page includes links to each of the different language editions — more than 300 of them. And, not unlike Google’s clutter-free homepage, a search bar takes you into a world of millions of article pages. The English version alone has some 500 new entries added every day. And that is not including Wiktionary (a multilingual dictionary and thesaurus), Wikidata (a source of open data) and Wikimedia (a repository of free-to-use image, audio and video files).
The academic community failed Wikipedia for 25 years — now it might fail us
The Wikimedia Foundation — a non-profit organization in San Francisco, California, that runs Wikipedia — employs around 700 people. However, most of the project’s nearly 280,000 editors (those who edit pages regularly) are volunteers. To accommodate such a collaborative effort, Wikipedia has had to develop and adapt its working practices, which are summarized in five ‘pillars’. Most famous is Wikipedia’s ‘neutral point of view’. This principle requires editors to be accurate and fair. The entry on climate change, for instance, states clearly that the modern increase in global temperatures is caused by human activities, but it also discusses the misinformation that informs climate denial. This pillar encompasses another important principle: verifiability. Each fact or claim should be attributed to reliable, published sources. Wikipedia itself cannot be a source.
Each entry also contains a ‘talk’ page, on which users can read the discussion between editors, and trace the evolution of the article’s content. This means that an article is part of an ongoing conversation and not the final word. Editors have diverse backgrounds and views, and there is plenty of disagreement — one page even describes criticisms of Wikipedia itself. Wikipedia insists that editors treat each other with respect and civility. According to one study, this approach to publishing is creating high-quality information (F. Shi et al. Nature Hum. Behav. 3, 329–336; 2019).
AI produces gibberish when trained on too much AI-generated data
Many of Wikipedia’s processes correspond to how the acquisition, refinement and communication of scientific knowledge should work. That’s why we encourage members of the research community who are not already involved to consider participating in the Wikipedia effort. Editors who are fair and civil, and who know how to weigh up evidence, are sorely needed as communication on social-media platforms and through other media outlets becomes increasingly polarized, and evidence is provided in less transparent ways.
By putting AI-generated summaries before Wikipedia entries, search engines are making the website less visible than before. This development could present an existential threat to the platform. Many of the AI models behind these search results have been trained on Wikipedia entries, generally without acknowledgement or compensation. Earlier this month, the Wikimedia Foundation announced that it is working with tech companies to ensure that its content is used responsibly, and that it is compensated. Wikipedia relies mainly on small payments from around eight million donors a year. “They’re not donating in order to subsidize these huge AI companies,” Wales has said. Wikipedia is, by any measure, a remarkable achievement. It deserves our support in all ways possible.
