Wikipedia Turns 25: How Is Wikipedia Still Standing In the Age Of AI?

On 15 January 2001, Wikipedia appeared online with a short announcement saying it was new. Pew Research Center traces the earliest edit on the homepage to that day. No launch event followed and nobody had an idea about what the site would become.

25 years on, Wikipedia feels ordinary because of how often people use it. The Wikimedia Foundation says the site now attracts nearly 15 billion views every month. Pew Research Center puts daily use at an average of about 508 million views across the past decade.

Wikipedia holds over 65 million articles in more than 300 languages, according to Wikimedia data. English accounts for around 7 million articles, with over 5 billion words in total. Pew Research Center estimates it would take one person 38 years to read them all.

 

Who Keeps Wikipedia Running Day After Day?

 

Wikipedia does not employ writers. The Wikimedia Foundation says nearly 250,000 volunteer editors make at least one edit each month. They write, revise and check entries under shared rules on sources and neutrality.

To commemorate the anniversary, the Foundation released a video docuseries following eight editors. One has spent decades documenting storms in California. Another, a medical doctor in India, shared public health information during the Covid pandemic. A retired librarian in Tokyo works to strengthen Japanese language coverage.

Pew Research Center reports more than 600,000 active users across Wikipedia each month. About 45% contribute to English Wikipedia, with the rest spread across hundreds of languages.

Edits arrive constantly. Wikimedia statistics show an average of 324 edits per minute. Bots handle routine maintenance and anti vandalism, making up about 40% of edits, while people handle judgement calls and sourcing.

 

What Changes When AI Starts Using Wikipedia?

 

Wikipedia now feeds machines as well as readers. The Wikimedia Foundation says its articles form one of the highest quality data sources used to train large language models.

Pew Research Centre found that web crawlers and AI systems generated over 88 billion visits to Wikipedia during 2025. At the same time, the Foundation reported that human pageviews went down about 8% in October 2025 compared with the same months in 2024, linking this to AI summaries in search results.

Companies building AI tools rely heavily on Wikipedia content. Through Wikimedia Enterprise, organisations such as Google, Microsoft, Amazon and Meta access data in large volumes while supporting the non profit that runs the site, according to the Foundation.

Wikipedia’s infrastructure has also changed. The Wikimedia Foundation points to new data centres, mobile apps, interface updates and dark mode as ways the site stays usable across regions and devices.

At 25, Wikipedia looks less like a project and more like shared public space. It runs quietly, corrected minute by minute, shaped by people whose names rarely appear on the pages they keep up to date.

 

Our Experts:

 

  • Peter Bilzerian, U.S. Managing Director, MPP Insights
  • Vin Mitty, PhD, Sr. Director of Data Science and AI, LegalShield/IDShield
  • Mike Wood, Founder, Legalmorning
  • Jonathan Satter, Chief Operating Officer, White Wolf Capital Group
  • Jonathan Schaeffer, CEO and Founder, Kind
  • Laurence Minsky, Professor, Columbia College Chicago
  • Babar Khan Javed, Principal, Aberrant
  • Laviet Joaquin, Head of Marketing, TP-Link
  • Josh Qian, COO and Co-Founder, LINQ Kitchen

 

Peter Bilzerian, U.S. Managing Director, MPP Insights

 

 

“Wikipedia is the encyclopedia of the internet, it’s peer-reviewed by humans and a consumer of data sources – it’s not a content creator itself. Large Language Models like ChatGPT or Gemini search Wikipedia for its content and its’ sources because they are credible.

“The threat isn’t that AI will replace Wikipedia – it’s that AI ‘slop’ will pollute the internet so bad that Wikipedia editors can’t rely on the data integrity anymore. Imagine a hall of mirrors where AI cites a blog post that was written by another AI model – it’s just mirror images of itself.

“The existential threat to Wikipedia isn’t AI replacing it – it’s data integrity”

 

Vin Mitty, PhD, Sr. Director of Data Science and AI, LegalShield/IDShield

 

 

“Wikipedia has survived the rise of AI because it builds trust out in the open.

“As we know, Wikipedia is an active network of sources, editors, discussions, corrections, and responsibility. Each statement should link to real evidence and every edit is recorded. If something is wrong, people notice and fix it. This social aspect is more important than ever.

“AI tools are quick and sure of themselves, but they don’t take responsibility. They mix up information and there is no accountability. Wikipedia is different because it makes its process visible. You can check who made changes, why something was taken out, and where the proof comes from. This openness matters a lot now that so much text is generated by machines.

“This is also why AI is a real challenge for Wikipedia. When people get answers straight from chatbots, fewer visit the original sources. With fewer readers, there may be fewer editors. That weakens the system that helps keep Wikipedia trustworthy.

“But there’s another side to this. AI actually makes Wikipedia more valuable as a reliable source. Large language models still need good, carefully checked, human-reviewed information to learn from and use. Wikipedia is still one of the best organized and most trustworthy knowledge bases online.

“Projects like Grok’s test project, “Grokpedia,” tell us what may come next: AI tools that use Wikipedia’s ideas about sources, edit history, and community review, could be useful instead of trying to take their place.

“In the end, I think Wikipedia lasts because it focuses on being responsible. In a time when AI sounds sure of itself, that focus on accountability may be its biggest strength.”

 

Mike Wood, Founder, Legalmorning

 

 

“I’m a digital marketer specialising in Wikipedia editing, and I see this play out constantly with clients. They’ll come to me saying, “ChatGPT is saying something inaccurate about me.” When I look closer, most of the time the inaccuracy is traced back to Wikipedia.

“This creates a feedback loop problem. LLMs scrape Wikipedia to learn. Then people use those same LLMs to try editing Wikipedia. Wikipedia’s volunteer editors are now tagging thousands of articles for suspected AI-generated content because the quality is so poor. It is full of bad sourcing, hallucinated facts, promotional tone. Editors are actively at war with low-effort AI content.

“Even Grokipedia proves the point. It was launched as a “Wikipedia alternative,” but what did it do? It scraped Wikipedia’s top million articles to use as the foundation for the content it initially created.

“The bottom line is if Wikipedia isn’t accurate, AI won’t be accurate. That’s why the human editorial layer matters more now than it ever has. With that in mind, AI will never displace Wikipedia. Readers will still see Wikipedia content, they will just see it served through AI summaries as opposed to directly from the site.”
 

 

Jonathan Satter, Chief Operating Officer, White Wolf Capital Group

 

 

“I grew up with Encyclopedia Britannica as the gold standard for reliable information—it was expensive, gatekept, and updated infrequently. Wikipedia seemed radical when it launched: free, open-source, and editable by anyone. Today, it’s ubiquitous and has earned something even more valuable than Britannica ever had—dynamic trust.

“Its transparent editing processes, community oversight, and real-time updates have made it the world’s most trusted reference source. The question now is whether Wikipedia’s community-driven model can withstand the AI era, or if we’re about to see another generational shift in how humanity organizes and accesses knowledge.”

 

Jonathan Schaeffer, CEO and Founder, Kind

 

 

“Wikipedia is an incredible resource and an amazing accomplishment achieved through the generous time contributions of thousands of people over several decades. But, like everything else that purports to collect, synthesize, and explain knowledge, the future of Wikipedia is under threat from AI (notably grokpedia). In the short-term, Wikipedia has little to fear; the long-term is not as clear. Today’s LLMs (Large Language Models, such as ChatGPT) have shortfalls, the two most significant of which are hallucinations and trust. LLMs make up “facts”; this isn’t a programming flaw but rather is fundamental to the core LLM algorithm.

“Whereas Wikipedia’s content has been carefully curated and reviewed, and there is an audit trail of all changes made to the content, the same is not true for an AI-generated equivalent. The AI scoops up data from the entire Internet. It does not understand trusted versus unreliable sources; it does not understand political sensitivities; it does not understand context, emotion, or objectivity. In response to a question, what an LLM says now may differ significantly from what it says two minutes later.

“An LLM-based equivalent of Wikipedia, today, would be rife with factual errors, insensitivities, and politically incorrect statements. Yes, Wikipedia has some errors in it. We accept and understand that humans sometimes make mistakes, but in this case there is an army of volunteers that are relentless in their pursuit of accuracy and objectivity. We hold AI to a different standard: we want the AI to be perfect. But AI’s mistakes are frequent and often high profile, and this has eroded public trust. Humans trust Wikipedia.

“It will be a long time before we will trust an AI-generated equivalent of Wikipedia. This is no different than is seen in other applications of AI. Do you trust an AI controlled vehicle? Would you like to be treated by an AI doctor? AI is advancing at a rapid pace. Perhaps we are only a breakthrough or two away from AGI (Artificial General Intelligence) – or not. There is too much that is unknown right now to come up with a useful prediction for the long-term future of Wikipedia.”

 

Laurence Minsky, Professor, Columbia College Chicago

 

 

“At the very least, you need to “feed the beast.” Just think: LLMs (Large Language Models) constantly need to be updated and one of the key sources for their training (i.e., updating) is Wikipedia. As long as this reliance continues, wikipedia will stay relevant on the information resource side and people who care about (hopefully accurate) information will continue to update it. Then, on the other side, when Wikipedia is a source for the answer of an AI query, there’s a good chance people will click on the “citation” link in what the LLMs generate when they want more when they want more information. As a result, I believe Wikipedia will not only continue to survive, but thrive.”

 

Babar Khan Javed, Principal, Aberrant

 

 

“The data speaks volumes: Ahref’s website traffic tracking tool shows Wikipedia is averaging 1.3 million visitors are day while Grokipedia is 45% less. Wikipedia persists like a steadfast lighthouse amid the swirling fog of algorithms and chatbots.

“The site is not just a repository of facts, but as a testament to the messy, collaborative essence of humanity itself. In an age where machines promise omniscience with a tap of the screen, why does this volunteer-driven encyclopedia endure? I think that with time the site has proven itself to be built on trust and human curation that machines have not replicated.

“Wikipedia has adapted by expanding its machine learning team post-ChatGPT’s rise, allowing AI tools to assist but not dominate, ensuring content meets rigorous quality standards through consensus and source citation. This resilience stems from a model that prizes debate over dictation, turning potential chaos into a self-correcting symphony of knowledge.

“The irreplaceable warmth of community involvement is what Wikipedia has going for it. We also need to recognise that the site is perceived to be bipartisan and grounded in truth, while LLMs can be influenced based on politcal leanings of the era. We know that the entire Mag-7 leans Trump and will ensure their LLMs repeat his agenda, while refusing to use links and news sources the POTUS despises.”

 

Laviet Joaquin, Head of Marketing, TP-Link

 

 

“Wikipedia has survived the age of AI because people still care where information comes from, not just how fast they get it. In everyday use, Wikipedia offers something AI answers don’t consistently provide: visible sources, community review, and accountability you can trace back to real humans, not black boxes.

“AI becomes a threat when speed is mistaken for truth, but it doesn’t replace Wikipedia’s role as a reference layer where facts are checked, debated, and corrected over time. AI-driven knowledge bases like Grokpedia generate answers; Wikipedia documents consensus, and those serve fundamentally different needs.

“In an AI world, trust isn’t built by confidence; it’s built by sources, transparency, and human responsibility.”

 

Josh Qian, COO and Co-Founder, LINQ Kitchen

 

 

“In today’s era of AI, Wikipedia is thriving as a source of valuable knowledge thanks to its strong base of community involvement and its focus on accuracy. While AI-generated content may contain limited amounts of data or be unreliable at times, the diversity of knowledgeable contributors is the backbone of Wikipedia.

“The community involvement aspect of Wikipedia creates an environment in which content is scrutinised by many people who contribute to the article; this leads to higher quality and reliability in the content provided in each article. A variety of contributors with different backgrounds and perspectives help create the context and reference, and update the content to ensure the reader receives the most accurate and up-to-date information possible.

“The continued success of Wikipedia is also attributed to its provision of detailed, complete narratives on the topic at hand, whereas AI usually provides only small pieces of information. Providing readers with a complete narrative of the subject matter enhances their learning experience and deepens their understanding.

“The consistent reading and writing style used on Wikipedia also makes the information easier for readers to understand. This structured approach to presenting information increases accessibility and encourages readers and contributors to engage with each other.

“However, the increasing use of AI presents some challenges. One major challenge posed by AI is its ability to rapidly generate articles and answers from large databases. This can cause users to turn away from Wikipedia in search of the information they need.

“As long as Wikipedia maintains its collaborative model and continues to prioritise the accuracy of the information it presents, it is expected to remain relevant and a trusted source for users in an increasingly digital society.”