Wikipedia is the go-to source for quick facts. Millions of people trust it every day. But what if the information you’re reading doesn’t reflect the real world? What if whole groups of people, ideas, and histories are missing-not because they’re unimportant, but because the people editing the site don’t represent them?
Who Writes Wikipedia?
The average Wikipedia editor is a man in his 30s or 40s, living in North America or Europe, with a college degree, and likely working in tech or academia. This isn’t a guess-it’s from Wikimedia Foundation surveys. In 2023, fewer than 20% of editors identified as women. Less than 10% lived in Africa, Latin America, or South Asia, even though those regions make up over half the world’s population.
This isn’t just about numbers. It’s about perspective. When most editors come from similar backgrounds, they naturally write about what they know. The life of a middle-class man in Berlin gets detailed coverage. The traditional healing practices of a rural community in Malawi? Often ignored-or summarized in a single paragraph with no sources.
What Gets Left Out?
Think about the articles you’ve read on Wikipedia. How many are about women scientists who published before 1950? How many cover Indigenous languages that aren’t taught in schools? How many detail the cultural impact of hip-hop in Senegal or the feminist movements in Nigeria?
Studies from the University of Oxford show that Wikipedia has significantly fewer biographies of women than men-even when those women were pioneers in their fields. A 2022 analysis found that 87% of science-related biographies on Wikipedia were about men. The same study showed that articles on African history were, on average, 40% shorter than those on European history.
It’s not that these topics are unimportant. It’s that they’re not seen as important by the people who edit. And when something isn’t written down on Wikipedia, it’s easy to assume it never happened.
Neutral Isn’t Always Fair
Wikipedia’s policy of neutral point of view sounds fair. But neutrality doesn’t mean equal representation. It means presenting all sides-but what if one side has no voice?
Take the article on colonialism. It might mention British rule in India, French control in Algeria, and Belgian exploitation in Congo. But if there’s no editor from those regions to add local perspectives, the article reads like a textbook written by the colonizers. Survivors’ stories, oral histories, and resistance movements often don’t make it in unless someone with access and time decides to fight for them.
Wikipedia doesn’t ban bias. It just assumes that debate will balance it out. But when one group dominates the conversation, the debate becomes one-sided by default.
How Bias Gets Locked In
Bias doesn’t just disappear because it’s unmentioned. It becomes the norm.
When a topic has few editors, it’s easier for a small group to control the narrative. One person with strong opinions can repeatedly revert edits from newcomers. New editors, especially those from marginalized groups, often face hostility, being told their edits are "not notable" or "original research"-even when they’re citing local newspapers or academic journals from their own countries.
Wikipedia’s notability guidelines were designed to keep out spam. But they’re written in a way that favors Western institutions. A university professor in the U.S. publishing in a peer-reviewed journal? Notable. A community leader in Papua New Guinea publishing a book in a local language? Often not.
This creates a feedback loop: fewer editors from certain regions → fewer articles about them → less visibility → even fewer people feel encouraged to contribute.
Real Examples of Missing Stories
Here are a few real cases:
- Ada Lovelace has a detailed article. But Mary Jackson, one of NASA’s first Black female engineers, didn’t get her own page until 2016-after the movie Hidden Figures made her famous.
- Traditional Maori healing practices were removed from Wikipedia for years because editors called them "unverifiable." Only after Maori scholars submitted peer-reviewed ethnographies did the article stay.
- The 1984 anti-Indian riots in Delhi had minimal coverage for decades. When Indian editors tried to expand it, their edits were flagged as "biased" by editors in the U.S. and U.K. who didn’t know the context.
These aren’t edge cases. They’re symptoms of a system that values certain types of knowledge over others.
Who’s Trying to Fix It?
Some groups are pushing back. Wikimedians in Africa, Latin America, and South Asia are running edit-a-thons to add content about local figures, events, and traditions. The Wikimedia Foundation has funded projects to train women editors in Nigeria, India, and Brazil. Organizations like Art + Feminism and Black Lunch Table host events to write about underrepresented artists and thinkers.
But these efforts are small compared to the scale of the problem. Most new editors drop off within a few months. The culture of Wikipedia can be intimidating. The rules are complex. The tone can be cold. And the people who’ve been there the longest often guard the gates.
One study from the University of Toronto found that women editors were twice as likely to be reverted on their first edit than men-even when their edits were correct.
Why This Matters Beyond Wikipedia
Wikipedia doesn’t just report knowledge. It shapes what we think is true.
Google, Siri, and Alexa pull answers from Wikipedia. Students use it for school projects. Journalists cite it. Politicians reference it. If Wikipedia leaves out entire populations, those populations stay invisible in the digital world.
And when people realize Wikipedia doesn’t reflect them, they stop trusting it. Younger generations are already turning to TikTok, YouTube, and community blogs for stories that Wikipedia ignores. That’s not a failure of those platforms-it’s a failure of Wikipedia to evolve.
What Can You Do?
You don’t need to be a scholar or a tech expert to help fix this.
- Find a topic that’s missing-your grandmother’s recipe book, your town’s history, a local activist-and write a short article. Start with five sentences. That’s enough to get it started.
- Use reliable local sources. A university press, a newspaper archive, a community newsletter-these count as citations.
- Don’t wait for permission. If your edit gets reverted, ask why. Most reverts are based on misunderstanding, not malice.
- Encourage others. Ask your friends, classmates, or coworkers to add one edit. One person. One article. One day.
Wikipedia isn’t broken. It’s unfinished. And it’s still editable.
What’s Next?
The future of Wikipedia depends on who gets to write it. If nothing changes, it will become a digital museum of 20th-century Western perspectives-detailed, polished, and deeply narrow.
If enough people from different backgrounds start editing, it can become something else: a living archive of global knowledge, shaped by the people who actually live it.
The question isn’t whether Wikipedia can be fixed. It’s whether enough of us will care to try.