Recognizing Implicit Bias in Wikipedia Image Selection

Wikipedia is supposed to be the free encyclopedia anyone can edit. But if you look closely at the images that show up next to articles-especially for women, people of color, or marginalized groups-you’ll notice something off. Why do so many articles about Black scientists use photos of generic lab equipment instead of the scientist? Why do articles about female politicians often show them in formal portraits while men get action shots at conferences? These aren’t accidents. They’re the result of implicit bias in image selection.

How Images Shape What We Believe

People don’t just read Wikipedia-they look at it. A 2021 study from the University of California found that readers remember information better when it’s paired with a relevant image. But when the image is wrong, misleading, or missing, it doesn’t just confuse-it distorts. An article about Frida Kahlo that shows a generic Mexican folk art piece instead of her self-portraits doesn’t just miss the mark-it erases her identity. That’s not neutrality. That’s bias by omission.

Wikipedia’s image policy says images should be “relevant, representative, and non-promotional.” Sounds fair. But who decides what’s representative? In practice, it’s the volunteers who upload and tag images. And those volunteers? Most are white, male, and from Western countries. A 2023 survey of active Wikipedia editors showed that over 85% identified as male, and nearly 70% lived in North America or Europe. That’s not a random sample. It’s a lens-and it’s coloring every image you see.

The Hidden Patterns in Image Selection

Look at articles about women in STEM. A 2022 analysis by the WikiProject Women in Red found that only 18% of biographical articles about women scientists had a photo of the person. For men? It was 74%. The rest? Stock photos of microscopes, chalkboards, or generic lab coats. Those aren’t neutral. They’re stereotypes dressed up as documentation.

Same pattern shows up with race. Articles about Indigenous leaders often use historical photos from the 1800s-even when the person is alive today. Articles about Black activists rarely show them in modern settings. Meanwhile, white male politicians get recent, high-resolution photos of them speaking at podiums, shaking hands, or walking confidently down streets.

Even geography plays a role. Articles about cities in Africa or Southeast Asia often use images of poverty, markets, or wildlife-not modern architecture, tech hubs, or cultural festivals. Why? Because those are the images that Western editors associate with those places. It’s not about what’s true. It’s about what’s familiar.

Volunteers in Nigeria photographing local writers to upload better images to Wikipedia.

Why This Isn’t Just About Photos

Wikipedia doesn’t just reflect bias-it amplifies it. When someone searches for “first female president of Brazil,” and the top result shows a photo of a man in a suit, that’s not just inaccurate. It’s harmful. It reinforces the idea that leadership is male. When you search for “Islamic architecture,” and all the images are from the Middle East while photos of mosques in Indonesia, Senegal, or Bosnia are buried, you get a distorted view of global culture.

These aren’t fringe cases. They’re systemic. A 2024 study from the University of Oxford tracked 10,000 Wikipedia articles across 12 languages and found that articles about people of color were 3.2 times more likely to have no photo at all. When they did have one, it was 57% more likely to be outdated, low-quality, or unrelated.

This affects real-world perception. Teachers use Wikipedia in classrooms. Journalists cite it. Students trust it. When the visual record is skewed, it becomes part of the collective memory. And memory, once formed, is hard to change.

What’s Being Done About It?

There are efforts to fix this. WikiProject Women in Red has been adding biographies and photos of women since 2015. They’ve added over 100,000 articles-but photos lag behind. The WikiProject on Black Lives Matter has pushed for better representation of Black figures, but many of their suggested images get rejected by other editors for being “not encyclopedic enough.”

Some editors use tools like WikiShootMe, a mobile app that lets users photograph local landmarks or people and upload them directly to Wikipedia. Others use Wikimedia Commons to tag images with metadata like “person of color,” “female,” or “modern.” But these tools aren’t widely known or used. Most editors still rely on old, cached images from Google searches or public domain archives that reflect outdated norms.

Wikipedia’s own guidelines don’t require editors to check for bias in images. There’s no checklist. No training. No penalty for using a 1950s photo of a living person. The system assumes neutrality if no one complains. But silence doesn’t mean fairness.

A cracked mirror reflecting fragmented faces, replaced by vibrant portraits of underrepresented people.

How You Can Help

You don’t need to be an expert to fix this. Here’s what you can do right now:

  1. When you edit an article about a person, check: Is there a photo? Is it of the actual person? Is it recent and high-quality?
  2. If the photo is missing or wrong, search Wikimedia Commons for a better one. Use filters like “person,” “portrait,” and “license: free to use.”
  3. If you find a good image but it’s not tagged correctly, add the right categories: “Female scientists,” “Black activists,” “Indigenous leaders.”
  4. Upload your own photos. If you’ve taken a picture of a local scientist, artist, or community leader, upload it to Commons with a clear description and license.
  5. When you see a questionable image, don’t just delete it-replace it. Leave a note on the talk page explaining why the change matters.

Small actions add up. In 2023, a single editor in Nigeria uploaded 400 photos of contemporary African writers. Within months, those images started appearing in articles across multiple language versions of Wikipedia. That’s the power of one person correcting a bias.

The Bigger Picture

Wikipedia is one of the most visited websites in the world. It’s a mirror of human knowledge. But if the mirror is cracked-if it reflects only some faces, only some stories, only some versions of truth-then it doesn’t just misinform. It marginalizes.

Neutrality isn’t about avoiding opinions. It’s about including all perspectives. And that includes the visual ones. The next time you see a Wikipedia article with no photo of the person it’s about, ask yourself: Why? Who decided this image was good enough? And whose story is being left out?

Fixing image bias isn’t about political correctness. It’s about accuracy. It’s about justice. And it’s about making sure the world’s largest encyclopedia actually looks like the world.

How can I tell if a Wikipedia image is biased?

Look for patterns: Is the photo of the actual person, or a generic substitute? Is it outdated, low-quality, or unrelated to their work? Are people of color or women more likely to have missing or stock photos compared to white men? If you notice the same image used across multiple articles about different people, that’s a red flag. Check the image’s upload date and source-many are from old public domain archives that reflect outdated stereotypes.

Why doesn’t Wikipedia have a policy to prevent image bias?

Wikipedia has general policies on neutrality and reliable sourcing, but no specific rule for image selection bias. This is because the platform relies on community consensus, and many editors don’t recognize image bias as a problem. There’s no training on visual representation, and no automated tools flag potentially biased images. Changing this requires more editors to speak up and push for formal guidelines.

Can I upload my own photos to Wikipedia?

Yes, if you own the rights to the photo and release it under a free license like Creative Commons. You can upload through Wikimedia Commons, tag it with relevant categories, and link it to the article. Many editors are actively looking for new, high-quality images of underrepresented people. Your photo could be the one that finally gives a real face to a forgotten name.

Do other encyclopedias have the same problem?

Yes, but Wikipedia is the most visible because it’s free and open. Commercial encyclopedias like Britannica also struggle with representation, but their editorial teams are smaller and more centralized, so changes happen faster-though often less transparently. Wikipedia’s openness means the bias is visible, which is why it’s also easier to fix-with enough people paying attention.

What’s the most common excuse editors give for using biased images?

“There’s no good photo available.” That’s often untrue. It usually means the editor didn’t look hard enough, or didn’t know where to search. Many assume public domain means “old” or “generic,” but there are thousands of high-quality, freely licensed photos of modern people on Wikimedia Commons-just not always easy to find. A quick search using filters like “portrait,” “modern,” or “person of color” often turns up better options.