Think Tanks and Wikipedia: How Expert Groups Shape Public Knowledge
When you look up a policy, a scientist, or a global issue on Wikipedia, you’re not just seeing facts—you’re seeing the result of decades of influence from think tanks, organizations that research and promote specific policy ideas, often shaping public discourse and media narratives. Also known as policy institutes, these groups don’t edit Wikipedia directly, but their reports, funding, and public statements become the sources that editors rely on to build articles. Think tanks like the Brookings Institution, RAND Corporation, or Heritage Foundation don’t just write reports—they create the foundation for what Wikipedia considers "reliable" and "notable." If a think tank publishes a study on climate policy, that study becomes a source for Wikipedia articles on global warming, carbon taxes, or renewable energy. But here’s the catch: not all think tanks are equal. Some are transparent, peer-reviewed, and nonpartisan. Others are funded by corporations or political groups with clear agendas. Wikipedia’s editors have to sort through this noise every single day.
That’s why Wikipedia’s reliance on written sources often mirrors the priorities of these institutions. If a think tank’s report gets picked up by major media outlets, it quickly becomes a go-to source for Wikipedia editors. This creates a feedback loop: think tanks influence media, media influences Wikipedia, and Wikipedia becomes the public’s default reference. Meanwhile, smaller, independent researchers or grassroots organizations often get left out because they lack the resources to get their work cited in high-profile outlets. This isn’t bias by design—it’s bias by structure. And it’s why Wikipedia’s coverage of topics like economic inequality, indigenous rights, or healthcare reform often reflects the interests of powerful institutions rather than the full range of human knowledge. The Wikimedia Foundation, the nonprofit that supports Wikipedia and its sister projects, funding infrastructure and advocacy for free knowledge has tried to address this by encouraging edits from underrepresented communities, but the system still favors well-funded, media-savvy voices.
Wikipedia’s editorial standards—like verifiability and no original research—were meant to protect accuracy. But they also lock out knowledge that doesn’t fit the mold: oral histories, community-led research, or local data that never makes it into academic journals. Meanwhile, media influence, the way news outlets amplify or ignore certain ideas, often determines what becomes Wikipedia content becomes a silent gatekeeper. When a think tank’s study gets a headline in The New York Times, it’s almost guaranteed to show up on Wikipedia within hours. When a community group publishes the same data on their blog? It might never be seen. This isn’t just about sources—it’s about power. And understanding that connection is the first step toward making Wikipedia more honest, more complete, and more fair.
Below, you’ll find a collection of articles that dig into how Wikipedia’s rules, tools, and community practices respond to this reality. From how edit filters protect news articles from manipulation, to how journalists and academics navigate Wikipedia’s policies, these posts show the real-world impact of who gets to define what’s true—and who gets left out.
How to Evaluate Think Tanks and Advocacy Groups as Sources on Wikipedia
Learn how to evaluate think tanks and advocacy groups as sources on Wikipedia. Discover red flags, verification methods, and when these sources are acceptable-or not.