Wikinews isn’t run by a company. It doesn’t have editors in a newsroom. It doesn’t pay its writers. Yet, every day, dozens of news stories get published-fact-checked, sourced, and polished-by people who log in just because they care. How does that even work? The answer lies in its community governance: a quiet, structured system of volunteers who take on specific roles to keep the site honest, accurate, and alive.
Who Runs Wikinews?
Wikinews runs on volunteers. Not just anyone can publish a story. There’s a process. And that process is enforced by a small group of trusted users with special permissions. These aren’t hired staff. They’re regular people-teachers, students, retirees, freelancers-who’ve earned the trust of the community over time.
At the top are the administrators. They’re the ones who can delete pages, block users, and protect articles from edits. But here’s the catch: they don’t decide what gets published. Their job isn’t to censor or control content. It’s to handle disruptions. If someone keeps adding false claims to a breaking news story, an admin steps in to block them. If a page gets vandalized, they restore it. They’re the maintenance crew, not the editors.
The Reviewers: Gatekeepers of Accuracy
Behind the scenes, there’s another group: reviewers. These users have been granted the ability to mark articles as "reviewed." That’s a big deal. Until an article is reviewed, it won’t appear on the main page. It won’t be featured in newsletters. It won’t get picked up by other Wikimedia projects.
Reviewers don’t write the stories. They check them. They look for original reporting. They verify sources. They make sure claims are attributed properly. They check for neutrality. A reviewer might spend 20 minutes on a 300-word article just to confirm that every fact has a link to a reliable publication-like a government press release, a major newspaper, or an official transcript.
There’s no formal training. No checklist you memorize. But over time, reviewers develop a gut sense. They know when a quote is pulled out of context. They spot when a blog post is being passed off as journalism. They’ve seen the same mistakes made over and over-misattributed statements, unverified rumors, biased language-and they’ve learned how to fix them.
The Writers: Volunteers With Standards
Anyone can write for Wikinews. But if you want your story to get published, you have to follow the rules. The site has a strict policy: no original analysis. No opinion. No speculation. Only reporting based on published, verifiable sources.
That means if you want to write about a new law passed in Congress, you can’t say, "This law will hurt small businesses." You can say, "Senator X stated in a press conference that the law will reduce funding for small business grants." And you must link to the video or transcript of that statement.
Writers often start by submitting drafts to the "Drafts" namespace. There, other volunteers-sometimes reviewers, sometimes just experienced editors-leave comments. "This paragraph needs a source." "This headline is sensational." "The third quote is from a partisan blog. Replace it."
It’s not personal. It’s not mean. It’s just how the system works. You learn fast. Most new writers get their first article published within a week. The ones who stick around? They become reviewers. Then, sometimes, admins.
How Decisions Are Made
There’s no boss. No CEO. No board of directors. So how do they decide what’s important enough to cover? Through discussion.
Every major change-new policy, new guideline, even how to handle a controversial topic-is debated on the project’s talk pages. These aren’t hidden forums. They’re public, archived, and searchable. Anyone can join. You don’t need special rights. Just a Wikipedia account.
For example, in 2023, there was a months-long debate about whether to cover social media posts from public figures as primary sources. Some argued Twitter/X threads were breaking news. Others said they were unverified rumors. The community voted. The rule changed: social media posts can be cited, but only if confirmed by a traditional source.
That’s governance in action. No one has final say. But the people who show up, who read the discussions, who vote and argue and compromise-they’re the ones who shape the site.
What Happens When Things Go Wrong
It’s not perfect. Mistakes happen. In 2022, a false report about a political figure’s death was published. It was quickly retracted, but it spread to other sites. The community didn’t ignore it. They held a public review. They updated their guidelines. They added a new step: all obituary-style reports must be confirmed by at least two independent sources before review.
When a user breaks the rules-spamming, lying, harassing others-they get blocked. Not by one person. Usually, after three or more community members report the behavior, an admin reviews the logs. If it’s clear, the block goes through. If there’s doubt? The community discusses it. Sometimes, users are unbanned after a cooling-off period. Sometimes, they’re not.
The system isn’t fast. It’s not flashy. But it’s transparent. And that’s the point.
Why This Model Works
Most news sites rely on hierarchy: editors, reporters, fact-checkers, lawyers. Wikinews flips that. Everyone is equal. Everyone can contribute. But trust is earned. And trust is what keeps the system from collapsing.
Think of it like a neighborhood watch. No one gets paid. But people show up because they care about their block. If someone starts breaking windows, others notice. If someone starts spreading rumors, others correct them. The system works because it’s visible, accountable, and rooted in shared values.
Wikinews doesn’t have the reach of Reuters or the AP. But it has something they don’t: radical transparency. You can see every edit. Every debate. Every vote. You can trace a story from its first draft to its final published version-and see every change along the way.
How to Get Involved
Want to help? You don’t need to be a journalist. You don’t need a degree. You just need to care about truth.
- Start by reading a few recent articles. Notice how they’re written.
- Find an article that’s marked "Needs Review" and leave a comment. Did the source link work? Is the tone neutral?
- Write a short draft. Submit it. Don’t worry if it’s rejected. Learn from the feedback.
- Join a discussion on the talk page. Say something. Ask a question.
Most reviewers started as writers. Most admins started as reviewers. It’s not about titles. It’s about showing up, consistently, with care.
Can anyone become an admin on Wikinews?
Yes, but not easily. Anyone can nominate themselves, but the community votes. To be elected, you need to have been active for at least six months, have made over 100 meaningful edits, and demonstrate a clear understanding of Wikinews policies. You also need to show you can stay calm during conflict. Most admins are chosen because they’ve helped resolve disputes fairly, not because they’ve written the most stories.
Do Wikinews reviewers get paid?
No. All roles on Wikinews are voluntary. Reviewers, admins, and writers are all unpaid volunteers. The Wikimedia Foundation provides the platform and legal support, but no one receives a salary for their work on Wikinews. People participate because they believe in open, accurate journalism.
How is Wikinews different from Wikipedia?
Wikipedia writes summaries of events after they’ve happened. Wikinews reports on events as they happen-using original reporting based on published sources. Wikipedia avoids current events unless they’re well-established. Wikinews thrives on breaking news, but only if it’s verifiable. Also, Wikinews requires a neutral tone and strict sourcing, while Wikipedia allows more interpretive summaries.
What happens if a reviewer makes a mistake?
Mistakes are corrected publicly. If a reviewer approves a story that later turns out to be inaccurate, another reviewer can flag it. The article is then re-reviewed, and the original reviewer is usually notified. There’s no punishment-just learning. The system relies on peer correction, not top-down enforcement.
Is Wikinews reliable?
It’s not perfect, but it’s more reliable than most social media. Because every claim must be sourced to a published, credible outlet, and every article must be reviewed before publication, the error rate is low. It’s not used by major media outlets as a primary source-but academics and journalists often use it to track how events were reported in real time by independent volunteers.
What Comes Next?
The biggest challenge for Wikinews isn’t technology. It’s attention. Fewer people are reading it. Fewer are writing. The community is aging. New volunteers are harder to find.
But the model still works. The rules still hold. The system still catches lies before they spread. And as long as there are people who care enough to check a source, to correct a headline, to argue in a comment thread about neutrality-Wikinews will keep going.
You don’t need to be a journalist to help. You just need to care about truth. And that’s something anyone can be.