Archive: 2026/03 - Page 7

Leona Whitcombe

Volunteer Moderation on Wikipedia: Workload, Burnout, and Support

Wikipedia relies on volunteers to moderate content, but the workload is overwhelming, burnout is common, and support is nearly nonexistent. This is the human cost behind one of the world’s most-used knowledge sources.

Leona Whitcombe

The Signpost and Multilingual Reach: How Wikipedia’s News Hub Connects Global Communities

The Signpost is Wikipedia's volunteer-run news source, reporting on community debates, policy changes, and global editing efforts across 300+ languages. It’s not just news-it’s the living record of how knowledge is built, challenged, and preserved.

Leona Whitcombe

Using Shortcuts and Custom Scripts to Speed Up Wikipedia Editing

Learn how to use Wikipedia's built-in shortcuts and custom scripts to cut editing time in half, reduce errors, and contribute more effectively to one of the world's largest encyclopedias.

Leona Whitcombe

AI-Assisted Editing on Wikipedia: How Guardrails, Review, and Quality Control Keep It Reliable

AI-assisted editing on Wikipedia uses smart tools to flag vandalism, enforce neutrality, and suggest improvements-keeping the world’s largest encyclopedia accurate and reliable. Human editors still have the final say, but AI makes their work faster and more effective.

Leona Whitcombe

Wikipedia's Response to Paid Editing Scandals

Wikipedia responded to paid editing scandals by enforcing transparency, requiring editors to disclose paid relationships. Volunteers and automated tools now flag suspicious edits, and companies like Google and Microsoft have adopted strict internal policies. Trust in Wikipedia remains intact because of its open, community-driven enforcement.

Leona Whitcombe

How to Handle Retractions and Corrections in Wikipedia References

Wikipedia relies on reliable sources, but when those sources are retracted or corrected, the article must change. Learn how to identify, document, and replace faulty citations to maintain trust and accuracy.

Leona Whitcombe

Using ORES and Machine Learning to Flag Risky Wikipedia Edits

ORES uses machine learning to detect vandalism on Wikipedia by analyzing edit patterns in real time. It helps human editors prioritize risky changes, reducing the time harmful content stays online. Trained on decades of edit history, it catches 80%+ of vandalism faster than humans alone.

Leona Whitcombe

How Press Releases Influence Wikipedia Article Updates

Press releases don’t directly update Wikipedia, but they can trigger changes when journalists turn them into credible news stories. Wikipedia relies on independent reporting-not corporate announcements-to verify and add information.

Leona Whitcombe

Comparative Journalism: Wikipedia vs Traditional Encyclopedias

Wikipedia and traditional encyclopedias approach knowledge in opposite ways - one open and dynamic, the other expert-driven and static. Which one should you trust? The answer isn't simple.

Leona Whitcombe

The Gender Gap in Wikipedia: What Research Shows

Research shows that fewer than 20% of Wikipedia editors are women, leading to significant gaps in coverage of women's history, achievements, and perspectives. This imbalance affects what information is preserved-and who gets remembered.

Leona Whitcombe

Wikipedia Oversight: How Suppression Requests Work and What Gets Hidden

Wikipedia oversight allows trusted editors to hide sensitive edits - like personal data or threats - from public view. It’s not censorship. It’s a safety tool that protects real people behind the screen. Only a few hundred users worldwide can use it, and only under strict rules.

Leona Whitcombe

How Wikipedia's Pending Changes and Autopatrol Protect Edit Quality

Wikipedia's Pending Changes and Autopatrol features protect article quality by filtering out vandalism while letting trusted editors make instant updates. Learn how these tools keep the encyclopedia accurate and up to date.