Author: Leona Whitcombe

Leona Whitcombe

AI-Assisted Editing on Wikipedia: How Guardrails, Review, and Quality Control Keep It Reliable

AI-assisted editing on Wikipedia uses smart tools to flag vandalism, enforce neutrality, and suggest improvements-keeping the world’s largest encyclopedia accurate and reliable. Human editors still have the final say, but AI makes their work faster and more effective.

Leona Whitcombe

Wikipedia's Response to Paid Editing Scandals

Wikipedia responded to paid editing scandals by enforcing transparency, requiring editors to disclose paid relationships. Volunteers and automated tools now flag suspicious edits, and companies like Google and Microsoft have adopted strict internal policies. Trust in Wikipedia remains intact because of its open, community-driven enforcement.

Leona Whitcombe

How to Handle Retractions and Corrections in Wikipedia References

Wikipedia relies on reliable sources, but when those sources are retracted or corrected, the article must change. Learn how to identify, document, and replace faulty citations to maintain trust and accuracy.

Leona Whitcombe

Using ORES and Machine Learning to Flag Risky Wikipedia Edits

ORES uses machine learning to detect vandalism on Wikipedia by analyzing edit patterns in real time. It helps human editors prioritize risky changes, reducing the time harmful content stays online. Trained on decades of edit history, it catches 80%+ of vandalism faster than humans alone.

Leona Whitcombe

How Press Releases Influence Wikipedia Article Updates

Press releases don’t directly update Wikipedia, but they can trigger changes when journalists turn them into credible news stories. Wikipedia relies on independent reporting-not corporate announcements-to verify and add information.

Leona Whitcombe

Comparative Journalism: Wikipedia vs Traditional Encyclopedias

Wikipedia and traditional encyclopedias approach knowledge in opposite ways - one open and dynamic, the other expert-driven and static. Which one should you trust? The answer isn't simple.

Leona Whitcombe

The Gender Gap in Wikipedia: What Research Shows

Research shows that fewer than 20% of Wikipedia editors are women, leading to significant gaps in coverage of women's history, achievements, and perspectives. This imbalance affects what information is preserved-and who gets remembered.

Leona Whitcombe

Wikipedia Oversight: How Suppression Requests Work and What Gets Hidden

Wikipedia oversight allows trusted editors to hide sensitive edits - like personal data or threats - from public view. It’s not censorship. It’s a safety tool that protects real people behind the screen. Only a few hundred users worldwide can use it, and only under strict rules.

Leona Whitcombe

How Wikipedia's Pending Changes and Autopatrol Protect Edit Quality

Wikipedia's Pending Changes and Autopatrol features protect article quality by filtering out vandalism while letting trusted editors make instant updates. Learn how these tools keep the encyclopedia accurate and up to date.

Leona Whitcombe

Training Modules for Students Editing Wikipedia: What to Include

Effective training modules for students editing Wikipedia must teach the Five Pillars, reliable sourcing, notability rules, and conflict navigation-not just editing tools. Real examples and structured practice turn beginners into confident contributors.

Leona Whitcombe

Wikimedia Foundation Challenges to Government Regulations

The Wikimedia Foundation is fighting government censorship worldwide to protect access to accurate, free knowledge. From Turkey to India, it refuses to remove factual content-even when governments demand it.

Leona Whitcombe

Notable Cases of Admin Abuse and How Communities Fought Back

When admins misuse power, communities don’t stay silent. From Wikipedia to Twitch, documented evidence and organized action have forced platforms to change. Here’s how real users fought back-and won.