Tag: community moderation

Leona Whitcombe

Wikipedia Universal Code of Conduct: How Rules and Enforcement Work

Explore the Wikipedia Universal Code of Conduct. Learn about the rules, how behavior is enforced by the community and foundation, and its impact on editor diversity.

Leona Whitcombe

How Wikipedia Editors Manage Breaking News Events

Discover how Wikipedia editors maintain accuracy and neutrality during breaking news events through verifiability, page protection, and community consensus.

Leona Whitcombe

How to Handle Wikipedia Appeals and Unblock Requests: A Guide to Case Evaluation

Learn how to evaluate Wikipedia unblock requests and appeals. This guide helps moderators distinguish between sincere apologies and scripted responses to stop vandalism.

Leona Whitcombe

How Wikipedia Manages Disruptive Editing Without Using Sanctions

Explore how Wikipedia uses social norms, consensus building, and technical filters to stop disruptive editing without relying on bans or sanctions.

Leona Whitcombe

How Wikipedia Stops Misinformation During Breaking News

Discover how Wikipedia uses a mix of human moderators, automated bots, and strict sourcing rules to stop misinformation during breaking news events.

Leona Whitcombe

How Arbitration Enforcement Areas Work on Wikipedia: Monitoring and Reports

Explore how Wikipedia's Arbitration Enforcement Areas combat vandalism through strict monitoring, ArbCom mandates, and detailed reporting to maintain encyclopedia neutrality.

Leona Whitcombe

Off-Wiki Canvassing and How It Undermines Wikipedia Consensus

Off-wiki canvassing undermines Wikipedia's consensus by letting outside groups influence edits through social media and other platforms. This violates the core principle of neutral, evidence-based collaboration and erodes trust in the encyclopedia.

Leona Whitcombe

Off-Wiki Canvassing and How It Undermines Wikipedia Consensus

Off-wiki canvassing undermines Wikipedia's consensus by manipulating edits from outside the platform. It erodes trust, triggers edit wars, and threatens the integrity of one of the world's most trusted information sources.

Leona Whitcombe

Volunteer Moderation on Wikipedia: Workload, Burnout, and Support

Wikipedia relies on volunteers to moderate content, but the workload is overwhelming, burnout is common, and support is nearly nonexistent. This is the human cost behind one of the world’s most-used knowledge sources.

Leona Whitcombe

How Wikipedia's Arbitration Committee Makes Final Editorial Decisions

Wikipedia's Arbitration Committee handles the most serious editing disputes, making final, binding decisions based on community policies. Composed of elected volunteers, it enforces sanctions like topic bans and blocks when community mediation fails.

Leona Whitcombe

Case Study: German Wikipedia’s Quality and Policy Rigour

German Wikipedia stands out for its strict sourcing rules, trained editors, and policy-driven editing culture. With fewer articles but far fewer errors, it offers one of the most reliable encyclopedias in the world.

Leona Whitcombe

Where AI Fits in Wikipedia’s Editing Culture: Community Debates

Wikipedia's editing community is grappling with AI's growing role in content creation. From automated bots to AI-generated edits, the debate centers on trust, verification, and whether machines can uphold Wikipedia's human-centered values.