Big tech platforms like Reddit, Discord, Instagram, and Twitter have reporting systems that protect users from harm and the platform from regulation. Our self-governing communities on Wikipedia have devised various creative ways of moderating online social interactions, but there is no official, effective, and transparent reporting system that is accessible to all. For years the community has requested such a system and this year Trust and Safety passed a Universal Code of Conduct which promises an incident reporting tool as a means of enforcing policy.
More recently, the EU passed the Digital Services Act ****which requires big tech platforms, like Wikipedia, to have official systems in place for users to report policy violations.
In 2022, the Wikimedia community ratified the Universal Code of Conduct (UCoC), a community-led effort to create behavioral guidelines for all the wikis. As part of enforcing the UCoC, the UCoC committee and Wikimedia Trust and Safety proposed an “incident reporting tool” for our contributors to report incidents that violated the UCoC, to Admins on Wikipedia. Our team was responsible for stewarding the design and implementation of this tool.
My first step was to understand the harassment ecosystem on Wikipedia. This involved a literature review of research the Foundation’s done on harassment in previous years, exploratory interviews with targets of harassment on the wikis, and a comparative analysis (deck, Miro) of reporting systems on other tech platforms.
Important learning from the exploratory research