by Aishwarya Vardhana, 2022-2023
A user on Korean Wikipedia flags the account ‘RomanPark’. By flagging potentially concerning accounts or content, the community is able to govern themselves and keep the platform safe and inclusive. Each report that is filed is processed and acted upon by another community member. My team, Trust and Safety Tools, is the bridge that connects folks to one another. We build tools that help the community enforce Trust and Safety policy.
This screenshot is an exploration of what we potentially could display when an account is reported. The dialog you see is the first step in the reporting process.
⚠️ This document is structured in the following way ⚠️
I joined the Trust and Safety Tools team in July of 2022 to lead design for the private incident reporting system (PIRS), which would be a large, complex, multi-year project.
The Trust and Safety policy team knew that harassment on Wikipedia is an ongoing problem. This has been the case since 2005. The Wikipedia community had been requesting a reporting system since 2015. In 2022, our team was tasked with building it. The first step was to understand the problem space, given very little data. Since Wikipedia does not collect data on its users and very little activity is monitored or tracked, we were operating mostly on assumptions and qualitative research ↗️.