by Aishwarya Vardhana, 2022-2023

Designing a safe and painless system for reporting harassment and abuse across all of Wikipedia.

A user on Korean Wikipedia flags the account ‘RomanPark’. By flagging potentially concerning accounts or content, the community is able to govern themselves and keep the platform safe and inclusive. Each report that is filed is processed and acted upon by another community member. My team, Trust and Safety Tools, is the bridge that connects folks to one another. We build tools that help the community enforce Trust and Safety policy.

A user on Korean Wikipedia flags the account ‘RomanPark’. By flagging potentially concerning accounts or content, the community is able to govern themselves and keep the platform safe and inclusive. Each report that is filed is processed and acted upon by another community member. My team, Trust and Safety Tools, is the bridge that connects folks to one another. We build tools that help the community enforce Trust and Safety policy.

This screenshot is an exploration of what we potentially could display when an account is reported. The dialog you see is the first step in the reporting process.

This screenshot is an exploration of what we potentially could display when an account is reported. The dialog you see is the first step in the reporting process.

⚠️ This document is structured in the following way ⚠️

  1. Discover and define
  2. Research and plan
  3. Design and develop
  4. Key collaborators
  5. Cultivating a design culture and process on the team
  6. Gaining complex domain knowledge
  7. Resources I use throughout my design process

I joined the Trust and Safety Tools team in July of 2022 to lead design for the private incident reporting system (PIRS), which would be a large, complex, multi-year project.

Discover and define

Understanding the problem space

The Trust and Safety policy team knew that harassment on Wikipedia is an ongoing problem. This has been the case since 2005. The Wikipedia community had been requesting a reporting system since 2015. In 2022, our team was tasked with building it. The first step was to understand the problem space, given very little data. Since Wikipedia does not collect data on its users and very little activity is monitored or tracked, we were operating mostly on assumptions and qualitative research ↗️.

I conducted a review of our existing research and published an extensive report titled “Harassment on Wikipedia” [PDF]. In this report I identified major themes, insights, and areas of concern and presented these findings to our product manager.