*In preparation for the time I almost went to work for the birds who work for the bourgeoisie, I built a study guide for interviewing at Twitter for an Information Integrity position. The guide consists of structure super-imposed on research into how misinformation is built and propagated (all mostly via my work at the Center for Humane Technology)

I've also included a list of experimental product proposals, most of which were inspired by research, experts, and folks who have been on the Twitter platform for the past decade. Since this piece was published in early October, 2019, Twitter has implemented features similar to several of these proposals. I'll be documenting them below as they're released.*

Anatomy of misinformation campaigns


Social media provides 3 "discursive opportunity structures", or opportunities for shaping discourse, that are leveraged by misinformation campaigns in the following ways.

  1. Reach is achieved through a groundwork of controlled interactivity, such as bot accounts and disciplined message couriers, and a scaling strategy of volatile virality, such as third-party promoters. To whom is it being said?
  2. Emotional Impact that triggers a strong physiological response, such as anger and awe, are more likely to go viral. These can be positive or negative, but anger tends to travel faster. What is being said?
  3. Legitimacy is reinforced by authorial context like audience size, bio claims, and verification, message context like hashtag popularity, and audience context like confirmation bias. By whom is it being said?

Sources and Supporting Ideas

Social Media Mechanisms for Right-Wing Political Violence in the 21st Century: Discursive Opportunities, Group Dynamics, and Co-Ordination | Radicalisation Research

Discursive Opportunity Structure

Fig 1. Tweet surges are correlated to attacks

Fig 1. Tweet surges are correlated to attacks

Fig 2. How reach is achieved

Fig 2. How reach is achieved

Opinion | First They Came for the Black Feminists

Taxonomy of solutions


  1. Refutation can disrupt legitimacy. In order for refutation to be effective, it must be done by a trusted party — if not, refutation can actually backfire. This is obviously a moving target, as trust in institutions is often damaged by misinformation.
  2. Exposing inauthenticity can disrupt legitimacy. This similarly must be done by a trusted party, or at least in a consistent and demonstrably unbiased way.
  3. Alternative narratives can disrupt emotional impact and legitimacy. To be distinct from refutation, this must be done in a way that is simultaneously organic and not reflexive, otherwise becomes refutation.
  4. Algorithmic filter manipulation can disrupt reach. Arguably the most powerful, if something is identified as misinformation, limiting its reach explicitly, but leaving it publicly accessible, allows for control without censorship.