⬅️ Back to Home

<aside> 📝 What are we missing?Add a start-up OR ✏️ **suggest an edit.**

</aside>


Problem

Data bias can occur at multiple stages: from collection, cleaning, parsing, to analysis. Acting upon biased information has led to disproportionate sentencing and policing of Black and LatinX Americans, discriminatory hiring decisions, and racially charged targeted advertising. Most recently, Google and Facebook came under fire for targeting potential customers by demographics and geography for mortgage, unemployment insurance, and credit assistance advertisements.

There are three main data and privacy problems that exacerbate informational inequities and racism: 1) Targeted Ads / Adtech, 2) Big Data and Racist Algorithms / Algorithmic Bias, and 3) Surveillance / Facial Recognition.

Key Terms


Solutions

1. Anonymization

Existing solutions and players heavily rely on data anonymization, or “blind” data practices which discard demographic data entirely. While these solutions address concerns with demographics based bias, they also propagate a “color blind” approach which hinders holistic and intersectional analyses.

Existing Solutions

What needs to be done?

<aside> ⚙ We're in need of some innovation in this section. Help contribute and add to this section by suggesting an edit!

</aside>


2. Privacy Protected Data Sharing and Data Decentralization

Government, healthcare, financial institutions, and technology companies hold mass amounts of individuals’ data, at times leading to misuse and manipulation. Increased user privacy and data decentralization could reduce inequities exacerbated by data bias.