Picture a typical technologist: she watches how her users scroll, click, and engage with her product. Based on what she sees, she makes changes. She uses user behavior as her "source of truth" about whether her design is working.
While doing this, she shapes her product based on individual preferences (not social ones) and immediate preferences (not long-term ones).
The problems caused by tech—like clickbait, internet outrage, or the rise in teen depression—are social and they emerge over time. That's why tech founders, despite their optimism, keep making things worse. (And why those working on the "decentralized web" seem likely to travel a similar path.)
Of course: we can't ask our technologist to ignore her users' behavior. We must offer a way for her to broaden her insights into users' lives. We must give her an alternative "source of truth" for judging her design's success: a way to design based on data, but not just about clicks—data about how users want to live, and whether her product helps.
Here's how to do that, in three quick parts.
To make things concrete, it helps to break down "how someone wants to live" into atoms, which I call values. For instance, someone might want to live
honestmore difficult: for example, it may be harder to be honest on Instagram, if honest posts get fewer likes, or are harder to caption well.
creativity, and every other way a person wants to act or relate to others. A courageous statement on Twitter might lose you followers, or lead to a pile-on, which makes twitter less of a good place to practice courage.
Because we use platforms that aren't designed for our values, we act and socialize in ways we don’t believe in, and later regret: we procrastinate, avoid our feelings, pander to other people’s opinions, participate in hateful mobs reacting to the news, etc.
Meanwhile, the software looks like its succeeding (in terms of engagement), but it sucks in terms of what it does to our lives.
One part of the answer is to design and test software according to values. To do this, we need to be more specific than vague words like "honest", "courageous", etc.
In our course, we write them in a special format, and test software against them:
Those are some of my values, written in this format. Do you recognize any that we share?
One big difference between values and preferences is this: preferences are instantaneously expressed in the moment someone clicks, downloads, votes, or purchases something. Values are woven deeper into a life. Our exercise Hard Steps reveals these deeper patterns.