Few manage to live by their values day-by-day, moment-by-moment. Which is to say, few live meaning-packed lives. Why's that?

I'll break it down, starting with large-scale, societal trends, but quickly zooming into problems that crop up in individual social spaces—the things which, as social designers, we must be aware of and avoid in our designs. I'll end by mentioning a few sources of inevitable meaninglessness—meaninglessness that can't be removed from life, even in a society with better social systems.

The Big Picture

Products mostly live or die within a larger social system we call "the market"; users mostly find products through marketing and word-of-mouth; and other non-market social spaces often live or die within a nest of social systems we call "democracy".

At these large scales, these meta-social systems don't support meaning very well. What thrives under the market or in various democratic systems are not the most meaningful social spaces but those that drive the most purchases, votes, or other engagements, or which serve the people who generate the most purchases, votes, or other engagements. I covered this in detail in Values, Preferences, & Meaning.

What we would want, ideally, would be systems that help people find the smaller systems that would best support what's meaningful to them—helps them discover new scenes, new practices to try with existing friends, etc. A kind of values-based market and marketing.

In fact, we have a kind of an opposite meta system that works by promising people meaning, promising them adventure and love, and so on. While selling them on things that make things even more meaningless. We promise adventure and sell blue jeans. We promise love, and sell breath mints and Tinder. We promise impact, and sell desk jobs.

Why do we have this bad system, and not the one that'd address meaninglessness? This is a bit of a puzzle. Here are three possibilities:

  1. Such a system would only make sense if there were more values-based designs to make a market out of.

    The more values-based designers, the more likely a community or a system will emerge to route people whose values are suppressed to designs that help.

  2. Such a system would require values-articulate consumers, and there are very few.

    We don't have such systems, because people have not been articulate about what's meaningful to them. Which means they can't shop by what's meaningful to them, or design around it, very well. Even articulacy about feelings is new, and articulacy about values is much newer. The kind of values articulacy we developed in quest one is extremely new.

    Spreading values-articulacy would lead to values-articulate designers, and also to systems for shopping by values—systems which help people find where their values would be supported.

  3. It just hasn't been invented yet.

    Someone just needs to invent a new meta-system, that responds to meaninglessness by helping people (1) identify suppressed values, and (2) find social spaces where those values are supported.

    I have some experiments in this direction. About Meaning Supplies

One deeper explanation that underlies all of these is that a design culture around values and meaning has not yet spread. We still live mainly in a culture designed around goals, preferences, roles, etc. I talk about the evolution of new design cultures over time in my essay Nothing to Be Done.

What to be aware of, as designers

From a design perspective, that big picture view isn't so helpful. We need to zoom in, and ask why individual social designs end up meaningless. The answer is surprisingly simple.

There are two ways our environments fail to support our values.

I'll define each of these and give examples, then show why these problems cannot be entirely avoided—why improving an environment for one value will often make it worse for another value.

Crowding Out

<aside> 📖 Debbi needs this promotion to pay for her son's private school. Normally she'd try to be honest with her boss, but in this meeting she's just trying to say everything right. To behave as expected. The stakes are too high.

</aside>