Jamie Wang

“What I propose, therefore, is very simple: it is nothing more than to think what we are doing.” - Arendt

“Man himself must first of all have become calculable, regular, necessary, even in his own image of himself, if he is able to stand security for his own future.” - Nietzsche

"To exit the trajectory of productive time, so that a single moment might open almost to infinity." - Jenny Odell

“If the entire history of landscape in the West is indeed just a mindless race toward a machine-driven universe, uncomplicated by myth, metaphor, and allegory, where measurement, not memory, is the absolute arbiter of value, where our ingenuity is our tragedy, then we are indeed trapped in the engine of our self-destruction.” - Simon Schama

It’s been historically true that the speed of technological development continues to outpace ethical and regulatory development. At least in the digital world, we’re building faster than ever now and we can safely predict that the law will take years to catch up. In the meantime, we can only resort to the development of soft norms, starting with understanding our ethics and values. We must emphasize the search for the Good, beyond instrumentalization and measurement.

Let’s call this gap between technological progress and moral progress “ethical debt”. Many people are working on lessening ethical debt: the policy-makers who work on regulating social media companies; the researchers who study the harmful effects of cryptocurrencies; the philosophers who study the ethics of self-driving cars; the journalists who investigate AI bias. But what about technologists themselves? Are we, the people who start companies and create things, doing enough to probe the moral frameworks and value systems that underlie our work?

At the moment, the answer is an unequivocal “no”. Silicon Valley is premised on an ethic of “move fast and break things”, of “fuck around and find out.” Many technologists do not see it as their responsibility to pause, reflect, and introspect before creating something of vast social consequence. Instead, Silicon Valley, for all its counterculture-inspired talk about radically reimagining the future, is ultimately about turning a profit, no matter where it can be found. Technology is about business, after all.

What makes Silicon Valley so unique, relative to businesses in the rest of the United States, or indeed the world, is the merger of professional and personal values. Silicon Valley tech workers embody a personal ethic that glorifies quantification, improvement, and optimization. When you are fully immersed in the world of software, everything starts to feel like a technical problem.

For many tech workers, therefore, their pursuits must be done in the name of self-improvement, and even most types of “fun” must serve some kind of instrumental goal. Books must be read with a note-taking system; exercise must be quantified; friendships are managed in personal CRMs; casual hangouts and dating apps are covert recruitment missions. Those who build instruments tend to instrumentalize, or to treat something as a means or resource for achieving some end goal.

This value system also fits neatly with the mental model of tech workers. They tend to analogize all systems they see, from their own bodies to society at large, to software systems. Growing up, technologists usually interact more with systems than with other people, but moral psychologists like Kohlberg and Piaget believed that most moral development occurred via social interaction.

For example, to get your food for the week, you could either go grocery shopping or use Instacart. Instacart is an efficient instrument. You swipe quickly around the app, inspecting flattened images of produce, and quickly checkout. Even better, you could set automated purchases such that you never need to think about groceries again. Like any path that technology makes easier, it has increasing consequences the more people use it. The more people use Instacart, the less investment grocers will make in physical grocery stores that are pleasant to shop in. At the margin, this could mean that urban planners are disincentivized to plan a walkable density of grocery shops. In the name of grocery efficiency, we might be eroding the experience of in-person grocery shopping, which for many people is a staple of their life.

Heidegger argued that instrumentalization is a natural consequence of organizing life under technology. The world, including oneself, ceases to have presence or value in its own right, and is regarded instead as a ‘standing-reserve’, that is, a resource to be extracted from and instrumentalized.

Frequently, this goal-directed behavior is incredibly useful. Technologists, and Silicon Valley writ large, is known for an optimistic, can-do attitude. Technologists decide on goals and effectively marshal resources to achieve those ends.

But in the process, we frequently end up treating everything around us instrumentally, including ourselves. How many times have your friends told you they were burned out from work? How many new startups for productizing, flattening, and managing social relationships pop up every year?

We believe this is insufficient. We believe instrumentalization without understanding what pre-existing worlds we’re building within, and what worlds we are building towards will not lead to a beautiful, sane, and deeply good future. What is appealing about the efficiency mindset is precisely what is dangerous about it; it is measurable and generalizable.

Jacques Ellul claimed that as technology spreads, it renders what it touches increasingly mechanical. Questions of morality and aesthetics are reduced to, or forgotten in favor of, questions of efficiency: “All that was good becomes data. All that was beautiful is now efficient.” Efficiency is valued above all else. Why shouldn’t it?

Because frequently, it leads to neatly sweeping hard questions under the rug, for example the consideration of non-quantifiable externalities. It’s much easier to understand the number of flattened interactions between two users of a platform than to theorize about the health of their relationship as compressed through such a system, or of the larger system around them. You can optimize easily for the UX of completing a grocery purchase on Instacart and measure NPS scores before and after a change; it’s much more complicated to reason about how making Instacart easier to use impacts the health of the community.