Keywords: Nervous System, Metrics, Platform Power, Abstraction, Extraction, Burnout, Embodied Limits
This essay arose from my experience designing and operating within digital systems. It reflects a growing tension between optimization logics and embodied limits. Rather than offering solutions, it holds a question about responsibility, limits, and the role of human intelligence in the systems we build.
Tech platforms often claim to be neutral agents. However, when creating a digital space for people to interact with, one inherently holds responsibility for that space. As such, neutrality is not really possible from a position of power. In an age where digital platforms are combined with human services – such as delivery, ride share, and gig work apps – bodies are treated as interchangeable units. There is little known by the companies of who is fulfilling the services other than their rating out of five stars or their daily efficiency. Many of our most valuable assets – such as our time, energy, and attention – are abstracted into performance metrics. This is increasingly the case on social media platforms that are explicitly designed to retain people’s attention, marking success by the amount of time spent on these applications. Our energetic state is compared to that of a lab rat, with excitation marking dependency rather than alignment.
We live in a globalized age, which has brought with it many positives and negatives. Though we can now easily connect with people and cultures around the world, we have subsequently also effectively been able to ignore time zones, and as such, have significantly devalued our circadian rhythms. Having immediate access to people at all times means that constant availability has become a default, more often than not at the expense of our own nervous systems. With content creation as a legitimate means for income generation, engagement metrics matter more to platforms and most creators than any meaningful felt experience from the audience. As a result, much of the content generated is hyper-pallatable and attention-grabbing, but lacking any real connection with its consumers. Markers of success are frequently related to scale, often equated with profits and reach, overriding context in the process.
With these factors at play and in constant interaction with our lives, our nervous systems are consistently compromised and overstimulated. However, burnout is often framed as an individual failure rather than a systemic one, despite the disproportionate amounts of people dealing with chronic anxiety or stress. When our systems are aligned with flat, numerical values, systems of societal and relational value – such as care work – become invisible. On social media platforms, virality is rewarded over dialogue and inflammatory statements and reactions take algorithmic precedent. In this process of inverted values, the result is that communities are fragmented. Trust erodes not only between individuals, but also in the systems that dictate the ways that we communicate and interact.
What would it look like if systems listened to their end users? Many mainstream tech platforms reduce human interaction to a metric, but there may be other forms of feedback that not only better reflect human response, but also reframe what types of interactions are valuable. What would it look like to design technology with an awareness of rhythms? In systems that are strictly linear, the non-linearities of daily life are often framed as an error. The design assumptions embedded in these systems are producing flattening effects that no longer feel sustainable. What if systems considered human limits to be intelligence, not failure? In an era of ‘smart’ technologies and artificial intelligence, we risk erasing other forms of felt intelligence that are contained in the body. I believe there is more to explore on how our uniquely human intelligence can support the responsiveness of systems we operate under, rather than being erased as a flaw.
I am unsure if it is possible to create such systems in a sustainable way, but I am deeply curious about exploring its possibility. It may be that we have to undo entire frameworks for understanding or defining technological systems in the first place, but part of me is also hopeful that there are elegant solutions available that can cause a ripple effect in the way we engage with technology. Beyond curiosity, however, I also consider exploring these possibilities a responsibility as a systems thinker. It feels increasingly untenable to consider systems designers neutral given the impact these technologies have on our daily lives. The question is not whether systems will shape us – they already do. The question is whether we are willing to take responsibility for how they do.