essay:

MAPPING THE CHOREOGRAPHED SELF THROUGH ART, DATA, AND THE DIGITAL BODY: On being choreographed by the smartphone in your hand



pdf

I was frantically patting my butt-pockets, the tiny one in the front of my pants, my jacket, another jacket, my bag—inside, outside, and under it. My hands moved faster as my mind raced, eyes scanning from left to right. I retraced my steps to the bathroom and checked the red chair I perch my feet on to poop. Nothing. The sink? No. Above it? Still no. The cupboard? Empty. I marched to the kitchen where I’d just been: the top of the microwave, the tabletops, the chair I’d been sitting on, the kitchen table. Still nothing. My hands made another anxious pass over my pockets, and I headed to the living room, even though I knew I hadn’t been there. I checked the sofa, lifted the blanket, peeked under the cushions—and there it was. Relief. I had tucked my phone away under a blanket in the other room, to shield it from overhearing a conversation I didn’t want tied to my case at Migrationsverket. Especially now, with new laws eroding privacy under the guise of "safety." I picked it up, walked back to the toilet, and placed it face down on the red chair. A few seconds passed. Then my hand reached out instinctively, almost magnetically, and picked it up again.

Am I paranoid, or are they really watching? And if so, what exactly do they see?


0:Introduction Imagine a world where your phone doesn’t just track you; it knows you better than you know yourself. In a world where our devices watch us more than we watch them, who’s really in control? In the digital age, an economic paradigm has emerged that prioritizes behavioural manipulation over traditional goods and services. Surveillance capitalism, as coined by Shoshana Zuboff, describes a system where personal data is continually extracted from individuals and commodified for profit. I aim to unpack how smartphones, increasingly viewed as extensions of ourselves, are key instruments in this process—subtly choreographing our actions and perceptions while fostering dependency through carefully designed elements that manipulate human behaviour.

This essay is guided by two critical questions:

1. How does persuasive design create an insidious choreography of human behaviour?
2. How can artistic practices reveal the underlying mechanisms of choreographed human behaviour by smartphones?


These questions bridge my artistic practice and theoretical inquiry, reflecting my intention to uncover how human-device interaction blurs the line between autonomy and algorithmic influence. My artistic work interrogates the embodied experience of using smartphones—the ways they choreograph gestures, behaviours, and emotions—and seeks to make visible the hidden infrastructures that shape these interactions with the leading question that will embed itself through the essay; Are they watching, and if so, what do they see?

The essay oscillates between macro and micro perspectives. It examines surveillance capitalism's vast systems while zooming in on the intimate dynamics of human-device relationships. This dual approach, grounded in transparency, parallels the intent of my artistic practice: to reveal the mechanisms of control embedded in everyday technology and invite critical reflection. At the heart of this inquiry is Zuboff’s theory of surveillance capitalism, which positions persuasive design as a powerful tool for influencing human behaviour arguing how notifications, endless scrolling, and algorithmically tailored content are not neutral; they compel users to participate in an economy of attention meticulously mined for profit. This essay argues that such designs foster an economy of dependency, raising urgent questions about privacy, autonomy, and agency. 4 To contextualize these themes, the essay draws on the work of artists such as Vladan Joler, Candela Capitan, and Eva and Franco Mattes. Each artist reveals the hidden choreography of technology, bridging abstract concepts like surveillance and persuasive design with tangible human experiences. Their practices reflect my artistic intentions: to critique these systems while exploring the role of the body as a site of both resistance and control.

The essay unfolds in three sections:

1. The first section introduces surveillance capitalism, exploring how smartphones function as tools for constant surveillance and data collection. 2. The second section examines the notion of being watched and the embodiment of smartphones.
3. The last section analyses how artists respond to these themes, using their work to expose the hidden infrastructures of surveillance capitalism and invite audiences to confront their role within this system.

While the scope of this essay cannot encompass the full complexity of these infrastructures and their consequences, it focuses on the choreographic - how the embodied experience of using technology offers the most tangible way to understand its effects. By investigating this intersection of behaviour, design, and artistic intervention, the essay seeks to illuminate how we are watched, shaped, and ultimately choreographed by the devices in our hands.1

1. Using “we” throughout this essay is not intended to imply a universally neutral user figure or disregard the intersectional factors—such as race, gender, and class—that shape individual experiences with technology. While these dimensions are crucial, this essay focuses on the collective experience of being a smartphone user. Using plural such as “we” acknowledges participation in the system and this choice simplifies the discussion while addressing collective patterns of influence.



1: Surveillance Capitalism and Data Collection
Surveillance capitalism, as defined by Dr. Shoshana Zuboff in her iconic book The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, is an economic model that prioritizes behavioural manipulation over traditional production.2 This model transforms the digital landscape into an environment designed for continuous user monitoring and modification. Rather than focusing on meeting individuals’ needs or contributing to collective welfare, surveillance capitalism views humans as sources of data to be harvested, analysed, and ultimately commodified. According to Zuboff, companies extract behavioural data to make predictions about individuals’ future actions, which are then sold for profit. Zuboff describes how this model actively choreographs human behaviour by aligning digital infrastructures with mechanisms that continually nudge, capture, and modify user actions, often without their full awareness or consent.

Smartphones, as primary tools of surveillance capitalism, are designed to collect a constant stream of data by embedding surveillance into everyday interactions. For example, every swipe, tap, and pause on a smartphone screen provides valuable behavioural data, allowing companies to create increasingly accurate profiles of users.3 Through location tracking, data, meta-data, app usage, and even battery levels, smartphones continuously gather insights that reveal not only what users are doing but also where and how they are doing it. Persuasive design, in the field of Interaction design, UX design and Software engineering, is the practice of motivating, guiding and making the user comfortable to make decisions that suits the employer of the designer, using established psychological research, often subconsciously.4 Persuasive design plays a crucial role in this behavioural choreography, using techniques like infinite scrolling, real-time notifications, and tailored content to guide and prolong user engagement.5 Infinite scrolling, popularized by platforms like Instagram and Twitter, removes natural stopping points, encouraging users to keep scrolling indefinitely. This design feature takes advantage of our neurochemistry and psychological impulse to continue viewing new information, keeping users engaged far longer than they originally intended.6 Similarly, notifications are strategically timed to trigger dopamine releases, creating a feedback loop that encourages users to check their phones frequently. These seemingly minor design choices are not neutral; they are engineered with the purpose of maximizing data capture by ensuring users remain engaged and responsive.

This complex infrastructure is hard to comprehend because it is embedded and invisible, and the most tangible way to understand it for me so far, has been through a present embodied experience, noticing and, Vladan Joler’s practice. Joler’s work New Extractivism, an extensive visual map with an accompanying video, illustrates the intricate, hidden architectures of data extraction, revealing how the human body and mind are transformed into resources for the digital economy. Joler, a professor and leader of SHARE Lab, a research group focused on algorithmic transparency and digital labour exploitation, uses large scale diagrams to map invisible infrastructures, making them visible and comprehensible.7

In New Extractivism, Joler unveils how smartphones are not merely communication devices but tools for behavioural control that extract data from every interaction and in detail go from the macro to the micro, the source where the line of work starts.8 His visual mappings offer a crucial perspective on how user behaviour is exploited, aligning with the notion that the phone is not a passive object but an active participant in manipulating user actions. Through art, Joler reveals the flow of data, control mechanisms, and user profiling in ways that standard analyses may not capture in an accessible way, bridging a gap between user experience and the hidden layers of surveillance capitalism.

The Steps, the Movements and the Choreography
Choreographing human behaviour through design can be understood as how smartphones, through notifications, vibrations, and interfaces, orchestrate human movements and interactions in real life. Similar to a choreographer arranging a dance, persuasive design ensures that users move their hands, eyes, and bodies in predictable patterns that keep them engaged with the device, drawing users into constant interaction with their devices.9 Andreas Spahn differentiates persuasion from manipulation, highlighting how persuasive design falls closer to manipulation, as it leverages asymmetrical power dynamics and strategic rationality to nudge behaviour without informed consent.10 Choreography in this context is not just metaphorical. The gestures we perform when interacting with our phones, such as immediately reaching out when a notification pings or relentlessly scrolling on social media, are bodily responses to the design architecture of persuasive technology.11 Again, this choreography is far from accidental; it’s created by highly skilled designers who, trained at prestigious institutions, work to fine-tune these digital systems to maximize user time, attention, and ultimately, profit .12

Here, the concept of embodied experience becomes central to understanding how these interactions affect users on a physical and psychological level. Unlike traditional user experience design, which aims to create seamless and intuitive interactions, embodied experience captures the way these designs are integrated into users' physical actions and perceptions. As Shoshana Zuboff puts it, this is achieved “through methods that bypass human awareness, individual decision rights, and the entire complex of self-regulatory processes that we summarize with terms such as autonomy.”13 The phone, therefore, is not merely an extension of the hand but a device that seamlessly integrates itself into the rhythms and patterns of daily life, influencing behaviours in ways that are both unconscious and habitual.14


While smartphones facilitate this embodied choreography, they also alienate users from their own bodies.15 The device, often perceived as an extension of the self, draws individuals into a virtual realm where another time, space, and ‘you’ take shape.16 This duality—where the phone acts as both a tool of connection and a mechanism of control—reveals the tension between embodied presence and digital alienation. The smartphone, in this context, becomes a sophisticated instrument of behavioural control, capitalizing on every tracked data point. By linking persuasive design with behavioural manipulation, surveillance capitalism transforms human interaction with technology into a form of data-driven choreography, shaping actions to serve profit-driven motives while often bypassing user awareness.


In an interview, Dr. Jasmina Maric, a senior lecturer in Interaction Design and Software Engineering, suggested that the same persuasive design principles could, in theory, be used to enhance human well-being and improve user experience.17 She notes that persuasive design could be used to enhance well-being rather than merely extract profit. However, she highlights that the current economic system incentivizes persuasive technology toward profit at the expense of human agency.

To conclude this section, I question what the issue is. Is it that surveillance capitalism as a model choreographs users’ movements, thoughts, and bodies entirely for capitalist profit, at all costs? Is it that these mechanisms operate largely unconsciously, without us even really noticing? Is it that we live in the mere beginning of the technological revolution and -like the car at the beginning of the industrial revolution- we need critical safety measures like seatbelts, airbags, proper breaks, and functional urban infrastructure of roads?


2: Are They Watching? What Do They See?

Observation in the Digital Age: Redefining "Watching"
Imagine standing in a crowded room, aware of hundreds of eyes on you. Now replace those eyes with invisible sensors—your phone, observing not just what you do but how you do it. In this digital age, “watching” takes on a nuanced meaning. Unlike human observation, often limited to the sense of sight or passive and limited by context, algorithmic "watching" is active, systematic, and pervasive. It doesn’t just see; it interprets, categorizes, and predicts.18


This idea of being “watched” is central to my artistic practice and forms the core of my enquiry: Are they watching? And if so, what do they see? Through my work, I explore the act of watching as it applies to the relationship between humans and the digital devices that observe them. In my installations and video works, I examine how our phones, as extensions of our bodies, track, record, and interpret our behaviours. By drawing attention to the phone’s perspective—the data it collects and the inferences it makes—I aim to extract and project how this intimate, yet invasive relationship reshapes a sense of agency and self-perception.19

The word watching traditionally implies intentionality—to observe attentively over a period of time.20 Smartphones, however, go beyond human notions of attention. They track everything from thumb movements to location changes, creating a continuous stream of behavioural data. As Zuboff highlights in The Age of Surveillance Capitalism, this is not mere observation but behavioural modification.21 While humans can "see" without really watching, algorithms attentively monitor, analysing even the minutest interactions. Moreover, this observation operates under the guise of convenience and comfort. When your phone suggests a route based on traffic or a playlist for your mood, it feels helpful, almost intimate. Receiving assurance and validation from social media when you need it, feels genuinely good.22

The comfort of these features are byproducts of a surveillance ecosystem designed not for your benefit but for profit. The hook is powerful and an attachment forms. 23 Users engage with their smartphones in deeply personal, and in some ways, as our most intimate partner. 24 For many, it’s the first thing they see in the morning and last thing they put away at night —an intimate object that no one else is to touch.25 This attachment isn’t accidental. Big Tech companies, with the goal of keeping you engaged, manipulate your neurochemistry to make the screen irresistible and in turn, while you are choreographed, produce data. TikTok exemplifies this with its unprecedented data collection methods enabled by its persuasive design. In an article in the Washington Post, Geoffrey A. Fowler writes that the app generates an "abnormal" number of network requests — 210 in the first nine seconds of use —sending over 500 kilobytes of data (equivalent to half a megabyte, or 125 pages of typed data) to the Internet.26 This data includes details such as screen resolution and the Apple advertising identifier, which can be used to "fingerprint" your device, even if you’re not logged in. These methods underline how your phone "watches" not to assist but to chart, predict, nudge, and extract value from your behaviour.

It is important to note that while smartphones may capture and process vast amounts of information, they do not, "see" in the literal sense. While a new report based on a presentation from Cox Media Group claims (aimed at companies using their services) that they can tailor advertising using data from its “Active Listening” tool using microphones and listening to conversations happening in the proximity of the devices (and thus potentially revealing a covert feature), most experts argue that devices, while not constantly "listening," can process contextual data through algorithms that analyse metadata.27 This nuanced form of observation is central to surveillance capitalism, where users' interactions with their devices help build predictive models based on what can be inferred from aggregated data rather than direct observation. This leads to an intriguing parallel: humans can hear but not listen, see but not watch. While this is often our state in moments of distraction or inattention, the technology we use operates differently. It doesn’t see but most certainly watches (as per the definition of watching). And, while it doesn’t hear (at least according to most experts), it most certainly attentively listens.



Data and Metadata: Building a Digital Twin
The relationship between the smartphone as a body extension and its role in data collection is inherently symbiotic. 28 As intimate prosthetics, smartphones blur the boundaries between human and machine.29 They shape behaviour through persuasive design, embedding themselves as indispensable extensions of the body and mind.30 Users knowingly surrender their data, unaware of the scale or implications of the exchange.

As data is collected through biometrics—like fingerprints, Face ID, or emotional engagement— these interactions generate a stream of data informing algorithms.31 Apple's True Depth camera used to unlock an iPhone, for example, projects thousands of invisible dots and captures infrared images to create a depth map of your face, adapting to changes in appearance and working even in total darkness.32 This dynamic raises ethically significant questions about privacy, consent, and the extent to which individuals are aware of and can mitigate their digital footprints in a landscape dominated by surveillance capitalism. Every interaction with your smartphone contributes to the creation of a digital trail, footprint or to picture it- a digital twin -a detailed, data-driven portrait of you.33 This twin is not just a mirror; it’s a predictive model built to anticipate your desires, vulnerabilities, and actions. Biometric data like fingerprints and face scans combine with behavioural data—how you scroll, pause, or hesitate—to form an intricate map of who you are. Even emotional cues, such as lingering on a post or reacting to a notification, feed into this composite. Take, for instance, the way smartphones collect data during seemingly mundane activities. When you check your phone late at night, it records the time, your location, the apps you open, and the patterns of your interactions. This data, seemingly innocuous in isolation, becomes part of a vast dataset that allows companies to infer your habits, preferences, and even emotional states. Are you doom-scrolling out of stress? Seeking connection? Avoiding something? Your digital twin tells—and companies capitalize on this knowledge. The implications extend beyond mere personalization.

This is all stated in the privacy policies—if one were to actually read them—and, with this fictive consent, tech companies legally exploit users through expansive data collection, building a digital twin to choreograph.

As Vladan Joler’s New Extractivism illustrates, this process is not random but systematic. Joler maps out the layers of invisible infrastructures -data pipelines, algorithms, and corporate systems that transform human actions into commodities. The phone, in this context, becomes more than a tool; it’s a node in a vast surveillance network, designed to feed the digital twin and, by extension, the mechanisms of surveillance capitalism. Joler’s work Facebook Algorithmic Factory, maps out, in detail, the infrastructure of building your digital twin becomes evident.34 The chart reveals how interests, such as 'ethnic affinity,' expat status, or personal economy, are combined with data points from life events, preferred routes, or even political beliefs, to construct hyper-specific profiles tailored for targeted advertising and behavioural predictions, with alarming precision. The investigation in my practice has led me to search for a comprehensive list of what movements and non-movements are data points, to see how we are choreographed down to clear steps such as thumb pressure, scroll speed, pauses, and drafts never sent -subtle gestures that reveal as much as deliberate actions. Still, this information remains elusive, leaving me to question who actually has access to this knowledge.35 This opacity, paired with the detailed mapping Joler outlines, highlights how companies prioritize their control over user transparency. It forces us to ask: how much of our digital selves do we own? Can a device designed to serve also exploit? By embedding itself into our daily lives, the smartphone becomes more than a tool—it becomes a co actor in the choreography of modern existence.



Who are "They" and What Do They See?
The question of "they" directs us to the hidden actors and infrastructures behind our devices.36 On the surface, "they" are the corporations, governments, and platforms that thrive on surveillance capitalism, profiting from the relentless extraction and commodification of data. A project by DISNOVATION.ORG highlights this dynamic and who “they” are by using advanced big data analytics to create detailed psychological, cultural, and political profiles of Big Tech companies themselves—mirroring the surveillance algorithms these corporations use to profile their users.37 Yet, "they" also encapsulate the algorithms that power these systems, learning from user behaviour to predict and manipulate thoughts and actions. While this essay critiques the smartphone, it is not the hardware itself but the software and platforms within it that act as the true agents. The smartphone, an intimate extension of the self, is a portal to vast systems of data extraction. As Nick Srnicek outlines in Platform Capitalism, platforms such as Google, Facebook, and TikTok operate not merely as tools but as dominant economic entities that derive value by capturing and holding user attention.38 Srnicek highlights that platforms thrive on network effects -the more users engage, the more valuable the platform becomes- creating a system that incentivizes the continuous monitoring and manipulation of human behaviour.

Persuasive design turns every gesture into a data point: your frequency of looking at who liked your post, hesitation before deleting a photo or taking a photo over and over again. These micro behaviours, often unnoticed, are analysed and monetized, aligning with the logic of the Attention Economy. In this model, platforms competitively treat human attention as a finite resource to be extracted and commodified, employing persuasive techniques that nudge users into sustained interaction. Tiziana Terranova critiques this system for how it restructures not only labour but also the consumption of time and cognition, transforming users into participants in an economy where engagement is the product and attention is commodified socially.39

What "they" see is not just a reflection of user actions but an encoded and abstracted representation of behaviours and desires. The publication Reality Harvester: Nature after Data after Nature offers a metaphor for this process.40 Its content, presented entirely in code, reflects how platforms reduce human behaviours into inaccessible and abstract data streams, further reinforcing their power to manipulate attention and shape interactions.

The smartphone shapes habits, tracking biometric data, emotions, and location to create a "digital twin." While it may feel personal, its scale and opacity reveal how platforms blur the lines between autonomy and control, intimacy and exploitation. As Srnicek observes, this dynamic extends beyond individual screens, with platforms dictating behaviours through personalized content recommendations that manipulate user focus to maximize engagement and profit. This encoded view enables platforms to choreograph attention, shaping not only what users see but also how they interact, further entrenching their control over digital behaviour. The Attention Economy transforms even mundane gestures into opportunities for value extraction, choreographing behaviour to align with profit-driven objectives. Ultimately, the smartphone does more than observe—it choreographs.


3: Why Should We Care?

Why should we care? Short answer: it’s a threat to the fabric of human society.

Already in 2015, Facebook filed a patent for technology capable of analysing user emotions through the cameras on personal devices, demonstrating the extent of surveillance ambitions.41 A Facebook researcher acknowledged that while their platform fosters engagement, it often leaves users feeling divided and depressed. An internal Facebook study revealed in 2019 that its algorithms are programmed to amplify content that provokes strong reactions, resulting in the widespread virality of outrage and misinformation. 42 Let’s entertain for a second what that means today, in 2025, in the middle of an information war speculated to be WW3, during which Big Tech knowingly insist on growing what they call “engagement”.43 They have the tools to change, and they could, tomorrow, but seem to respond only to economic penalties, like the EU's €1.2 billion record fine against Meta.44 Without stronger legislation, their unchecked power continues to commodify our attention, emotions, and bodies for profit.

What does it mean when our attention, emotions, and bodies are commodified for profit? In the age of surveillance capitalism, this commodification is not a distant abstraction but a lived reality. From the moment we wake up to the moment we sleep, our devices collect data about our lives, feeding algorithms that shape not only what we see but how we act. The consequences of this system are profound. Technology, once a tool for connectivity and expression, now extracts our attention, drives addiction, and shapes our behaviours in ways we barely notice.45

The Center for Humane Technology, a non-profit organisation led by the Silicon Valley whistle blower Tristan Harris identifies key areas of concern: the erosion of democratic functioning, the destruction of our information ecosystem, and the exploitation of vulnerable populations.46 These effects threaten the very fabric of human society, raising urgent questions about autonomy, agency, and ethics.47

The Mediators, Amplifiers, and Extractors -Artistic Response
When researching artists and artistic practices addressing surveillance capitalism, a clear pattern emerges: most non-male artists (read; that the algorithm showed me) are exploring identity, body, image and digital aesthetics, rather than power dynamics, data, surveillance capitalism, persuasive design, and smartphone design. While these are not exclusive of each other but are in fact deeply interconnected, my understanding is that identity, body, image and digital aesthetics are a symptom of the infrastructure and power dynamics of surveillance capitalism that, assuming just as capitalism itself, builds on a patriarchal racist capitalist political and economic system.48 This reflects the continued influence of technopatriarchy -a term coined by VNS MATRIX in A Cyberfeminist Manifesto for the 21st Century from 1991 on technological systems and artistic practices.49

An artist who touches upon choreographing human behaviour, from a perspective of self representation, self-curation and a living embodied experience enabled and reproduced as part of the ecosystem of surveillance-capitalism is Candela Capitan. Her work SOLAS (2023) is a choreographic composition exploring the tension between hyper-visibility and objectification in the digital age.50 Focusing on sixteen female bodies, Capitan underscores how bodies are consumed, replicated, and commodified within digital networks. The choreography shifts between the physical stage and the virtual screen, mirroring the process of data extraction where bodies are reduced to analysable fragments for categorization and profit. While the female body remains a primary object of desire, shown here through its repetitive, erotic, almost mechanical movements, it is simultaneously abstracted into data points within the larger framework of surveillance capitalism.

By exposing the choreography of self-representation, Capitan highlights how visibility functions as both currency and trap. Aligning with broader themes of surveillance capitalism, SOLAS critiques the commodification of bodies in digital spaces, revealing how these processes are intertwined with data extraction and behavioural manipulation. Capitan’s practice raises questions about agency: Can we curate what we allow others—or algorithms—to see? Or are we perpetually performing for a digital audience, shaped by the very systems we engage with?

We curate our behaviour when we know others are watching, but how often do we forget that surveillance is already embedded in our devices?51 In Sell Your Phone (and all of your photos) for $1,000 (2020), Eva and Franco Mattes ask participants to sell their phones, revealing the deep attachment we have to these extensions of ourselves.52 The immediate concern of "What will they see?" highlights how self-awareness arises in response to an external gaze yet ignores the constant observation by surveillance systems already in place.53 The project highlights the illusion of control, questioning how surveillance capitalism thrives on our obliviousness to its pervasive gaze. In my own practice, emphasizing the body's role (enabled by awareness) as both a site of resistance and control, recognizes the platforms' reliance on embodied human behaviour to sustain their extractive economies.

In a landscape dominated by opaque algorithms and hidden infrastructures, artistic practices play a crucial role in revealing what remains unseen explains Louise Wolthers in the research project, exhibition and publication WATCHED! Surveillance, Art and Photography.54 By visualizing the mechanisms of surveillance capitalism in the macro, artists create a space for critical reflection and emotional engagement in the micro. Technologies, rooted in reductive, binary algorithms, reflect an underlying simplicity that mirrors the logic of persuasive design: you either click, or you don’t. This binary logic doesn’t just shape our digital interactions but choreographs our bodies in similarly reductive ways, prescribing a right-or-wrong path that flattens the complexity of human agency and movement. As Vladan Joler maps the invisible infrastructures of data extraction, artists like Candela Capitan and Eva and Franco Mattes extend this critique by exploring how these dynamics manifest in embodied experiences and personal relationships. Their work reminds us that technology’s binary logic spills into physical and emotional realms, choreographing our movements, decisions, and connections with others.

Artistic practitioners act as mediators, amplifiers, and extractors, uncovering obscured aspects of the digital landscape—surveillance, data privacy, and the meta-digital—to prompt critical reflection and inspire alternative imaginaries. This involves not only revealing the mechanisms of digital control but also envisioning speculative futures that challenge dominant narratives. This practice doesn’t aim for perfection or uniformity; rather, it embraces multiplicity, recognizing that many voices and perspectives are needed to cut through the noise and challenge the hegemony.



Conclusion
Imagining a different future requires confronting the deep-rooted issues of our present. Surveillance capitalism reflects a neoliberal fixation on the individual while obscuring the systemic forces at play.55 At its core, this imbalance begins with exploitation on the micro level: the labour behind smartphone production, often hidden under layers of supply chains and modern slavery, as Vladan Joler’s maps reveal. On the macro level, the environmental devastation caused by the tech industry adds another layer of harm, feeding into a grotesque concentration of wealth and power among a select few.56

These interconnected systems are not accidental; they are carefully choreographed. The algorithms shaping our interactions with technology create a controlled dance that blurs autonomy and agency. On the micro-scale, smartphones mediate our daily lives, tracking our movements, pauses, and even hesitations. On the macro scale, these individual data points feed a vast infrastructure designed to commodify our behaviours for profit, to what seems to be at all costs. 

But how do we resist such an all-encompassing system? Individual efforts, like managing screen time, are insufficient in a world where power lies disproportionately with corporations. As Zuboff argues, awareness is the first step toward reclaiming autonomy. Without awareness, we lose the ability to make meaningful choices—whether to resist, withdraw, or even understand the forces shaping our behaviours.

This is where art intervenes. Art operates at the intersection of the micro and macro, making the intangible systems of surveillance capitalism visible and relatable. By highlighting the embodied experience of interacting with technology—the subtle choreography of movements and gestures— art connects individual experiences to broader systemic critiques. 

My practice focuses on this choreography because the micro level of embodied interaction is one of the most tangible ways to understand the effects of a surveillance capitalist ecosystem. Through performance, visual language, and experimental inquiry, I aim to contribute to bridging the vastness of macro-level systems with the intimacy of micro-level human-device interactions. This interplay between scales allows for both critical reflection and emotional engagement, making the abstract mechanisms of control more accessible. Ultimately, this essay and research are just a starting point and have thoroughly informed my practice by leading me into the theory and knowledge of the infrastructure behind the screen, bringing accuracy and depth to the inquiry. The vastness and complexity of this infrastructure mean that there is far more to critically explore and analyse. While it’s impossible to encompass it all, focusing on the choreographic and embodied experience offers a pathway to understanding the interplay between individual agency and systemic control. Art plays a crucial role in imagining alternative futures, fostering collective awareness, and challenging the invincibility narrative perpetuated by Big Tech. By operating across the micro and macro scales, artistic practices can illuminate hidden systems, provoke critical engagement, and inspire collective action. No system of power is beyond challenge. With awareness and creativity, we can begin to dismantle these structures and imagine futures that prioritize human dignity over algorithmic control.


2. Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: PublicAffairs, 2019), 9.
3. Tanya Kant, “A History of the Data-Tracked User,” MIT Press Reader, October 8, 2021, https://thereader.mitpress.mit.edu/a-history-of-the-data-tracked-user/.
4. Christine Perfetti, “Guiding Users with Persuasive Design: An Interview with Andrew Chak,” UX Articles by Center Centre, March 1, 2003, https://articles.centercentre.com/chak_interview/.
5. Alejandro Mujica et al., “Addiction by Design: Some Dimensions and Challenges of Excessive Social Media Use,” Medical Research Archives 10 (2022): n.p., https://doi.org/10.18103/mra.v10i2.2677.
6. Hilary Andersson, “Social Media Apps Are ‘Deliberately’ Addictive to Users,” BBC News, July 3, 2018, https://www.bbc.com/news/technology-44640959.
7. Admin, “Vladan Joler,” SHARE Lab (blog), July 6, 2016, https://labs.rs/en/vladan-joler/.
8. Vladan Joler, “New Extractivism,” New Extractivism, accessed October 30, 2024, https://extractivism.online/.
9. Andreas Spahn, “And Lead Us (Not) into Persuasion...? Persuasive Technology and the Ethics of Communication,” Science and Engineering Ethics 18, no. 4 (2012): 633–50, https://doi.org/10.1007/s11948-011-9278-y.
10. Spahn, “And Lead Us (Not) into Persuasion…?”, 643.
11. The instinctive act of glancing at and tending to a smartphone, almost like caring for an infant, is a striking illustration of the deep psychological attachment and dependency with a bodily manifestation that drives my research.
12.  Johann Hari, Stolen Focus: Why You Can’t Pay Attention (London: Bloomsbury Publishing, 2022).
13. Zuboff, The Age of Surveillance Capitalism, 307-308.
14. Russell W. Belk, “Extended Self in a Digital World,” Journal of Consumer Research 40, no. 3 (October 2013): 477–500, https://doi.org/10.1086/671052.
15. Justin Harmon and Lauren Duffy, “Alienation from Leisure: Smartphones and the Loss of Presence,” Leisure/Loisir 46, no. 1 (2022): 1–21, https://doi.org/10.1080/14927713.2021.1886870.
16. Michael Lynch, “Leave My iPhone Alone: Why Our Smartphones Are Extensions of Ourselves,” The Guardian, February 19, 2016, Technology section, https://www.theguardian.com/technology/2016/feb/19/iphone-apple-privacy smartphones-extension-of-ourselves.
17. Jasmina Marić, interview by Irma Beširević, Beyond the Screen: A Dialogue on Data and Humanity, podcast audio, aired December 1, 2024, published by K103 Göteborgs Studentradio, https://open.spotify.com/show/5sGJndUrFBIoii4DBLPz4R?si=7d0c4de7ac8d496d.
18. David Lyon, "Surveillance Capitalism, Surveillance Culture, and Data Politics," in Data Politics, ed. Didier Bigo et al. (London: Routledge, 2019).
19. Extract refers to the process of uncovering or isolating the mechanisms of data collection and inference, making visible the typically hidden layers of surveillance capitalism. Project involves translating these findings into a tangible or visual form, allowing audiences to critically engage with how such systems influence agency and self perception. Together, these terms underscore the dual approach of investigation, representation and communication central to the artistic inquiry.
20. "Watch," Dictionary.com, accessed November 22, 2024, https://www.dictionary.com/browse/watch.
21. Zuboff, The Age of Surveillance Capitalism.
22. Sarah Diefenbach and Laura Anders, "The Psychology of Likes: Relevance of Feedback on Instagram and Relationship to Self-Esteem and Social Status," Psychology of Popular Media 11, no. 2 (2022): 196–207, https://doi.org/10.1037/ppm0000360.
23. Nir Eyal, Hooked: How to Build Habit-Forming Products (New York: Portfolio/Penguin, 2014).
24. Alexis Blue, "How Your Smartphone Is Affecting Your Relationship," ScienceDaily, materials provided by the University of Arizona, accessed October 22, 2024, https://www.sciencedaily.com/releases/2019/02/190211140046.htm.
See also: Emanuela S. Gritti, Robert F. Bornstein, and Baptiste Barbot, "The Smartphone as a ‘Significant Other’: Interpersonal Dependency and Attachment in Maladaptive Smartphone and Social Networks Use," BMC Psychology 11, no. 1 (2023): 296, https://doi.org/10.1186/s40359-023-01339-4.
25. David Sbarra, Julia Briskin, and Richard B. Slatcher, "Smartphones and Close Relationships: The Case for an Evolutionary Mismatch," OSF, November 5, 2018, https://doi.org/10.31234/osf.io/rqu6f.
26. Geoffrey A. Fowler, “Perspective | Is It Time to Delete TikTok? A Guide to the Rumours and the Real Privacy Risks,” Washington Post, July 13, 2020, https://www.washingtonpost.com/technology/2020/07/13/tiktok-privacy/.
27. Chase DiBenedetto, "Your Devices Might Actually Be Listening: Ad Company Claims to Use ‘Active Listening’ Tool," Mashable, December 15, 2023, https://mashable.com/article/your-devices-listening-ad-company-claims.
See also: Andrew Griffin, "Is Your iPhone Listening to You? Probably Not," The Independent, September 3, 2024, https://www.independent.co.uk/tech/is-my-phone-listening-to-me-ad-microphone-privacy-b2606445.html.
See also: Dana Rezazadegan, "Is Your Phone Really Listening to Your Conversations? Well, Turns Out It Doesn’t Have to," The Conversation, June 20, 2021, http://theconversation.com/is-your-phone-really-listening-to-your conversations-well-turns-out-it-doesnt-have-to-162172.
28. Lydia J. Harkin and Daria Kuss, “‘My Smartphone Is an Extension of Myself’: A Holistic Qualitative Exploration of the Impact of Using a Smartphone,” Psychology of Popular Media 10, no. 1 (January 2021): 28–38, https://doi.org/10.1037/ppm0000278.
29. Thomas Nyrup, “Smart Phones as an Embodied Technology,” accessed March 14, 2024, https://www.academia.edu/24584176/Smart_phones_as_an_embodied_technology.
30. Yue Lin et al., “Smartphone Embodiment: The Effect of Smartphone Use on Body Representation,” Current Psychology 42, no. 30 (October 1, 2023): 26356–74, https://doi.org/10.1007/s12144-022-03740-5.
31. ‘Biometric Data - Definition, FAQs’, Innovatrics, accessed 1 January 2025, https://www.innovatrics.com/glossary/biometric-data/.

32. “About Face ID Advanced Technology,” Apple Support, accessed December 12, 2024, https://support.apple.com/en-us/102381.
33. Bernard Marr, “What Is Digital Twin Technology - And Why Is It So Important?” Forbes, accessed November 25, 2024, https://www.forbes.com/sites/bernardmarr/2017/03/06/what-is-digital-twin-technology-and-why-is-it-so important/.
34. Vladan Joler et al., “Quantified Lives on Discount,” SHARE Lab (blog), August 19, 2016, https://labs.rs/en/quantified-lives/.
35. This term has brought me closer to accessing the knowledge and sources I seek, introducing terminology previously outside my scope. While I have yet to find a comprehensive overview, I am following a lead by examining persuasive design from the perspective of its creators and the capitalists who implement these strategies. Many of them regard these choreographies as innovative and beneficial, often lacking a critical stance on their broader implications.
36. “They” strongly evokes associations with conspiracy theories—a territory my artistic practice teeters on the edge of. The term carries connotations of misinformation and ambiguity. Yet, through my inquiry and the work of DISNOVATION.ORG, “they” are now, through the investigation building this essay, distinctly situated within the context of my practice.
37. DISNOVATION.ORG, PROFILING THE PROFILERS, 2019, http://profilingtheprofilers.com/live.html.
38. Nick Srnicek, Platform Capitalism (Cambridge, UK: Polity Press, 2017).
39. Tiziana Terranova, “Attention, Economy, and the Brain,” Culture Machine 13 (2012).
40. Amy Boulton et al., Reality Harvester: Nature After Data After Nature (Skogen, 2020), https://www.skogen.pm/@johan/works/4RaJfT5DwFbePZXF2/www.skogen.pm.
41. Curtis Silver, "Patents Reveal How Facebook Wants To Capture Your Emotions, Facial Expressions and Mood," Forbes, accessed December 15, 2024, https://www.forbes.com/sites/curtissilver/2017/06/08/how-facebook-wants-to capture-your-emotions-facial-expressions-and-mood/.
42. David Ingram, Olivia Solon, Brandy Zadrozny, and Cyrus Farivar, "The Facebook Papers: Documents Reveal Internal Fury and Dissent over Site’s Policies," NBC News, October 25, 2021, https://www.nbcnews.com/tech/tech news/facebook-whistleblower-documents-detail-deep-look-facebook-rcna3580.
43. HASE Fiero, "World War 3 Is an Information War," Medium, July 27, 2024, https://information warfare.com/world-war-3-is-an-information-war-449ccdbad644.
See also: Tekla Berozashvili, ‘World War 3: The Ethical Dimensions of Information Warfare in the Digital Age’, ResearchGate, accessed 14 December 2024, https://www.researchgate.net/publication/380069837_World_War_3_The_Ethical_Dimensions_of_Information_Warf are_in_the_Digital_Age. 
See also: Jarred Prier, ‘Commanding the Trend: Social Media as Information Warfare’, Strategic Studies Quarterly 11, no. 4 (2017): 50–85. https://www.jstor.org/stable/26271634
44. "1.2 Billion Euro Fine for Facebook as a Result of EDPB Binding Decision," European Data Protection Board, accessed May 11, 2024, https://www.edpb.europa.eu/news/news/2023/12-billion-euro-fine-facebook-result-edpb binding-decision_en.
45. Torbert, Preston M., ‘“Because It Is Wrong”: An Essay on the Immorality and Illegality of the Online Service Contracts of Google and Facebook’, Case Western Reserve Journal of Law, Technology and the Internet 12 (2021 2020), https://heinonline.org/HOL/P?h=hein.journals/caswestres12&i=1.
See also: Christopher W. Chagnon Hagolani-Albov Sophia E., ‘Data Extractivism: Social Pollution and Real-World Costs’, in The European Digital Economy (Routledge, 2023).
See also: Alejandro Mujica et al., “Addiction by Design: Some Dimensions and Challenges of Excessive Social Media Use,” Medical Research Archives 10 (January 1, 2022), https://doi.org/10.18103/mra.v10i2.2677. 46 ‘Key Issues Overview - Center for Humane Technology’, accessed 26 October 2024, https://www.humanetech.com/key-issues.
47. Beatriz Botero Arcila and Rachel Griffin, “Social Media Platforms and Challenges for Democracy, Rule of Law and Fundamental Rights,” requested by the European Parliament's Committee on Civil Liberties, Justice and Home Affairs, completed April 2023, http://www.europarl.europa.eu/supporting-analyses. 48 Paul B. Preciado, “BAROQUE TECHNOPATRIARCHY: REPRODUCTION,” Artforum (blog), January 1, 2018, https://www.artforum.com/features/baroque-technopatriarchy-reproduction-237175/.
See also: Alyssa Adamson, “Capitalism and the Problem of Intersectionality,” Public Seminar, April 25, 2015, https://publicseminar.org/2015/04/capitalism-and-the-problem-of-intersectionality/.
See also: Deborah K. King, “Multiple Jeopardy, Multiple Consciousness: The Context of a Black Feminist Ideology,” Signs: Journal of Women in Culture and Society 14, no. 1 (October 1988): 42–72, https://doi.org/10.1086/494491.
49. VNS Matrix, “A Cyberfeminist Manifesto for the 21st Century / VNS Matrix,” January 16, 2018, https://vnsmatrix.net/projects/the-cyberfeminist-manifesto-for-the-21st-century.
50. Patuca Rodríguez, ‘“SOLAS” es una coreografía sobre la sobreexposición del cuerpo femenino’, Cultura Inquieta, 23 July 2024, https://culturainquieta.com/estilo-de-vida/solas-es-una-coreografia-sobre-la-sobreexposicion-del cuerpo-femenino/.
51. John G. Adair, “The Hawthorne Effect: A Reconsideration of the Methodological Artifact,” Journal of Applied Psychology 69, no. 2 (May 1984): 334–45, https://doi.org/10.1037/0021-9010.69.2.334.
52. Eva & Franco Mattes, SELL YOUR PHONE (AND ALL OF YOUR PHOTOS) FOR $1,000, 2020, https://0100101110101101.org/sell-your-phone-and-all-of-your-photos-for-1000/.
53. Roland Toth and Tatiana Trifonova, “Somebody’s Watching Me: Smartphone Use Tracking and Reactivity,” Computers in Human Behavior Reports 4 (August 1, 2021): 100142, https://doi.org/10.1016/j.chbr.2021.100142.
54. Louise Wolthers, Dragana Vujanović Östlind, and Niclas Östlind, WATCHED! Surveillance, Art and Photography (Cologne: Verlag der Buchhandlung Walther König, 2016).
55. Similarly, using the same tactic, the focus in this essay has been from an individualistic perspective -reframing it as a micro-perspective in order to understand the macro.
56. Pinpointing it down to “a grotesque concentration of wealth and power among a select few” leads to a logical assumption that the core issue is capitalism itself.