
When the Observer Becomes the Observed

Project Overview
"When the Observer Becomes the Observed" is an interactive installation that delves into the intricate relationships between humans, technology, and their environments. At its core, the project emphasizes the concept of "sensing" as an active engagement, contrasting traditional sensors with human perception.
This work aims to introduce the essence of situated sensing. AI serves as a tool that offers fixed inputs and open-ended outputs, facilitating a departure from one-to-one mappings in data interpretation. This underscores the dynamic and performative nature of data, highlighting that understanding is not merely a passive display but an active, context-driven process.
Drawing on Karen Barad's theories of agential realism, the project reflects on the agency of measurement instruments—how tools like environment sensors influence the knowledge acquisition of the observed object, influencing the action of the observer. Traditional sensors function as exact measuring instruments to represent the physical world. Conversely, humans are subjective, inaccurate sensors, whose measured quantities are often influenced by their perception of the environment.
In this project, we speculate on the possibility of a sensing system where the human being becomes the sensor. We observe the human sensor through their movement, as a performative expression of the measure. We explore this concept through site-specific embodied practices, gathering movement and textual expressions related to environmental sensing from professional choreographers. This data is then utilized to fine-tune a large language model that generates abstract words based on the input movements. This language model is integrated into a system as a critical alternative to the technology of sensing, where the process of sensing turns into reflecting the observer's subjective body expression rather than providing 1:1 mapping of the phenomenon.
Workshop 1
We explored how humans can act as sensors to measure the environment and how they express the collected data. In this workshop, we organized participants to use their senses — smell, touch, sound, and vision — to measure the environment separately. They then expressed their perceptions through drawing, three-line stories, and performance. The figure on the right shows our classification and analysis of participants’ behaviors and performances.


Workshop 2&3
In these two workshops, we mainly invited professional choreographers and participants from physical theater. Participants were asked to position themselves as sensors, using their sensory perceptions — smell, sound, vision, touch, temperature, and humidity — to measure the environments. They were then instructed to isolate different parts of their bodies (e.g., head, hands, core, knees, feet) and provide a simple description to express the perception sensed by each body part. The collected data is then utilized to fine-tune a large language model that generates abstract words based on the input movements. This language model is integrated into a system as a critical alternative to conventional sensing technologies, where the process of sensing shifts toward reflecting the observer’s subjective bodily expression rather than providing a direct 1:1 mapping of the phenomenon.


Installation
When the Observer Becomes the Observed is an interactive installation exploring the entangled agencies among humans, computational systems, and environments. It speculates on the possibilities of a sensing system in which humans themselves become active sensors. In this installation, the audience, the environment, and the AI system together form an apparatus of intra action, offering a critical alternative to the conventional 1:1 model of data representation.
The work is set up within a semi private zone framed by curved, semi transparent fabric. A ceiling mounted projector cycles through a series of projected environments, each accompanied by its own soundscape and creating a shifting atmospheric context. On the left side of the fabric, a pre recorded performance video serves as a behavioural reference. On the right side, an interactive system responds to the audience in real time.
Participants are encouraged to act as human sensors, measuring and expressing environmental data through movements or daily actions by an overhead camera. An integrated AI system translates these actions into dynamically generated words and narrative sequences, projecting them onto the fabric in real time and providing reference back to the observer for further reflection.
In response to the theme "When the Observer Becomes the Observed," the installation’s projected environment extends beyond the physical boundaries of the space. As people move around the outer edges of the installation, they find themselves already within the projection—becoming part of the environment before even entering it. When they peer through the translucent fabric into the interior, they are not only immersed in the projection but also become part of the environments—simultaneously observing and being observed by those within.




Art Paper
Traditional sensors function as exact measuring instruments to represent the physical world. Conversely, humans are subjective, inaccurate sensors, whose measured quantities are often influenced by their perception of the environment. In this project, we speculate on the possibility of a sensing system where the human being becomes the sensor. We observe the human sensor through their movement, as a performative expression of the measure. We explore this concept through site-specific embodied practices, gathering movement and textual expressions related to environmental sensing from professional choreographers. This data is then utilized to fine-tune a large language model that generates abstract words based on the input movements. This language model is integrated into a system as a critical alternative to the technology of sensing, where the process of sensing turns into reflecting the observer's subjective body expression rather than providing 1:1 mapping of the phenomenon.
The art paper has been accepted and will be presented at SIGGRAPH Asia in Hong Kong this December.
Authors: Zhen Wu, Xiaomin Fan, Mika Shirahama, Tristan Braud


