
Inside Out
How might we foster a deeper emotional connection between people from distance?
Challenge of Emotional Resonance
Long-distance relationships present significant challenges to emotional expression, understanding, and empathy, hindering deep connections. Though nowadays we have advanced communication technologies, texts and images still cannot fully capture people's feelings in some situations.
Research Through Design
The iterative creation and testing of tangible prototypes served as our primary method for actively exploring and generating new insights into how technology can mediate remote emotional connection.
To Develop a Deeper Understanding
The iterative creation and testing of tangible prototypes served as our primary method for actively exploring and generating new insights into how technology can mediate remote emotional connection.
Design Brief
A tangible artifact that allows users to capture and translate their moods into unique sounds through physical manipulation, and for close relationships (families, couples, friends) to connect from their own private spaces when far apart.
Mapping the Gestures × Moods × Sounds
We ran several generative workshops to observe how users intuitively express moods through gestures with everyday objects. By analyzing these interaction patterns and their musical preferences, we built a foundational map linking physical actions to specific audio-visual languages.
The core technical challenge was translating raw gesture data into meaningful, real-time music. This required a dual approach: we analyzed sound synthesis to identify auditory parameters that convey emotion, while simultaneously iterating on the technical implementation.
By centering the solution on an accelerometer and the Tone.js library, we created a stable system capable of instantly pairing music clips with recognized movements.
User feedback led us to a two-part design: a base and a controller. We explored this concept through "traditional telephone" and "gift box" metaphors with low-fidelity mockups.
The chosen direction was then refined via an iterative cycle of 3D printing and testing, continuously improving ergonomics and hardware integration to arrive at the final form.
A Tangible Bridge for Moods
Inside Out is a device that transforms natural gestures into music to convey different moods, building emotional bridges with others through tangible interaction at a distance.
Shaped like a simple cube, it embodies the essence of the letter box.
Interaction States
The device is designed to work in pairs and at a distance. It enables bidirectional communication where a user can craft and send a musical expression of their mood, while also receiving and playing expressions sent from their partner.
1. Open the box
You will see a controller that will help you express your mood.
2. Take the controller
Pick up the controller and express your mood by freely moving it.
3. Express your moods
The box will play different sounds and lights in real-time, according to different movements related to moods.
4. Put back the controller
Put back the controller, after a green light loading notification, close the box to send your mood to another user living far away.
1. Notice the signal
The box will light up to inform you of the new message received.
2. Open the box
Open it to start playing the sounds produced by another user.
3. Listen to other’s sounds
The sounds will be automatically played until it’s finished.
4. Close the box
Close the box to stop the sounds playing and end the process.
Mood Gestures
The final design achieved four moods, each mapping a specific gesture to a unique musical track and colored light. Performing a gesture with the controller triggers the base to instantly play the corresponding sound and illuminate in response.
Squeeze to modulate the frequency
For an additional layer of expressive control, user can also squeeze the controller to manipulate the BPMs of the melody to make the mood even stronger.
Hardware and Components
The prototype consists of two core components: a base for output and a controller for input.
The base unit provides synchronized audio-visual feedback through a Bluetooth speaker and an RGB LED ring. A light sensor detects when it is opened.
The Controller uses an Arduino Nano with a built-in IMU to read gestures, a pressure sensor for squeeze-based modulation, and an RFID reader to start and stop the interaction.
Machine Learning
The model was trained on Edge Impulse with the gesture data from user interviews and testings. We assigned different moods labels to classify them, in order to trigger corresponding music melodies and light colours. The more you train, the more accurate it will be.