Inside Out

How might we foster a deeper emotional connection between people from distance?

During a school project from the Embodied Interaction Design Studio, we wanted to explore a new way of building remote connections, allowing people to share and understand each other's moods beyond words. Inside Out is a tangible device that bridges emotional distances by translating gestures into music.

During a school project from the Embodied Interaction Design Studio, we wanted to explore a new way of building remote connections, allowing people to share and understand each other's moods beyond words. Inside Out is a tangible device that bridges emotional distances by translating gestures into music.

Embodied Interaction

Embodied Interaction

Emotion&Mood

Emotion&Mood

Music Therapy

Music Therapy

Machine Learning

Machine Learning

Year
Year
Time
Time
Team
Team
Role
Role
2024
2024
5 Months
5 Months
5 People
5 People
UX, Industrial Design
UX, Industrial Design

PROBLEM

PROBLEM

Challenge of Emotional Resonance

Long-distance relationships present significant challenges to emotional expression, understanding, and empathy, hindering deep connections. Though nowadays we have advanced communication technologies, texts and images still cannot fully capture people's feelings in some situations.

How could we facilitate more natural and meaningful emotional communication in an increasingly connected yet physically distant world?

How could we facilitate more natural and meaningful emotional communication in an increasingly connected yet physically distant world?

APPROACH

APPROACH

Research Through Design

The iterative creation and testing of tangible prototypes served as our primary method for actively exploring and generating new insights into how technology can mediate remote emotional connection.

RTD approach: A methodology where insights are generated directly from the process of creating and testing the design artifact itself.

RTD approach: A methodology where insights are generated directly from the process of creating and testing the design artifact itself.

RESEARCH

RESEARCH

To Develop a Deeper Understanding

The iterative creation and testing of tangible prototypes served as our primary method for actively exploring and generating new insights into how technology can mediate remote emotional connection.

Why is it crucial for people to feel emotionally connected?

user analysis

Why is it crucial for people to feel emotionally connected?

user analysis

Why is it crucial for people to feel emotionally connected?

user analysis

How to foster empathy and help people express naturally when apart?

observations

How to foster empathy and help people express naturally when apart?

observations

How to foster empathy and help people express naturally when apart?

observations

What can we learn from other devices designed to promote emotional connection?

case studies

What can we learn from other devices designed to promote emotional connection?

case studies

What can we learn from other devices designed to promote emotional connection?

case studies

Design Brief

A tangible artifact that allows users to capture and translate their moods into unique sounds through physical manipulation, and for close relationships (families, couples, friends) to connect from their own private spaces when far apart.

DEVELOPMENT

DEVELOPMENT

Mapping the Gestures × Moods × Sounds

I. Decoding the language of emotion and action

I. Decoding the language of emotion and action

We ran several generative workshops to observe how users intuitively express moods through gestures with everyday objects. By analyzing these interaction patterns and their musical preferences, we built a foundational map linking physical actions to specific audio-visual languages.

Exploration experiment of tangible interactions

Exploration experiment of tangible interactions

Questionnaires, participant interviews and responses

Questionnaires, participant interviews and responses

II. Learn by trying, from movement to music

II. Learn by trying, from movement to music

The core technical challenge was translating raw gesture data into meaningful, real-time music. This required a dual approach: we analyzed sound synthesis to identify auditory parameters that convey emotion, while simultaneously iterating on the technical implementation.

By centering the solution on an accelerometer and the Tone.js library, we created a stable system capable of instantly pairing music clips with recognized movements.

Analysis on music database and sound synthesis

Analysis on music database and sound synthesis

Testing of convertibility and technical feasibility of motion parameters

Testing of convertibility and technical feasibility of motion parameters

III. Shaping the physical form and interaction

III. Shaping the physical form and interaction

User feedback led us to a two-part design: a base and a controller. We explored this concept through "traditional telephone" and "gift box" metaphors with low-fidelity mockups.

The chosen direction was then refined via an iterative cycle of 3D printing and testing, continuously improving ergonomics and hardware integration to arrive at the final form.

Physical interface testing

Physical interface testing

Iterations of prototyping and testing

Iterations of prototyping and testing

FINAL DESIGN

FINAL DESIGN

A Tangible Bridge for Moods

Inside Out is a device that transforms natural gestures into music to convey different moods, building emotional bridges with others through tangible interaction at a distance.

Shaped like a simple cube, it embodies the essence of the letter box.

USER FLOW

USER FLOW

Interaction States

The device is designed to work in pairs and at a distance. It enables bidirectional communication where a user can craft and send a musical expression of their mood, while also receiving and playing expressions sent from their partner.

Sending Mode

Sending Mode

1. Open the box

You will see a controller that will help you express your mood.

2. Take the controller

Pick up the controller and express your mood by freely moving it.

3. Express your moods

The box will play different sounds and lights in real-time, according to different movements related to moods.

4. Put back the controller

Put back the controller, after a green light loading notification, close the box to send your mood to another user living far away.

Receiving Mode

Receiving Mode

1. Notice the signal

The box will light up to inform you of the new message received.

2. Open the box

Open it to start playing the sounds produced by another user.

3. Listen to other’s sounds

The sounds will be automatically played until it’s finished.

4. Close the box

Close the box to stop the sounds playing and end the process.

CONTROL

CONTROL

Mood Gestures

The final design achieved four moods, each mapping a specific gesture to a unique musical track and colored light. Performing a gesture with the controller triggers the base to instantly play the corresponding sound and illuminate in response.

Happy

Happy

Happy

Calm

Calm

Calm

Stressed

Stressed

Stressed

Angry

Angry

Angry

Squeeze to modulate the frequency

For an additional layer of expressive control, user can also squeeze the controller to manipulate the BPMs of the melody to make the mood even stronger.

STRUCTURE

STRUCTURE

Hardware and Components

The prototype consists of two core components: a base for output and a controller for input.

The base unit provides synchronized audio-visual feedback through a Bluetooth speaker and an RGB LED ring. A light sensor detects when it is opened.

The Controller uses an Arduino Nano with a built-in IMU to read gestures, a pressure sensor for squeeze-based modulation, and an RFID reader to start and stop the interaction.

TECHNICS

TECHNICS

Machine Learning

The model was trained on Edge Impulse with the gesture data from user interviews and testings. We assigned different moods labels to classify them, in order to trigger corresponding music melodies and light colours. The more you train, the more accurate it will be.

Video

kaiyuan7liu@gmail.com

© KAIYUAN LIU PORTFOLIO 2025

ITALY

12:24:06

© KAIYUAN LIU PORTFOLIO 2025

ITALY

12:24:06