Project Overview



Inside Out



Inside Out is an embodied interaction device that transforms hand gestures into musical expressions. We want to explore a new way for remote connection building, allowing them to share and understand each other's moods beyond words.Duration
12 Weeks, Spring 2024

Course
Embodied Interaction Design Studio
# Tangible and Embodied Interaction
# Research Through Design
# Arduino Prototyping
# Human-computer Interaction
# Machine Learning




In today's hyper-connected yet emotionally distant world, we explored how tangible interaction could help people communicate emotions with loved ones across distances. “Inside Out” is a gesture-based musical device designed to express and receive moods without words. This project combines emotional design, collaborative activity, and music interaction to propose a new way of staying close—emotionally—when physically apart.



Methodology


*Everyday things and objects having intelligence and agency for becoming collaborative partners (Rozendaal et al, Objects with Intent: Designing Everyday Things as Collaborative Partners, 2019).
*«RTD involves creating artifacts as a means to explore and reflect upon design practice and its implications and facilitates an iterative process where design serves both as a method of inquiry and a means of articulating insights» (P. Dalsgaard, Research In and Through Design, An Interaction Design Research Approach, 2010).

Object with intent
Research Through Design

<------------------------------------------------------------------------------------------------------------->
Our Strategy
  • Responsiveness according to what is happening
  • Adaptability to its owner
  • Exploring new alternatives





User Research

Research QuestionHow can me map the relationship between Mood x Gesture x Melodies?



Research MethodsMethod 01: Questionnaire
(quantitative data)

The questionnaire helps us collect various demographic information on the users and understand their relationship between moods and music.

It also enables us to identify:
– which moods users commonly experience in daily life
– which musical parameters most influence users' emotional states

       Method 02: Participant Observation
+ Exploratory Research approach (qualitative data)
+ Music Database(quantitative data)

We aimed to understand:
– how mood  can be translated into visible and audible forms
– through which actions users naturally communicate emotions
– how different types of gestures correlate with emotional states (gesture–mood coupling)

By observing participants’ spontaneous interactions, we gained insights into embodied expressions of emotion and how users externalize internal moods through music-related actions.


User Feedbacks

1 How mood can be translated into something visible and audible




2 Which intuitive actions users naturally communicate moods






Physical Interface Exploration



ConceptOur device consists of two main components: a handheld controller and a base unit.
The controller is held by the user and is responsible for detecting physical gestures and movements.
The base serves as the central unit for receiving and transmitting sound, while also housing essential internal hardware components.



Two Proposals

1/Classic telephone

We drew inspiration from the classic telephone. This device, like our concept, facilitates one-on-one emotional communication, supports long-distance connections, and serves as a tool for emotional exchange.
2/Gift box

In this concept, we draw a metaphor to a letterbox. The expression of emotions from the other party is likened to a letter. The processes of receiving and sending letters symbolize the emotional exchange. The act of opening the box carries a sense of anticipation and curiosity, akin to the excitement of discovering what’s inside a letterbox.
Letters and gifts serve as emotional connections across physical distances, which aligns perfectly with our primary theme.

Conclusion:
 
❌ Difficult to solve technical problems
Conclusion:

✅ More intuitive through the product modeling
✅ Easier to achieve from the technical level




User Test


InsightsBased on the results of our physical interaction user test, we decided to move forward with the gift box proposal and abandon the classic cellphone metaphor.
We also redesigned the shape of the controller within the gift box concept, as users’ intuitive interactions with the original form did not align with its intended functionality.




Physical Interface Development



The device consists of two main components: a handheld controller and a base unit.

The controller is held by the user and detects hand gestures and hand movement using a pressure sensor and the built-in IMU of the Arduino Nano. It is connected to the base via a wired connection.

The base unit serves as the central system for processing and feedback. It integrates components such as the RFID tag, photoresistor, RGB ring LED, and speaker, enabling a multi-sensory interactive experience.
The RFID tag provides signals for detecting the placement of the controller.

On the software side, we trained a machine learning model to map user actions to commonly experienced emotional states and generate corresponding music feedback in real time.

Sensor
Built-in IMUPressure Sensor

Detect actions and identify corresponding moods through motion trajectories.Adjusting the BPM of the melody according to pressure







*Iterations of prototyping and testing






Machine Learning

Environment:
Hardware: Arduino nano
Software: Edge Impulse





Procedure:
01. BUILD THE DATA SET02. TRAIN  THE  MODEL03. TEST THE MODEL

a. Collect movement data
  • Collect movement data like shaking, swinging and so on by using sensors, specifically using accelerometer of mpu 
  • Each sample includes accelerometer data on the x, y, and z axes. 

b. Classify movement data by labeling mood
  • Different type of movement data samples are labeled with corresponding mood tags such as “Stress,” “Calm,” “Positive,” and “Anger.”
  • Train a machine learning model using the extracted feature data to build an mood-movement classification model.
  • During training, the dataset is split into training and test sets (as shown in the images, 73% for training and 27% for testing).
  • Evaluate the model’s performance to ensure it can accurately classify moods on unseen data.
  • Use the test set to evaluate the model, checking its classification accuracy across different mood categories, as shown in the images with the correct and incorrect rates for each mood.
  • Identify misclassified samples for further improvement.






Interaction Mapping



Music ParametersWe made several iterations to our music system.

In one user test, participants reported that the melodies were too short and lacked clear emotional cues, making them difficult to distinguish. Based on this feedback, we made improvements.

We extended the length and increased the complexity of the melodies, while still keeping them simple and easy to remember. We also aligned the rhythm of the melody more closely with the user’s hand movement frequency to enhance the sense of embodiment.

In addition, for each emotional track, we experimented with different instrument timbres to reflect the textures that users had visualized during earlier sessions. Through iterative development and user feedback, we refined our approach to creating emotionally evocative melodies.


Mapping
Gesture
Melody
Mood Positive
Instrument: Marimba
Default Tempo: 120
   Calmness
   Instrument: Woodwind
   Default Tempo: 80
    Nervousness
    Instrument: Synth
    Default Tempo: 120
     Anger
    Instrument: Synth
    Default Tempo: 120




Interaction Flow

Receive Mode
/1 Notice the signals
The box will light up to inform you of the new message received.
/2 Open the box
Open it to start to play the sounds produced by another user.
/3 Listen to other’s sounds
The sounds will be automatically played until it’s finished.
/4 Close the box
Close the box to stop the sounds playing and end the process.




Sending Mode
/1 Open the box
You will see a controller that will help you express your mood.
/2 Take the controller
Pick up the controller and express your mood by moving it.
/3 Express your moods
The box will start to play sounds in real-time according to your different movements related to moods.
/4 Put back the controller
Put back the controller, close the box to send your mood to another user living far away.



Gallery 





Video 




Recognition

Team Member

Interdependence x Milano Design Week, Milan, ItalyAndrea Borsato
Lisa Buttaroni

Kaiyuan Liu
Zixin Mou
Chunhan Yi