Inside Out
12 Weeks, Spring 2024
Course
Embodied Interaction Design Studio
# Research Through Design
# Arduino Prototyping
# Human-computer Interaction
# Machine Learning
Methodology
*Everyday things and objects having intelligence and agency for becoming collaborative partners (Rozendaal et al, Objects with Intent: Designing Everyday Things as Collaborative Partners, 2019).
Object with intent
Our Strategy
- Responsiveness according to what is happening
- Adaptability to its owner
- Exploring new alternatives
User Research
(quantitative data)
The questionnaire helps us collect various demographic information on the users and understand their relationship between moods and music.
It also enables us to identify:
– which moods users commonly experience in daily life
– which musical parameters most influence users' emotional states
+ Exploratory Research approach (qualitative data)
+ Music Database(quantitative data)
We aimed to understand:
– how mood can be translated into visible and audible forms
– through which actions users naturally communicate emotions
– how different types of gestures correlate with emotional states (gesture–mood coupling)
By observing participants’ spontaneous interactions, we gained insights into embodied expressions of emotion and how users externalize internal moods through music-related actions.
User Feedbacks
1 How mood can be translated into something visible and audible
2 Which intuitive actions users naturally communicate moods
Physical Interface Exploration
The controller is held by the user and is responsible for detecting physical gestures and movements.
The base serves as the central unit for receiving and transmitting sound, while also housing essential internal hardware components.
Two Proposals
We drew inspiration from the classic telephone. This device, like our concept, facilitates one-on-one emotional communication, supports long-distance connections, and serves as a tool for emotional exchange.
In this concept, we draw a metaphor to a letterbox. The expression of emotions from the other party is likened to a letter. The processes of receiving and sending letters symbolize the emotional exchange. The act of opening the box carries a sense of anticipation and curiosity, akin to the excitement of discovering what’s inside a letterbox.
Letters and gifts serve as emotional connections across physical distances, which aligns perfectly with our primary theme.
❌ Difficult to solve technical problems
✅ More intuitive through the product modeling
✅ Easier to achieve from the technical level
User Test
We also redesigned the shape of the controller within the gift box concept, as users’ intuitive interactions with the original form did not align with its intended functionality.
Physical Interface Development
The device consists of two main components: a handheld controller and a base unit.
The controller is held by the user and detects hand gestures and hand movement using a pressure sensor and the built-in IMU of the Arduino Nano. It is connected to the base via a wired connection.
The base unit serves as the central system for processing and feedback. It integrates components such as the RFID tag, photoresistor, RGB ring LED, and speaker, enabling a multi-sensory interactive experience.
The RFID tag provides signals for detecting the placement of the controller.
On the software side, we trained a machine learning model to map user actions to commonly experienced emotional states and generate corresponding music feedback in real time.
Sensor
*Iterations of prototyping and testing
Machine Learning
Software: Edge Impulse
- Collect movement data like shaking, swinging and so on by using sensors, specifically using accelerometer of mpu
- Each sample includes accelerometer data on the x, y, and z axes.
b. Classify movement data by labeling mood
- Different type of movement data samples are labeled with corresponding mood tags such as “Stress,” “Calm,” “Positive,” and “Anger.”
- Train a machine learning model using the extracted feature data to build an mood-movement classification model.
- During training, the dataset is split into training and test sets (as shown in the images, 73% for training and 27% for testing).
- Evaluate the model’s performance to ensure it can accurately classify moods on unseen data.
- Use the test set to evaluate the model, checking its classification accuracy across different mood categories, as shown in the images with the correct and incorrect rates for each mood.
- Identify misclassified samples for further improvement.
Interaction Mapping
In one user test, participants reported that the melodies were too short and lacked clear emotional cues, making them difficult to distinguish. Based on this feedback, we made improvements.
We extended the length and increased the complexity of the melodies, while still keeping them simple and easy to remember. We also aligned the rhythm of the melody more closely with the user’s hand movement frequency to enhance the sense of embodiment.
In addition, for each emotional track, we experimented with different instrument timbres to reflect the textures that users had visualized during earlier sessions. Through iterative development and user feedback, we refined our approach to creating emotionally evocative melodies.
Mapping
Instrument: Marimba
Default Tempo: 120
Instrument: Woodwind
Default Tempo: 80
Instrument: Synth
Default Tempo: 120
Instrument: Synth
Default Tempo: 120
Interaction Flow
Receive Mode
The box will light up to inform you of the new message received.
Open it to start to play the sounds produced by another user.
The sounds will be automatically played until it’s finished.
Close the box to stop the sounds playing and end the process.
Sending Mode
You will see a controller that will help you express your mood.
Pick up the controller and express your mood by moving it.
The box will start to play sounds in real-time according to your different movements related to moods.
Put back the controller, close the box to send your mood to another user living far away.
Gallery
Video
Recognition
Team Member
Lisa Buttaroni
Kaiyuan Liu
Zixin Mou
Chunhan Yi