Music has powers. Music can enhance our happiness when we feel excited; it can relieve us when we feel nervous. Moreover, music connects people together. Harmoniiize is a connected instrument, which allows anyone to create music using gestures and express their creativity through musical notes.

0
IoT Scenario
Rippen Liu - https://youtu.be/WXr7-uwid3w
0

Introduction

Our current studio is a highly stressful environment. We come to the studio for meetings, individual work, team projects, and we spend most of our work time in this space. Despite it being a highly productive space, we often burn out, and our productivity deteriorates significantly over time. 

The studio doesn't have to be such a negative environment, and Harmoniiize can make the change. Harmoniiize allows anyone to express their feelings, identity, and creativity through improvising music, and de-stress from all the work happening in the studio. In addition, Harmoniiize allows users to collaborate with others on making music, allowing them to connect with each other at another level through the non-verbal, musical communication. Harmoniiize is a musical oasis in the studio, which heals us and recharges us.


How It Works

Similar to Theremin, Harmoniiize's gesture and distance sensors detect and sense the gestures. Horizontal movement determines the notes of the sound, and vertical movement decides the octaves. Harmoniiize has five distinctive classical instruments' sounds: flute, violin, piano, cello, contrabass. The selection of the instrument types is determined by the feelings of studio users gauged by Feel-O-Meter as noted below.

Feel-O-Meter Indication

Harmoniiize Instrument

Very Happy

Flute

Happy

Guitar

Neutral

Piano

Sad

Cello

Very Sad

Drum


Once a user finishes making the music, he/she would press the button to indicate the completion of the session, which would trigger Harmoniiize to upload the music file to communal MIIPS SoundCloud which opens possibilities for remaking and collaboration. 

0

Process

Ideation  

The ideation process started with a brainstorming session. The brainstorming session included a discussion about problems of the studio space which each team member discovered through the previous individual discovery process. 

The problem that we identified is that the studio is an interactive learning space, students come here to study, chat and learn. But sometimes workload and deadlines can get stressful. 

How might we help MII-PS student feel re-energized during times of stress? 

We thought about the idea of an interactive music station. The initial idea was a music station situated in the kitchen (a relaxing area for MIIPS students) for us to play music freely with our body and hand gestures. Play music alleviates emotions, and we want everyone, regardless of their musical experience and knowledge, be able to create music individually or collaboratively with friends. 

Inspirations & Precedents

Video of a K-POP musician making music - "[I Live Alone] Henry Singing 'Uptown Funk'

Theremin: https://www.youtube.com/watch?v=-QgTF8p-284


Iteration 

After the initial dry run of the presentation, we received valuable feedback from the judges, which helped us push our ideas further and re-evaluate the key elements and value propositions of Harmoniiize. We carefully thought about how to leverage data to provide meaningful values to the users, and came up with the idea of connecting Harmoniiize with the collective mood data of MIIPS student. 

Why music?

During our first ideation/brainstorming session, our team had a consensus on music being the most effective and accessible method of stress relief for ourselves. We believed that listening to music and creating music provide a therapeutic effect during the time of mental distress. We confirmed our experience and hypothesis through secondary research on music's effect on us, and we found the concept of "catharsis" of music particularly appealing. Catharsis refers to "the process of releasing and thereby providing relief from, strong or repressed emotions" (Cambridge Dictionary). Music provides us with a channel to express the emotions and thoughts that cannot be conveyed through words and allows us to feel the "vicarious emotions" associated with the music. By listening to or creating music that suits the mood and state, we find freedom and relaxation.

One of the judges during the final presentation suggested that there are research studies around improving productivity; creating intentional interruptions in workflow and engaging other creative activities are proven to help enhance the performance. 

Why connected? 

We observed the behavior of students in the MIIPS cohort around musical experience. We noticed that a lot of students, as well as non-MIIPS students peers, share the music they are listening to or that they enjoy on social media. Many of those who know how to play musical instrument post videos of themselves practicing or playing music and share with friends. At a larger population scale, SoundCloud is the most prominent online platform for anyone with musical interests and talents shares their musical production and collaborate with other musicians. However, filming videos or recording the music, editing and sharing currently is a messy process. We saw this as an opportunity for Harmoniiize; Harmoniiize automatically uploads the recordings of music improvised through the gestures to the communal SoundCloud to which anyone has access to for listening, sharing on social media, or collaborating.

In addition, we realized that there is an interesting intersection between Harmoniiize and two other projects in the IoT class ecosystem, Feel-O-Meter, and Bump. The connection with these two projects would potentially enhance the connected experience of the students in the studio as a whole. The further explanation is provided in the next section.

0

Storyboard


0

Connect

Feel-O-Meter: Feel-O-Meter is an interactive device for feeling and feedback expression. 

How we interact

Feel-O-Meter displays the average feedback or feeling of the students exiting the studio via five different icons and matching color LED lights. Harmoniiize receives the feedback data from Feel-O-Meter and translates them into five different types of musical instruments. 

Why we connect

The common ground between Feel-O-Meter and Harmoniiize is that both projects aim to provide students with an outlet to express their feelings. Harmoniiize would provide Feel-O-Meter with an additional channel for students to be informed on the feeling/feedback, as Harmoniiize would embody and reflect the student's feelings at the time.


Bump: Bump is a connected treasure hunt project which transforms the entire studio space to an interactive interface. 

How we interact

When Harmoniiize is activated by a user, Bump would receive the data that IMS is on and turn on its lights around the studio. The lights function as the messengers to users in the studio that someone is de-stressing with Harmoniiize, encouraging them to participate and socialize. 

Why we connect

We saw that Bump and Harmoniiize projects share a common vision to transform the studio space to a fun, positive one from a dull, boring one of today. Through this collaboration, we expect to create a synergy, which would help both projects reach the goal effectively and successfully. 

0

Challenges

1. A primary application challenge was configuring and operating the FeatherWing Music Maker Shield, due to lack of libraries for the Particle Argon. This resulted in a singular speaker sound rather than implementing different instruments according to the mood data. This also restricted us to use the inbuilt PWM function of Argon, Tone(), which converts the Argon into a low fidelity DAC.

2. Due to a single I2C communication channel, an amplifier could not be connected for amplifying the produced notes. Due to this, we had to implement a self-made low fidelity amplifier using an NPN transistor, capacitor, resistor and an external power supply.

0

Future Applications & Vision

In the near future, we envision a truly interactive music experience for people to relax and have fun in the studio. 

In specific, we envision to achieve the following functionalities:

1. Gesture recognition not limited to hands on a limited area. Ideally, a person does not have to be limited by only hands gestures or a limited area. The person can utilize his/her whole body to perform movements similar to dance moves to create music. This requires movement sensor similar to Microsoft's Kinect, which is capable of identifying movements in a large area.

2. Gestures recognition can recognize multiple people's movements simultaneously for melody generation. Currently, the station cannot multitask. Due to the limitations of the distance sensor, only one point of input can be recognized. In the future vision, multiple people's gestures can be recognized, and the social bonding will be the key element of the station, as the group of participants needs rapport to achieve harmonious outcomes.

3. Leverage music data stored in the database for friends to collaborate, edit and retrieve. 

0
Interactive Music Station - IoT
Rippen Liu - https://www.youtube.com/watch?v=GrKsi0Cfcy0
0

Bill of Material

Particle Argon     x 1

Sparkfun ZX gesture and distance sensor     x 1

4ohm-3W speaker     x 1

External Power Supply     x 1

Potentiometer     x 1

NPN Transistor     x 1

22 microfarad capacitor     x 1

2.2 kOhm resistor x 1

LED    x 1


0

Codes

0
#include "ZX_Sensor.h" //Library for gesture and distance sensor

#define speakerPin D3 // Speaker
#define led D4 //LED

int note; //Note frequency value
int x; //X axis distance of hand
int z; //Z axis distance of hand
int xVal; //Note value
int yVal; //Octave value

int melody[] = {1319,1319,1319,1047,1319,1397,784}; //Start melody
int duration[] = {500,500,500,500,500,250,500}; //Note duration
String instrument; //MIDI instrument

//String notes[]={c    d    e    f    g    a    b    C};
int tones1[] = { 523, 587, 659, 698, 784, 880, 988, 1047 }; //Lowest Octave Notes
int tones2[] = { 1047, 1175, 1319, 1397, 1568, 1760, 1976, 2093 }; //Middle Octave Notes
int tones3[] = { 2093, 2349, 2637, 2794, 3136, 3520, 3951, 4186 }; //Highest Octave Notes

ZX_Sensor   sensor = ZX_Sensor( 0x10 ); //Setting up the I2C commuications for ZX Sensor

void setup() {
  
  Particle.subscribe ("diot/2019/smartstudio/",handleSharedEvent); //Subscribing studio events
  sensor.init (); //Initializing ZX sensor
  pinMode (speakerPin, OUTPUT); //Setting speaker to output
  pinMode (led, OUTPUT); //Setting LED to output
  digitalWrite(led,LOW); //Setting LED as off
  Particle.publish("diot/2019/smartstudio/harmoniiize/","Music is Playing"); //Informing start of music
  playStart(); //Playing start melody
}

void loop() {
    
    // Decising instrument
    
    if (instrument="Drum") {
        //set MIDI to Drum
    } else if (instrument="Cello") {
        //set MIDI to Cello
    } else if (instrument="Piano") {
        //set MIDI to Piano
    } else if (instrument="Guitar") {
        //set MIDI to Guitar
    } else if (instrument="Flute") {
        //set MIDI to Flute
    }
    
    //Reading sensor values
    x = sensor.readX();
    z = sensor.readZ();
    
    //Mapping note index
    xVal = map(x,0,240,0,7);
    yVal = map(z,0,240,0,2);
    
    // Selecting the note
    if (yVal==0) {
        note = tones1[xVal];
    } else if (yVal==1) {
        note = tones2[xVal];
    } else if (yVal==2) {
        note = tones3[xVal];
    }
    tone(speakerPin,note,250); //Playing the selected note
    delay(250); // 4 beats/minute (=1000/delay)
}

//Playing Melody
void playStart() {
    for (int i=0;i<7;i++) {
        tone(speakerPin,melody[i],duration[i]);
        delay(duration[i]);
    }
}

//Catching published event
void handleSharedEvent(const char *event, const char *data) {
    
    String eventName = String(event);
    String dataValue = String(data);
    
    
    if( eventName.indexOf( "feelometer_avg" ) != -1 ){
      instrumentChange(dataValue); //Instrument selection from Feel-o-meter data
    } else if (eventName.indexOf( "Shhh" ) != -1) {
        redLight(); //Displaying Shhh command
    }
}

//Instrument Selection
void instrumentChange(String Val) {
    if (Val="1.000000") {
        instrument = "Drum";
    } else if (Val="2.000000") {
        instrument = "Cello";
    } else if (Val="3.000000") {
        instrument = "Piano";
    } else if (Val="4.000000") {
        instrument = "Guitar";
    } else if (Val="5.000000") {
        instrument = "Flute";
    }
}

//Turning ON Shhh signalling
void redLight() {
    digitalWrite(led,HIGH);
}
Click to Expand
0

Feedbacks: 

1. Some of the feedbacks we have received is the high pitch sound which might actually cause stress. This is due to the lack of amplifier library, and we intend to solve it for the next stage. The vision is the amplifier will simulate different instruments' sound.

2. Nearly everyone who we have showed demo to thought the demo was fake and staged. We were even surprised ourselves seeing how smooth and easy the creating process is. This really brings back to our core idea of making it fun and easy to pick up for everyone in the studio to de-stress.

3. Think about how to make connection more meaningful and engaging. How to foster creativity after people upload the tunes online.

4. What if there are multiple Harmoniiize stations, each representing one instruments, so that people can create a diverse tunes simultaneously?

x
Share this Project

Courses

49713 Designing for the Internet of Things

· 18 members

A hands-on introductory course exploring the Internet of Things and connected product experiences.


About

Music has powers. Music can enhance our happiness when we feel excited; it can relieve us when we feel nervous. Moreover, music connects people together.

Harmoniiize is a connected instrument, which allows anyone to create music using gestures and express their creativity through musical notes.

Created

March 5th, 2019