A decorative object that brings back users to their home country through sound

Problem Statement:    

Students from a different country that are living abroad and want to be reminded of the sensorial experience (visual, sound, smell) of the weather in their countries. Students need some ways to connect themselves with their homes and families that are far away from them. 


In this part of the ecosystem, our goal is for students to feel that they are connected to their hometown’s environment by their familiar sounds that they used to hear all the time back home.  Sound can be expressed non-intrusively and are embedded with valuable memories. Our goal is to design an ambient device that integrates aesthetic looking and comforting sounds to "bring" students back to their homes. 


Concept Description: 

A house shaped device that contains a mini speaker that can play nature sounds according to temperature data. Thus we need weather web hooks to tell the weather (Sunny, Cloudy, Clear sky, Rainy) to the photon and trigger different sounds scenario. For example, when it is a sunny day at the student’s hometown, the speaker will play the pre-recorded audio: Sparrow chirps; when it is rainy, the student will hear the sound of toads/frogs croaks and when it is during winter, the sound of crickets will be heard by the user. 

Prototype of Circuit
Sounds pic

After talking to the other two teams in @home, we started brainstorming on how we can convey sounds and make our user feel that he is at home with familiar sounds. With the other two teams, we brainstormed what message we would be utilizing to create an environment that the user is familiar with. As we interviewed Manchit, our advisor, expressed how much he missed the weather in Mumbai when being here at Pittsburgh. During the snowing days, Manchit always misses how nice the weather was in Mumbai. With the two other teams, we decided to use weather data in Mumbai to bring our user to home. As the weather in Mumbai changes, the sight team will project different icons on the wall; the smell team will diffuse different smells according to weather as well. It is more straightforward for us, the sound team to reveal different weathers based on the sound of, for example, the rain. We wanted to incorporate not only the sound of rain or wind, but also the sound of animals such as crows and crickets to further picturing home away from home. We then prepared seven audios with different combinations that indicating season and weather so that they can correspond to real-time data.

For the physical prototype, we first came up with the idea of a box with a shape of house and figured that it will be done by laser cutting. Then we thought about the idea of putting the skyline of Mumbai into the prototype. By tracing down a picture that has the shape of skyline for it, we were able to laser cut a house with Mumbai’s skyline on it. We also decided to turn the rooftop of the house into a modular piece that can be altered according to users’ needs.
For the coding prototype, we first tried to implement a button to turn on/off the speaker. We figured that “sound of home” should not be turned on if no one is in the room, so a motion sensor was attached to the system so that the system will only function when it senses movements. We utilized web hooks, reading the data about the chance of precipitation with its intensity from API to trigger different motions for the player to play different audios accordingly. This was done individually by the three objects in team @home. But after getting feedbacks, we thought that it would be intuitive to let the three objects talk to each other than working separately. We then updated the codes that the “sound of home” will publish events that the other two objects can subscribe and change their outputs accordingly. 


The final outcome of both the coding part and physical prototype came out as we expected. The physical prototype ended up with a house shaped wood box with a dimension of 6 in x 6 in with a height of 5 in. A modular triangle shaped rooftop was also created and implemented with cut-outs spaces for speaker and motion sensor. The sound team is now the only team connecting to weather data through web hooks and sends out events for the other two objects to subscribe and act accordingly. The following is a summary of the bill of materials, circuits diagram and iterated codes use during the process: 

Final prototype
Img 4038v2
// This #include statement was automatically added by the Particle IDE.
#include <DFRobotDFPlayerMini.h>

// button variables
int buttonPin1 = D0;
bool buttonPressed;
int state = 0;

// sensor variables
int pirPin = D2;
int pirState = LOW;
int motionReading = 0;
int calibrateTime = 1000;

// whether variables
String weatherIcon;
double temperature;
double precipProbability;
double precipIntensity;

// player variables
DFRobotDFPlayerMini myDFPlayer;

// time variables
unsigned long startTime;
unsigned long waitLength;
int duration = 30000;

// set up the photon
void setup() {
    Particle.publish("forecast", "");
    Particle.subscribe("hook-response/forecast", handleForecastReceived, MY_DEVICES);

    pinMode(buttonPin1, INPUT_PULLUP);
    pinMode(pirPin, INPUT);

    Particle.function("PlayTrack:", playSound);

    Serial.println("Serial Monitor Initialized");
    Serial.println(F("Initializing DFPlayer ... (May take 3~5 seconds)"));

    if (!myDFPlayer.begin(Serial1)) {
        Serial.println(F("Unable to begin:"));
        Serial.println(F("1.Please recheck the connection!"));
        Serial.println(F("2.Please insert the SD card!"));
    Serial.println("DFPlayer Initialized");


void loop () {
    startTime = millis();
    waitLength = startTime + duration;

    if(waitLength < millis()){
        startTime = millis();


    buttonPressed = digitalRead(buttonPin1);

    if (buttonPressed == LOW){
        state += 1;
        state = state % 2;
        Serial.println("State is: ");
            case 0:
            case 1:

// get whether data for a specific location
void getData() {
    // Publish an event to trigger the webhook
    Particle.publish("forecast", "19.0896,72.8656", PRIVATE);

// respond to the whether by making 1 of 4 sound tracks
// send the whether info to other devices by publishing it to the particle cloud
void reportTheData() {
    motionReading = digitalRead(pirPin);
    if (motionReading == HIGH) {
        if (pirState == LOW) {
            Particle.publish("Motion", "Detected");
            pirState = HIGH;
            Particle.publish("Whether", weatherIcon + ", " + String(temperature) + ", " + String(precipProbability) + ", " + String(precipIntensity));
            if (weatherIcon == "clear-day" || "clear-night") {
                Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "clear" );
            if (weatherIcon == "rain" || "thunderstorm") {
                Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "rain" );
            if (weatherIcon == "wind" || "tornado") {
                Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "wind" );
            if (weatherIcon == "fog" || "cloudy" || "partly-cloudy-day" || "partly-cloudy-night" || "sleet") {
                Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "cloud" );
    } else {
        if (pirState == HIGH){
            Particle.publish("Motion", "LOW");
            pirState = LOW;

// manually control the output for all three devices
int playSound(String command) {
       if (command == "1") {
    	   Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "clear" );
       if (command == "2") {
    	   Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "rain" );
       if (command == "3") {
    	   Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "cloud" );
       if (command == "4") {
    	   Particle.publish("diot/2018/connected/atHome" + System.deviceID(), "wind" );
       if (command == "0") {
    return 1;

// parse the data received into readable text
void handleForecastReceived(const char *event, const char *data) {
    // Handle the integration response
    String receivedStr =  String( data );
     Particle.publish("data", receivedStr );
    int loc1 = 0;
    int loc2 = 0;
    int loc3 = 0;
    int loc4 = 0;

    loc1 = receivedStr.indexOf("~");

    weatherIcon = receivedStr.substring(0,loc1);

    loc2 = receivedStr.indexOf("~",loc1+1);
    temperature = (double) String(receivedStr.substring(loc1+1,loc2)).toFloat();

    loc3 = receivedStr.indexOf("~",loc2+1);
    precipProbability = (double) String(receivedStr.substring(loc2+1,loc3)).toFloat();

    loc4 = receivedStr.indexOf("~",loc3+1);
    precipIntensity = (double) String(receivedStr.indexOf(loc3+1)).toFloat();
Click to Expand

Bill of Materials

Photon x 1

5 Ohm Speaker x 1

DF mini player x 1

Micro SD card x 1

Breadboard x 1

Motion Sensor x 1

Jump wires

Casing Materials-laser cut wood

Power Supply


The process of making this project is memorable. We learned that aligning on a goal and direction for approaching problem is important. During the beginning part of the project, the ten of us met a few times as a big team to brainstorm and agreed on use cases. By doing so, we found that the amount of double-work was less if we keep conversation open not only within the small group but also with the big group. As a result, both the coding and physical prototypes turned out to be what we expected as a big group.

Something we learned:

  • Use of mini player and mini speaker
  • Leveraging a motion sensor into the system as a switch
  • Use of Web hooks technology so that we can get data from an API
  • How to connect other devices in order to make a holistic IoT system

Challenges we faced:

  • When we started implementing mini player, we could not figure out why no audio was playing when there is no error on the codes or the circuits. We then went to the internet and looking for solution, we finally found that there is hidden file which is also the first file in the sd card that prevents us from playing the right audio. We then tried to avoid this by using a PC when store audios in sd card.
  • When we were trying to utilize weather data, we faced so many information that could be used and had no idea how we can only use some of them. We then group them into different groups, for example, grouping “wind” and “tornado” in to the same group so that it is easier to navigate the weather data.

Future Iterations:

  • Implementing more audios such as familiar songs that are played during festival and certain songs for family members when their birthdays approaches
  • A two-way communication between user and their familiar, for instance, when weather changes and the three objects respond to them, a text is sent to user’s family saying something like “Hey mom, it's getting cold, be warm!”    


“Weather.” Dark Sky, darksky.net/forecast/18.9322,72.8308/us12/en.

Share this Project


49713 Designing for the Internet of Things

· 25 members

A hands-on introductory course exploring the Internet of Things and connected product experiences.


A decorative object that brings back users to their home country through sound


February 24th, 2018