Brainstorming:
After talking to the other two teams in @home, we started brainstorming on how we can convey sounds and make our user feel that he is at home with familiar sounds. With the other two teams, we brainstormed what message we would be utilizing to create an environment that the user is familiar with. As we interviewed Manchit, our advisor, expressed how much he missed the weather in Mumbai when being here at Pittsburgh. During the snowing days, Manchit always misses how nice the weather was in Mumbai. With the two other teams, we decided to use weather data in Mumbai to bring our user to home. As the weather in Mumbai changes, the sight team will project different icons on the wall; the smell team will diffuse different smells according to weather as well. It is more straightforward for us, the sound team to reveal different weathers based on the sound of, for example, the rain. We wanted to incorporate not only the sound of rain or wind, but also the sound of animals such as crows and crickets to further picturing home away from home. We then prepared seven audios with different combinations that indicating season and weather so that they can correspond to real-time data.
Prototype:
For the physical prototype, we first came up with the idea of a box with a shape of house and figured that it will be done by laser cutting. Then we thought about the idea of putting the skyline of Mumbai into the prototype. By tracing down a picture that has the shape of skyline for it, we were able to laser cut a house with Mumbai’s skyline on it. We also decided to turn the rooftop of the house into a modular piece that can be altered according to users’ needs.
For the coding prototype, we first tried to implement a button to turn on/off the speaker. We figured that “sound of home” should not be turned on if no one is in the room, so a motion sensor was attached to the system so that the system will only function when it senses movements. We utilized web hooks, reading the data about the chance of precipitation with its intensity from API to trigger different motions for the player to play different audios accordingly. This was done individually by the three objects in team @home. But after getting feedbacks, we thought that it would be intuitive to let the three objects talk to each other than working separately. We then updated the codes that the “sound of home” will publish events that the other two objects can subscribe and change their outputs accordingly.
Content Rating
Is this a good/useful/informative piece of content to include in the project? Have your say!
You must login before you can post a comment. .