Gesture Piano

The Interaction of Sensors & Music

For an NYU Shanghai Interaction Lab class, my team hoped to combine the powerful library of code that p5.js has and create a interactive prototype combining camera recognition, and music. 


I was inspired by an earlier project in which I created a fun toilet seat cover which combined sensors to create music as you do your business on it. Thus, we were inspired to create the gesture piano, a piano that uses the power of p5.js to be able to sense each of our individual fingers and where they are in relative to the camera position. 


Although at the end of the project timeframe, our prototype was not as best as we wanted due to the lack of machine learning library needed for proper recognition not allowed for the class project, it still worked delightfully. Please view the video below. 

My main contribution to this project was the creation of the physical aspect of the project. This included the wiring, cutting & assembly of the piano fingers, the laser cutting of the housing and overall physical design. Sketchup, Photoshop,Illustrastrator & Figma was used in this project. 

IMG_0788.MOV