F

Finguitar

Finguitar("Finger-guitar") is an online instrument which you can play with just your fingers. The idea is to then give the user the ability to combine these gestures to make a tune out of it.

F

Finguitar

Finguitar("Finger-guitar") is an online instrument which you can play with just your fingers. The idea is to then give the user the ability to combine these gestures to make a tune out of it.

The problem Finguitar solves

Ever thought of playing music with your gestures? Finguitar("Finger-guitar") does exactly that.

Finguitar is a web-app which uses Computer Vision to build an online instrument that you can play using just your fingers. You can hold up a gesture with your hand and the web-app will record it using your webcam and play a music note for every gesture.
The idea is to then give the user the ability to combine these gestures to make a tune out of it. So, it's like playing a guitar... but using Computer Vision!

Currently we recognize 10 gestures to play different sounds(notes of a guitar):

  • Palm
  • Fist
  • Thumb
  • Index
  • Middle 2 fingers
  • Pinky
  • Call
  • Rock
  • LShape(finger+thumb)
  • Ok

Each of these gestures maps to a unique note of a guitar. The user can first train on these gestures and get familiar with what tune each gesture generates. Then the user can utilize his/her musical creativity to make music.

Gamification can also be introduced to improve the user's experience and feedback. Another aspect is customizability - we can allow the user to customize and modify any gesture to whatever tune he/she wants to map to. The user can also run light background music as a track and make music on it. The user can also utilize the other hand to make secondary gestures which can control the effects of the music being generated - like the volume, bass, tempo.

Challenges we ran into

The major challenge we faced was regarding the communication aspect of the project. Capturing frames from the webcam on the browser, and then encoding it and sending to the server(which feeds it to the model) was the main hurdle. We wanted to ensure that there is minimal delay between the gesture and the response, and that there are no latency issues. We solved this using WebSockets. WebSockets are a 2-way communication endpoint, where a client and a server can send messages to each other on the same endpoint. This helped us reduce the overload tremendously and encoding the data into base64 strings made the communication even more efficient than we had initially expected.

Discussion