Our app allows users to try and enact to play the guitar on air, by setting some reference keys and imitating chord shapes and strumming patterns. It utilises computer vision and analysis for the same.
We ran into challenges every step of the way while using not so beginner friendly python libraries. Figuring out, landmarks and its correlation with chord shapes. Then identifying and analysing hands separately and the strumming patterns of the user. Fixing lags between output and input to give a realtime flawless experience.
Technologies used
Discussion