This is my first interactive computer vision project, where I developed a virtual concertina using Python and MediaPipe for real-time hand tracking and gesture recognition. The program visualizes the instrument on screen and responds dynamically to hand movements. It recognizes hand gestures to trigger notes and sounds, while hand distance controls the volume and pitch, creating an immersive audio experience.