Let's begin!
We[me and my friends] have been preparing this project since the start of June. Now what we need to do is implementing the already planned system!
The overall system design
The system consists of two main parts: Gloves and Central computer. The central computer gets the full image of piano and analyzes the user's location of hand and piano keys to figure out what key the user is currently on. The system provides the information of person's hand motion/gesture to the gloves that give the vibration feedbacks. The central computer engages on the practice by playing the piano music and giving the hand location to the gloves that correspondingly provide the vibration feedback according to the information. Additionally, tactile information on whether the user played the right key is transmitted to the central computer to progress the practice. The overall system requires the visual analysis of the user's motion and communcation between the central computer and gloves.
We will be using the
Jetson Nano for the central computer, an embedded board centered to AI computation. For the gloves, we will be using the
Arduino Nano 33 BLE, as it is capable of bluetooth communication.
HC-06 module will be used for Jetson Nano, as it lacks the bluetooth communication modules.
A. Visual Parts of the System
The visual processing consists of following things to implement :
- Recognizing the piano from the image
We need to recognize the piano from the image(or the frame.) Ideal method should be recognizing the piano and masking the area so that only the piano part is left from the image.
- Detecting the keys of the piano
Now that the area of the piano is detected, the keys are all detected.
- Detecting the hand
Now that the locations of the keys and pianos are all done, the location and the gestures of the hands are all detected and quantified.
This is the most fundamental parts of the entire system. If this failed, the project itself fails.
B. Communication Parts of the System
- Communication with the gloves
We need to communicate with the gloves to get the tactile feedback of the user. We'll use bluetooth protocol to communicate.
- Communication with the application
We also thought of an application that can change the environmental settings of the system, so we need to be able to communicate with the application(probably mobile platform) to customize the system. We will (of course) be using the bluetooth for that.
C. Other parts
We need to design the overall practice routine for the user. Thankfully, we[me and my friends] already have the rough outline of the course of the practice. For now, I will focus on the implementation of the specific features(Because that's the important part!)
Now, what we need to do is just get right into the implementation! For the implementation of the central system, I will be using C++ language, and I will use opencv to process the images!