This week I want to build upon the project I made last week by replacing using the distance between the eyes to using handpose to control the pitch with tone.js.
Begin with reconstructing the existing code
This is very helpful since I can reuse it if I wish to further develop it in the future.
Add handPose
I define three states - open, make a fist, and pinch. Below are the details
Open and fist
I achieve this by first defining the palm center, and then calculated the ratio of five finger tips to the center point to the palm width. Then console.log the result to define the cut point.
I use print to find out the index to each keypoints. (Though they’re mirrored due to the translation of the whole canvas)
This dictionary is defined to make sure each hand has its own id so that it won’t cause conflict.
Pinch and the melody
I originally planed to achieve through tone.js, but I have trouble after including the library and I couldn’t fix it after asking three AIs, so I turned back to use p5sound. The reason I planned to use tone.js was that I can write a more natural keyboard sound. But I think p5sound also convey my idea well. I will try to solve the issues with tone.js in the future.
Thus you can see a lot of trace of tone.js in my editor.
I map the height of pinch point to control the frequency as well as the color of the point to add more user feedback. (no sound in the video cause Macos doesn’t support buitin sound recording TT)
Main logic stuff
I defined a variable called ‘started’ as the main control of all the states.
The states return from the bodyCtrl class is the second important one when the game(?) is activated.
Another function called muteAlltracks is used cause my loaded tracks are all long, and some bugs happen during my tests so I put a brutal way to it.