Project goal at this point

"How can physical artifacts and gestures create meaningful and playful physical/digital experiences?"

In order to fully comprehend the tools at my disposal, I have to first understand the ways that the physical can be interfaced with the digital, and vice-versa.

This week, I have been working on:

Setup

I've tried using ofxWarp addon to map the screen to the target area, but need to iron out how to output the objects drawn in the draw() function to a projected texture, ie probably capturing each frame (grabScreen()) and storing into a ofTexture or ofImage.

Program

Getting area, which is unique at this point..

Which allows me to detect the shape...

Allowing me to overlay the custom wheel.