World to canvas position

Hey. Is there a way to translate 3D position of face tracker to canvas?
I want a head movement to control 2D elements on the screen
Please link me to the existing topic if this question has been asked already

@monogon has a nice script (WITH AN EXPLANATION!) about how to do this

I don’t think that works since the space is different. world coordinate and screen coordinate is a different realm. we have to use projection matrix to project world position into screen that translated to coordinate points. Only removing Z won’t make it automatically match to screen 2D position. Fortunately, we can do that! Unfortunately, that feature for now only available in script to access it, and then pass the value to patch editor like what @josh_beckwith said.

There’s another option, which is the method I usually use. It’s the Facetracking2D module. It’s pretty straight forward.

I have a script utility to handle this (it’s script only):

Awesome! thank you for sharing :smiley:
Does it has a feature to send values or coordinate from patch editor to then processed inside the script and then give the result back to patch editor so layman/beginner creator like me who are not so familiar with script can also access it with ease and without having to manually change something inside the script? or is that even possible? Like so it become as easy as drag and drop, put some values in there in patch editor by connecting node. So we can just build our own patch asset to send values to script, and also patch asset as values receiver from script to then passes them to whatever we want to control in the scene.
I think it would be so amazing and super helpful to thousands of creator and millions to come in the future if it has that.

I have other repos used to send position to patch and set set object moving with head. I think it’s a good idea to combine their usage:

I’ll try this idea when I free