SceneObject.transform.rotation and tracked planes - how do they interact? (scripting and/or patches)

Good evening :slight_smile:
I have just spend several days of fighting all possible quaternion-properties and methods and head banging to make a sprite on a tracked plane always face the user (think “look at”) .
I have tried to work on my trauma by posting to SO here:

The short version is, I am simply binding the transform.rotation of the sprite to the worldTransform.rotation of the camera. This works fine as long as the sprite is outside the trackedPlane hierarchy, by also binding its position to the trackedPlane0, and it looks alright at first.

However, there is also the usual user-interaction logic tied to our tracked plane, as in “tap to set tracker for tracked plane”, pinch to scale/zoom etc. So I also connect this stuff to the relevant transform parameters of the sprite using patches, and I can zoom it and move it, inelegeant but so far so good.

Now when testing this on the device, there is a problem, which never occured while the sprite was inside the planeTracker0: Its scale sometimes jumps or jitters, while everything inside the tracker works fine. The only thing connected to that scale param though was the pinch-logic (with a multiply-patch inbetween).

So now I’m thinking, why don’t I go back to binding the sprite’s transform.rotation to the camera-worldTransform.rotation, and leave it inside the trackedPlane0? So all I’m doing is dropping the sprite inside the hierarchy again, but it gives me some super strange rotational behaviour. This is probably somehow related to the order of applying the quaternion rotations applied to my object when it inherits somehing from the tracked plane?

Why does this work before the sprite (or any other object for that matter) is inside the planeTracker, but not after? How does the trackedPlane screw with these rotations?

Thank you!

Man, I gave up on that after way too many days. My goal was to take snapshots in space and leave textured planes floating around. After I finally solved the problem of counter-rotation, scale, etc… it didn’t work at all on device. The measurements were totally out of whack, and not in a consistent way.

I never did find a solution to that, so I took another approach. I ended up abusing the particle system’s world space feature and it works pretty well. I think I put a target plane in the focal distance object and used its world transform to align the particles when snapshots were taken.

1 Like

Hey Josh,
again, thanks for your feedback and the idea regarding the emitter… I’m just glad someone else is feeling my pain!
I haven’t figured out your solution yet, but I actually got mine to work (at 4am this morning) using quaternion multiplication and keeping the look-at-object inside the trackedPlane hierarchy. For a software that is supposed to be so accessible I feel like I had to watch too many clips on higher mathematics…

In code, the solution seems simple, of course. I had to get some data from patches and some from script, which is also kinda icky, so if anyone has ideas on how to get a trackedPlane’s transform into script without patches that’d be cool!
Here it goes, sorry for the wall of text.

A big part of the complexity and confusion for me also stems from the fact that rotations are expressed or modelled in several different ways here and have to be made compliant to each other, as well as implicitly rotated local transforms eg of the trackedPlane.
By different rotation I mean axis rotations in degrees from the patches, axis rotations in radians in the transform.rotationX etc properties and then the quaternion parameters from rotation.x… good fun.

Like I said before, I’m binding the look-at-object.transform.rotation to the DeviceMotion. However, in this filter the user is able to move the tracked plane’s origin by tapping, which seems to leave the original scene coordinates in place and just move the tracked plane on the point cloud (I’m guessing?) and rotate it towards the device. So being inside the tracked-tree the look-at-object inherits this orientation… on TOP of the device motion it’s bound to. So I have to account for that by creating a quaternion counterrotation around the planes’… Z axis! Because it’s a plane! Laying on its back!
And “stacking” quaternion rotations, as I found out, works by multiplying them in the right order.

I hope this makes more sense with the rough code for it, cleaned of customer-data:

const S = require('Scene');
const R = require('Reactive');
const D = require('Diagnostics');
const DeviceMotion = require('DeviceMotion');
const Patches = require('Patches');

// object that needs to look at the user here
const objectName = 'orienter';

(async function () {
    D.log("Script initialized");

    let [object] = await Promise.all([
        S.root.findFirst(objectName)])
        .catch((error) => D.log(error));

    // rotations as quaternions
    let deviceRotation = DeviceMotion.worldTransform.rotation;

    // trackedPlane rotation vector from patches (comes in degree)
    let trackedPlaneRotation = await Patches.outputs.getVector('trackedPlaneRotation');
    let trackedPlaneRotationY = degToRad(trackedPlaneRotation.z.neg());
    let counterRotation = R.quaternionFromEuler(0, trackedPlaneRotationY, 0);

    object.transform.rotation = counterRotation.mul(deviceRotation);

    // For fun: desperate debugging (along with the "Axis"-block from the SparkAR library:
    let numberFormat = "{:.2f}";
    let debugString = "";
    debugString = R.concat(debugString, "DeviceMotion (x,y,z):");
    debugString = debugString.concat("\n");
    debugString = debugString.concat(radToDeg(deviceRotation.eulerAngles.x).format(numberFormat));
    debugString = debugString.concat("\n");
    debugString = debugString.concat(radToDeg(deviceRotation.eulerAngles.y).format(numberFormat));
    debugString = debugString.concat("\n");
    debugString = debugString.concat(radToDeg(deviceRotation.eulerAngles.z).format(numberFormat));

    debugString = debugString.concat("\n");
    debugString = debugString.concat("Tracked plane rotation:");
    debugString = debugString.concat("\n");
    debugString = debugString.concat(trackedPlaneRotation.x.format(numberFormat));
    debugString = debugString.concat("\n");
    debugString = debugString.concat(trackedPlaneRotation.y.format(numberFormat));
    debugString = debugString.concat("\n");
    debugString = debugString.concat(trackedPlaneRotation.z.format(numberFormat));

    debugString = debugString.concat("\n");
    debugString = debugString.concat("Object rotation:");
    debugString = debugString.concat("\n");
    debugString = debugString.concat(radToDeg(objectRotation.eulerAngles.x).format(numberFormat));
    debugString = debugString.concat("\n");
    debugString = debugString.concat(radToDeg(objectRotation.eulerAngles.y).format(numberFormat));
    debugString = debugString.concat("\n");
    debugString = debugString.concat(radToDeg(objectRotation.eulerAngles.z).format(numberFormat));
    await Patches.inputs.setString('debugText', debugString);
    D.log("End of Log");
})();

function degToRad(degrees)
{
    var pi = Math.PI;
    return degrees.mul(pi/180);
}

function radToDeg(radians)
{
    var pi = Math.PI;
    return radians.mul(180/pi);
}

Also, @josh_beckwith, I’d be super interested in your emitter abuse! Sounds super hacky aka fun :smiley:

1 Like

What you are looking for is called a LookAt function, that link is to an update to v82 & Promises. It’s on Reactive but it doesn’t work outside the focal distance, I tried to search how to make a 3d look at function for my math library but it’s too complicated and I didn’t really have a project to use it on so I gave up sadly.

Hey Tomas,
thanks for your reply! I tried all the look at functions a week ago, and I actually started out with your github so thank you for that. I needed it to work outside the focal distance, and now it does. I posted my code in this thread, which uses script and patches both, which can probably be improved… also, cool library, I’m sure that will come in handy, thank you!

1 Like

Oh thank you so much, I should have read the whole thread. I’m glad you found it useful, now that you found a solution I’ll try to incorporate it into the library - which has a few things I need to update anyways.

Unfortunately, it’s not a general look at function… it just looks at the user while on a tracked plane, which is what I needed. But maybe the things I learned on the way can help to build one.

Re: emitters… I basically just copied the focal plane world transforms onto the individual emitters when the user taps the screen. The emitters themselves are set to always show a single particle - birthrate and lifespan are complimentary to each other. It seems a bit inefficient, but it got the job done.

This was all done before the quaternions came into the mix, so maybe they fixed some issues with that feature (I know gimbal lock was a problem before, at least.)

1 Like

Looking into this to do an update to my Math module.

“Unfortunately it wouldn’t update once in the scene. After some debugging I found out that DeviceMotion.worldTransform ACTUALLY ONLY CONTAINS A ROTATION, but doesn’t throw an error when its positional coordinates are accessed. I find this package EXCESSIVELY frustrating to work with.” Yes absolutely frustrating, after asking many people I found the camera position is accessed by checking the position of the Scene “Camera” object and not the DM module - answered by Lasse.

Nice explanations, very detailed :clap:t4:

Hey @Tomas_Pietravallo,
our project is finished and the sprite I got to look at the user got replaced by a 3D-object that needs no orienting :smiley:
Still, I wanted to leave this look-at-thing in a better state than it was, and so I got rid of the patches. The code looks different here, but just the necessary bits are this:

const S = require('Scene');
const R = require('Reactive');
const D = require('Diagnostics');
const DeviceMotion = require('DeviceMotion');

// The planeTracker0 in my scene doesn't have any exposed properties like "worldTransform"
// I verified this by looking at its keys (they are undefined)
// So I need to introduce an empty into the tracked plane, which I call 'placementRef'
// It does nothing but give me the tracked planes' transform

const objectName = 'orienter';
const placementRefName = 'placementRef';

(async function () {
    D.log("Script initialized");

    const [object, placementRef] = await Promise.all([
        S.root.findFirst(objectName),
        S.root.findFirst(placementRefName)
        ])
        .catch((error) => D.log(error));
    D.log(getKeys(placementRef));

// This (overlong) line contains the whole logic. 
// The placementRef.worldTransform.rotation.conjugate() is the counterrotation
// that the object needs to perform in addition to the deviceMotion, 
// so they are multplied by each other.
    object.transform.rotation = placementRef.worldTransform.rotation.conjugate()
                                   .mul(DeviceMotion.worldTransform.rotation);

    D.log("End of Log");
})();
3 Likes

Hi! I tried this one, but its give me an error

Blockquote

Hey @Jorik_Rosa, sorry, just delete the line with the getKeys in it from the code. I left it by accident, will edit my post.
EDIT: I can’t seem to edit my previous post anymore, so here is the code without the line and comments:

const S = require('Scene');
const R = require('Reactive');
const D = require('Diagnostics');
const DeviceMotion = require('DeviceMotion');

const objectName = 'orienter';
const placementRefName = 'placementRef';

(async function () {
    D.log("Script initialized");

    const [object, placementRef] = await Promise.all([
        S.root.findFirst(objectName),
        S.root.findFirst(placementRefName)
        ])
        .catch((error) => D.log(error));

    object.transform.rotation = placementRef.worldTransform.rotation.conjugate()
                                   .mul(DeviceMotion.worldTransform.rotation);

    D.log("End of Log");
})();
2 Likes

Thank you! Its work now! But how can I move object by Y and Z axis cuz I need that my object look at camera just for right-left and up-down sides? is it possible?

Hey @Jorik_Rosa, sorry for the late reply - I’m terribly busy at the moment, so can’t always react quickly.
I did work on your request in making this a little more flexible though, and hopefully made the function a little clearer.
The most work intensive part of this for me was finding how the trackedPlanes worldTransform rotation maps to the global coordinate system. In this, the axis-block from the SparkAR library helped a lot, once I stumbled upon it.

const S = require('Scene');
const R = require('Reactive');
const D = require('Diagnostics');
const DeviceMotion = require('DeviceMotion');

const objectName = 'looker';
const placementRefName = 'placementRef';
const pi = Math.PI;

(async function () {
    D.log("Script initialized");

    const [object, placementRef] = await Promise.all([
        S.root.findFirst(objectName),
        S.root.findFirst(placementRefName)
    ]).catch((error) => D.log(error));

    /*
    Construct a quaternion to neutralize the tracked planes rotation
    The axes don't directly map to the world coordinate system,
    but these offsets and switches work for me.
    To leave out one of the axes, just use "0" instead of the
    placementRef rotation for that axis, for example to keep your object from following the planes' z rotation.
    */
    const trackedPlaneRotation = R.quaternionFromEuler(
        placementRef.worldTransform.rotationX.sub(pi/2),
        placementRef.worldTransform.rotationZ.neg(),
        placementRef.worldTransform.rotationY
    )

    /*
    Construct a quaternion to neutralize the device rotation
    from the worldTransform angles of the DeviceMotion
    To leave out one of the axes, just use "0" instead of the DeviceMotion rotation for that axis,
    for example to keep your object from following the device's z rotation like this:

    const deviceRotation = R.quaternionFromEuler(
        DeviceMotion.worldTransform.rotationX,
        DeviceMotion.worldTransform.rotationY,
        0
    )
    */
    const deviceRotation = R.quaternionFromEuler(
        DeviceMotion.worldTransform.rotationX,
        DeviceMotion.worldTransform.rotationY,
        DeviceMotion.worldTransform.rotationZ
    )

    object.transform.rotation = trackedPlaneRotation.mul(deviceRotation);

    /*
    If you want to neutralize all axes of the tracked plane, you can use the "conjugate"
    of its rotation instead of building the counter rotation from the eulerAngles,
    and if you want the object to completely follow the devices rotation you can just use
    object.transform.rotation = placementRef.worldTransform.rotation.conjugate().mul(DeviceMotion.worldTransform.rotation);
    */

    D.log("End of Log");
})();

I also uploaded a sample project to https://github.com/johannesrave/spark-ar-utilities to hopefully give further clarification.
Best regards from Berlin

3 Likes

You are my hero :smiley: Thaaaaank you very big much :slight_smile: It work perfect!

1 Like

Hi, first of all thank you for this research, it is extremely helpful. I have encountered a small problem when trying the script with z set to 0. When moving around the scene and looking at the object from behind the object flips on the x axis to be displayed upside down.
I think for now I will just the one liner with conjugate to rotate all axis.

Thanks for letting me know! I’ll look into it, when I’m ready to go back to that dark place… staring at virtual objects facing every way except the one you expect them to, for days on end

1 Like