New features in v102

Wow, did you guys check the new changelog yet?
This stuff is crazy!

  • Dynamically instantiate scene objects, materials and blocks in an effect at runtime by using the new create() method exposed by the SceneModule and MaterialsModule APIs. Use dynamic mode to visualize these dynamic objects in the scene.
    https://sparkar.facebook.com/ar-studio/learn/scripting/dynamic-instantiation

  • Write custom shaders in Spark AR Studio.

  • Extract a 3D object’s global and local transform values to position objects in your scene.

  • Optimize imported texture sequences in the Import Texture Animation window before adding them to your project. Click Add Asset > Import > Texture Animation.

But… but… what about all these elaborate workarounds to get the position of a goddamn sphere in the scene? What will we do with all this time freed up by not having to jump through a hundred hoops every project?

Having read through the the dynamic instantiation article, it apears that only SOME objects can be dynamically instantiated this way. A model loaded into assets for example wouldn’t work - unless it’s inside a prepared block if I understand this right.

I always thought they didn’t have instantiation to avoid allocating memory at runtime and enfore something like “built-in pooling” to avoid garbage collection. Does anyone know whether that train of thought applies at all? Cheers!

2 Likes

Yes, Davide confirmed that models can be instantiated if you place them inside a Block (and instantiate the block). No idea on the memory thing :face_with_monocle:, but I feel like they must have avoided Dynamic instancing all this time for that reason…

Edit: YESSS THE UPDATES ARE CRAZY!! I’m enjoying 102 haha

1 Like

I am super excited about all of this! We can make much more complex shaders and they are still just as simple to use in patches! In case you are itching to get into this, Adam Ferris made a nice little repo of examples

And yeah, I’m super happy about the global transform patch. There were way too many of my projects that had scripts just for this reason.

4 Likes

Thaaanks! I’ve been making some terrible little shaders lately and would love some material on that! Aaaaaaand fork’d!

That really is a great repo, and an important update. I was meaning to learn GLSL for Unity for quite some time, and now I can get into it while doing spark… very cool. “Learn concepts, not languages!”

1 Like

its looking great…
but there is no way to ASIGN a global transform.
so there is no “mySphere.globalTrans = myPyramid.globalTrans”, being both of different hierarchies.

so far I could grab the global of an object inside the camera tree and apply it successfully to an object that was outside of all hierarchy as input local transform. (not even a child of “device”)

on the other hand…
if I have a picker UI, the whole patch system just dies. is this new to 102? idk, first time ever needing the picker UI for me

// some new findings: my textures had to not be compressed. still doesnt work. i cant find a way to clean a texture slot once i picked something there. damn this picker is hard // apparently I had to kill the previous picker and the compressed textures (wanted single color in the cheapest) once killed and redone from scratch, it took on =) victory

ps. how do I transform “color” into texture? I usually have zero problems with this for shaders, but as input for the picker, no option I fumbled in was taken

Thanks for sharing the repo Josh! The README alone is already full of great info about SparkSL. I can’t wait to look into the examples.

1 Like

I think the picker UI needs uncompressed textures because they are consumed outside of the spark runtime. That would explain why color inputs don’t work because the external UI doesn’t know what to do with a color signal

makes sense, it would be nice to have that on the patch info though… it wouldnt throw that particular error with the patch, it only appeared when I tried the script way