Gallery Texture

Hi everyone,
I have some questions about gallery texture.
So I just watched the #sparkarquicktipsv122, and it makes me have these questions:

  1. Can we use a video for the gallery texture now?
  2. So, I always use sender called “camtex” to feed all further processing. I want to make it so people can use my effect from their gallery, my guess was that the logic would be something like this, but how to make the trigger for it?
    (why not using ui picker or tap or slider? because I often uses them for other stuffs.)
  3. Is it possible to make the face tracker tracks the face from gallery texture and not camera texture?
  4. Same question like number 3, but segmentation instead of face tracker.

Just re-watched the episode to be sure but yeah :
1° Video is possible for the gallery texture
2° He’s mainly saying the change will be in scripting, doesn’t seem like we have access to it via patches. They’re adding the capacity to choose a time clip from the user’s uploaded content basically. Not too sure if they give us access to controls during the effect runtime to edit these clippings though
3° Not sure, will test it out though
4° same

You can’t use any of the ML processing stuff on gallery videos, but I think you could do this with the “existing media” option for post processing pre-recorded videos.

I haven’t used this option yet, so I’m not sure what the limitations are

1 Like

and AFAIK you can’t defer it (ML processing) to further along in the render pipeline either, right ? (like track the face generated by my effect ; ) but I can confirm that I apply segmentation based effects to existing media (from within IG on a phone) frequently so that is indeed possible… (but strangely I found that while person segmentation worked, hair or skin segmentation was not yet compatible) - other limitations for existing media effects are no interactive gestures (screen taps and native UI etc.), no audio, no target or plane trackers

1 Like