Spark AR person segmentation extrusion

Hi, I want to understand how this extrusion is happening in a filter by holographmusic.

Filter link:

Thanks a lot!!

1 Like

Hi, there are multiple ways to approach this effect, but since spark can not take depth information from the device that has multiple camera to take the depth information, we could try to fake the depth by using the luminosity or brightness value of the image.

the concept is simply to distort the texture that has bunch of lines in it with “something”. and that “something” as you guessed it is the luminosity or brightness value of the image. the idea is to use the luminosity or brightness value to feed or drive how much distortion you want from the image. in a black and white texture, black represents 0 or below. white represent 1 or above.

with that said, you can experiment with different black and white texture to then feed the distortion.
you can even try to fake a normal map to then feed the distortion.

if you don’t know how distortion works, i suggest you to take a look at it first and understand how texture coordinate or UV works in shader,

the basic idea of distortion is that every image inherent a default UV, and to distort means to add, or subtract or multiply or any other math to shift the original UV with “something”. and yup, if you think it’s familiar, that “something” is the same “something” from earlier. it could be any texture, but in this case, that “something” is the luminosity or brightness value.

in the spark ar uv world, green means y axis, and red means x axis. so, if you want the image distorted up and down, then what you need to do is to translate that black and white to black and green value. where fully bright green means distorted down, and black means distorted up. (because the coordinate system in spark, 0,0 is on top left. and 1,1 is on bottom right.)

if you made it this far, cool. as you can see in the image, the distortion is not that powerful, no lines are crossing or overlap each other, it could means that the creator tone down the range. to do it, you could use divide or even from range/to range combination.

also if you notice that the distortion seem to only happens significantly on the edges. so there is a good chance that the creator use sobel filter, make it black and white, and maybe translated to a normal map to drive the distortion. there is a good chance too that the creator downsampling the texture using shader renderpass and then maybe put a tasteful amount of blur before feeding it to distortion patch, because the distortion looks smooth, no sharp edges.

also also, for fun, you can also multiply the amount of distortion with something like sound. louder sound, more distortion.

well that’s it. of course there will be some further tweaks and fine tuning to find a good balance to your linking, but that’s my “guesstimation” of how it works. i could be wrong, but i think it still can help to guide you to the right direction. who knows. maybe those lines are not texture but bunch of 3d object with flat shaders and distort the polygon.

I hope you find it helpful. feel free to come back if you find some problem or maybe to show us your final result.
Good luck!