Creating "flood effect" with camera depth and occlusion (render output) (LiDAR)

I’m doing some studies trying to better understand the possibilities to create spark air filters that use LiDAR.
I tried to replicate the “flood” effect which I thought was cool, but the edges are getting too jagged. Do you know what I could do to make them more natural like in the first example?

You could try blurring the depth texture before using it. The “DepthColorOverlay” template does this, so you can take a look at it there.

These jagged lines are seen a lot in effects that use the runtime to drive shaders. The default patch shaders use low precision, so you get jagged edges at time progresses. One way to get around this is to use the progress value from a loop animation patch so it just goes from 0-1. Another way is to use precision highp float; in your spark SL shaders that use std::getTime();