I’m having some depth/z-indexing issues when trying to put UI elements on a layer that isn’t the UI layer.
If you’re wondering why I’m not putting it on the UI layer, its because I believe it renders topdown hierarchy-wise, and I need it to instead render depth-wise since things in the front should hide things in the back. (Though there’s a chance I’m doing something wrong here as well)
The problem I’m running into is when putting UI elements onto layers other than the UI layer, I get things popping in and out (despite having physically placed some in front of others)
I’m not sure if its zfighting, or if at some camera rotations it assumes its offscreen and is culling the element, but wanted to ask if anyone has run into this or solved it before… or if I’m just doing something wrong and no one has had the issue… because thats always possible.
I tried setting to 0.01 as well as throwing a couple extra zeroes to see if that would help, but it seems to still vanish. My guess is it isnt getting culled out by the near or far clip planes. I guess it must be a little funky on how its trying to figure out the depth.
Well quick update, if I set UI elements to use materials instead of color or textures, it seems the problem goes away. Im guessing it has something to do with writing to the depth buffer being tied to having an actual material. Not sure if this is gonna work permanently but I guess Im gonna find out since thats what Im going to go with.
Unfortunately I am facing the same issue. It persists also at runtime.
Editor:
Runtime:
Even though the elements are on a seperate layer, under the world layer in the editor panel, the lower part of the red square still disappears in the world-layer plane.
Why is
the text disappearing based on the angle you look at it?
the red square not showing over the world-layer plane?
TLDR, as both elements are on the same render transparent layer, the engine is trying to work out what to render first based on distance from the camera. Depending on the camera angle, it may have calculated that the text element is further away than the image element because it’s very difficult to work out/know what should be rendered first
Why do you expect to? It’s on a layer that doesn’t clear the depth buffer before rendering nor is the depth buffer cleared between rendering the world layer plane and this layer
There’s a few ways to workaround this but it depends on what else you have in your scene. Eg will you have models with transparent materials, what will this be used for (eg nametags) etc
To answer your earlier question, I guess I expected it to work like when I replace the layer with the UI layer. But you already explained why that wouldn’t work.
As to your latest question for which usecase it is; I have a 3D world with vr setup. There are some 3D space screens which will show messages, pause menu and other things. These screens are parented to the camera offset entity so they will always appear in front of the user, independent of the users location.
The problem is that they clip through the walls in 3D space. That’s why I thought of putting these UI elements in a new layer to render above the world layer.
With your original problem of elements not rendering in front of each other how you expect to. Given that you want the World UI to render over everything and it’s on it’s own layer, you don’t have to worry about other transparent objects in the world or on the same layer.
As long as its all on a 3D Screen UI and you are using UI elements, I believe you can set the render order of the World UI Layer to be ‘manual’ sorting as the UI system will define the render order based on the Scene Hierarchy.
It’s not obvious in the Editor but the UI layer provided by the engine/editor is set to clear the depth buffer before rendering which is why it renders over the top of every thing that came before it