So, I have an intuition as to what is going wrong here - but not a solution. It looks as though:
both objects have alpha enabled on their materials
object A is larger than object B
object B is behind object A’s center point, but is partially within it’s volume
object B either disappears, or partially disappears
Why is this a problem? Well, almost everything in a 3D game relies on scenarios like this - such as explosions, weapons effects, status fx, AoE fx etc. So almost everything I try to implement in game - runs into this scenario at some point.
Is this a bug? Or something I can configure around?
It’s a case of the renderer not knowing or being impossible to know which order to render the meshes in. In the case of two objects intersection which are both transparent, there is no correct order.
Not writing the z-buffer create’s it’s own artifacts - so that’s not going to solve it either. I think the layer system could work. This is an extreme case, but the actual problem occurs when doing things like particle effects (i.e. a shotgun blast) and the quads are too close together. It get’s confused about render order and you start getting weird artifacts.
Pretty sure this is an engine implementation issue - it’s a common problem in other engines (Unity for instance suffers from this, or at least used to). There are ways to avoid it - but each method has caveats.
Not a dumb question - maybe not obvious if you don’t do a lot of particle FX.
For instance, if you have a charge effect that lasts for 2 seconds, say for a laser rifle, the charge starts in one location, and then stays there. It does not follow the gun.
This happens with all particle effects, some you notice less than others (i.e. smoke for a shotgun blast). But essentially, any effect that you want to actually follow the point of emission as it moves through space, breaks in PlayCanvas.
It’s particularly noticeable for things like rocket plume effects, rocket jets, particles that have to ‘charge’ etc.
Ooooh… You are doing charge effects, that makes sense now. If I’m reading that right, you would like particles to be ‘attracted’ to the emitter in a sense?
So it’s fairly simple to do in Unity et al - usually just a checkbox to use local coordinates. But unfortunately it doesn’t exist in PlayCanvas (that I can see). A bit frustrating because it’s a hugely important part of getting certain game FX to look ‘correct’.
So in our game - there are a number of situations where the particle effect must run for a few seconds, and all while the emitter is moving. Because they are emitted into world space, it creates some pretty buggy looking results. So far we’ve either had to create them procedurally, or re-design… but it’s definitely diminishing the quality as a result.