As far as we can tell this affects most iPads running an iOS version less than 15.
To complicate things, it’s extremely difficult to differentiate between a Macbook and an iPad running iOS >=13 so even if we can come up with an alternative lighting scheme I’m not sure exactly how I’m going to detect affected devices.
One other thought just to add to this… Are you using Browserstack by chance? That looks like the Browserstack UI from the screenshots. My company uses browser stack for general web development but when we started doing WebGL work we tried using it as we always had and noticed a lot of inconsistencies coming from it that cannot be replicated on phones in real life. For example on browser stack about half the time I use a galaxy s20+ I get odd artifacting that isn’t real. The bug list goes on and on so if that is what you’re using, make sure you’re getting the issue on a physical phone.
Are you sure this is iOS related and not model related? 9th generation iPads have an sRGB retina display while iPad Pro 11 uses DCI-P3 liquid retina display, so maybe it’s just down to different color gammuts of the models.
The main changes to the lighting were increasing the shadow Distance and VSM Blur Size so the shadows weren’t harsh on both our main and fill directional lights.
In testing, if we decreased the blur size on the fill light we lost some visual quality and the artifacts seemed to go away, but if we dropped it too much we get the artifacts on desktop. Currently desktop looks fine.
At this point, I would look at creating a smaller project to repro this issue and make it a lot easier to see the differences across platforms. Is that something you would be able to provide and we can try to track down the commit.
I’m not sure I understand the scene … all I see is what seems to be a white background image with lines, and some windows on top. What is casting and what is receiving shadows here?
I generate a 3D garage door based on some input. This is composed of a bunch of small pieces that I then stretch as appropriate to get the correct size.
I take a screenshot through that ortho cam and save the result to a texture. The texture is then applied to a dynamic mesh plane which is then placed in front of another plane with a picture of a house on it:
The thing that keyed us into the issue was that the client was getting these artifacts on the final scene view. This has happened before on old iDevices (and @yaustar helped me resolve it, I believe the issue in that case was using too large, non-power of two textures) but this time it appears that the distortion is occurring in the live view from the capture cam rather than just in the resulting textures.
I was immediately able to get shadow banding of this type on iPad iOS 14 just by copying our lighting rig over to a blank project and pointing an ortho cam at a white plane. Here is the project.
I’ve tried tweaking basically all of the settings with a browserstack open on my second monitor, and I do get varying results… but nothing is really quite right. Also some settings appear to make the effect more or less pronounced in the editor. For example setting VSM blur size to 8 makes it extremely apparent:
At this point I’m doubtful that there is any lighting configuration that will actually eliminate this problem on all devices. It seems to be a severe bug in the way VSM blur is applied.
I’d probably just suggest to use PCF5 shadows instead of VSM, as VSM is a bit harder to get working consistently. At least till we have time to investigate and improve it.