So I’ve been mapping HLS/M3U8 streams to a 360 Sphere for my application and it’s all been working great so far… until I upgraded to iOS 14. Now I’m just getting a black sphere with the audio of the stream playing. No issues with PC/Android as those support hls.js.
This seems to specifically be an issue with iOS devices on chrome/safari (not sure about other browsers), but I’ve been using the video element for iOS since it supports HLS natively through that.
I’m pretty close to app launch and this just popped up so I’m trying my best to narrow this down. I’ll detail my findings so far below.
Wondering if texture.setSource is not being fed the correct type anymore? Not sure though since it’s still in a video object?
Would love some insight into how to fix this issue, not sure if there is something I’m missing or overlooking. Let me know if more info is needed. Thanks!
Yep I see the video playing on iOS/Safari on those tv screens.
EDIT: Just forked repo and added the bunny stream as src for the video being fed, and saw black screens on all TVs. Didn’t expect success here but worth a try.
If it’s not playing with an external domain URL but playing with a video in the assets, it might be protection from Safari to protect against cross domain issues? Be worth checking what the console logs say when you try run it.
Not seeing any CORS issues in the tutorial build, tutorial build with bunny stream, and also my test 360 project with bunny stream on ios/safari.
I actually ran into this earlier and solved by using this bunny stream (which is used with the hls.js test player).
Only error I saw, on all test builds mentioned above was
Failed to load resource: The certificate for this server is invalid. You might be connecting to a server that is pretending to be “cdn.webglstats.com” which could put your confidential information at risk.
Doesn’t look related though, especially since it happened with the official working TV tutorial, where the videos played successfully.
Sorry to bring up an old thread, but- has there been any discoveries with this now that iOS 14 has been out for a bit longer?
I’ve refreshed myself on the issue and it seems the stream is definitely playing (can hear audio of video even though the sphere is black, playing an m3u8 in the video element on a non-playcanvas HTML page does display the video content)
There is no CORS issues on my end as well, so I’m back at the same place which is trying to figure out why texture.setSource doesn’t apply the video texture even though it’s an accepted type to that function and the video element is playing it.
Which fix are you referring to? If you referring to the ticket being resolved, it’s not something PlayCanvas can implement as it’s an issue with WebKit. We will have to wait for the fix to make it to the release of iOS.