Currently I’m trying to integrate my playcanvas app into one portion of a react app, and the playcanvas app has 360 videos/viewing and in order to get 360 cardboard working in playcanvas, I have to prompt the user for permission on iOS for access to device motion and device orientation (otherwise camera is frozen at current angle).
So I have a button that successfully does that in a playcanvas build, but I tried this in an embedded iFrame on a different site and the prompt does not come up at all.
I got it working if I export the build from playcanvas but was hoping to use an iframe to keep playcanvas contained.
Is there something I’m overlooking here? iframe setting perhaps? Let me know if anyone else has run into this.
I think the issue is a new attribute required in iframes.
Check this post:
Looks like it works on Android but I still have the issue of iOS not being able to look around via device motion and orientation.
I understand that iOS requires user interaction and having them allow use of it (and am fine with having the prompt), just seems weird that the same user interaction that was prompting the user isn’t working anymore because of the iFrame.
Was wondering if there is an archive on the webvr api? I’m trying the method that was within the link you sent above to send device orientation data via PostMessage (which I’ve set up) to playcanvas to then set the camera rotation to get the same type of look around movement as the WebVR Starter kit.
I’m able to use the device orientation (alpha,gamma,beta) in playcanvas but unsure how it’s being set in the look-camera in the vr starter kit. Do you have any insight of how those deviceorientation values are being set on the camera transform? I was just setting this.vrCam.setLocalEulerAngles(phone.beta, phone.gamma, 0).
Hopefully this makes sense, just hoping to avoid ditching the iframe embed, then having to implement the raw html files into my react app.
I think you can override the pitch/yaw in the look-camera.js script, something like this:
this.pitch = phone.beta;
this.yaw = phone.gamma;
But I may be wrong, I don’t have much experience with that VR template. @yaustar may be able to offer more insight on this.
It’s done via the WebVR polyfill/support in the browser. I believe we just get the transform of the camera back.