A new camera setup

Hello guys, some day ago i saw on tv the presentation of a visor where you can plugin the new Samsung phone and use it as a virtual reality device, since playcanvas allow to create games for mobile phones i have thought that you can add a new preset camera function that allow to make a visualization 3d. Calculating the distance of objects from cam and creating the shadows images(red/green). It will be useful for pc apps too, just using the green/red lens one can experience awesome 3d effects. What do you think?

I think you saw the Samsung GearVR device. To target that device with PlayCanvas, you can set up stereoscopic rendering in your app. We already have a 360 Video Player example project that does this although it doesnā€™t specify any eye separation. This is because itā€™s not 3D video so eye separation wouldnā€™t have any effect. Anyway, if you have a Cardboard VR device, itā€™s worth having a play with that project.

To target VR devices like Oculus and Vive, you can use the WebVR integration scripts available on GitHub. WebVR provides low latency head tracking and also handle the fish-eye distortion effect for each eye.

1 Like

Sorry not clear if there is a script or setting for vr, specifically for cardboard via mobile web. It seems that the 360 video player is just for video. How about none video scenes?

The 360 Video Player scene works for any kind of scene.

In the case of the video player weā€™re rendering video to a sphere. You could also render a still image.

Or you could remove the sphere entirely and the camera will work for a 3D scene.

I was playing with the 360 Video scene that Dave is talking about, and made some modifications for it to work with Google Cardboard, particularly on IOS devices.

Fell free to give it a try:

http://playcanv.as/e/p/ATuuvNEK/

The project can be forked here, in case anybody else finds it useful:
http://playcanvas.com/project/396111/overview/google-cardboard-prototype

1 Like

I cannot get the phone (iPhone 6+) to respond gyroscopically. Does it work for you?

1 Like

Hmmā€¦ youā€™re right. Iā€™m testing on my iPhone 5, and Iā€™m getting the same issue, but when I launch the project through the editor, the accelerometer rotations do work:

http://playcanvas.com/editor/scene/428762/launch

Iā€™ll look into it tonight when I get home, Iā€™m not sure why the published version isnā€™t working.

Ok, found the problem. In order to detect orientation changes, I need to add a deviceorientation event listener. When I publish the project, the build gets saved on an Amazon server, and is delivered via an iframe to the PlayCanvas site. This, paired with the deviceorientation listener, creates a security issue due to the same-origin policy:

Blocked attempt add device motion or orientation listener from child frame that wasn't the same security origin as the main page.

@max, do you have any suggestions to circumvent this issue?

1 Like

Safari on iOS doesnā€™t support acceleration on insecure (non-https) or cross domain origins.

Make sure you use the https link (replace http with https: e.g. https://playcanvas.com/editor/scene/428762/launch)

And for published projects currently you need to use the ā€œembedā€ link. so for a published app which has the URL: https://playcanv.as/p/123456 use https://playcanv.as/e/p/123456

2 Likes

Excellent! Thanks, Dave. I didnā€™t know you guys had the ā€œe/ā€ embed shortcut.

@creg this link should work on iPhone 6:
http://playcanv.as/e/p/ATuuvNEK/

1 Like

Oh the link redirectsā€¦ Yay that worked! Thanks!
So is there a script you dropped in that handles the split view?

I see that you have a dual camera system set up in the editor. Do you know if itā€™s possible to clone a public project? Or would I need to set it up manually as you have it?

Yes, as long as itā€™s public, you should be able to make clones:

Visit the project overview, click ā€œForkā€, and give it a name. The project will be cloned to your projects section.

And yes, for stereoscopic view, you need to set up two cameras side by side. Hereā€™s a quick explanation:

First, you set up two cameras so each one takes up half of the viewport
Camera Left viewport settings: x: 0, y: 0, w: 0.5, h: 1 Camera Right viewport settings: x: 0.5, y: 0, w: 0.5, h: 1

Then, you space them out a bit, to help provide the illusion of depth
Camera Left position: x: -0.1, y: 0, z: 0 Camera Right position: x: 0.1, y: 0, z: 0

1 Like

I was able to get my demo working, camera-wise, but now itā€™s stuttering along (on the iPhone 6+) and Iā€™m not sure why:

EDIT: I had way too many spot lights in play. Once I removed redundant lights, the speed issue went away.

Thanks again for all of your help @marquizzo and @dave !

1 Like
ZuiHou.prototype.update = function(dt) {
    if (this.alpha && this.beta && this.gamma) {
        this.entity.setEulerAngles(-90,0,0);
        //偏ē§»
        this.entity.rotate(0, 0, -this.gamma);
        this.entity.rotate(this.beta, 0, 0);
        this.entity.rotate(0, this.alpha, 0);
        this.entity.rotateLocal(0, 0, this.orientation);
    }
};

How do you know how to add touch(vr+touch)?please! like this (Please use your mobile phone to open it):http://changan.itqiche.com/api.php/WebVR/index/id/20171012191031Osrn9

my project: https://playcanvas.com/editor/scene/557880

code:https://playcanvas.com/editor/code/509976?tabs=9555526

You are best off creating a new thread about your problem as I donā€™t think it relates to the VR camera talk here.

Donā€™t forget to include a link to the project if you can.

3Q,Now the address has been added:grinning:

It looks like you are using some custom HMD VR code/polyfill so Iā€™m assuming that the HMD that you are using has some sort of ā€˜touchā€™ input similar to the Google Cardboard.

These tutorials/code samples should help: https://developer.playcanvas.com/en/tutorials/?tags=input

oh,no no no
no use HMD;use ā€œzuihouā€ script,only!

Even then, the same code for touch input in the tutorials/code samples applies.