Render the entire 3d project in my own code?

I have published the completed playcanvas project to my server, and now how to render the entire 3d project in my own code?

Hi @winni,

Could you elaborate on what you mean by render the entire 3d project in your own code?

hi!
To put it simply, I developed a mobile phone application. This application has only two elements: buttons and canvas. When you click the button on the screen, the 3D project on the server will rendered to the canvas,it will show in the phone screen.But i don’t know how to package the playCanvas engine to my plugin.i’m stuck now.
Can you give me some suggestions?
Thanks a lot!

Can you use an iframe and open the PlayCanvas app in there?

After loading the scene, I need to add some events in pc instance, to interact with the background, so I can’t load it directly with iframe.

I can see two main options here:

hi,Thanks for your comments!
now i already upload my playCanvas project to a personal server,now i have a domain and AR project in cellphone,i want to use this export pc project to implement the AR function,for example to render the 3D project in the Camera,like the following picture:


how can i do that?
Thankyou

You either have to use WebXR which is only supported on Android phones or use an external library like 8th Wall.

If you don’t need any tracking of the floor and you just want to overly a 3D world on top of the camera feed no matter where you are pointing, you can have the camera feed in a DOM object and PlayCanvas with a transparent canvas on top.

I’m still not sure what you trying to do here.

Can you share your technology stack and software you are using? Eg, what is this AR app on your phone? Did you write it yourself?

i m not sure have you heared Wechat applet,i m going to overly a 3D world on top of the camera feed in the wechat applet, like the following picture.
.

i want to write my own code to replace this XXXPlugin,i don’t have the code of this plugin,so i don’t know how to rewirte those two functions:loadProject,loadScene.

// XXXPlugin is 
const { pcSetup, loadProject, loadScene, rotationFromMotion} = requirePlugin('XXXPlugin');
      const { pc, app, canvas } = pcSetup(_canvas);
      this.app = app
      this.canvas = canvas
      this.pc = pc
      app.start()

      loadProject(app, project_path, {      
      }).then(() => {
          console.log('resourceLoaded')
          return loadScene(app, sceneFileName) 
        })
        .then(scene => {
          console.log('sceneLoaded')
          const camera = app.root.findByName('Camera')
          camera.camera.clearColor = new pc.Color(0, 0, 0, 0)
          camera.camera.clearColorBuffer = true

          var mat = new pc.Mat4();
          var initRot = new pc.Quat().setFromMat4(mat);
		  .
		  .
		  .
         });

That looks like marker based AR tracking.

The closest project we have to that is: https://playcanvas.com/project/481413/overview/ar-starter-kit

Which uses this: https://github.com/playcanvas/playcanvas-ar

You can see from the source how it gets the video stream from the camera and adds it as a DOM to the background of the page for the canvas to overlay.

https://playcanvas.com/editor/code/481413?tabs=8792451

hi! Thank you for all your comments.
Actually, i know how to get the video stream from the camera and adds it to the background in wechat applet. However, i have no idea how to overlay the playcanvas project from the server to the canvas created in wechat applet.
The real problem is how to implement the two functions: loadProject, loadScene? To be more precise what is the detailed logic of the two funcitons?

In a nutshell, you can’t overlay an external build of PlayCanvas app on top of an existing webpage and communicate with it directly.

You HAVE to change the PlayCanvas app and/or exported build’s html to do this.

You can either:

  • Add the camera video DOM object as part of the PlayCanvas project
  • Modify the HTML to have the necessary DOMs and extra buttons needed for the WeChat applet. If want to load the PlayCanvas app into a specific canvas, look at the exported build’s __start__.js code on how it loads the PlayCanvas application.

Both of these require transparent canvas to be enabled in the project settings and a transparent clear colour for the camera.

The PlayCanvas AR Starter Kit that I linked above show how to do this.

I recommend importing the WeChat library into the PlayCanvas project and doing everything in there.

Is there documentation for this plugin that you are using?

@yaustar that’s the problrm,i don’t have any related documents and code about that plugin,and now i need to rewirte this plugin.Thank you for the advice,i I’ll study it first,If I have any questions later, I would like to ask you again,Thankyou so much,take care!