Hi Team,
We are going to implement a multiplayer 3D game project. So here we have couple of things to clarify at production level.
->What is the best and precise way to implement multiplayer 3D game?
-So far we came across procedure like hosting server in third party service and
communicating the server using socket connection. Real Time Multiplayer | Learn PlayCanvas
Is this the only better way that we have so far or Do we have any other better ways to do this?
->How can we load avatars at run time and what could be the best platform to load them into our scene?
-So far we can able to load avatars from ready player me. PlayCanvas | HTML5 Game Engine
Is there any other better service we can use?
->How can we implement voice chat for avatars in-terms of lipsync and hand movements animations etc…?
->if we go for avatars & lip-sync in our project , is our scene gets heavy and is that smooth enough?
So please provide us your inputs on this multiplayer project at production level and on top of above queries any other suggestions and inputs are appreciated.
How can we load avatars at run time and what could be the best platform to load them into our scene?
At the moment, readyplayerme is the best service available for that, you might be able to use Snapchat’s bitmojis but it is a bit restricted service.
How can we implement voice chat for avatars in-terms of lipsync and hand movements animations etc…
To add on to @Leonidas’s answer, I’ve written a short starter guide on Photon(and how to integrate it into PlayCanvas projects) here. Hope it helps!
I haven’t done anything with voice chat and lip syncing there but ultimately it was just animating morph targets on lips based on an external factor. In my case here: https://twitter.com/yaustar/status/1496918844874575874
That was based on marker positions using face tracking technology.
Hi Team,
We can able to load and use ReadyPlayerMe avatars. For voice chat , we implemented with webRTC but we need to achieve avatar animations(lip-sync , facial expressions , etc…) in-sync with players voice chat. Currently we are in exploration on this requirement. So can anyone suggest us on this requirement how can we make avatar animations in-sync with players voice chat.
To do this from scratch is going to be REALLY difficult.
You effectively have to map a voice/audio to expressions and mouth shapes on the RPM avatars. You may be able to find some library that can do that but even then, doing it in real time is going to a tall order.
Hi @yaustar ,
Thanks for reply.
As we checked with Unity , they can use photon server for voice chat along with oculus ovr sdk to achieve avatar animations in-sync with players voice chat. So do we have anything for playcanvas or webGL supported tools or sdks to do that?
You will have to find an external library. PlayCanvas won’t provide this out of the box. I don’t know of any offhand but yes, something like the OVR SDK is the type of thing you need https://www.youtube.com/watch?v=4JGxN8q0BIw