The exact FPS-conversion from Blender to PlayCanvas?


#1

Without having any illusions to where it might end, I am (maybe naïvely) trying to make a lipsynced storytelling project (FBX-based with firing/adhering sound bites). Within that realm the notion of reaching the highest possible synchronization {the exact FPS-conversion between my used FBX-builder; Blender, and PlayCanvas} becomes pivotal [by the way, googled alternatives to FBX+sound heavily already].
As it stands right now I come to the sub-conclusion that;

  1. The consensus maximum execution framerate in PlayCanvas stands at: 60 FPS.
  2. Beneath that maximum I have found that a strip extent/action extent of 400 frames, result in a FPS-conversion of approximately 28 FPS:
    -> put this alongside the imported duration result in PlayCanvas: image
    Conversion result: 400 frames/14.1667 second = 28.235 FPS-as-functional-export
  3. Also beneath that maximum, I find to believe that there is an alternative export-setup conversion. Here the duration result of 14.1667 seconds rely on the actual extent of the keyframes in the Blender ‘Dope sheet summary’/‘NLA-strip’ only:
    Conversion result: 340 frames/14.1667 second = 23.999 FPS-as-functional-export

Conclusive question: Am I right to think that the conversion is based on 3) and not 2)?


Heavy bugs within the 'scene change' functionality
#2

Blender with have its own internal ‘FPS’ so I wouldn’t be use frames as a measure of time. I would have thought that the animation timeline in blender would have actual time units (seconds) to work with.


#3

Ok, I am currently doing some testing alongside, where to computers (PC and laptop) executes the same custom made script that uses the 0.4 conversion (FPS: 24/60). Maybe I will post some of the result - not so alerting an issue, so kind of solved… wanted some extra info from backend, but thx anyway :slight_smile:


#4

IIRC Blender defaults to 24 or 25 frames per second for animation


#5

Right. Animation systems like Blender typically work on units of “frames” and then let the user specify how many frames they want to play per second. I would assume that an FBX export would respect the FPS setting of the particular animation timeline in question. But a person would have to test that to be sure.

From observation, it seems like PlayCanvas tries to write a frame ever 1/60 of a second. The dt interval is typically around 0.01667. I think I would try to monitor dt in some way if I were trying to keep things in perfect sync.


#6

Now we are ‘nerding’ somewhere :slight_smile:

yaustar: consider your gangnam dancing robot, from the animation example was made in Blender. Then you would have to change the executed animation speed in PlayCanvas editor to 0.4 (conversion) in order to slow it down (perceived speed-correctness).

wturber: In the PlayCanvas FPS example, the top corner number can be changed from FPS to MS -> this ms is exactly the dt update time that changes in relation to ‘cost’ (system performance [Profiler: GPU, triangles, shaders etc.]).


#7

IIRC, PlayCanvas targets vsync rate and runs as fast as it can.

I still don’t understand why we are talking about frames and not time? If it’s understood that each frame in blender is 1/24 the of a second, then animate as such? Am I missing a workflow issue here?

Edit: to put it more clearly, PlayCanvas framerate is variable and frame counts should not be used as a measure of time. Use dt in the update loop to keep track of how much time as elapsed.


#8

On those areas I totally agree, yaustar … and will alter my knowledge, to what you are saying in regards to vsync (https://www.vsynctester.com/detect.html) - that is super info

The main focus of this conversion-post; Is getting to a future place where FBX better mimics a video-file (rendering follows sound in execution):

  1. Autodesk has only allowed sound to work in an FBX-file, in one their own MotionBuilder https://forums.autodesk.com/t5/fbx-forum/can-fbx-contain-sound-track/td-p/4258886
  2. One cannot stretch-variate the sound execution time in PlayCanvas (which is good and ok within this sync-issue), so all the fuzz about this post, concerns my own experiment of making a scripted structure that comes close to a workable (render-sound) synchronization.
  • hope to help myself along others, while 1) can’t be used inside PlayCanvas

->> furthermore in regards to vsync-area: I am already using dt in my scripting, to go beyond and above vsync rates of 60


#9

And in relation to yaustar’s last remark; this post should have been called: " A vsync FPS-conversion from Blender to PlayCanvas"