Only 7 models in fbx format : from 2 MB to 6 MB each (value checked inside the playcanvas editor, from models sources)
Archive unzipped is 610Mo (!) so yeah it seems to be 610 MB of texture plus the 637 MB of checkpoints
any ways to manage that?
I mean the project is very small, the unzipped web build is only 30 MB
Wow I just found two 256 MB dds files inside the assets folder of my unzipped archive project.
Is that normal for a 4096*4096 lightmap? The sources of these are 2 exr file of 5.65 MB and 18.4 MB who generated two png in rgbm of 1.52 MB and 3.67 MB only
FBXs gets converted to a JSON format by the PlayCanvas Editor service. These files are large in comparison to the original binary but compress well in an archive and other GZIPed enabled servers (if you are self hosting, make sure you have GZIP compression enabled).
Not sure about the DDS files. 256MB does seem to be pretty large even for a 4K texture file. I’m surprised they are not part of the web build though
I just checked my other project’s archives, seems like all my exr source generate a huge dds of 64 MB for 2048*2048 exr and 256 MB dds for 4096*4096 exr
Is that hopefully a formating bug of exr(->png rgbm)->dds ?
By the way, the dds stay in the archive even if I completly remove the exr from the project before archiving and keep the png rgbm generated, so if it’s a bug, it could be the png rgbm -> dds part
We know why this is happening but it’s actually quite difficult to fix. Essentially, the DDS is generated on the backend and used as input for prefiltered cubemap generation (if you happen to use the HDR/EXR file as a face on a cubemap). The right thing to do is generate the DDS files if the prefilter operation is executed and delete them when it’s done. This is logged in our bug database. I can’t promise an ETA for a fix at this point though.