[SOLVED] Huge project size in my projects manager

Hello,

I just forked a big project for simplify it but the result is still more than 1Go on my organization account.

I suspect than compressed textures are still in memory inside the project despite the fact than I removed 90% of them.

When I download the build or archive the project, I have something like 50/100 Mo but I still see 1.28 Go in the project page.

any ideas?

Web download with all scenes : 19.1 Mo zip
Export project : 92.8 Mo zip
Project size in my project page : 1.28 Go !

As I pay 50$/month for 10 Go and some of these are mysteriously wasted, could I have a solid answer to this please?

Do you have any checkpoints?
How many builds do you have hosted on the PlayCanvas servers?


As you can see, this project is 1.28GM on my screen but it should be only around 100 MB (the size of the archive if I export it)

3 projects total
I did not publish any build at all
where do I check checkpoints settings? I only have one general checkpoint size : 637.83 MB

How big is the archive after unzipping? It’s possible that the checkpoints + size of project uncompressed will be roughly that size? Especially if there are lots of models.

Only 7 models in fbx format : from 2 MB to 6 MB each (value checked inside the playcanvas editor, from models sources)
Archive unzipped is 610Mo (!) so yeah it seems to be 610 MB of texture plus the 637 MB of checkpoints
any ways to manage that?
I mean the project is very small, the unzipped web build is only 30 MB

Wow I just found two 256 MB dds files inside the assets folder of my unzipped archive project.

Is that normal for a 4096*4096 lightmap? The sources of these are 2 exr file of 5.65 MB and 18.4 MB who generated two png in rgbm of 1.52 MB and 3.67 MB only

FBXs gets converted to a JSON format by the PlayCanvas Editor service. These files are large in comparison to the original binary but compress well in an archive and other GZIPed enabled servers (if you are self hosting, make sure you have GZIP compression enabled).

Not sure about the DDS files. 256MB does seem to be pretty large even for a 4K texture file. I’m surprised they are not part of the web build though :thinking:

They are 100% of the problem, and no, in the web build there is only the png version
I will try to delete them and reimport them
I join the exr, dds and png version here if you want to check

https://www.ld3d-livrable.com/Tests/T2-vide-lm0.exr

https://www.ld3d-livrable.com/Tests/T2-vide-lm0.png

https://www.ld3d-livrable.com/Tests/T2-vide-lm0.dds

exr is my source, png is generated in rgbm mode when I import my source, and dds is generated somehow in the project and only visible in the archive


remove and reimport the exr did not resolve the issue, I still have a 256 MB dds present in the archive

I just checked my other project’s archives, seems like all my exr source generate a huge dds of 64 MB for 2048*2048 exr and 256 MB dds for 4096*4096 exr
Is that hopefully a formating bug of exr(->png rgbm)->dds ?


By the way, the dds stay in the archive even if I completly remove the exr from the project before archiving and keep the png rgbm generated, so if it’s a bug, it could be the png rgbm -> dds part

I just added you in a empty project where is just imported my 4096*4096 exr, wait for the end of the png generation, then remove the exr and then archive it.

Result : 256 MB dds in the archive

https://playcanvas.com/editor/project/579706


Interesting fact : “test dds” project only shows 1.52 MB
but if I export an archive of it and reimport it, it shows 539.94 MB

I don’t know why the DDS gets created. It seems odd given it’s a completely blank project. @will, any ideas?

I added @will as well.

keep this in mind :

No informations about this issue?

We know why this is happening but it’s actually quite difficult to fix. Essentially, the DDS is generated on the backend and used as input for prefiltered cubemap generation (if you happen to use the HDR/EXR file as a face on a cubemap). The right thing to do is generate the DDS files if the prefilter operation is executed and delete them when it’s done. This is logged in our bug database. I can’t promise an ETA for a fix at this point though.

1 Like

Thank you for letting me know.

By the way I managed to reduce my project size from 1.25 GB to 156.95 MB by doing this :

  • exporting the “too big” project
  • remove from the archive the faulty dds files and edit the hashes.json file by deleting the ref to these files
  • reimport the project from the corrected archive

Nice trick @Exanova! I’m glad you managed to find a workaround.

Hopefully it won’t take too long before we can address this properly.

1 Like

Hi, I just want to add my own workaround for this bug, although it’s not efficient I managed to shrink my project size from 1.46Gb to 354Mb.

I just copy everything in my old project to a new project.

Downside: I have to set every attribute references again

sorry for my english

1 Like