I have issue with texturing compression and downloaded size
There is a mismatching between editor hard drive size and texture on disk.
I downloaded the project on my drive and check the texture Parquet_01_d.jpg
Parquet_01_d.jpg = 3.2mb
Parquet_01_d-dxt.jpg = 2.8mb
Parquet_01_d-etc1.jpg = 2.8mb
Parquet_01_d-pvr.jpg = 2.8mb
On the editor the situation is different:
Parquet_01_d.jpg = 3.2mb
Parquet_01_d-dxt.jpg = 1.76mb | VRAM 2.8mb
Parquet_01_d-etc1.jpg = 1.94mb | VRAM 2.8mb
Parquet_01_d-pvr.jpg = 1.75mb | VRAM 2.8mb
I think you might just be misunderstanding the meaning of the two numbers. The first is the download size (so the size of the texture when GZIPed). The second is the non-gzipped size.
Note that our server (https://playcanv.as) is configured to serve DDS files GZIPed.
We do if you publish to our playcanv.as domain. If you self-host, you’ll need to configure Apache (or your www server of choice) to serve DDS files GZIPd.
Hi Will,
Reaching out again about the DDS issues since it’s a huge stumbling block for us right now
If we need to save in a database structure, does your S3 bucket allow for external uploading tools and code to access those DDS files the way we would do typically with pngs and Jpegs?
Our apps need the DDS or texture map files to be loaded from instructions sent by our backend code that then load them into the Play Canvas built app.
When we export the DDS files compressed in the Engine to our s3 buckets and CDN, the custom header is causing a lot of errors.
If we use externally encoded DDS files we lose quality big time.
Could you point us to DDS encoding settings that would be similar / compatible to the ones we use in Paly Canvas or tell us how we’d go about making this error with Play Canvas DDS Files go away?
Could you point us to DDS encoding settings that would be similar / compatible to the ones we use in Paly Canvas or tell us how we’d go about making this error with Play Canvas DDS Files go away?
When the DDS files are self contained in the exported build hosted on our own servers it works, we have our server configured to serve DDS files no problem
The issue occurs when we move the DDS files to our Amazon server
I’ve reached out to Play Canvas for help on this, offering to talk about a paid solution but they haven’t responded to a single email or a single forum post and we don’t know why
it should, it’s when we break the DDS Files out of the build that we run into problems.
We have hosted our builds on a few different internal and external servers.
In the build, the DDS Files work fine, take them out and apparently because of the custom header they cause an error. Assuming metadata associated with the DDS files goes missing due to the custom header
The issue this causes for us is this, we need to add textures all the time from clients, and we can’t do a build and download it and install it every time we add one or two in some cases.
We haven’t had a chance to test encoding a DDS after a build and then inserting it into the build, but this would cause havok with our database structure, how can an external data base know the structure of the build and where the DDS files are and if it exists even. You have to be able to point to it and the way Play Canvas stores all the DDS files in their own folder makes it tough. We’d need to know what metadata files to update inside the build.
I don’t know why Play Canvas won’t talk to us about this, we’re willing to talk about a paid solution but they won’t respond