i’m really struggeling a lot witht his one because i’m a total server-noob. when i put out my website. i realized that the weight of files transfered (in network tab) when launching from playcanvas is significantly smaller than when deployed to my webserver. i realized that this may be due to some content-types not being recognized correctly resulting in it not being gzipped (i possibly got that wrong though :>).
however @will recommended AWS for hosting to solve the problem. i had a quick success in deploying something through amplify which gave me some of these:
this actually happens because playcanvas wants ‘images/dds’ as content-type, not ‘image/vnd.ms-dds’. i was able to change these by hosting a static website using a S3 bucket instead of amplify. this version now is even bigger than the one giving the warnings. so the results are this:
so the question is. what am i getting wrong here? i even ask myself if
this is a problem at all considering that the transfer time is roundabout the same for all three approaches.
while aws was a good tip since it has a lot of advantages i must say that the last days felt like this to me: https://youtu.be/ZHRGjfEQpy4
so anyone, any opinions, recommendations, solutions?
thanks … cloudfront is where i stopped and hoped someone on the forum might say ‘leave it be’ … i created a distribution and checked the compression but i could find the url to the result… i’ll give it another try then!
the url was right infront of me, of cause. dpbeec200egew.cloudfront.net says gzip active. yaih. saves 1mb … the problem now is that cloudfront seemingly gzips only boring file formats like fonts and textbased stuff… i’m pretty sure that it must be customizeable. are you familiar enough with aws to give me another clue?
he … thanks for the suggestion… being more specific on my problem i’m using amazon s3 for static website hosting and i want more exotic formats like .glb or even .dds bein compressed like when i press play in playcanvas.
so as yaustar has pointed out it gzip needs no further configuration in cloudfront and as i stated it does only for text/ascci or non binary files. the lambda function as i understand would then process and gzip all the files before they are put into the s3 bucket. this made me realize that i can upload gzipped files directly to s3. i guess as a pro dev you’d use the CLI (command line interface) for this. i found a tool that has a gui to set mime types and do compression. feels kinda lame but i guess it uses the same interface afterall… i’ll post all the steps that it took when i’m done.
so tldr gzipping the files upfront BEFORE uploading into a s3 bucket does the trick.
I am now reaching a point in a current project where I am reaching the same type of problems with S3 compatible buckets and file compression. I would be extremely extremely appreciative if you could post the tool you used and give some insights into the steps you took.
as i understand it cloudfrond needs this so it knows which files to gzip. also make sure that everything after ‘Configuring CloudFront to compress objects’ is configured as they point out.
Thank you very much for the information. I actually completed the purchase of S3 browser a couple of minutes before your reply! It really is a great tool. I set the headers, compression, and object tag rules to gzip the required file extensions as well upload them correctly.
Testing my site with https://www.webpagetest.org/ showed that the content was both compressed and coming from a known CDN. I’m using Digital Ocean for my S3 bucket and it comes with a CDN, so luckily, there was no extra configuration there.
Thank you again for the information! I’m sure this will be really helpful (especially S3 Browser) for others in the throws of the compression v. CDN nightmare.