#StandWithUkraine

Russian Aggression Must Stop


Website optimizations to save the planet (probably not though)

2020/07/18

Tags: meta tech

I’ve said it a few times in the past that this website is pretty optimized. It’s completely static, meaning it has zero backend apart from a standard HTTP server (I use Apache because I vaguely knew how to configure Apache). The website content is simply just generated by a static site generator called Hugo on my desktop system and then transferred to my VPS using rsync.

The other day my friend Liam (the admin and owner of GamingOnLinux) messaged to me that he had enabled this web tracking software called Plausible for his site. Basically, it’s a traffic tracker that estimates the amount of clicks you’ve received, where those clicks originated from, how long a visit is on average etc. The cool thing about Plausible is that it’s A. open source and B. light-weight. And they make a pretty big thing about that second point, even talking about CO2 emissions of transferring website content.

Now, I don’t really care about Plausible itself that much and I’m not planning on including it. I used to do a bunch of content on YouTube and started having a pretty toxic relationship to arbitrary numbers, so I’ve tried to minimize numbers in my online content creation as much as is reasonable. I know my website doesn’t receive much traffic so tracking it would be both unnecessary and would probably just make me self-conscious.

But the CO2 part got me interested. Plausible linked to https://www.websitecarbon.com/ which attempts to calculate the carbon efficiency of your website based on an URL. It’s basically a website speed/optimization test kind of like Google Page Insights or similar things, except it gives a CO2 footprint to your site.

Turns out that while my website is indeed among the tiniest in terms of carbon footprint, the only thing that is really heavy about it are images since everything else is text, there were a few things I could improve.

Optimizing image size and loading

The biggest optimization was to make those images smaller. A modern optimization would be to serve multiple versions of the images in different formats like webp to capitalize on modern, efficient compression algorithms and multiple image resolutions, but I don’t have a way to dynamically convert them and I don’t want to store duplicate files on my server. So, I did the next best thing and I put the JPEGs and PNGs through jpegoptim and optipng respectively.

I saved a fair bit of storage and transfer on that alone, since my images so far have been pretty unoptimized. Many of them have just been screenshots I took, cropped and then uploaded.

The second thing I decided to do was enable lazy image loading. Modern web browsers can check if an image is visible and avoid loading images that cannot be seen currently. All I had to do to take advantage of that was to add the necessary loading="lazy" tag to all <img> elements.

Better cache expiry

The next optimization was discovered by running a few of those website speed tests. One mark I did particularly poorly on was cache expiry information. So, I just enabled the expires Apache module and enabled cache expiration information to all content. I’m hoping that it doesn’t bite me in the ankle and I might need to tune it down for certain types of content, but it made website tests happy.

New and improved, ready to save the planet

So, now the website is a bit speedier. It’s probably not noticeable unless you are on a particularly slow network connection and for some reason haven’t disabled automatic image loading. The only noticeable thing is that you might now see images load in when you scroll pages with images on them.

There are still some things that I could technically do, such as use a CDN to cache content, but considering how often Cloudflare’s errors seem to take down half of the Internet, I’m not feeling like doing that. And like I mentioned earlier, I could also serve alternative file types for images to save on bandwidth, but I’d either need to serve them alongside normal JPEGs and PNGs or sacrifice compatibility with certain browsers. Serving scaled images would also bring savings but I would once again either need to duplicate files or bring in a backend to handle it dynamically.

So, did I just single-handedly save our planet from climate change? Nah, not really. All this has basically been an academic exercise since for a website of this scale there’s barely any effect either way. Also, we’re starting to reach a point where individual action won’t achieve much and change would need to be systemic.

But hey, maybe you, the reader, might happen to a have a website or may be you are planning to make one. And maybe it’s a bigger site than mine. And maybe you got inspired to make your website more optimized. I’m not sure if that’s going to save the planet but it probably can’t hurt and at least some of your users might end up a bit happier.

Oh, and to add to the bloat on this page, here’s a badge thingy:

I had to figure out how to get Hugo to allow me to add arbitrary HTML into my posts for that badge, hopefully it was worth it.

>> Home