The Stack Cycle

The cyclic nature of web development technologies is a chore to keep up with and sometimes just makes me laugh. When I started out doing web stuff professionally back in the early 2000’s, most dynamic websites were running some interpreted language like perl or php. While php has come a long way, it wasn’t very performant then. High traffic sites with a lot of content faced the scaling issue that microservices and caching strategies would solve today. But at the time virtual private servers weren’t a thing. You had shared hosting and you could colocate or lease a physical server in a datacenter. Scaling up was certainly possible but it quickly got expensive because it wasn’t possible to scale up in small increments, and it wasn’t automatic.

Online publications were really popular during this time even regionally because there wasn’t nearly as much data aggregation. People actually went to their local newspaper online instead of Google news. Even a moderately sized newspaper market would see significant traffic and a lot of vendors were serving every request through an interpreted language, usually pulling content from a database. I don’t know if this solution was more prevalent in the publication industry because it was all I knew at the time, but what content management vendors started doing was similar to what you see in JAMStack stuff today.

The newspaper I worked for at that time signed on with a vendor doing static site generation. Of course, it wasn’t called that. This product was called NewSys and they did pretty well selling what was essentially an SSG to newspapers across the country.

“The Newsys platform was maintained/developed by 2 people, and served over 400 newspaper websites as their primary CMS.” sauce

It was a fantastic solution at the time. It made a lot of sense particularly for publications where you push all your content once a day and you’re done. There aren’t constant updates to keep users attention all day long. It was one-and-done. This made a batch process of generating static pages a good fit. The process of generating the static files was slow by today’s standards, but the end result was a fast loading site that didn’t require much horsepower to serve. Sounds familiar, eh? I remember laughing when a banner ad - the only dynamic content on the site - would prevent the page from loading during peak traffic times. All that work just to get stiff-armed by a banner ad. But seriously, NewSys was ahead of it’s time and whenever I see static site generator or JAMStack I always think of NewSys.

The JAMStack movement brought static site generation back, but probably for different reasons. We now have robust server-side caching software options, load balancing reverse proxies to take the heat off the backend language and database. But yet we still see a lot of SSG love. Security and CI/CD are probably two big reasons for it, in addition to the prevalence of API’s and other tools that help static and dynamic content coexist (e.g. serverless functions). The expectation of instant response is higher than ever with search algorithms punishing slow loading sites. CDN’s and SSG’s go together like peanut butter and jelly. With modern tools it’s much easier to decouple your frontend from your backend using SSG at build time, which makes a lot of sense from both a security and performance perspective.

I still find it a bit funny that we’ve gone full circle back to favoring static content. In some way it feels like we’ve sort of realized we stopped using HTTP in the way it was intended, so we’ve gone back to the basics a bit. Why were we rendering every byte of every page on the server side when maybe 10% of the page is dynamic? I think that’s the crux of how we ended up with SSG’s like Hugo and things like React where we either massively reduce the overhead of rendering the html on the server or push that job off to the client. So while it is a little funny and sometimes the topic of contentious debates about it’s worth, isn’t the static site generation we see today a natural evolution to a more optimized web? It doesn’t seem forced and people aren’t using it just to do things differently, they’re doing it because it solves problems and makes sense.

So what will next next evolution of the stack cycle be? Will we push onus further to the clients browser? Browsers today are effectively virtual machines for web applications. They have local storage and background services like notifications. WebAssembly is literally a VM for web apps. I get the sense that we’ll continue down this path as devices and the browsers within them become more capable of handling the load. People are also on-the-go more than ever before. Doing things in the browser is more resilient to lapses in connectivity. Another trend I would expect to continue pushing forward that really pairs well with static site generation are compiled static binaries with languages like Go and Rust; at least until WASM gains more traction. In a cloud dominated industry it makes sense to compile distro agnostic binaries. They can be stuffed in a minimal container and paired with static content creating a more easily scaled application with few dependencies. This is the path I’ve been going down as a developer and I very much prefer it.

Copyright 2024. All rights reserved.

Built with Hugo and Bulma.