When I was working at my first “real” job in the field in the mid-2000s, it was hammered in the web dev field to build tiny websites (no more than 100KB per page), only use JavaScript for special effects, and make sure everything—from images to Flash content—has a fallback so that JavaScript features progressively enhance. If there was no JavaScript, the site still works 100%, only not as fancy.

The reason for this advice was simple: in the early days of the web, everyone was limited to a dial-up Internet connection which was really slow. Shoot, it took a few minutes just to connect to the Internet, let alone access a website. Hit the “Connect” button and go make some coffee or grab a smoke because it is literally going to take a few minutes. So, in that sense, it’s perfectly reasonable that the key principles of building a good website in the mid-2000s involved making the site small with solid fallbacks for content.

I miss those days a lot. In my opinion, those constraints made the web better. But, as time wears on and the service provider technology continues to improve in the U.S. with broadband (and eventually fiber), the constraints are no longer viewed as an issue anymore.

Today, the rules of web development are completely different. It’s more about the developer experience and less about the user experience: build processes, deciding which framework and tech stack is used, and figuring out where the site lands in Google search results. Sadly, gatekeeping (i.e. you’re not a real “x” if you don’t “y”) and framework battles have replaced the “how to make this cool thing without JavaScript” conversations. I really don’t pay attention to this stuff because, at the end of the day, it all renders down to HTML, CSS, and JavaScript in the browser; use whatever works for you.

Cartoon illustration from The Simpsons TV show showing a yellow hand holding a newspaper clipping with the headline Old Woman Yells At Cloud. A photo of the woman shows her with wild gray hair and a raised right fist as she holds two nervous-looking cats.

Source: @aexm

What does bother me, though, is the fact that huge sites that require JavaScript just to use have become the accepted norm. Current stats show websites are weighing in at an average of—big gulp—2MB per page?! And, if you do not have JavaScript enabled, be prepared for the blizzard (that’s what I call sites that only display a white screen).

JavaScript has become a third pillar of the web. I know JavaScript is super useful, so why am I picking on it? I’m picking on the fact that a lot of sites can’t even load if the JavaScript is not there. Since when did loading HTML and CSS rely on JavaScript?

I recently moved to a rural area on the outskirts of a major city, and I’m reminded of those early web dev days because, once again, I have a horrible Internet connection on my mobile device. The broadband connection is okay, but when I’m outside or the power goes out, I’m left with the same old slow experience of the dial-up days. When I’m browsing the web on my mobile device, I live for Reader Mode and I have to turn off things like JavaScript (and most images, compliments of JavaScript lazy loading) because it is too much to download and run. But, just by switching off JavaScript, a lot of sites won’t even load. The white screen of death is not the result of my phone dying; it’s my access to the Internet. And this white screen appears on site after site, hence the blizzard.

Six screenshots of webpages on a mobile device in a three-by-two grid. All of the screens either show blank content or a notice asking users to enable JavaScript to view the website.

My all-too-familiar experience browsing the web on a mobile device.

For the most part, I can get by using my WiFi in the house to do necessary Internet things, like working, shopping, and paying bills. But, late this summer, my electricity went out. So, I went to the electric company’s website on my mobile device to see when service might be back on because the entire house runs on electricity and I need to know if we need to make arrangements for things, like food and water. But, the electric company’s mobile website is large (3MB transferred and 8.6MB in resources — ouch) and won’t load (even with JavaScript enabled).

Angry, I went to Twitter and posted my outrage.

I got some pretty awesome responses showing me that some sites actually do handle constraints really well, like traintimes.org.uk and the text-only version of NPR. YES! Why don’t they provide a text-only version of the outage page so when people actually need that page of the site, they can use it even when the conditions are the worst?

My power-outage scenario is a one-off type of situation, but there are people across the U.S. and the entire world who live with unreliable Internet because there is no other option available to them. NPR published an article covering the struggles of trying to teach students during the pandemic in areas where communities have unreliable Internet and it’s hard to believe that this is the current state of Internet availability in any part of the U.S. But, the fact remains that it is.

It’s clear a lot of people would benefit from the constraints of the early days of web development. The beauty of websites built before broadband networks were available (like the infamous Space Jam site from 1996) is that they can load on any kind of device from anywhere under almost any circumstances because they were built with these constraints in mind. Hopefully, all developers start employing an adaptive loading strategy that provide solutions for users on slow networks or low-end devices.

Similar Posts