figure out what elements need to be delivered first, and what parts can follow a bit later with a lag time users won’t notice.
Caching involves storing the data that appears first on a website so that it can be called up right away. For example, a news website like The New York Times may cache everything “above the fold,” or all the stories and links that appear in your browser when you first visit the website before you start scrolling down. So while the lower parts of the site may still be loading, the user doesn’t feel as though they’re experiencing those crucial few seconds of lag that could determine whether they stay on your site or go somewhere else.
Load balancing traffic across multiple servers is another vital tool. The load balancing process involves distributing incoming website requests from users across multiple servers so that the backend of your website (where all the data is coming from) doesn’t get overloaded and slow down the frontend of your site (the homepage, user interface elements and other site pages people are actually clicking on). Load balancing is a dynamic process that automatically allocates server resources based on which site pages are receiving heavier traffic. To come back to the highway analogy, a load balancer might take the form of a traffic cop directing cars into different lanes to keep traffic moving.
The last important tool to keep in mind is what’s called a content delivery network (CDN). CDNs are especially important for a website that regularly receives visitors from across the country or around the world. The way they work is that instead of having your website supported by only a single data center in New York or San Francisco, for example, your site will have data storage stations all along the internet highway to help send data back and forth depending on where users are in the world. A CDN is made up of a distributed network of data centers and proxy servers (computers that serve as intermediaries relaying data) to ensure that content loads quickly no matter where you are.
CDNs aren’t easy to build out (just ask Netflix), but once you have this distributed network in place, your site will have the distributed global infrastructure to weather fluctuating internet speeds. Even on a web without net neutrality, a CDN ensures there’s always a nearby data center or server so the data doesn’t have to travel too far along that traffic-jammed highway.
One last caveat to keep in mind is encryption. Whether you’re talking about caching, load balancing, or CDNs, if you’re encrypting traffic to protect customer data and comply with GDPR — the European Union’s strict personal data privacy regulations—it’s particularly important to keep an encrypted Internet traffic line open. Protecting customer data is a vital aspect of any website, but there’s no need for it to slow down performance. Keeping a persistent encrypted channel open ensures that when a user does need to enter sensitive personally identifiable information (PII), or asks for it from your site, you won’t need to divert the interaction with the user from an unencrypted channel to an encrypted one. This can help reduce that all-important lag and loading time across your website.
These are practices widely used all across the Web. But in a world without net neutrality, application performance has never been more important than it is today.
To come back to the six-lane highway analogy, your website needs to deliver traffic not as a car or a truck, but like a motorcycle bouncing between cars at a lighter weight with higher performing delivery. With smaller packet sizes and CDN stops all along the way, you don’t need a bulky vehicle to lug all that cargo for the entire journey. If your app isn’t built on a scalable framework with the ability to quickly cache, load, and balance traffic, and dynamically allocate bandwidth and storage as needed, you may end up an 18-wheeler truck stuck in a single lane of endless traffic.