Why is the web performance of back-end systems important and in need of optimization?

One day I had an interesting conversation with the support team of the Department of International Trade (DIT) . They wanted to improve the performance of one of their web applications. Talking like this is what I like the most in my current position. I talk about topics that interest me, meet new people, tell them about opportunities that they may not know about, for example, how to improve the UX of an application.



To be honest, I was a little upset when I was informed that this service is for internal use and not publicly available. This meant that my usual goto tools for evaluating web performance were not available. Namely:





This gave me an idea: how can you test internal services and products? In this article I will try to find out.



The most important condition for fast internal systems



As we all know, the world has changed significantly due to the global pandemic . Because of the widespread lockdowns, everyone who had the opportunity started working from home. This has had a significant impact on the performance of the Internet worldwide and has resulted in high traffic and increased download times. Because of this, all the tools that employees need to get their jobs done must load quickly and also be interactive.



Previously, there were no such performance issues because internal users were working from the office and using a local area network (LAN). The problem is compounded by the fact that many live in small cities with slow and unstable Internet connections. Or, on the contrary, in cities with a high coefficient of competition (contention ratio). Since we're talking about internal systems, most of the time employees need to use a VPN to get into the office network. But this is also not very convenient: VPN often slows down the Internet speed .



It's important to remember that employees are users too. Therefore, make sure that you optimize your internal systems as well, as poor performance of web applications negatively affects the work of employees. What tools to use if internal systems are usually private? We will talk about this further.



WebPageTest



The first thing I would like to clarify is that when you are testing internal systems, you will not be able to use the open source version of WebPageTest . But you can set a private one. Thanks to this, it will be possible to use it on the internal network.







After launching, access to a huge amount of data for all internal systems will open. Be sure to read the guide to get as much information as possible from the WebPageTest. Setting up your own version is not as difficult as it sounds. There are some great tutorials on how to set it up on AWS yourself in a couple of minutes:





This option will enable the WebPageTest to run automatically. This can be done in several ways:





This way, you can track the performance impact (positive or negative) of new system changes.



Lighthouse



The next goto tool you probably already have is Google's Lighthouse . If you have installed a copy of Google Chrome on your computer, then using Lighthouse to audit your internal system is very simple:







In the DevTools tab, find “Lighthouse” and click “Generate report”.







A minute after launch, the audit will return the result, as shown in the picture above. I highly recommend setting up a new Chrome profile specifically for testing Lighthouse as browser extensions can negatively impact performance (depending on what exactly they do on the page).



But Lighthouse isn't just an audit dashboard. Here are some other ways you can use this tool:



You can run Lighthouse using the Command line interface (CLI)



You can easily run Lighthouse on all pages of your site .



Compare Before and After Performance with Lighthouse CI Diff



Automatically run Lighthouse at regular intervals across multiple sites by running tests .



Add your own audit to monitor specific parts of the site



Share results via Github Gist and Lighthouse Report Viewer .



Sitespeed.io



I put Sitespeed.io at one of the top spots on this list because I find it underestimated by many. It is a wonderful set of tools for improving website performance. It can be configured quickly and easily with a simple command docker



or npm



. Also Sitespeed.io is easy to run on your local machine for quick testing.







With sitespeed.io, you can continuously track as many pages as you need by sending data to Graphite / Grafana to get dashboards like this . Basically, Sitespeed.io is the following set of tools:



  • Coach is an automated tool that helps you increase your page loading speed.
  • Browsertime is Sitespeed 's main tool. It interacts with test browsers (e.g. Chrome, Firefox, iOS Safari) and collects performance metrics, images, videos, and more.
  • PageXray is used to convert HTTP-archive (HAR) files to JSON for ease of reading and use. (Read more about HAR files below)
  • Throttle is a command line tool for throttling network connectivity for performance testing.


Note: Throttle blocks the Internet connection on the entire computer, so remember to turn it off after you shut down.



  • Compare is an online tool to quickly and easily compare HAR files (eg before and after).


DevTools in Chrome



All modern browsers already have built-in developer tools. We've come a long way since Firebug in Firefox . Most importantly, DevTools are becoming more powerful with each new version of the browser. This is convenient for both developers and users, because there should be fewer bugs on the sites, right? Ha!







The image above shows the detailed information that a web page performance audit can provide ( Performance tab). But Chrome DevTools, besides the performance measurement tab, has many other features:





And if you're looking for non-Google articles on this topic, check out these:





DevTools in Firefox



There are other browsers that can help you evaluate performance issues using DevTools. I am a Firefox user, so I regularly use the tools of this browser. Firefox also has a full set of tabs that you can use to find problems on your site:







You can use these tools to:





Libraries for extending analytics data



This may not work with internal tools, but if you track usage using analytics (such as Google Analytics, Fathom, Matomo), you can extend the data collected to include more complete information about web performance.







Several libraries you can use:





Perfume.js stands out among other tools for the amount of collected data Real User Monitoring (RUM). It can be fully customized to collect as much as you need. Here are a couple of guides on how to do it:





JavaScript Analytics



This article does not cover JavaScript internally, although I believe that minimizing its use can improve web performance and overall stability . But if you do use JavaScript, try to optimize it as much as possible.







Fortunately, there are many tools for this:



  • bundle-wizard - CLI tool for creating visualizations of a JavaScript bundle (see image above) so that you can remove anything you don't need.
  • Bundle Phobia - Find out how much it will cost to add an npm package to your bundle.
  • Webpack Bundle Analyzer - Visualize the internal structure of webpack output files with this interactive scalable map tool.
  • source-map-explorer - use source maps to parse bloated JavaScript code (this tool also works with Sass and LESS to parse CSS).


Special frameworks for analyzing tools and articles are also available:



  • reactopt - React 's A CLI performance optimization tool - determines if the page needs to be re-rendered.
  • TracerBench is a monitored performance testing tool for web applications. Provides clear, actionable, and convenient analysis of performance differences.
  • React Performance App - DebugBear


CSS analysis



Besides the Coverage tab mentioned earlier in Chrome DevTools, there are also tools that you can run through the Command Line Interface (CLI). They will parse the CSS given its complexity and also identify unused selectors throughout the website:



  • analyze-css is a CSS selector complexity and performance analyzer that runs from the command line interface.
  • uCSS - Traverse the entire site looking for unused CSS selectors that can then be removed.


Server performance measurement



The golden rule of performance is that: 80-90% of the time a user spends on the front-end.



It's still a good idea to make sure the backend / server is optimized. After all, “Time to First Byte matters” .







It is also important to make sure that the server can continue to run under heavy load if it ever does. There are a number of tools that can help you do both of these things:



  • httpstat is a small Python script to visualize the connection time data returned from curl (see image above).
  • h2load is a tool for testing HTTP / 2 and HTTP / 1.1, run from the command line interface.
  • Hey — .
  • k6 — , JavaScript. API CLI.
  • Server- — , -. - .


Puppeteer



Puppeteer is a Node library that provides a high-level API for controlling Chrome or Chromium using the DevTools protocol. Most of the things you do manually in a browser can be reproduced with Puppeteer. How can this be used for web performance testing? Addy Osmani wrote a blog post about using Pupperteer for web performance testing and also shared the code on Github. These tests can be easily run through the CLI to test both internal and external websites:





Browser extensions



There are many browser extensions that can be used to measure web performance. I would recommend using a separate profile with a minimum of included extensions when you launch them. This is because some browser extensions interact with the page and can degrade performance, resulting in incorrect results. You can use the following extensions:



  • sloth is an extension that slows down the processor and network in the browser, making it easy to simulate page performance on slower devices.
  • Perfmap - When using this extension, the browser creates a heatmap of the resources loaded into the browser and their individual performance impact according to the Resource Timing API .
  • Web Vitals Chrome Extension is a Chrome extension that displays Web Vitals core metrics (LCP, CLS, FID) for any page you visit. Note: This will soon be built into the Chrome DevTools, it is now available in Canary .
  • perf-diagnostics.css is not really a browser extension, rather a bunch of CSS that you can add to your page to fix common performance issues. A simple and effective way to highlight images without a width / height attribute, among a number of others.


There are also extensions designed specifically to improve performance when using certain JavaScript frameworks:







Network Throttling



Network throttling is a way to slow down your network connection. It is important to understand that many users will not have fast and stable broadband connections in a big city. Others, on the other hand, may be in rural areas with poor broadband and very weak mobile signal strength. By throttling your own network connection, you gain insight into the site's performance for users in that specific network environment.



You might ask, "Why would I want to block my network when it's built into the Chrome DevTools?" It is important to understand that not all network regulation methodswork the same way. Throttling with Chrome DevTools applies a browser-level delay for each response. Lighthouse runs the test at full speed and then simulates the connection speed, sacrificing accuracy for message speed. The following tools are much more accurate. They use OS-level network throttling that works at a much lower level.



Note: These tools below regulate the connection across the entire computer, so make sure you close unnecessary applications when testing, and also disable throttling when done!





I always use throttle



because it's pretty simple:



# Enable 3G Slow
throttle 3gslow

# Use a custom connection profile
throttle --up 1600 --down 780 --rtt 250

# Disable throttle
throttle --stop
      
      





Analyzing HAR files



I mentioned HTTP Archive (HAR) files earlier in this article. These files allow you to record the network interaction of web browsers with a website. The great thing about these files is that you can use them for any site that can be accessed through a browser (internal or external). Finding them is pretty easy in the Firefox and Chrome developer tools:



Firefox





Chrome





There are other tools you can use to view and analyze:





Web APIs



Back to the native capabilities of the browser browser, there are several APIs that you can use to measure site performance yourself, without using libraries.



  • performance.now () - The browser performancenow()



    interface method returns a high-precision timestamp from the time the method was called. It makes it very easy to measure the time between 2 calls. Adding them before and after a specific piece of code will allow you to measure and optimize it.
  • Navigation Timing - This API allows developers to collect timing data associated with document navigation .
  • Resource Timing - This API allows developers to collect complete timing information for the resources loaded by the document.
  • Assessing Loading Performance in Real Life with Navigation and Resource Timing - Jeremy Wagner - A very detailed article on how the above two APIs can be used to measure web page loading performance.


Outcomes



Hopefully, in the tools listed in this article, you will find those that will help you improve your internal systems. If a service or website is used only by employees within the company, this does not mean that it does not need to be optimized. Despite the fact that many people work remotely these days, not everyone has a fast and stable connection. Therefore, remember that you and your colleagues are users too!



All Articles