For Page Speed Work, Start by Testing the XML Sitemap
Best Practices
3
min read
Page speed plays a critical role in overall health and efficiency of any website. From a conversion standpoint, users expect fast-loading pages, and a delay of just a few seconds can significantly increase exit rates. Plus, slow-loading pages are frustrating for users, and increase the chance that they abandon the site before completing a desired action. A study from Amazon highlights the importance of low page speeds where they found that for every 100ms of added page speed time, they lost 1% of sales.
In addition to impacts on conversion rates, page speeds play a role in how Google evaluates page experience. Search engines like Google prioritize user experience, and as a general rules, faster loading pages are rewarded with higher search rankings. When you have pages that load quickly, they not only provide a better user experience but also contribute to improved crawlability and indexation, which ultimately leads to better visibility and higher organic traffic.
Web Marketers Can Only Do So Much
There are a significant number of factors that can impact page speeds, spanning from the infrastructure layer (i.e. server) to the application layer (i.e. CMS) to the content layer (i.e. text, media, front-end code). As a general rule, web marketers tend to be limited to the content layer, where they can optimize areas such as:
Rich media including images, animations, and videos
Third-party scripts
Front-end code like Javascript and CSS
Before web marketing teams seek out to improve on these areas, I think it’s a good idea to first establish a baseline to better understand where to focus efforts.
Understanding the Improvement Delta
When I was at Targus, we ran page speed tests on our ecommerce site and found that while they weren’t awful, they were pretty slow, and certainly slower than they should be. We kicked off projects to optimize our images, identify and remove third-party script calls we weren’t using anymore, and I asked our front-end developer to improve how we were handling JS and CSS code. It took a few months, but once we completed our work, we were excited to run page speed tests and bask in the glory of faster loading pages. While we did see performance improvements, it wasn’t nearly to the degree we had expected. How could this be? To see what could be going on, we published a blank page (nav/footer and nothing else) and ran a speed test on it. It loaded faster than the other pages we had optimized, but not by a significant amount. It turned out that the majority of our slow page speeds were a result of issues at the infrastructure and application layers where we had inefficiencies and unnecessary processes that were running. Had we established a page speed baseline first, we would have seen that the delta between that baseline and pages we intended to improve was too small too justify all that optimization work.
Establishing Your Page Speed Baseline
Before I embark on any work to improve page speeds, I follow two basic rules. The first, is to pick a page speed tool and use the same one for all of the pages you intended to optimize. I will use Pagespeed Insights, GTmetrix, Pingdom, or any other tool, but just use the same one to ensure the same method for measuring speed is being used across my pages.
The second rule is to figure out my page speed baseline and then compare it to pages I plan to optimize. You can get your baseline by running speed tests on pages such as:
XML Sitemap (has no images, no third-party scripts, and no JS/CSS)
Privacy Policy (has no images but does have third-party scripts and JS/CSS)
Or just a blank page (same as privacy policy)
See what those page speeds look like and then compare them to core website pages like the homepage, a product page, and so on. If there’s a significant difference there, you can probably go ahead and get started on your speed optimization project. If there isn’t, you might want to reach out to your operations and engineering teams and have them take a look first.
© 2024 Keith Mura