Earlier this year the Chrome team announced plans to support lazy loading natively in the browser. The plan was to add a loading attribute in both <img> and <iframe> elements. Chrome 75 included it behind a feature flag so that developers could test it out. Last week, with the release of Chrome 76 this feature became generally available. I was surprised to see that it is already in use by more than 1000 sites. Lazy loading is an easy web performance win, so you may want to try this out on your sites.
When we talk about cacheability of web content, often times the discussion is around content that site operators have control over (ie, first party content). But what about third party content? How much of that is cacheable? I was chatting with @yoav about this on Friday, since it could be useful to understanding the benefits of signed exchanges on accelerating third party content. Is it worth delivering cross origin resources on a site’s HTTP/2 connection, avoiding the need to establish a new connection and eliminate bandwidth contention between 3rd party resources and 1st party ones? In order to answer that we need to understand how many third party resources are delivered without credentials, and therefore can be signed. We will use the resource’s public cacheability as a proxy for that, and try to understand how common such third party resources are.
Lighthouse is an amazing tool that you can use to quickly audit a web page and learn how it stacks up on performance, accessibility, best practices, PWA support and more. You can run it from ChromeDevTools, run one via WebPageTest measurement or analyze them in bulk here! For every page measured in the HTTP Archive a lighthouse audit is run, and the results are stored in the lighthouse tables.
When we talk about web performance measurement, there is a long list of metrics to choose from. As an industry we are converging on metrics that gauge user experience – such as “Time to Interactive” and “Time to Visually Ready”. Other metrics such as onLoad and First Contentful Paint are also widely used and available in most browsers via APIs such as Navigation Timing and Paint Timing. And then there are Speed Index, Start Render, Fully Loaded time and many others, including protocol times (DNS/TCP/TLS) and backend times (TTFB). You are optimizing your sites and have all these measurements at your disposal – so what do you use to evaluate your changes?
Over the past few years we’ve seen a tremendous growth in mobile traffic on the web. Because of this many of the most successful websites have invested in optimizing the experience of users on whatever device they use and however they connect to the internet. With mobile traffic now exceeding desktop, serving a quality mobile experience is more important than ever. During the recent holiday weekend, I was wondering how much retail traffic occurred via mobile or desktop devices. Was there a large shift towards mobile during peak times on Black Friday and Cyber Monday? Did mobile usage spike on specific days, or times of day? And when users are connecting from mobile, are they connecting over cellular networks or WiFi?
The HTTP Archive is an open source project that tracks how the web is built. Twice a month it crawls 1.3 million web pages on desktop and emulated mobile devices, and collects technical information about each of the web pages. That information is then aggregated and made available in curated reports. The raw data is also made available via Google BigQuery, which makes answering interesting questions about the web accessible to anyone with some knowledge of SQL as well as the curiosity to dig in.
The Application Cache has been deprecated and removed from the web standards. While some browsers still support it - that support is going away. For example, starting with Firefox 44 a console warning advised developers to use Service Workers instead. In Chrome v68, when an HTTP page loads with AppCache configured, the browser provides a warning that v69 will restrict AppCache to secure context only.