Mobile Trends during the US Holiday Weekend

Over the past few years we’ve seen a tremendous growth in mobile traffic on the web. Because of this many of the most successful websites have invested in optimizing the experience of users on whatever device they use and however they connect to the internet. With mobile traffic now exceeding desktop, serving a quality mobile experience is more important than ever. During the recent holiday weekend, I was wondering how much retail traffic occurred via mobile or desktop devices. Was there a large shift towards mobile during peak times on Black Friday and Cyber Monday? Did mobile usage spike on specific days, or times of day? And when users are connecting from mobile, are they connecting over cellular networks or WiFi?

During normal day-to-day traffic we see shifts in device usage on the weekends. For example, in a recent study I learned that percentage of mobile traffic globally is 43% during the week but increases to 53% over the weekend. Tablet usage also increases marginally over the weekend as well. In the graph below you can see this trend for October 2018.

In previous blog posts and talks I’ve shared some insights using data from Akamai mPulse. The data I’m using for this analysis is a subset of overall mPulse traffic – specifically the US traffic of more than 50 retail websites. To avoid skewing stats by some larger sites, I’ve also ensured that none of the sites in this dataset account for more than a few percentage points of the total data set. (Note: non-US retail traffic during the US holiday weekend is a topic I may explore in a future analysis as well.)

The graph below illustrates the distribution of pages from Desktop, Mobile and Tablet form factors between Thanksgiving and Cyber Monday 2018. There are a few interesting peaks:

  • Thanksgiving traffic started to increase around 5pm EST and peaked at 9pm.
  • Black Friday traffic was intense from 9am to 10pm EST
  • Sunday evening traffic spiked between 8pm and 10pm EST
  • Cyber Monday traffic was as high as Black Friday for most of the day, and then bursted 30% higher than Black Friday’s peak during the evening

The fluctuations in mobile traffic were particularly interesting to me, so I decided to graph this for each day on a 24 hour axis. The graph below shows the percentage of mobile traffic per hour for each day. There was upwards of 60% mobile traffic during the early mornings, leveling off at 53% during the day. The percentage of mobile traffic during the evening hours of Thanksgiving spiked to 58%. Meanwhile, on Cyber Monday desktop traffic dominated for most of the day. As device usage fluctuates by time of day, it’s important for retailers to focus on providing an optimal experience to all users regardless of how they connect.

Now that we know what types of devices people are using, let’s explore how they accessed the web. The graph below illustrates the distribution of Desktop pages loaded over Cellular, Corporate and other Non-Mobile networks. The percentage of desktop traffic from corporate networks increased significantly during Cyber Monday and to a smaller extent on Black Friday. This indicates that a fair amount of online shopping was done by people while they were at work. We typically see spikes like this in other industries (such as streaming events), especially when major events are occuring during business hours.

When we look at the same data for Mobile devices, we can see an interesting pattern in connectivity. During each day mobile networks accounted for 40% of mobile traffic between the hours of 12pm and 2pm ET.

The Cyber Monday traffic patterns were quite interesting, so I decided to look at them on a per minute level. The graph below shows the relative page views for Desktop, Mobile and Tablet traffic. As we saw earlier, Desktop traffic was strongest during the Cyber Monday business day and then started to decline after 5pm EST as the US east coast business day ended. Mobile traffic quickly took its place. In the evening we saw an increase from both Desktop and Mobile traffic, which resulted in the impressive Cyber Monday evening peak. The periodic drops in traffic are likely due to the struggles that some retailers faced.

Many of these stats are similar to what I’ve seen in previous years, but this is the first time we’re able to see an aggregate view of the holiday traffic patterns like this. I’m interested to see how this changes next year.

One important thing to note: when looking at traffic for specific sites, some of the bursts were much more intense than what we see in aggregate. For example, one retailer I worked with had a timed event on Saturday evening, which was one of the lowest traffic days of the weekend. However this retailer managed to double their traffic within 4 minutes. I was particularly impressed with that retailer, because their response times also improved during this time – likely as a result of excellent holiday preparations!

Based on these stats, the 2018 Holiday shopping season seems off to a great start, and we’re seeing some impressive amounts of traffic – both per site and in aggregate across the retail industry. As with most years, mobile traffic continues to grow and spikes during certain times. However just as important as what devices are accessing your sites, the way that they are connecting matter greatly. Mobile traffic is mostly split between Cellular and WiFi, but that distribution varies based on both the time of day as well as the day of the week. In general we are seeing spikes in utilizations for both all form factors, which highlights the importance of ensuring that you can deliver optimal experiences to each device and regardless of how a user is connecting to your site.

On Becoming a Contributor to the HTTP Archive

The HTTP Archive is an open source project that tracks how the web is built. Twice a month it crawls 1.3 million web pages on desktop and emulated mobile devices, and collects technical information about each of the web pages. That information is then aggregated and made available in curated reports. The raw data is also made available via Google BigQuery, which makes answering interesting questions about the web accessible to anyone with some knowledge of SQL as well as the curiosity to dig in.

When Steve Souders created the project back in 2010, it included far less pages – but it was immensely valuable to the community. As sponsorship increased so did the infrastructure and the ability to do more with it. Over time more and more information was added to the archive – including HAR files, Lighthouse reports and even response bodies.

In 2017 Ilya Grigorik, Patrick Meenan and Rick Viscomi started maintaining the project. They have done some amazing work overhauling the new website, creating new and useful reports and continuing to push the envelope on what the HTTP Archive is capable of providing to the web community. As of last week I’ve joined Ilya, Pat and Rick as a co-maintainer of the HTTP Archive, and I couldn’t be more excited!

So how have I been using the HTTP Archive?

Rarely does a week go by where someone doesn’t ask a question or share a news article that doesn’t provoke a question that can be answered with the archive. I love diving deep into questions about the web, and many of my colleagues joke about “nerd sniping Paul”. Fortunately no Paul’s have been injured using BigQuery :).

Source: https://xkcd.com/356/

Over the past few months, I’ve been sharing some of my research on the HTTP Archive Discussion forums. An example of a recent post was just a few days ago when the Blink-Dev team announced that the Application Cache was being deprecated in Chrome. It only took a few minutes to write a SQL query to identify sites that are still using this feature. After doing some analysis on the data, I wound up sharing my research with the blink team so that they could track this. Since I work at Akamai, I’m also planning to give a proactive heads up to the customers whose sites will be affected. Being able to quickly notify numerous websites of an important change that might impact their business is truly a priceless use case.

At the 2018 Fluent Conference in San Jose, CA this past June, I’ve shared a few additional examples of how I’ve used the HTTP Archive at Akamai. You can see the slides here, where I talk about how I used the archive to help improve configuration defaults, assist in product research and even security notifications.

I’m truly grateful that Akamai is both sponsoring the HTTP Archive, as well as allowing me to spend some of my time supporting it. The project provides a significant benefit for the web community and it’s just so much fun to work with. I’m really looking forward to working with Ilya, Pat and Rick on this – and can’t wait to see what comes next!

Brotli Compression – How Much Will It Reduce Your Content?

A few years ago Brotli compression entered into the webperf spotlight with impressive gains of up to 25% over gzip compression. The algorithm was created by Google, who initially introduced it as a way to compress web fonts via the woff2 format. Later in 2015 it was released as a compression library to optimize the delivery of web content. Despite Brotli being a completely different format from Gzip, it was quickly supported by most modern web browsers.

Continue reading

HTTP Heuristic Caching (Missing Cache-Control and Expires Headers) Explained

Have you ever wondered why WebPageTest can sometimes show that a repeat view loaded with less bytes downloaded, while also triggering warnings related to browser caching? It can seem like the test is reporting an issue that does not exist, but in fact it’s often a sign of a more serious issue that should be investigated. Often the issue is not the lack of caching, but rather lack of control over how your content is cached.

If you have not run into this issue before, then examine the screenshot below to see an example:

Continue reading

Adoption of HTTP Security Headers on the Web

Over the past few weeks the topic of security related HTTP headers has come up in numerous discussions – both with customers I work with as well as other colleagues that are trying to help improve the security posture of their customers. I’ve often felt that these headers were underutilized, and a quick test on Scott Helme’s excellent securityheaders.io site usually proves this to be true. I decided to take a deeper look at how these headers are being used on a large scale.

Looking at this data through the lens of the HTTP Archive, I thought it would be interesting to see if we could give the web a scorecard for security headers. I’ll dive deeper into how each of these headers are implemented below, but let’s start off by looking at the percentage of sites that are using these security headers. As I suspected, adoption is quite low. Furthermore, it seems that adoption is marginally higher for some of the most popular sites – but not by much.

Continue reading

Cache Control Immutable – A Year Later

In January 2017, Facebook wrote about a new Cache-Control directive – immutable – which was designed to tell supported browsers not to attempt to revalidate an object on a normal reload during it’s freshness lifetime. Firefox 49 implemented it, while Chrome went ahead with a different approach by changing the behavior of the reload button. Additionally it seems that WebKit has also implemented the immutable directive since then.

So it’s been a year – let’s see where Cache-Control immutable is being used in the wild!

Continue reading

Measuring the Performance of Firefox Quantum with RUM

On Nov 14th, Mozilla released Firefox Quantum. On launch day, I personally felt that the new version was rendering pages faster and I heard anecdotal reports indicating the same. There have also been a few benchmarks which seem to show that this latest Firefox version is getting content to screens faster than its predecessor. But I wanted to try a different approach to measurement.

Given the vast amount of performance information that we collect at Akamai, I thought it would be interesting to benchmark the performance of Firefox Quantum with a large set of real end-user performance data. The results were dramatic: the new browser improved DOM Content Loaded time by an extremely impressive 24%. Let’s take a look at how those results were achieved.



Continue reading