Learn / Guides / Web analytics report

Back to guides

State of Web Analytics 2020

Insight from 2000+ pros on the good, the bad, and the ambiguous of traditional analytics in 2020. Brought to you by CXL and Hotjar.

Best practices: marketers and product or website owners

Last updated

29 Apr 2022

One in two web analytics users struggles to understand what customers do on a website. One in five is overwhelmed by data. Most wish that web analytics tools would magically reveal why people leave a website without converting.

If any (or all) of this applies to you, keep reading → in August 2020 we partnered up with CXL Agency to run an industry-wide survey and got insight from 2000+ pros on the good, the bad, and the ambiguous of traditional analytics. Below, we report on what the best-in-class do (or don’t do) to really understand website visitors and customers, and leave you with some essential power-ups to help you get more from your analytics data.

Insert coin to play. Let’s go!

What is web analytics? The standard (somewhat boring) definition

Web analytics is the process of collecting, analyzing, and reporting on website data to understand user behavior and optimize performance.

The data is collected through analytics tools and platforms (for example, Google Analytics) that keep track of what happens on a website: how many people get there, where they came from, where they go after landing, how many make a purchase, how much they spend, and so on.

What the standard definition won’t tell you… 🤨

A lot is written about the potential of analytics data, but one thing often goes unsaid: numbers and graphs alone can’t reveal the full picture of what is happening on a website—and they can’t be used to explain how people are really experiencing it, either.

For example: your analytics tools reveal that you have a page with lots of traffic but very few conversions. Clearly, there is some work to do; but how do you make sense of this information?

  • Are people leaving because something on the page is broken?

  • Are they leaving because they are looking for something different altogether?

  • Is it because something on the page doesn’t fill them with confidence?

  • Or is it because a crucial bit of information is missing?

These are just a few of the many potential reasons behind your website visitors’ actions, but without knowing how they experience a website or why they do what they do, your chances of optimizing and growing the business are limited. We’ve been saying this for a while, but now we have some data to prove others agree with us.

State of web analytics [2020]: an overview

In August 2020, we asked 2000+ global in-house professionals, consultants, and agencies a set of questions about the way they use traditional website analyticsbehavior analytics (think: heatmaps and session recordings), and feedback/voice of the customer tools. If you’re interested in the behind-the-scenes of how we ran the survey, check out the methodology notes at the bottom of the page.

* * *

Maturity categories

To get meaningful insight into the survey results, we asked respondents to rate their approach to data collection/utilization and used their answers to categorize them into 5 main groups:

  • Ignore: don’t collect or report on analytics data

  • Basic: use data to measure WHAT is happening

  • Intermediate: use data to measure what is happening + determine WHY

  • Advanced: use data to measure what is happening + determine why + make ONE-OFF data-informed changes

  • Elite: use data to measure what is happening + determine why + make ONGOING data-informed changes

Revenue brackets

We also asked in-house teams to select their annual revenue. Observing the relationship between maturity and revenue, we noticed an initial correlation between data maturity and revenue:

  • Of the companies that don’t collect or report on data, 8 in 10 make less than $1M/year

  • Of the companies that qualify as advanced and elite, 6 in 10 make more than $1M/year—and 1 in 6 in this latter group makes more than $500M/year

The BIGGEST pain point with traditional analytics tools is understanding why customers do what they do 💡

When we asked people what the #1 challenge with traditional analytics tools was, the most reported answer across any level of maturity is that traditional analytics tools are not helpful in explaining why customers behave the way they do on a website.

How we read this result:

Traditional analytics tools may track traffic fluctuations down to the exact decimal point or help you build the ultimate traffic attribution model, but by themselves, they cannot reveal why customers behave the way they do, including:

  • What needs or goals drove potential customers to a website

  • Why visitors leave from specific pages

  • What they are thinking as they browse through each page

  • Whether users have questions that remain unanswered

  • What kind of information is missing from a page

  • Whether visitors found what they came for and left happily vs. left in frustration after getting stuck or being unable to accomplish what they came to the site for

And this is why it matters: without knowing what customers do (and therefore don’t do) on your website, you’re left guessing and making changes based on what you think they do and/or what you (or your team, or your stakeholders) think needs to change.

💡 1 in 5 people find traditional analytics data overwhelming

A second challenge with traditional website analytics tools is that they are overwhelming—both in terms of the options they offer, and the amount of data they can provide. Across the board, around 1 in 5 respondents picked this answer to illustrate their main challenge with traditional analytics tools (but this doesn’t mean that the other respondents don’t also feel this as a major pain point).

Pro tip: overcoming data blindness

The main reason why people working with data often feel overwhelmed is the lack of context. You’ve got the numbers and you can turn them into complex reports and visualizations but what does it all mean and how do you make it actionable?

Lack of context is usually the byproduct of data silos–a situation where many tools are collecting data but aren’t sharing it with each other. It’s why different teams within a business report different results on a similar KPI and why it’s overwhelming to make decisions because you don’t have the full picture.

Let’s imagine you’ve been using session recordings in Hotjar and found a usability issue you’d like to eliminate. The logical solution is to work on a few hypotheses and test each against the control, using an A/B testing tool. If any of your variants produce a better result, great! You’ve most likely eliminated the issue you discovered in your initial research. But you still lack real context here; what exactly changed in the behavior of users seeing your variant?

A better approach would be to integrate your testing tool and Hotjar before you started your experiment. This way, you can create a heatmap for each of your test variants and tag the session recordings depending on what the user saw. Helping you to understand what changes in the user experience result in different user actions.

The same is true for all marketing, analytics, and user behavior analysis tools. The more context you can send or pull into each of them, the more effectively you can use the data. The ‘holy grail’ of breaking data silos involves either a data warehouse or a data lake–a central location for data across all tools and platforms. Meaning you’ll always have data with context to make better decisions.

High-revenue companies struggle with time- and resource-intensive tools

Things look almost the same when the data is split by revenue instead of maturity, with one significant exception: companies that report $500M in annual revenue experience the time- and resource-intensive work of making sense of analytics tools and data as their second biggest challenge.

Editor’s note → we did not include privacy/GDPR and accuracy/tracking as options in the survey, and only retroactively added both after they appeared in the ‘other (please specify)’ field. This is something to keep in mind if/when we run a follow-up edition of this survey: we suspect that more people might have picked them (and privacy in particular) had they been given the option.

💡 Conversion metrics are more important than traffic as a measure of success

In the survey, in-house practitioners and teams were asked for the main website metric they currently use to measure success. We grouped all metrics (both the ones we provided and the ones that resulted from the ‘other (please specify)’ option) into 5 main groups:

  • Conversion metrics: conversion rate, cost per conversion

  • On-page metrics: average time on page, bounce rate

  • Revenue metrics: value per session, revenue/AOV, leads generated

  • Sentiment metrics: Feedback/Net Promoter Score

  • Traffic metrics: traffic/pageviews, session duration, total number of sessions

The main insight: as a company grows in data and analytics expertise, so does its reliance on conversion data points to identify success. Or, put it in another way: the more a company grows in analytics maturity, the less it uses traffic and on-page metrics as a marker of success.

How we read this result:

When you are a small company, your primary objective is to raise awareness: you want people to come to your website, and traffic volume is a crucial indicator of whether you’re successful or not. As you grow and get more established, however, you probably have more regular and consistent traffic coming in—and that’s when you start focusing less on quantity and more on quality.

Other notable trends:

  • For companies in the basic group, traffic and on-page metrics define success half of the time, compared to elite companies who use the same metrics 20% of the time

  • Elite companies use conversion metrics to define success ⅔ of the time, while basic companies use them 43% of the time

  • The focus on revenue as a metric also grows from 3% in the case of basic companies to above 10% for advanced and elite ones*

Editor’s note: we did not include ‘revenue’ as one of our options in the survey, and only retroactively added it after it appeared in the ‘other (please specify)’ field. We suspect that the ‘conversion rate’ result, while directionally correct across all four levels of maturity, might be over-inflated by the lack of a specific revenue option.

💡 Virtually all of the highest-revenue companies use website feedback

Companies with $500M+ in revenue are the most likely to use website feedback: only 12% of this group chose ‘none’ as an option when asked to select the feedback tool(s) they use the most:

Over 70% of advanced and elite companies use website feedback, compared with 62% of companies at a basic level of analytics maturity (sidenote: this data point goes hand-in-hand with a previous finding that customer feedback is the #1 driver of successful customer experience strategies):

💡 The highest-revenue companies are the most reliant on usability testing

If we break down the data even further, we see that companies with an annual revenue of $500M+ also report the highest adoption of usability and user testing methods (15% of total methods), which is almost 3 times higher than companies with a revenue of less than $1M (6%):

How we read this result:

When you don’t have a lot of revenue, you are likely to focus on the cheaper and easier ways of sourcing insight; the highest-revenue companies have both more money and more in-house/supporting resources, so they can afford to invest in more expensive and sophisticated methods such as in-person usability testing, which require additional knowledge and specialization (not everyone can run a productive usability session, but pretty much anyone can interpret a website heatmap).

Other notable trends:

  • Companies with an annual revenue of $500M+ are the least reliant on internal feedback

  • Traditional analytics as the main method for understanding performance is used the most by companies with a revenue of less than $1M (37%), and the least by companies with revenue above $100M (24%)

💡 Behavior analytics users are more aware of the limitations of traditional analytics data

In addition to slicing the data by maturity or revenue, we found that another useful way to look at it is to separate respondents in two groups: those who rely on traditional analytics tools only vs. those who complement them with behavior analytics tools.

A quick definition of behavior analytics

Behavior analytics tools help answer behavioral questions about website visitors and customers that traditional web analytics can’t, including:

  • Where on a page people get stuck and struggle before dropping off

  • How people interact with individual page elements and sections

  • What they are interested in or ignoring

Examples of behavior analytics tools are heatmaps (which aggregate behavior on a page and display what users interact with, scroll past, or ignore), session recordings (which show how visitors navigate throughout a website), or feedback widgets (which allow users to share their feedback and opinion on a specific website page).

When we look at the #1 challenge with website analytics tools, both groups reply in line with the overall trend seen above (‘They don’t help us understand why customers do what they do’). But the split is revealing, because behavior analytics users feel this problem more than their counterparts.

How we read this result:

The most likely explanation of this discrepancy is problem awareness: people who only rely on traditional analytics may be less aware of their limitations compared to behavior analytics users—who, by virtue of having already used a complementary behavioral tool, were/are more aware of the shortcomings of traditional analytics in the first place.

💡 For behavior analytics users, conversion metrics > traffic metrics

Behavior analytics users are also more likely to define success through conversion metrics (conversion rate, cost per conversion), compared to a larger focus on traffic metrics reported by non-behavior analytics users:

💡 Agencies & consultants invest in methods that give easy access to insight (e.g. market research, usability testing)

Another way to slice the data is by separating respondents in two groups: in-house vs. agency/consultants. With this approach, one major difference in the survey appears when respondents are asked to pick up to 3 methods they rely on the MOST to understand how the website performs. The difference may be down to the fact that agencies/freelancers need results fast, and therefore rely on methods that can get them insights when they don't necessarily have access to all clients' data.

Notable trends:

  • Agencies and consultants enable in-house teams to expand their customer reach via usability/user testing (15% for agencies versus 8% for in-house teams).

  • Agencies and consultants also leverage knowledge of larger marketing and competitor insight around 20% of the time, compared to 11% of the time for in-house teams.

Website analytics tools and methods

Top traditional analytics tools

Using a multiple-choice question, we asked: “Which traditional web analytics tools is the company currently using?” The vast majority of respondents picked Google Analytics as their traditional analytics tool of choice, confirming its leading market position—according to BuiltWith, at least 30 million websites have the GA script installed (but the number is likely to be significantly under-reported).

In addition to Google Analytics, respondents selected or added:

  • Adobe Analytics

  • Mixpanel

  • Matomo

  • StatCounter

  • Amplitude

  • Baidu Analytics

  • Yandex Metrica

  • HubSpot

Top behavior analytics tools

Using a multiple-choice question, we asked “Which behavior analytics tools is the company currently using?” The majority of respondents picked Hotjar as their tool of choice, confirming its leading market position. Here, however, we want to be transparent: since we also shared the survey with Hotjar customers, the number of ‘Hotjar’ responses is likely over-inflated.

With the above caveat, other notable inclusions are:

  • Crazy Egg

  • FullStory

  • Lucky Orange

  • Smartlook

  • VWO

  • Inspectlet

  • Mouseflow

  • Clicktale

  • Contentsquare

  • PageSense

  • SessionCam

Top customer feedback tools

Using a multiple-choice question, we asked “Which website feedback/voice of the customer tools is the company currently using?” As in the previous case, a majority of respondents picked Hotjar as their tool of choice:

In addition, respondents also selected or added:

  • SurveyMonkey

  • Qualtrics

  • Mopinion

  • Survicate

  • Qualaroo

  • UserTesting

  • Intercom

  • Google Forms

  • Typeform

Note: it’s worth highlighting that we recorded over 400 ‘none’ answers, compared to 75 for the website analytics question and 179 for the behavior analytics one, which clearly indicates that feedback tools are a comparatively under-utilized category (but as we learned above, there is a correlation between the use of feedback and revenue → something to think about if you are not using this method yet 😉)

Website analysis methods

Using a multiple-choice question, we investigated the top methods our respondents rely on the most to understand how a website performs. The available answers can be categorized into 5 groups:

  • Traditional website analytics

  • Behavior analytics: website heatmaps, session recordings

  • Customer feedback: website feedback, interviews and focus groups, offsite surveys, support tickets and live chat

  • Usability testing: user testing, usability audits

  • Non-customer focused methods: internal/client feedback, best practices, competitor analysis, market trends

What people wish analytics tools would reveal 🔮

There was a final open-ended question in the survey: “If you could wave a magic analytics wand, what do you wish you knew about your website that you currently don’t?” We like asking this question because it reveals the deep pain points and desires of people better than any yes/no question ever will.

By far, the most recurring answers were some form of Why do people leave/bounce? and Why do/don’t people convert?.

Here is a sample of the hundreds of answers we got—do any of these apply to you?

  • Why users are scrolling for 6 to 10 minutes and not converting

  • Why they leave the checkout process 75% of the time... Nobody does that in real stores!

  • Why they decided not to buy - what was the phrase, image, whatever - which put them off.

  • Why they came, what they wanted to do, did they manage it. If not, why not?

  • Why people don’t buy from us - price? Stock availability? Lead times? Delivery times?

  • Which magical section will get our executives to make more productive data-driven changes instead of knee-jerk reaction changes

  • I'd like to know which pages ACTUALLY sold our product. Like, not the last page the visitors saw before converting, but the page that got them to think 'this software seems to be the sh*t..I'll try it out.'

  • Do people get it? (Is the messaging having the desired effect? Do visitors understand what we offer? Do they know where to go to find what they need?)

  • How the end user feels, talks and swears while browsing the site

  • What is the single most impactful change could we do to improve our customer experience, and therefore, conversion rate?

  • What makes users successful - most of our team focuses on why users don't convert and try to solve that (which is fair enough) but if it was more obvious why certain users/common groups of users convert over others… that'd be grand ;P

  • The magic wand would have some sort of website greeter (to guide) the visitor to what THEY WANT so we can give DIRECT EACH VISITOR to what they want. If we don't have it but should, we can then create it so that each visitor gets exactly what they need and our clients get the conversions they want.

  • Visitor intent. There's a big difference between how a competitor, an employee, and a potential customer interact with the site, and being able to magically filter on that would be amazing.

  • Magic cohort analysis :)

A couple of these issues will definitely require a magic wand—but the good news is that, for most of them, there’s a lot you can already do to make progress using existing tools and resources.

Free power-ups: a 3-step framework and a 5-lesson course

Start here → check out this 3-step framework that will help you understand visitor behavior—so you can optimize your website. Using it, you will find:

  • The DRIVERS that bring people to your website

  • The BARRIERS that might stop them or make them leave

  • The HOOKS that persuade them to convert

Level up → when you are ready for the next step, try the “How to give your customers a great experience (when you don’t have a lot of time or budget” free course below. We distilled our knowledge and that of other UX and CRO practitioners into 5 lessons of about 5 minutes each, where you’ll get: 📦 Out-of-the-box solutions to common problems 💡 Ideas for experiments to test now ♻️ A step-by-step demonstration on how to use the 3-step framework 🔎 Clear areas of the website you should focus on (and which ones you should ignore)

How to give your customers a great experience (when you don’t have a lot of time or budget)

5 days. 5 lessons. 5 minutes each

Methodology

The goal of this research was to understand how in-house practitioners, agencies, and consultants/freelancers use analytics tools to learn about website visitors and improve website experience.

To do so, we built a survey in partnership with CXL; using conditional logic, the survey presented respondents with slightly different but overlapping question paths depending on their answer to the initial prompt: What best describes where you work?

We launched the survey on August 4th, 2020 and received over 2500 submissions over the following two weeks. After downloading and cleaning the raw data, we performed the analysis in Google Sheets based on a dataset of 2028 respondents.

The main limitation of this research is sample bias. We shared this survey with people outside and inside the Hotjar network, and around half of the respondents came from an email we sent to a segment of our active customers. This might have inflated results where Hotjar and/or its tools (e.g. heatmaps or session recordings) were one of the available tool options. We made a point of transparently reporting this possibility in the analysis above.

Credits

This survey was planned, written, and edited by Fio Dossetto (Senior Editor) and Louis Grenier (Senior Marketing Strategist), and art directed by Denis Constantinou (Digital Designer) from the Hotjar team in August-September 2020. Thanks for their collaboration to Alex Atkins (Head of Growth) and Katie Kelly (Content Strategist) from CXL Agency.