Google Analytics: The Right Tool for Some Jobs … Not All Jobs
It’s commonly understood that having an analytics tool, like Google Analytics, on your website is a must. Less commonly understood is what this type of tool is good for and what it’s not. In order to understand user behavior, something like Google Analytics should be only one of the tools in your toolbox. Let’s discuss the types of things we should understand about our users and what the best tools for the job might be.
Understanding What Users Do
Tools like Google Analytics are great for understanding what users do. They can highlight problem areas of your website, such as pages with high bounce rates or stages of your conversion funnel in which more users exit.
Some of my favorite ways to find problem areas in Google Analytics are:
- Look at the exit rates for the top pages on desktop. Then compare those page exit rates with mobile traffic. A page with dramatically higher exit rates on either desktop or mobile might highlight an area for improvement.
- Set up goals. Google Analytics allows for the ability to set up a funnel (e.g., a set of steps a user must take to reach the goal) for those goals. A “Funnel Visualization” report allows us to see how many users make it from step to step of the funnel. A dramatic drop-off between stages might highlight another area of improvement.
- Look at the Landing Page report. Scanning through the bounce rates of the top landing pages will sometimes highlight pages that could be optimized. One thing to keep in mind is that sometimes a user will find exactly what they were looking for on a page and then leave. This can actually be a good experience for them. So it’s important to think critically about the kind of content on these pages, and decide whether we would like the user to do something after their visit to that page or not.
These reports are great at highlighting the symptoms at a high level but usually don’t offer enough detail to diagnose the true cause of a problem. For that we need to dig into how users are behaving on the page.
Understanding How Users Behave
Knowing that a page has a problem is only useful if we can identify how we might fix it. A user’s behavior on the page offers more detail about what specifically might need improvement. Tools such as heatmaps, visitor recordings, and scroll maps all offer more information about a user’s behavior within the page. There are a lot of good services that offer these kinds of tools. A few of our favorites are VWO, Hotjar, and Crazyegg. Once we know which pages might need optimization, these tools allow us to see the most commonly clicked elements of a page and how far users scroll down the page, plus watch where users move their mouse. All of this information can help us evaluate whether the user’s action matches what we would like them to do on the page. But sometimes even this information isn’t enough to understand what the problems might be. Sometimes we need to know “why” users do what they do.
Understanding Why Users Do What They Do
All of the data received through the previously discussed tools do a good job of showing the user’s “when,” “what,” and “how” but commonly don’t paint a clear enough picture of “why” users do what they do. To understand that we need tools such as surveys and user testing. Both of these tools get to the why, but they do so in very different ways.
Surveys are a great way of collecting opinions and feedback from large groups of people with very little personal involvement. However, because there’s no way of digging into answers given it can be difficult to get feedback on more complex user experiences.
User testing relies on a more personal approach to collecting a user’s “why.” By having users perform a series of tasks and then asking questions about why someone did something, we can gain a deeper understanding of motivations. Unlike surveys, which require users to self-report on why they might do something, user testing allows individuals to answer questions about the tasks they actually perform. The data gathered in these interviews will be smaller than that gathered through surveys, but it will likely be richer.
All in all, my approach to making sense out of analytics starts with looking for symptoms of a problem at a high level with tools such as Google Analytics. With symptoms in mind, I dig deeper through visual analytics such as heatmaps and scrollmaps to better understand on-page user behavior. Many times these behavioral analytics require an investigation into the “why.” For that, I rely on surveys and user testing. This type of research needn’t be complicated, long, or drawn out. But analysis of this kind helps provide an extremely solid foundation to base our solutions on, likely resulting in a higher rate of success.