Analytics and marketing professionals are faced with the challenge of making data driven decisions in order to drive their businesses forward. Too often we see mistakes made as companies wrestle with this challenge. Here is a list of the top seven pitfalls of measuring and managing data driven user experiences.
Incomplete Data
Until recently, the biggest challenge with digital analytics solutions is that they rely on page tags to track visitor actions. If you neglect to tag a page or event, then you won’t the data you need until you re-instrument your code. Not having access to all the data can be a very costly challenge.
Similarly, most session replay technologies only offer session sampling and can’t or won’t offer 100% data capture in a cost-effective way that can scale.
Having all the data at your fingertips when you need it the most is incredibly valuable. You won’t know that you have a mission critical blind spot until you encounter it, and by then the damage done by not having the data can be massive. Auryc automatically captures and indexes every customer interaction without requiring additional tags or code instrumentation, thus eliminating data blind spots.
Ignoring the Voice of the Customer
Without VoC data, behavioral data can be very misleading. You’ll be forced to make educated guesses or assumptions about whether certain behaviors are good or bad. For example, if someone abandons a shopping cart, many analytics professionals would assume that site visit resulted in failure. But what if the customer had no intention of buying during that visit. What if they were just looking for information during today’s visit and they are going return tomorrow to buy online or offline. The customer accomplished exactly what they came to the site accomplish. The visit was a success. But you’ll never know that unless you ask for the customer’s perspective and are able to link those perspectives to behaviors.
Failure to Act
Companies are investing significant amounts of money in digital analytic technologies. So why is it that so many companies fail to act on the insights that come from those tools. We would suggest that there are two fundamental reasons why people don’t act.
- Either the insights are too vague to be able to understand how to take action, or
- Because the people in charge don’t have enough evidence to understand what impact these issues are having on business performance.
Auryc is unique because the insights generated by our technology are incredibly granular. Granularity is important because it enables the sizing of site issues (when did the problem start, and how many people is this issue impacting each month) which leads directly to performance outcomes. When you can link Replay technology and Voice of Customer data to extremely granular behavioral analytics, the total user experience is revealed and the specific remedy becomes clear.
Over-Reliance on Anecdotal Information
Too often a lab-based usability approach is used as a substitute for analyzing larger volumes of data. Is a usability study made up of 5 individuals representative of your larger customer base? How about 10 people? 15? Do visitors who are being told what to do and know that they are being observed behave differently from customers to know what they want to do and are “in the wild”?
Similarly, if one customer complains about an online issue to your CEO, does that issue take precedence over all other issues? Analytics professionals must be able to discern between the rare one-off issues, and the more common and impactful issues that are impacting business performance, and they must be able to make a credible case about how and why limited budgets and resources are being spent on some issues before other ones.
Making Decisions Without Context
Analytics data is showing you that 10 critical pages on your mobile site are taking 4 seconds to load. Is that good or bad? Sometimes, customers will tell you if page load times are a problem if you proactively ask them. Other times they’ll just abandon the site. Fortunately, there is a significant amount of credible and publicly available research when it comes to the impact of certain site performance issues.
For example, Google has done research that shows that for every second of mobile page load times, conversions fall by 12% on average, and that 53% of visits are abandoned if a mobile site takes longer than 3 seconds to load.
At Auryc, we are constantly on the lookout for the latest, greatest statistics that will be relevant for our clients as they (and we) analyze their data, and we share this information along with our own internal benchmarks as part of our data reviews to provide that much needed context.
Unnecessarily Relying on Multiple Vendors
Sharing data across platforms and solutions will often enhance and enrich the value of each of the solutions being used. However, integrations do almost always come with a cost, whether that’s in the form of fees required by the vendors being integrated, or whether it’s the cost of time required to make disparate systems work properly together.
Whenever possible, look for solutions that reduce the need for integrations across multiple vendors. For example, Auryc offers a behavioral analytics solution that captures 100% of interactions on a site + Session Replay + an online survey solution for listening to the customer. These solutions have been built from the ground up to be fully integrated with each other, and while you may choose to integrate with other solutions already embedded in your organization, minimizing those integrations can be a significant value-add.
Lacking Confidence in the Data
One thing that keeps managers up at night is making decisions based on bad data. As they say, “Bad data is worse than no data at all.”. Analytics professionals must have more than a basic understanding of how the technology they are employing works, and must be able to get acceptable answers from their vendor partners if there are concerns about data accuracy or integrity.
At Auryc, we encourage those types of questions and provide our clients with a dedicated channel to communicate with our support team to help them answer the tough questions as quickly and effectively as possible.
At the end of the day, the goal of measuring and managing customer experience comes down to being able to answer three fundamental questions:
- How are we doing? From a data perspective, and from the customer’s perspective, how are we doing?
- What should we do? Where should we spend our limited budgets and resources for improvement?
- Why should we do it? Can we quantify what impact these issues are having, or will have on our business before we invest in the improvement efforts?
- An effective data-drive user experience program should be able to help you answer these three questions.