In web analytics, the quality of the collected data matters. In order to do good analysis, you first have to trust the data that you are analysing. The way to avoid data quality issues is to enforce the right processes and methodologies and have a proper Quality Assurance in place.
In the next few lines, I will go over Quality Assurance and Testing looking from an analytics perspective. I will also explore the different options available.
What is Quality Assurance
In software development, quality assurance (QA) is a process that ensures that developed software meets and complies with defined or standardized quality specifications. It is an ongoing process within the software development life cycle (SDLC) that routinely checks the developed software to ensure it meets desired quality measures. Similarly to software development, analytics implementations require quality checks and are not static with changes happening over time. Additionally, since data collection is dependant on the code base of the site, with continues website changes, analytics implementations are always affected. The way to apply standards and measures for data quality is to constantly monitor and test the analytics implementation.
A common methodology for that is Regression testing. By definition, regression testing is re-running functional and non-functional tests to ensure that previously developed and tested software still performs after a change. The aim of regression testing in analytics is to ensure that changes to the website (including tagging) have not affected the data quality.
Let’s have a look in some testing methods.
Manual Testing
Even though the word “manual” in the context of quality assurance and testing does not sound sexy, it can often be a more cost-effective method. Performing comprehensive checks of the analytics implementation is an understandably tedious task, but it is necessary to ensure data accuracy is achieved. Not all setups are that complex and website changes can be not as frequent. In these cases by just having a well-defined list of things to check and manually executing the pre-defined tests makes more sense than spending a lot of time building an automated solution.
To properly do manual QA, all you need is a well-defined test plan and a tester with great attention to detail. As it comes to creating a test plan, Lunametrics have a great article that covers all that you need – Creating a Test Plan for Google Analytics Implementations. It focuses on Google Analytics, but can easily be translated into different platforms.
When to use: Small-scale analytics implementations with a low tendency of website changes.
Pros: cost-effective, no programming required
Cons: takes man-hours, prone to human error
Automated Testing
Following up from the example above, you often get into a situation where the website/platform that the analytics implementation is built on, have frequent versions releases with new features added and the analytics setup is generally more complex tracking various user actions. In such situations, just having a test plan that is executed manually is simply not enough. This is where automation comes along and the investment in terms of technology and development is required.
There are various tools that are generally used for software development practices, that can also be applied to web analytics here.
Selenium, HTMLUnit, PhantomJS to name a few.
Here are also two GitHub repos, utilising the tools mentioned above: WAUnit and Site Infrastructure Tests
Additionally, check out this great step-by-step article using PhantomJS (with CasperJS) to minitor GA implementations.
When to use: Larger-scale analytics implementations with frequent website changes.
Pros: Thorough and automated checks with less errors. A lot of resources and solutions available to start things off.
Cons: Requires development, which depending on the implementation can be as complex as the setup.
Automated Testing – SaaS Solutions
Automation in large-scale organizations however, often requires compliance with a lot of policies as well as collaboration with the development teams. In such cases, an enterprise-level solution is required and here the Software-as-a-Service platforms come into the play. Generally, these would utilise the technologies mentioned in the Automation section, however, the product will be more refined, easier to customize and a degree of support will also be available. Two of the major players in the web analytics quality assurance area are ObservePoint and Hub Scan. Both of the platforms having similar offerings, with ObservePoint having a larger market share and a more mature platform. The choice between the two (or any other SaaS platform) would be based on multiple factors, but based on my experience, ObservePoint is leading the pack with a better product.
When to use: Larger-scale analytics implementations with frequent website changes. Also when the business scale is in an enterprise level.
Pros: Thorough and automated checks with less errors. Less programming required with support provided.
Cons: Paid for solution. Less flexibility when creating a solution.
Conclusion
Quality assurance is often overlooked process, but when it comes to analytics it is really important to make sure the data collection is right. Bad setup and wrong data collection results in inaccurate reports, which results in wrong decisions being made. Hopefully, the breakdown of the options available when it comes to quality assurance and testing provides a good overview and starting point when considering what approach to take.
Not taking any approach should never be the case!
Share this Post