Report Suite Inconsistency [Adobe Analytics]
In my last post about Virtual Report Suites, I discussed some of the pros and cons of consolidating an Adobe Analytics implementation with multiple report suites into one combined report suite and using Virtual Report Suites. However, one of the reasons why your organization might not be able to combine its report suites and leverage Virtual Report Suites, is the pervasive problem of report suite inconsistency. This is a topic I have ranted about periodically, most recently in this post about whether you should start over when re-implementing Adobe Analytics. In this post, I will review why report suite inconsistency is important, especially as you consider moving to an implementation with fewer report suites and more Virtual Report Suites.
Why Are Report Suites Inconsistent?
Most organizations implementing Adobe Analytics have the best intentions at the start. They want to implement one site and track the most important items. But after a while, things start to go downhill. A second site is implemented and it has some different needs, so different variables are used. Then maybe a different team implements a mobile app and yet another set of variables is used. This process continues until the organization has 5-10 report suites and very little is common amongst them. You know you have a problem when you see this in the Administration Console (with all of your report suites selected):
It is so easy to fall into this trap, so I don’t mean to blame you if it has happened to your organization. Often times, it was done by your predecessors over a long timeframe. Unless you have strict policies and procedures to prevent this type of inconsistency, it is will happen more often than not.
Of course, there are specific cases where you want different report suites to be inconsistent and for which seeing a “multiple” above is expected. For example, you may decide that each report suite will have 5-10 variables that are unique to each suite and for those variable slots, any data they want can be collected. I have many clients who specify 20 eVars, 20 sProps and 50 Success Events to be “local” variables that are purposely not consistent across report suites. That is a valid approach and requires discipline and management to enforce. The report suite inconsistency I am talking about is the unintentional inconsistencies that occur in many Adobe Analytics implementations. This is what I hope to help you avoid.
Why Is Report Suite Inconsistency Bad?
There are several reasons why not having report suite consistency can hurt you. Here are some of the ones that I encounter the most:
Data in Global Data Set Can be Wrong
If you have different data points feeding into the same variable in different report suites, when you combine the dataset, you will have different values rolled up. For example, if you track Cities in eVar5 for one suite and Zip Codes in eVar5 for another, in the shared data set, you will see a mixture of Cities and Zip Codes. This is even worse if you think about Success Events. If you are tracking Leads in event1 in one suite and Onsite Searches in event1 for another suite and roll the data up, you will see a sum of Leads and Onsite Searches in the shared data set and have no way to know which is which! That can get you in a lot of trouble, especially if you label event1 as Leads in the shared data set and many of the numbers represent Onsite Searches!
Can’t Use Virtual Report Suites
As mentioned in my previous post, if you want to save money on secondary server calls and consolidate your report suites into one master suite (using Virtual Report Suites), you need to make your report suites consistent. This is due to the fact that having one master report suite necessitates having just one set of variable definitions.
Can’t Re-Use Reporting Templates
One of the greatest benefits of having consistent report suites is the re-use of reports and reporting templates. If you use the same variables across multiple report suites, you can easily jump from one Adobe Analytics report to the same report in another suite, by simply changing the suite in the top-right dropdown. Let’s say that you have configured a great report in Adobe Analytics with a dimension and a few metrics. With one click you can change the report suite and see the same report for the second report suite without any re-work. The same applies if you use dashboards or reporting templates in Adobe ReportBuilder. Adobe ReportBuilder is where report suite consistency pays off the most, since you may spend a lot of time getting your Excel reports/dashboards to work and formatted properly. But this time can be leveraged for multiple report suites by tying the report suite ID to a cell in Microsoft Excel and refreshing the data for a different suite. If your report suites aren’t consistent, you would have to have different data blocks for each report suite and lose out on one of the best features of Adobe ReportBuilder.
Can’t See Aggregated Pathing
If you having Pathing turned on for sProps, you can see paths before and after specific items, but only for paths within the site for which the report suite is configured. If you send data to a global (shared) report suite, you can see paths across multiple web properties as long as both have the same sProp and Pathing enabled. For example, let’s say that you have a search phrase sProp20 with Pathing enabled in your Brand A report suite, but for Brand B, you have the search phrase in sProp15. In both of these report suites, you can see the Pathing of search phrases, but if the same person visits both brand sites in the same session, you might want to see search phrases paths across both sites. Even if you have a global (shared) report suite, you cannot see this, since the data is being stored in two different sProps. But if you had used the same sProp in both suites, you could see all search phrase Pathing in the global (shared) report suite for the entire session.
Can’t Re-use Training and End-User Documentation
I always like to provide good end-user documentation and training for implementations I work on. This means having some sort of file or presentation that explains each business requirement, how it is tagged, what data is collected and how it can enable analysis. I also like to provide training on how to use Adobe Analytics and the key reports/dashboards that have been pre-built for end-users. When you have a consistent implementation across multiple sites, you can build these deliverables once and re-use them for all sites. But if you have an inconsistent implementation, you have to create these deliverables multiple times, which can use up a lot of unnecessary bandwidth.
Can’t Use Consistent Tagging/JS File/Tag Management Setup
Last, but certainly not least, having inconsistent variable definitions means that each site has to be implemented slightly differently. Instead of always passing search phrases to sProp20 (as in the preceding example), your developers have to know that for Brand B, they have to place that data in sProp15 instead. Even if you use a tag management system and a data layer, you still have to configure your TMS differently by report suite, which increases your odds of mistakes and data quality issues. In addition, documentation of your implementation becomes much more difficult and time-consuming.
How Do You Avoid Report Suite Inconsistency?
So, how do you avoid report suite inconsistency? That is often the million dollar question, but there is no perfect answer for this (unfortunately). In my experience, this comes down to process and coordination. When I ran the Adobe Analytics implementation at Salesforce.com, I ruled it with an iron glove. I was the only one with Admin access, so no one could add any variables to any report suites without going through me. But since that approach might not be practical at larger organizations, I recommend that you have a shared solution design document that is kept up to date and always in line with the settings in the Adobe Analytics administration console. You can do this by comparing the two at least once a month and by using the administration console to compare the variables across your report suites. I also recommend that you drive your analytics program by business requirements instead of variables, so that you are only adding variables when new business requirements arise. I explain more about that process in my Adobe white paper.
Final Thoughts
Having consistency in your analytics implementation is difficult, but a goal worth striving for (in my opinion). I hope this post helps you see why it is advantageous and why I encourage my clients to pursue this goal. While it may take a bit more planning and forethought in the beginning of the process, it definitely pays dividends down the road. If you have any thoughts, questions or comments, please let me know. Thanks!