Adobe Analytics, Featured

Different Flavors of Success Events (Part 2)

Last week, I covered some of the cool new Success Event allocation features available in Adobe Analytics. These new allocations allow you to create different flavors of Success Events for Last Touch, Linear, Participation, etc. In this post, I will build on last week’s post and cover one of my favorite allocation additions – Reporting Window Participation. If you haven’t read the previous post, I recommend you do that first.

Expanding the Participation Window

In the last post, I demonstrated how you could create Visit-based Participation versions of any Success Event in your implementation. However, one of the six new allocation options is one that I can’t resist talking about because it is something I have been eagerly awaiting for years – “Reporting Window Participation.” While the Participation feature has been around for over a decade, it has always been limited to the session (Visit). This means that if you wanted to see which pages lead to orders, you could use Participation, but your data would be constrained to pages they viewed within a visit. That means if a visitor viewed ten pages, then came back tomorrow and viewed five pages and completed an order, only the last five pages would get credit, which can be very misleading.

But as you will notice, one of the new allocation options in the Calculated Metric Builder is called Reporting Window Participation and this allows you to see which items within the entire date range you are looking at led to the success event. So if you created an Orders Participation metric based upon the Reporting Window, all fifteen pages in the preceding example would get credit for the Order. This makes reporting more accurate and interesting.

Another great use for this is marketing campaigns. In the past, if you wanted to see which Orders or Leads were generated from each campaign code, your options were basically First Touch or Last Touch. But if you create a reporting window participation metric and view it in the campaign tracking code report, you can see which campaign codes, across multiple visits contributed to success. While this is still not true attribution (which divides credit as you desire), it does provide additional insights into cross-visit effectiveness of campaign codes.

To illustrate how the Reporting Window Participation feature works, let’s build upon the blog post example from my previous post. In this case, I want to do a similar analysis, but remove the Visit constraint from my analysis. To do this, I repeat the steps from the previous post to create a new Participation metric for Blog Post Views, but this time, change the Visit Participation to Reporting Window Participation like this:

Screen Shot 2016-08-25 at 3.03.55 PM

When this is added to the report (I am using a longer duration period of several months), I can now see the difference between Visit and Reporting Window Participation:

Screen Shot 2016-08-25 at 3.05.50 PM

As you can see, the Participation in the Reporting Window is much greater than the Visit. This means that visitors [who don’t delete cookies] are coming back and viewing multiple posts, just not always in the same session. If you want, you can create another calculated metric that divides the Reporting Window Participation by the original metric to see which post gets people to view the most other posts within the longer reporting window timeframe:

Screen Shot 2016-08-25 at 3.10.03 PM

In this report, you can see blog post pull-through for the visit or the reporting window and do some analysis to see how each post does in each scenario.

Finally, if you read my post on using Scatter Plots in Analysis Workspace, you can compare blog posts views and Participation (pull-through) to see which posts have the most pull-through, but lower amounts of views:

Screen Shot 2016-08-25 at 3.14.57 PM

Here you can see that I have some blog posts with a very high pull-through, but low views in the top-left quadrant. These may be ones that I want to publicize more since they seem to get people to read other posts afterwards in the same visit or a subsequent visit. Keep in mind, that this example is using blog posts, but the same type of analysis can be done to see which products on your site people view that lead them to view other products, categories lead to other categories, videos lead to other videos and so on.

One other note that has come to my attention is that Reporting Window Participation is, at times, based upon full months, such that selecting a mid-month date range might include data from the beginning of the month. You can learn more about that in this knowledge base article.

So between these two posts, you have a quick tutorial on how to find and use some of the new Success Event allocation options in Adobe Analytics. For more information, check out the Adobe documentation and there is also a video Ben Gaines created that you can view here.

Leave a Reply