Adobe Analytics, Tag Management, Technical/Implementation, Testing and Optimization

Adobe Target + Analytics = Better Together

Last week I wrote about an Adobe Launch extension I built to familiarize myself with the extension development process. This extension can be used to integrate Adobe Analytics and Target in the same way that used to be possible prior to the A4T integration. For the first several years after Omniture acquired Offermatica (and Adobe acquired Omniture), the integration between the 2 products was rather simple but quite powerful. By using a built-in list variable called s.tnt (that did not count against the 3 per report suite available to all Adobe customers), Target would pass a list of all activities and experiences in which a visitor was a participant. This enabled reporting in Analytics that would show the performance of each activity, and allow for deep-dive analysis using all the reports available in Analytics (Target offers a powerful but limited number of reports). When Target Standard was released, this integration became more difficult to utilize, because if you choose to use Analytics for Target (A4T) reporting, the plugins required to make it work are invalidated. Luckily, there is a way around it, and I’d like to describe it today.

Changes in Analytics

In order to continue to re-create the old s.tnt integration, you’ll need to use one of your three list variables. Choose the one you want, as well as the delimiter and the expiration (the s.tnt expiration was 2 weeks).

Changes in Target

The changes you need to make in Target are nearly as simple. Log into Target, go to “Setup” in the top menu and then click “Response Tokens” in the left menu. You’ll see a list of tokens, or data elements that exist within Target, that can be exposed on the page. Make sure that activity.id, experience.id, activity.name, and experience.name are all toggled on in the “Status” column. That’s it!

Changes in Your TMS

What we did in Analytics and Target made an integration possible – we now have a list variable ready to store Target experience data, and Target will now expose that data on every mbox call. Now, we need to connect the two tools and get data from Target to Analytics.

Because Target is synchronous, the first block of code we need to execute must also run synchronously – this might cause problems for you if you’re using Signal or GTM, as there aren’t any great options for synchronous loading with those tools. But you could do this in any of the following ways:

  • Use the “All Pages – Blocking (Synchronous)” condition in Ensighten
  • Put the code into the utag.sync.js template in Tealium
  • Use a “Top of Page” (DTM) or “Library Loaded” rule (Launch)

The code we need to add synchronously attaches an event listener that will respond any time Target returns an mbox response. The response tokens are inside this response, so we listen for the mbox response and then write that code somewhere it can be accessed by other tags. Here’s the code:

	if (window.adobe && adobe.target) {
		document.addEventListener(adobe.target.event.REQUEST_SUCCEEDED, function(e) {
			if (e.detail.responseTokens) {
				var tokens = e.detail.responseTokens;
				window.targetExperiences = [];
				for (var i=0; i<tokens.length; i++) {
					var inList = false;
					for (var j=0; j<targetExperiences.length; j++) {
						if (targetExperiences[j].activityId == tokens[i]['activity.id']) {
							inList = true;
							break;
						}
					}
					
					if (!inList) {
						targetExperiences.push({
							activityId: tokens[i]['activity.id'],
							activityName: tokens[i]['activity.name'],
							experienceId: tokens[i]['experience.id'],
							experienceName: tokens[i]['experience.name']
						});
					}
				}
			}
			
			if (window.targetLoaded) {
				// TODO: respond with an event tracking call
			} else {
				// TODO: respond with a page tracking call
			} 
		});
	}
	
	// set failsafe in case Target doesn't load
	setTimeout(function() {
		if (!window.targetLoaded) {
			// TODO: respond with a page tracking call
		}
	}, 5000);

So what does this code do? It starts by adding an event listener that waits for Target to send out an mbox request and get a response back. Because of what we did earlier, that response will now carry at least a few tokens. If any of those tokens indicate the visitor has been placed within an activity, it checks to make sure we haven’t already tracked that activity on the current page (to avoid inflating instances). It then adds activity and experience IDs and names to a global object called “targetExperiences,” though you could push it to your data layer or anywhere else you want. We also set a flag called “targetLoaded” to true that allows us to use logic to fire either a page tracking call or an event tracking call, and avoid inflating page view counts on the page. We also have a failsafe in place, so that if for some reason Target does not load, we can initiate some error handling and avoid delaying tracking.

You’ll notice the word “TODO” in that code snippet a few times, because what you do with this event is really up to you. This is the point where things get a little tricky. Target is synchronous, but the events it registers are not. So there is no guarantee that this event will be triggered before the DOM ready event, when your TMS likely starts firing most tags.. So you have to decide how you want to handle the event. Here are some options:

  • My code above is written in a way that allows you to track a pageview on the very first mbox load, and a custom link/event tracking call on all subsequent mbox updates. You could do this with a utag.view and utag.link call (Tealium), or trigger a Bootstrapper event with Ensighten, or a direct call rule with DTM. If you do this, you’ll need to make sure you configure the TMS to not fire the Adobe server call on DOM ready (if you’re using DTM, this is a huge pain; luckily, it’s much easier with Launch), or you’ll double-count every page.
  • You could just configure the TMS to call a custom link call every time, which will probably increase your server calls dramatically. It may also make it difficult to analyze experiences that begin on page load.

What my Launch extension does is fire one direct call rule on the first mbox call, and a different call for all subsequent mbox calls. You can then configure the Adobe Analytics tag to fire an s.t() call (pageview) for that initial direct call rule, and an s.tl() call for all others. If you’re doing this with Tealium, make sure to configure your implementation to wait for your utag.view() call rather than allowing the automatic one to track on DOM ready. This is the closest behavior to how the original Target-Analytics integration worked.

I’d also recommend not limiting yourself to using response tokens in just this one way. You’ll notice that there are tokens available for geographic data (based on an IP lookup) and many other things. One interesting use case is that geographic data could be extremely useful in achieving GDPR compliance. While the old integration was simple and straightforward, and this new approach is a little more cumbersome, it’s far more powerful and gives you many more options. I’d love to hear what new ways you find to take advantage of response tokens in Adobe Target!

Photo Credit: M Liao (Flickr)

Analysis, Reporting

Reporting: You Can't Analyze or Optimize without It

Three separate observations from three separate co-workers over the past two weeks all resonated with me when it comes to the fundamentals of effective analytics:

  • As we discussed an internal “Analytics 101” class  — the bulk of the class focusses on the ins and outs of establishing clear objectives and valid KPIs — a senior executive observed: “The class may be mislabeled. The subject is really more about effective client service delivery — the students may see this as ‘something analysts do,’ when it’s really a a key component to doing great work by making sure we are 100% aligned with our clients as to what it is we’re trying to achieve.”
  • A note added by another co-worker to the latest updated to the material for that very course said: “If you don’t set targets for success up front, someone else will set them for you after the fact.”
  • Finally, a third co-worker, while working on a client project and grappling with extremely fuzzy objectives, observed: “If you’ve got really loose objectives, you actually have subjectives, and those are damn tough to measure.”

SEO search engine optimization indiaIt struck me that these comments were three sides to the same coin, and it got me to thinking about how often I find myself talking about performance measurement as a critical fundamental building block for conducting meaningful analysis.

“Reporting” is starting to be a dirty word in our industry, which is unfortunate. Reporting in and of itself is extremely valuable, and even necessary, if it is done right.

Before singing the praises of reporting, let’s review some common reporting approaches that give the practice a bad name:

  • Being a “report monkey” (or “reporting squirrel” if you’re an Avinash devotee) — just taking data requests willy-nilly, pulling the numbers, and returning them to the requestor
  • Providing “all the data” — exercises of listing out every possible permutation/slicing of a data set, and then providing a many-worksheeted spreadsheet to end users so that they can “get any data they want”
  • Believing that, if a report costs nothing to generate, then there is no harm in sending it — automation is a double-edged sword, because it can make it very easy to just set up a bad report and have it hit users’ inboxes again and again without adding value (while destroying the analyst’s credibility as a value-adding member of the organization)

None of these, though, are reasons to simply toss reporting aside altogether. My claim?

If you don’t have a useful performance measurement report, you have stacked the deck against yourself when it comes to delivering useful analyses.

Let’s walk through a logic model:

  1. Optimization and analysis are ways to test, learn, and drive better results in the future than you drove in the past
  2. In order to compare the past to the future (an A/B test is a “past vs. future” because the incumbent test represents the “past” and both the incumbent and the challenger represent “potential futures”), you have to be able to quantify “better results”
  3. Quantifying “better results” mean establishing clear and meaningful measures for those results
  4. In order for measures to be meaningful, they have to be linked to meaningful objectives
  5. If you have meaningful objectives and meaningful measures, then you have established a framework for meaningfully monitoring performance over time
  6. In order for the organization to align and stay aligned, it’s incredibly helpful to actually report performance over time using that framework, quod erat demonstrandum (or, Q.E.D., if you want to use the common abbreviation — how in the hell the actual Latin words, including the correct spelling, were not only something I picked up in high school geometry in Sour Lake, TX, but that has actually stuck with me for over two decades is just one of those mysteries of the brain…)

So, let’s not just bash reporting out of hand, okay? Entirely too many marketing organizations, initiatives, and campaigns lack truly crystallized objectives. Without clear objectives, there really can’t be effective measurement. Without effective measurement, there cannot be meaningful analysis. Effective measurement, at it’s best, is a succinct, well-structured, well visualized report.

Photo: Greymatterindia

Analytics Strategy

Columbus WAW Recap: Don't "Antisappoint" Visitors

We had a fantastic Web Analytics Wednesday last week in Columbus, sponsored by (Adobe) Omniture, with just under 50 attendees! Darren “DJ” Johnson was the presenter, and he spoke about web site optimization (kicking off with a riff of how “optimization” is an over-used word!). I, unfortunately, forgot my “good” camera, which means my photojournalism duties were poorly, poorly performed (DJ is neither 8′ tall, nor was he ignoring his entire audience):

Columbus Web Analytics Wednesday -- March 2010

One of the anecdotes that stuck with me was when DJ explained a personal experience he had clicking through on a banner ad (“I NEVER click on banner ads!” he exclaimed) and then having the landing page experience totally under-deliver on the promise of the ad. He used the term “antisappointment” (or “anticappointment?”) to describe the experience. It’s a handy word that works better orally than written down, but I’ll be shocked with myself if I don’t start using it!

I’ve been spending more and more time thinking about and working on optimization strategies of late, and DJ’s presentation really brought it all together. This post isn’t going to be a lengthy explanation of optimization and testing…because I’m really not qualified to expound on the subject (yet). But, I will drop down a few takeaways from DJ’s presentation that hit home the most with me:

  • Testing (and targeting) doesn’t typically deliver dramatic step function improvements, so don’t expect it to — it delivers incremental improvements over time that can add up to significant gains
  • (Because of the above) Testing isn’t a project; it’s a process — it’s not enough to plan out a test, run it, and evaluate the results; rather, it’s important to develop the organizational capabilities to always be testing
  • “Testing” without “targeting” is going to deliver limited results — while initial tests may be on “all visitors to the site,” it’s important to start segmenting traffic and testing different content at the segment level as quickly as possible

Good stuff.

In other news, I’ve got a few additional bullet points:

  • Our next Web Analytics Wednesday is tentatively slated to be a happy hour only (unsponsored or with a limited sponsor) on a Tuesday. If you don’t already get e-mail reminders and you’d like to, just drop me a note and I’ll add you to our list (tim at this domain)
  • The Ohio Interactive Awards are fast approaching! This event, started up by Teambuilder Search, huber+co. interactive, and 247Interactive,  is shaping up to be a great event on April 29th at the Arena Grand Movie Theater (Resource Interactive is sponsoring the event happy hour)
  • The TechLife Columbus meetup.com group continues to grow and thrive, with over 1,500 members now — it’s free, and it’s a great way to find meetups and people who are involved in high tech and digital in central Ohio

It’s been a lot of fun to watch social media get put to use in central Ohio and make it so easy to find interesting people with shared interests. I’ve certainly gotten to know some great people over the past couple of years with a relatively low investment of my time and energy, and I’m a better person for it!