How to enhance Google Adwords and Analytics Integration

Google Adwords and Analytics Integration- What is the objective of the blog post –

In this post you will learn how to customize your Google Adwords and Analytics Integration to link your campaign information to other areas of your site apart from landing page and conversion.

How it makes life easier –

With standard integration, one can analyse the performance of Adwords campaign along with the landing URL in terms of cost, conversion and revenue. But, what if you have other variables in the user journey that you are testing apart from the lander?

By using this customization technique you will be able to analyze how combination of your adword campaigns and other areas of your site are affecting the conversion and analyze the complete picture.

Real life scenario –

Suppose an eCommerce company is using Google Optimize to test its landing pages as well as product details pages.

The company want to see which adwords groups, lander and product page combinations are delivering the best results in terms of order conversions.

For this they create a google sheets dashboard to automate the report for analysis through the Google Analytics Addon like this :

Now, with standard integration the company will not be able to see this entire journey and will miss out on some critical information and they will rather see this ugly error:

But, they don’t need to lose hope. DataVinci to rescue. Lets get cracking.

What is the recipe?

Ingredients:
  • 2 custom dimensions sanctioned at session scope (There can be more dimensions depending on the number of variables you are playing with)
  • Custom parameters in landing URL of Google Adwords campaign
  • 3rd Custom dimension to capture the Google Adwords custom parameter. Again, set at session scope
Method:

First, enable the custom dimensions from the Google Analytics admin. Name them appropriately and note down their index numbers. Make sure to set their scope at session level.

Next, customize your Google Adwords campaign parameters by passing in any campaign related information that you want to test. Let’s assume this information is ad group and you are passing this in “_adgroup” parameter.

This video tutorial provides the steps to update the custom parameters in Google Adwords :

Now, customize your Google Analytics implementation to capture the data in the respective dimensions.

This video tutorial provides steps to setup Google Analytics custom dimensions through Google Tag Manager.

https://www.youtube.com/watch?v=so3_bKY0mnM

Now what?

Ok, so when you set a custom dimension to a session scope, the last value that gets passed into it in a session gets associated with all the hits of that session. So, visualize this scenario in your head:

A visitor enters your site from your Google Adwords campaign. You have very smartly passed in custom information through custom url parameters in your landing URL. Your Google Analytics account captures this custom parameter information and stores it in a custom dimension, and since this custom dimension is set at a session scope, all the hits in your visitors sessions can be viewed against this value.

Also, on this landing page, we are capturing the landing URL in another custom dimension set at session scope. That means this information will also be available to be used with other data points captured through out the session.

Next, if the user navigates to the products page, we are capturing the version of the product page url as well and that too again in the session scope.

Now we have the three important dimensions which we want to see together broken down by the conversion metrics to make a decision. And this is how the report will look now:

Through the above minimal setup, the eCommerce company can clearly analyze which combinations are working better than others and push them to the broader audience.

Sweet.

Hope you liked this post on customizing your Google Adwords and Analytics Integration

How can we people?

Google Analytics is a very powerful tool when setup correctly. It can be customized to great extents and provide sensational insights to optimize your digital assets.

If you need help with this, then we are a crazy team of Google and Adobe Certified Analytics Experts who eat, breathe, sleep and dream Analytics. And we’ve made it our purpose and passion to electrify your business’s potential through Analytics.

Contact us here.

Found it informative? Leave a comment! You can also give us a thumbs up by sharing it with your community. Also did you know that you can light up our day by subscribing to our blog? Subscribe here –

What is an affiliate website

Have you been researching affiliate marketing? Maybe you’ve heard it’s a great way to make money online and are interested to learn more about what is an affiliate website.

Or maybe you want to know how you can make your own income producing affiliate website today.

People want to know about affiliate marketing and how it works.

Why is this a hot topic?

Because affiliate websites can make you a ton of money.

And the majority of affiliate marketing programs require that you have your own website in order to join their affiliate network.

Another major perk to creating an affiliate website is that it is a very inexpensive business to start up.

You can look at the cost difference between an affiliate business and a traditional “brick and mortar” business right here.

You’ll see two major differences between these two business models.

Most notably, the cost of start-up and the implied risk.

We’re talking under $500 to $1,000 dollars for the first year of an affiliate website business verses $10,000 to $100,000 dollars for a traditional business.

With an affiliate website business, your risk is low and your earning potential is quite high.

Now there are a good amount of factors that go into making a successful affiliate website (and I’ll discuss this later below) but if you can imagine the potential in affiliate marketing, then you’re halfway there.

In today’s post, we’ll discuss:

  • The benefits to owning your own affiliate website.
  • What it takes to create a successful online business.
  • And give you a step-by-step guide to create your own successful affiliate website.

What Is An Affiliate Website Exactly?


An affiliate website is any website or blog that utilizes affiliate marketing techniques.

So what does that mean?

What Is An Affiliate Website photo

Well, any website with advertisement banners is a form of affiliate advertising.

Also, text links.

Whenever you see a link that takes you to a website where you can purchase a product or service, again, is another style of affiliate marketing.

Affiliate marketing is performance based advertising.

In other words, an affiliate website will only receive a commission if the visitor they send goes on and makes a purchase.

So for example, you click on a link from one of your favorite blogs or websites and that link sends you off to say…Amazon.

Or any eCommerce website for that matter where you can purchase…

  • Products
  • Services
  • Educational Classes, etc.

Then if you buy something from Amazon, the affiliate site would earn a commission from whatever products you purchase.

In affiliate marketing, there are four players.

1). The Merchant – The person providing the product or service. In our example above, this would be Amazon.

2). The Affiliate Network – Contains the products from the merchant or a series of merchants.

They also handle the payments and sales commissions. Again in our example, this would be a subsidiary company of Amazon, called Amazon Associates Program.

3). The Publisher – This would be the website owner who is publishing the content to market the product. If you look at my article, Does Amazon Sell Fake Products, and click on my Amazon link for the most popular computers, then go to Amazon and buy a computer, I would be the Publisher in this example who receives the Amazon commission.

4). The Customer – This would be the website visitor that clicks on the Publisher advertisement and then is taken to the sales page of the Merchant.

If you clicked my link above and bought something from Amazon, then you’d be the customer.

Affiliate marketing websites use tracking links and banners throughout their websites to market products.

If the website sees a lot of traffic, that means more potential customers which equates to higher potential sales and increased revenue for you.

Building affiliate websites is an art form. It’s far more than just throwing up a bunch of banner ads and affiliate links throughout your website.

Hope this was helpful.

Subscribe to our blog to receive regular content like this.

 

 

 

Analytics with React-Redux SPAs and Google Tag Manager

React-Redux has become a hugely popular web development combo, but there aren’t too many guides out there on how to sprinkle in analytics. Most implementations require some modification to your app’s code, often with analytics specific business logic.

The most common pattern seems to be with redux middleware, which definitely is a step in the right direction. The redux-analytics package encompasses this pattern nicely. Every redux action becomes a place where insights can be extracted, simply by appending some analytics information to the action metadata.

const action = {
  type: 'MY_ACTION',
  meta: {
    analytics: {
      type: 'my-analytics-event',
      payload: {
        some: 'data',
        more: 'stuff'
      }
    }
  }
};

This is a great start, and I had many of these analytics payloads throughout the codebase for a while and it worked great. The problem was that whenever someone wanted to change pretty much anything, it required a redeployment. Plus you’ll often have less tech savvy users wanting to add their own insights.

We already had an integration with Google Tag Manager (gtm.js), so I was a little biased towards this implementation. This goes two-fold for other departments who were already familiar with gtm.js, which is currently reaping it’s benefits with less development overhead when adding analytics insights.

Anyway lets get started on a basic Redux integration with gtm.js and my personal analytics platform of choice — Mixpanel.


Getting Started

If you’re not already familiar with gtm.js, you can simply inject it’s javascript snippet into your app then get going. All of the configuration is driven through the gtm web UI, which has come a long way in the years.

Now on the app side, the Redux middleware approach is still the way to go here:

const analytics = () => next => action => {
  window.dataLayer = window.dataLayer || [];
  dataLayer.push({
    event: action.type,
    payload: action.payload
  });
  return next(action);
};
// Add in the analytics middleware
let store = createStore(
  todoApp,
  applyMiddleware(
    analytics,
    thunk,
  )
);

Instead of dispatching analytics events from the application, it’s now firing everything to the gtm.js dataLayer. Each dataLayer event needs an event attribute to denote the type of event, but other than that you can structure your data format in any way that suits your application.

Now that’s pretty much it for the initial setup, assuming you already have the gtm.js snippet embedded in your application somewhere. Everything else can now be driven by the Tag Manager UI. I’ve started storing tags/triggers/variables in their own respective folders, but these can be changed at any time.


Creating the first event

To get started, lets setup the beloved page load events that management always seems to want. A typical React SPA usually has some form of client-side routing, so there needs to be a method to track the initial page view (landing) and route transitions. To capture both of these, 2 triggers are required.

Create the trigger in some folder of your choice

First, create the tag for the page load. I used the window loaded trigger here, and named it Global.pageLoad for use later.

Create the first pageLoad event

Next, create the history change event, which will capture route transitions from your SPA router (e.g. react-router). This is similar to the Window Loadedevent above, but the History Change trigger can be selected instead.

Create a new tag Page View that triggers on either of these. I’ll be using Mixpanel throughout, but the same can apply to Google Analytics or your platform of choice.


Tracking authentication

The place where Mixpanel shines is tracking arbitrary events, with arbitrary (and sometimes changing) event attributes. This is the perfect behavior for a dynamic web application, and especially for the range of redux events that are fired.

In many applications, there’ll be some kind of authentication event fired. In my current app it’s structured as follows:

const authenticateAction = {
  type: 'AUTHENTICATE',
  payload: {
    user,
    token
  }
}

1. Create the trigger

This event is now available to use in Tag Manager as a custom event. Create a new trigger referencing this authenticate action:

The Event name should match the string type field in the redux action

2. Access the data

To access variables within your redux events, you need to create a Tag Manager variable for each primitive you want to access. Unfortunately there is no object dot notation access (yet).

Access the user id variable within the redux action

3. Send the analytics event

The complete authentication tag

Now that we have the trigger, and the data, we can send an analytics event. For user identity, this often varies per analytics-platform.

Create a new tag that uses the previous AUTHENTICATE event, along with the User.id variable. Inside a Custom HTML tag, the variable can be accessed using the {{VARIABLE}} notation.

Conclusion

That’s all there is to it to get started, now try login to your application and you’ll see the identification action get triggered and sent to your analytics. Now your analytics platform can grow as your application grows, without littering the code base with metadata tags.

It’s just as easy to add other actions and variables, and create triggers that fire conditionally based on the value of a variable — all within Tag Manager.

 Hope you liked this post.
You may subscribe to us for more good content for Free.
Best,

Non-Interaction Events in Google Analytics

So what are Non-Interaction Events in Google Analytics?

You already know about event tracking in Google Analytics and using it for everything from downloads to video plays. Maybe you’re using jQuery or Google Tag Manager to capture events.

One thing to note about events is that, by default, events affect the bounce rate. That is, if a user lands on a page and an event is triggered, they are not a bounce (even if they don’t view any subsequent pages). In many cases, that’s what you want: after all, if someone engages with the page in some way, you probably don’t want to count them as a bounce any more.

However, you have control over whether those events affect bounce rate. There’s a parameter you can send with the event data to decide this called the “non-interaction” parameter. In a case where a video auto-plays when someone lands on the page, for example, we might want to set the non-interaction parameter so that the bounce rate of that page isn’t zero.

Flagging Non-Interaction Events

The code for a non-interaction event is just a single parameter you set along with the event data.

For Classic GA:

For Universal Analytics:

Using Google Tag Manager:

Screen Shot 2014-05-05 at 8.14.21 AM

Effect on Metrics

Screen Shot 2014-05-01 at 1.03.19 PMNon-interaction events are mostly referenced in regard to bounce rate, but they actually affect several metrics. Setting the non-interaction parameter has the following effects:

  • Bounce rate and time metrics (session duration and time on page) are no longer affected by the event.
  • The number of total events, unique events, sessions, users, etc. are counted normally.

Give us a shout if you need any help with  Analytics by filling in the form below

Check out our Google Analytics solutions here

Check out our Adobe Analytics solutions here

FireBase Analytics Vs Google Analytics

Why FireBase Analytics ?

Because FireBase Analytics was created exclusively for apps.

There is a reason why a lot of app development companies invest on creating in-house analytical tools rather use vendor solutions.

Most digital analytical tools including Google Analytics were created in a “pre-mobile app” era. A majority still cater primarily to website architectures. Mobile applications differ greatly from websites. Websites are click based.

Other than forms where the text is typed, a click is supposed to change the page being viewed. Events like video view or document download are afterthoughts to the basic page based architecture. Apps, on the other hand, have several elements and types of interactions over one or multiple screens. Also, users can call various action by touching, swiping, pinch-in, pinch-out using multiple fingers in various ways. Users can thus interact with various elements that trigger content without changing the screen they are in.

There are many apps out there whose interface is just a single screen. The traditional page of websites does not exist. Also, Firebase analytics has solved a big question Google Analytics is still struggling How to uniquely identify a user who may access the page from multiple devices/browsers/time gaps? For mobile apps, there is no need to force yourself to the website language of Google Analytics.

Your analytics tool has to adapt to your business reality and not the other way around. Pure play app creators need a tool that understands users and events. Firebase Analytics is that solution.

Free unlimited event reporting in Firebase Analytics unlike Google Analytics

Another consequence of the page and session based tools like Google Analytics is that events are an afterthought for them. Thus there are limits to which one can go while analyzing events. Even paid solutions like Google Analytics premium have a limit on how many events you can report and analyze. Firebase, however, provides you with unlimited reporting for up to 500 distinct events.

Did we mention that Firebase is also free?

Funnel analysis makes much more sense in Firebase Analytics than in Google Analytics

The traditional page flow analysis involves analyzing sequence of pages visited in a session prior to the desired outcome. This is not useful. This is because visitors don’t “follow” the path that we want them to follow.

The correct way to analyze user behavior flow is to identify the critical actions taken at every step of the conversion process. Since Firebase is based on events and not on screen views, it allows you to create funnels based on events which give much more value than the page-view based funnels Google Analytics has.

You can connect Firebase Analytics to Google Analytics

Imagine tomorrow your organization decides to use websites in addition to the apps. Alternatively, let us say you still have stakeholders who understand only the language of Google Analytics. Firebase has a tight integration with Google Analytics. Connect your Firebase data to Google Analytics and see your Firebase analytics reports without leaving the Google Analytics user interface.

Unlike Google Analytics, Firebase Analytics is much more than an app analytics tool

Firstly, Firebase is a mobile and web application platform. Its original product was a real time database. Along with the famous Firebase Analytics, it also has services and infrastructure designed to help developers build high-quality apps.

Firebase features can be mix-and-matched by developers to fit their needs. After Firebase was acquired by Google in October 2014, it has expanded to become a full suite for app development with many Google products like Admob also integrated into it. You can take a look at what Firebase has apart from Analytics here. Firebase as a backend service is one of the fast growing businesses in the Android market.

Unlike Google Analytics, Audiences can be used through the rest of the Firebase Analytics platform

Firebase audiences are like segments in Google Analytics. Additionally, Firebase enables audience-specific push notifications and app configuration changes to be sent out without having to collate that information separately. You can identify custom audiences in the Firebase console based on device data, custom events, or user properties. These audiences can be used with any of the other Firebase features mentioned above.

By investing in Firebase, you will be investing in many more tools from Google that helps in app development and monetization. With the purchase of Fabricdeveloper platform from Twitter last month, Firebase has again come to the spotlight. Fabric’s reach of 580,000 developers will grow the user base of Firebase. If your digital strategy is app-driven, Firebase is the right analytics tool for you.

You might like this video on Firebase :

Give us a shout if you need any help with mobile App Analytics by filling in the form below

Check out our Google Analytics solutions here

Check out our Adobe Analytics solutions here

Bucket Testing

What Is Bucket Testing?

Bucket testing (sometimes referred to as A/B testing or split testing) is a term used to describe the method testing two versions of a website against one another to see which one performs better on specified key metrics (such as clicks, downloads or purchases).

There are at least two variations in each test, a Variation A and a Variation B. Metrics from each page variation are measured and visitors are randomly placed into respective ‘buckets’ where the data can be recorded and analyzed to determine which performs best.

Companies that market and sell products or services online rely on bucket testing to help them maximize revenue by optimizing their websites and landing pages for conversions.

How It Works: An Example

Let’s look at a hypothetical example. Each bucket test begins with a hypothesis that a certain variation on a landing page will perform better than the control. Say you have an existing landing page for a free nutrition eBook, Eat Raw Foods and Live Longer.

The button on the bottom of your landing page’s sign-up form says ‘Submit,’ but your hypothesis is that changing the text to ‘Get Your Free Copy’ will result in more form conversions. The existing page with the ‘Submit’ button is the control, or Variation A. The page with ‘Get Your Free Copy’ on the button is Variation B. The key metric you will measure is the percentage of visitors who fill out the form, or a form completion.

Because you have an ad campaign driving several thousand visitors a day to your landing page, it only takes a few days to get the results from your bucket test. It turns out that ‘Get Your Free Copy’ has a significantly higher click rate than ‘Submit,’ but the form completion rate is basically the same. Since the form completion rate is the key metric, you decide to try something different.

Bucket Tests & Conversion Optimization

Bucket testing plays a big role in conversion rate optimization. Running a bucket test allows you to test any hypothesis that can improve a page’s conversions. You can continue to try higher-converting button text for Eat Raw Foods and Live Longer or you can go on to test other hypotheses, such as bolder headline copy, more colorful imagery or arrows pointing to the sign-up button that will get more people to convert.

Companies spend millions of dollars to drive traffic to landing pages and websites that promote their product or service. With simple variations to page copy, imagery, and layouts, you can conduct a series of bucket tests to gather data and to iterate towards your highest-performing version of the page. You simply create variations of the page, changing one element at a time and measuring key metrics, then collect the results until reaching statistically significant results for each experiment.

Bucket testing can make a significant impact on conversions per page, resulting in revenue increases on your highest-trafficked pages.

Bucket testing can also help to eliminate subjective opinions as deciding factors in a page’s design or layout. The author of Eat Raw Foods and Live Longer may think that her photo will drive more customer demand – or she may insist on a rainbow palette of colors.

With bucket testing, there is no need for debate on what design or page elements will work best to convert a customer. The quantitative data will speak for itself, and drive the decision for you.

Tests should be prioritized to run on your most highly trafficked pages, since you may need hundreds or thousands of visitors to each variation to gather statistically significant data. The more traffic a page receives, the quicker you will be able to declare a winner.

Common Page Elements To Test:

  • Headlines and sub-headlines: varying the length, size, font and specific word combinations
  • Images: varying the number of images, placement, type of imagery (photography vs. illustration) and subject matter of imagery
  • Text: varying the number of words, style, font, size and placement
  • Call-to-action (CTA) buttons: varying common ones such as ‘Buy Now,’ ‘Sign Up,’ ‘Submit,’ ‘Get Started,’ or ‘Subscribe’ and varying sizes, colors and page placement
  • Logos of customers or third party sites: build credibility and convey trustworthiness (could include Better Business Bureau, TRUSTe or VeriSign logos as well as customer logos)

Give us a shout if you need any help with A/B testing by filling in the form below

Check out our Google Analytics solutions here

Check out our Adobe Analytics solutions here

Split Testing

Split Testing Simplified

Split testing (also referred to as A/B testing or multivariate testing) is a method of conducting controlled, randomized experiments with the goal of improving a website metric, such as clicks, form completions, or purchases. Incoming traffic to the website is distributed between the original (control) and the different variations without any of the visitors knowing that they are part of an experiment. The tester waits for a statistically significant difference in behavior to emerge. The results from each variation are compared to determine which version showed the greatest improvement.

This marketing methodology is frequently used to test changes to signup forms, registration pages, calls to action, or any other parts of a website where a measurable goal can be improved. For example, testing changes to an online checkout flow would help to determine what factors increase conversions from one page to the next and will lead to increased orders for the website owner.

Seemingly subjective choices about web design can be made objective using split testing, since the data collected from experiments will either support or undermine a hypothesis on which design will work best. Demonstrating ROI (return on investment) for a testing platform can be measured easily because tests are created with a quantifiable goal in mind.

Split testing results

Split testing tools allow for variations to be targeted at specific groups of visitors, delivering a more tailored and personalized experience. The web experience of these visitors is improved through testing, as indicated by the increased likelihood that they will complete a certain action on the site.

Within webpages, nearly every element can be changed for a split test. Marketers and web developers may try testing:

  • Visual elements: pictures, videos, and colors
  • Text: headlines, calls to action, and descriptions
  • Layout: arrangement and size of buttons, menus, and forms
  • Visitor flow: how a website user gets from point A to B

Some split testing best practices include:

  • Elimination: fewer page elements create less distractions from the conversion goal
  • Focus on the call to action: text resonates differently depending on the audience
  • Aim for the global maximum: test with the overarching goal of the website in mind, not the goals of individual pages
  • Provide symmetric and consistent experiences: make testing changes consistent throughout the visitor flow to improve conversions at every step of the process

Habitual testing for a website owner or business helps to build a culture of data-informed decision-making that takes into account audience preferences. Each click on a website is a data point from a potential customer. Conflicting opinions can be put to the test with split testing methodology, and the visitors to the website will inform the final decision on the “best” design.

Split testing Process

Split testing is equivalent to performing a controlled experiment, a methodology that can be applied to more than just web pages. The concept of split testing originated with direct mail and print advertising campaigns, which were tracked with a different phone number for each version. Currently, you can split test banner and text ads, television commercials, email subject lines, and web products.

Hope you liked the post.

Give us a shout if you need any help with A/B testing by filling in the form below

Check out our Google Analytics solutions here

Check out our Adobe Analytics solutions here

Google Tag Manager (GTM) for mobile apps

Google Tag Manager (GTM) for Mobile Apps was first announced in August this year and has some great implications for app developers.

Perhaps most notably, the product has the potential to overcome one of the most critical challenges in the business: pushing updates to the user base without having to publish a new version on the app marketplace.

Typically, from the moment an app is shipped it is frozen, and from that point onwards the developer can only make changes to how the app behaves if the user accepts an update. By shipping an app with GTM implemented, configurations and values may be continuously updated by publishing new container versions through the web-based GTM interface.

In this post, we will cover how to get started with GTM for mobile apps and how to implement Universal Analytics tags using the GTM SDK for Android. As a heads up, this will occasionally get pretty technical, however I believe it is important to understand the product from its fundamentals.

Initial Set Up

Before we get started, some initial configuration steps need to be completed. More detailed instructions on these are available in the Google Developers Getting Started guide, but in a nutshell they include:

  • Downloading and adding the GTM library to our app project
  • Ensuring our app can access the internet and the network state
  • Adding a Default container to our app project

We will hold back on that last part, adding a Default container, until we have created some basic tags and are ready to publish. We will revisit the Default container later in this post.

Create an App Container

We need to start off by creating a new container in Google Tag Manager and select Mobile Apps as the type. Typically, we will have one container for each app we manage, where the container name is descriptive of the app itself (e.g. “Scrabble App”). Take note of the container ID on top of the interface (in the format “GTM-XXXX”) as we will need it later in our implementation.

App container for mobile app

Opening a Container

Assuming we have completed the basic steps of adding the Google Tag Manager library to our project, the first thing we need to do before we start using its methods is to open our container.

Similarly to how we would load the GTM javascript on a webpage to access a container and its tags, in an app we need to open a container in some main app entry point before any tags can be executed or configuration values retrieved from GTM. Below is the easiest way of achieving this, as outlined on the Google Developers site:

ContainerOpener.openContainer(
        mTagManager,     // TagManager instance.
        GTM-XXXX”,       // Tag Manager Container ID.
        OpenType.PREFER_NON_DEFAULT,   // Prefer not to get the default container, but stale is OK.
        null,                    // Timeout period. Default is 2000ms.
        new ContainerOpener.Notifier() {       // Called when container loads.
          @Override
          public void containerAvailable(Container container) {
            // Handle assignment in callback to avoid blocking main thread.
            mContainer = container;
          }
        }
    );

Before we talk about what this code does, let’s hash out the different container types to avoid some confusion:

  • Container from network: Container with the most recent tags and configurations as currently published in the GTM interface
  • Saved container: Container saved locally on the device
  • Fresh vs. Stale container Saved container that is less vs. greater than 12 hours old
  • Default container: Container file with default configuration values manually added to the app project prior to shipping

We will talk more about the Default container later on. Back to the code. In this implementation, the ContainerOpener will return the first non-default container available. This means that we prefer to use a container from the network or a saved container, whichever is loaded first, because they are more likely to hold our most updated values. Even if the returned container is Stale it will be used, but an asynchronous network request is also made for a Fresh one. The timeout period, set as the default (2 seconds) above, specifies how long to wait before we abandon a request for a non-Default container and fall back on the Default container instead.

We may change the open type from PREFER_NON_DEFAULT to PREFER_FRESH, which means Google Tag Manager will try to retrieve a Fresh container either from the network or disk. The main difference is hence that a Stale container will not be used if we implement PREFER_FRESH unless no other container is available or the timeout period is exceeded. We may also adjust the timeout period for both PREFER_NON_DEFAULT and PREFER_FRESH, however we should think carefully about whether longer request times negatively affects the user experience before doing so.

Tag Example: Universal Analytics Tags

We have completed the initial set up and know how to access our Google Tag Manager container. Let’s go through a simple example of how to track App Views (screens) within our app using Universal Analytics tags.

Step 1: Push Values to the DataLayer Map

The DataLayer map is used to communicate runtime information from the app to GTM, in which we can set up rules based on key-value pairs pushed into the DataLayer. Users of GTM for websites will recognize the terminology. In our example, we want to push an event whenever a screen becomes visible to a user (In Android, the onStart method is suitable for this). Let’s give this event the value ‘screenVisible’. If we want to push several key-value pairs, we may utilize the mapOf() helper method as demonstrated below. In this case, since we will be tracking various screens, it makes sense to also push a value for the screen name.

public class ExampleActivity extends Activity {

  private static final String SCREEN_NAME = "example screen";
  private DataLayer mDataLayer;

  public void onStart() {
    super.onStart(); 
    mDataLayer = TagManager.getInstance(this).getDataLayer();
    mDataLayer.push(DataLayer.mapOf("event", "screenVisible",
                                                   "screenName", SCREEN_NAME));
  }
//..the rest of our activity code
}

We may then simply paste this code into every activity we want to track as a screen, replacing the SCREEN_NAME string value with the relevant name for each activity (“second screen”, “third screen”, etc.).

Note: the container must be open by the time we push values into the DataLayer or GTM will not be able to evaluate them.

Step 2: Set Up Macros In Google Tag Manager

Simply put, macros are the building blocks that tell GTM where to find certain types of information. Some macros come pre-defined in GTM, such as device language or screen resolution, but we may also create our own. First of all we want to create a Data Layer Variable macro called screenName: this is the name of the screen name value we pass along with the event as demonstrated above.

GTM will then be able to evaluate the screenName macro, which can consequently be used in our tags. If we have not done so already, we may also create a Constant String representing our Analytics property ID at this point. These macros are now at our disposal in all container tags.

Macros for Mobile Apps

Step 3: Configure an App View Tag

Let’s set up our Universal Analytics App View tag. Our configurations are visible in the screenshot below (note the use of our newly created macros). The screen name field value of the App View will be automatically populated and corresponds to what we push to the DataLayer as the value of the screenName macro. The gaProperty macro value specifies which Google Analytics property data should be sent to (by reusing it throughout our container, for every Universal Analytics tag, we can both save time and prevent some critical typos).

Tag Manager app view tag

Step 4: Configure a Firing Rule For Our Tag

Finally, we need to set up the conditions under which the tag should execute. Since we are pushing an event with the value “screenVisible” every time an activity becomes visible, this should be the condition under which our tag should fire, as demonstrated below.

Tag Manager firing rule

Step 5: Save and Publish

We can continue to create other tags at this point. It may be beneficial, for example, to create some Google Analytics Event tags to fire on certain interactions within our app. We should apply the same logic in these instances: We need to push various event values to the DataLayer as interactions occur, and then repeat the steps above to create the appropriate Universal Analytics tags. When we’re happy, all that’s left to do is to create a new version of the container and Publish.

Tag Manager version

As we ship our app with Google Tag Manager implemented, requests will be made to the GTM system to retrieve our tags and configuration values as we discussed earlier.

Hold on, there was one more thing: the Default container!

Default Containers

When we are finished with our initial Google Tag Manager implementation and feel happy with the tags we have created, we are almost ready to ship our app. One question should remain with us at this point: what do we do if our users are not connected to the internet and hence unable to retrieve our tags and configurations from the network? Enter the Default container.

Let’s back up a little bit. In the GTM world, tag creation, configuration, settings, etc. is primarily handled in the web-based GTM interface. The power of this is obvious: we no longer need to rely on our development teams to push code for every change we want to make. Instead, we make changes in the GTM interface, publish them, and our tags and values are updated accordingly for our user base. This of course relies on the ability of our websites or applications to reach the GTM servers so that the updates can take effect. Here things get a bit more tricky for mobile apps, which partly live offline, than for websites.

To ensure that at least some container version is always available to our app, we may add a container file holding our configuration values to the project. This can be a .json file or a binary file, the latter being the required type to evaluate macros at runtime through GTM rules. We may access the binary file of our container through the GTM user interface by going to the Versions section. Here, we should download the binary file for our latest published container version and add it to our project.

create tag manager version

The binary file should be put in a /assets/tagmanager folder and its filename should correspond to our container ID (the file must be located in this folder, and it must be named correctly with our container ID). At this point, we should have both the JAR file and the binary file added to our project as shown below.

Mobile app tag manager files

Once this is done, we are ready to ship the app with our Google Tag Manager implementation. As described earlier, Fresh containers will be requested continuously by the library. This ensures that, as we create new versions of our container and publish them in the web-based GTM interface, our user base will be updated accordingly. As a back-up, without any access to a container from either the network or disk, we still have the Default container stored in a binary file to fall back on.

Summary

Let’s summarize what we have done:

  1. After completing some initial configuration steps, we created a new app container in the web-based GTM interface
  2. We figured out how to open our container as users launch our app, choosing the most suitable opening type and timeout value (taking into consideration user experience and performance)
  3. We then implemented code to push an event to the Data Layer as various screens become visible to our users, setting up a Universal Analytics App View tag in GTM to fire every time this happens
  4. We downloaded the binary file of our container and added it to our app project to be used as a Default container
  5. Lastly, we created and published our container in GTM

We are now ready to ship our application with GTM implemented!

Closing Thoughts

Google Tag Manager for mobile apps can be an incredibly powerful tool. This basic example shows how to implement Universal Analytics using this system but barely scratches the surface of what is possible with highly configurable apps that are no longer frozen. Simply put, getting started with GTM for mobile apps today sets businesses up for success in the future, I recommend trying it out as soon as possible.

I would love to hear your thoughts around Google Tag Manager for mobile apps. What are your plans for (or how are you currently) using it?

Check out our Google Analytics solutions here

Check out our Adobe Analytics solutions here

Say Ola! by filling in the form below.

Digital Analytics on Permanent Roommates

Yo,

Permanent Roommates, Created by TVF and produced by CommonFloor.com.

Lets start things by understanding the business objective of CommonFloor.com, which is an online real estate lead generation portal. At the time of creating this post, the online real estate market in India is nearly 250 crore Rs. and is expected to grow at 50-100% CAGR.

Now, who are the customers of CommonFloor.com? People looking for properties? Nope. Its the brokers and real estate developers.

The way it works for these real estate portals is that they work on a service based model where brokers and developers subscribe to their packages and in return get the leads from these portals.

So, what is the fundamental Key Performance Indicator for these websites? Yep, no brainer – the leads. The trend of leads, the total leads generated during various time frames etc etc

So, the more leads they generate the more they can attract the brokers and developers.

And the more property seekers they can attract to the website the more leads they can generate. Sounds simple, but as usual its is not,

The reasons –

1. There are presently 9 players in the online real estate market – So, obviously the property seeker is not having a dearth of options

2. There are presently 9 players in the online real estate market 🙂 The brokers and developers( OK only the brokers, developers are you know..pretty happy) will spend their money after considerable contemplation before subscribing to the services of any of these portals.

So now, what are these portals to the property seekers? Basically, they act as a research tool. So, essentially they need more variety, more choices, easy to use interface and authentic listings. Cool.

But, how to attract the users?

Generating more brand awareness and understanding the property seekers’ journey and experience on the portal. Now, meri dukaan meri website hi hai, and as things get more and more digital the human contact looses out. So, if a user comes to one website, pukes and leaves to other website, there is no one to ask the fellow – “didi what happened? color pasand nahi aya?” And here my friend you need people like me 🙂 the Analytics guys who will evaluate the entire customer journey, research upon it and give the answer to

“Mirror Mirror on the Wall, Who Is the Fairest of Them All?”

ummm..ok I am finding this interesting. Lets make an analytics map for CommonFloor.com.

So, what is a regular customer journey?

To generate awareness of course they have to communicate across various customer touch points. The collaboration with TVF on YouTube is one such touch point.

As far as my understanding goes, they are trying to generate brand awareness, one, two, they are trying to promote their App downloads.

Hmm, so lets make a layered cake of descriptive analytics.

But first, lets set the goals. I strong believe in doing this. Goals essentially are the motivation behind doing something and the more focused we are with the goals the better are the returns on our efforts.

Again, what is the ultimate goal of CommonFloor.com? ……Lead Generation (macro goal)

What would be one step behind this Goal conversion ?……Getting visitor to the property listing page (micro goal 1)

What would be one step behind this?……Getting the visitor to search for a category (micro goal 2)

What would be one step behind this?……Acquiring the visitor (micro goal 3)

So, my analytics layered cake should satisfy the taste buds with the above four goal flavors

Lets start with micro goal# 3 Aha

In all the digital analytics tools we have the birth right to set any goal completion as a metric. So, to analyse the completion of the above goals i will configure them in my analytics tool as metrics.

Since we are analyzing a marketing campaign the analytics will be around that only.

I wish to add here only that TVF has shared the commonfloor.com URL in their description but have not appended any campaign code to the URL. I consider explaining the consequences of that beyond the scope of this particular entry.

Ok, in layer one of my analytics cake I want to cook reports that tell me about the total number of brand impressions through the TVF videos.

So, the mentions are being made once at the start, once somewhere in the middle and once in the end.

As CommonFloor’s marketing manager I will ask TVF to share YouTube Analytics data with me to see the percentage of video completes and corresponding numbers. As in how many people saw 50% of the video, how many saw 75%, how many 100%

So my total impressions will be like = total number of video views where video was seen less than 30% + twice the total number of views where video was more than 30% but less than 100% and thrice the total number of views where the complete video was viewed.

I used 30% here, because in the last video the brand was mentioned somewhere near the 30% mark, thus this of course will vary from video to video.

So, this will give me total impressions. Layer one baked.

For the second layer we need to take a certain leap of faith. But, yeah it won’t taste bad either.

I would like the see the rise in traffic on my website post the launch of the videos.

Now, I would look at the average traffic volume for the certain periods like daily, weekly, monthly before the first video was launched. Fine.

Now, Any rise in traffic volume above the volume i have been regularly experiencing can be attributed to the video launch. Again, as I said we have to take a leap of faith here.

Now, the traffic can come to my site in four ways.

One, after clicking on the commonfloor’s link provided in the description of the TVF videos.

Two, by typing the commonFloor’s address in the browser.

Three, sometime later after watching the video, may be through a different device.

Four, they have not seen the TVF video but someone who has seen the video suggests them this site like Tanu does with Mikesh.

So, we need to consider the above four cases to give TVF a justified credit.

For case one all the web analytics tools have a referral and campaign report which tells about the traffic being brought by a particular referrer or campaign. The tools also provide attribution reports which credit the success of a goal (the metric set) to channels through which the visitor has come to my site.

First, as mentioned above I will set a metric to count my micro goal 3 that is number of times my landing page was loaded.

Next I can see the Acquisition > All traffic > Source/medium report to see the number of visitors YouTube has brought to my site and what % of those visitors are new to my site that is they are coming to my site for the first time. Isn’t this great?

For cases two three and four the deployment of leap of faith comes into picture. The traffic that comes directly to my site without clicking on the link provided by TVF video.

All the web Anlytics tools will consider this traffic as “Direct”. And thus you need to add a temporal context to your analytics and compare the direct traffics you have been getting post video launch and pre video launch.

You can use the Attribution tools for further deep down analysis

So, with this I have my two layers. Layer one the brand impressions through the communication and layer two the traffic that those brand impressions brought.

Micro goal # 3 delivered. Lets have a voyage for micro goal #2

My micro goal #2 is getting the visitor to search a category.

Now, on CommonFloor.com following is the space where the visitor is submitting the requirement –

The above is on the home/landing page.

The next is on the listing page –

First I will work with my developer to set an event that this send data to analytics server every time  a user submits a query, be it on the homepage or the landing page.

Then i will set a goal metric to track the number of times the query was submitted by the user.

I will again open my source/medium reports to see how the users coming from YouTube are completing this particular goal.

Again i will compare my direct traffic’s goal completion pre and post the video lauches.

The above is a screenshot of the report I am trying to explain. As you can see you can choose your goal from the drop box. So, i can set this as micro goal # 2 for this case.

Layer #2 baked..Tada!

Now layer 3 for my micro goal #1 – getting the visitor to my listing page.

Now, you can question that I could have eliminated goal 2 since anyway I am tracking goal #1 which is not going to happen without goal #2.

The reason behind that is at times the listing page might not load or the users are submitting a query which my site is not providing.

So, I want to see the conversion from users submitting a query to my site satisfying it.

So, the layer 3 is pretty simple. Set a goal metric to count the number of times the listing page was loaded, open the source/medium report, set the goal to micro goal#1 and see the performance.

And finally the top layer. The platform for my insightful bride and groom.

Have a look below how a user submits a lead –

So, i will work with my developer to set an analytics event every time the submit button is clicked. And i will set up a goal metric to count the number of times the submission was made and again the procedure is same open the source/medium reports and check out the performance.

Alright, now my selection of layered cake has a reason. If you look at the structure of layer cake –

The base has the largest area and it gradually reduces as we go up. Which is also the case with the journey of customers that come to our site. To find the effectiveness of campaign we should consider the ratios of each of these steps.

  • % Micro goal 1/Impressions = % (total visits to my site with the association)/(Total impressions in the video)
  • % Macro goal/Micro goal #3 = % (total leads generate by campaign)/(Total visits by campaign)
  • % Macro goal/Micro goal #1 = % (total leads generated)/(Total number of time the listing page was viewed)
    • This tells me the effectiveness of my listing page and the quality of my listings.
    • I would also see a trend of this ratio. How is it trending as my site is becoming bigger
  • % Micro goal #1/ Micro goal #2 = % (listing page loads)/(the query is submitted)
  • % Micro goal #1/ Micro goal #3 = % (listing page loads)/(Landing page loads)

All the web analytics tools have some sort for goal flow report. I would definitely look at it to see the goal completion journey of my visitors.

I will also like to track a few other metrics like the bounce rate and time spent per visit.

I will also like to track the location of my users to see this campaign has been popular in which all geographies.

Alright, I am done 🙂

Hope you liked reading this one as much I liked writing it. I would sincerely love it if you can give me a feedback.

I would be extremely grateful if you can share this post.

Till then 🙂

Adobe Analytics Configuration Variables

So, first of all what are Configuration Variables. Lets Google – SiteCatalyst Configration Variables

Webmetric.org  has provided a pretty exhaustive content. Much appreciated. Usually, there is just copy paste from SiteCatalyst PDFs. But, this one is helpful.

Another good content can be found at AnalyticCafe

So, one general understanding after skimming through the content is that the Configuration Variables are the fundamental setting for the Analytics JavaScritpt Library. In most cases you will not change them at a page level (Exceptions- Link tracking ). Secondly, they don’t directly collect data but rather affect data collection. You will not find a report for report suite IDs or currencies by default in SiteCatalyst.

Lets have a look at the list of Configuration Variable

  1. s.account
  2. currencyCode
  3. charSet
  4. trackDownloadLinks
  5. trackExternalLinks
  6. trackInlineStats
  7. linkDownloadFileTypes
  8. linkInternalFilters
  9. linkExternalFilters
  10. linkLeaveQueryString
  11. linkTrackVars
  12. linkTrackEvents
  13. cookieDomainPeriod
  14. fpCookieDomainPeriod
  15. doPlugins
  16. usePlugins
  17. dynamicAccountSelection
  18. dynamicAccountMatch
  19. dynamicAccountList

So, that brings the total to 19 configuration variables over which we have control.

Tomorrow I will tick them off one by one from this list.

Ok, Aug 21st – Time flies!

So, its 10.35 AM, I am back from gym and now its time for some Analytics Heavy lifting. I start my daily learning with some blogs. I alternately read Google Analytics and SiteCatalyst. Today its Google Analytics,

So lets see what Justin Cutroni has for us today.

I read Advanced Content Tracking with Universal Analytics. Go check it out. He has written a JavaScript to track events for how is content being read by readers on a site. Pretty Sweet and simple. I also need to learn how to write such codes.

Now, coming back to Configuration Variables –

1.s.account

The report suite is the most fundamental level of classification you apply on analytics data. Each report suite has a Unique ID. The s.account is nothing but a way tell Analytics data collection server about the Report suite to which the data is to be sent and we do that through the unique ID. Please keep in mind the report suite name is different from report suite ID. The scope of report suite name is only till your account and the scope of report suite ID expands till Adobe Data collection servers.

For a single suite tagging, here is an example –

s.account = “AwesomeAnalytics”

For multi suite tagging use a comma separator –

s.account = “AwesomeAnalytics,SuperAwesomeAnalytics”

Other things to keep in mind –

  • Each ID has a total length of only 40 bytes
  • Don’t use space while declaring the multiple IDs
  • Unique ID should contain only alphanumeric characters, the only exception being hyphen “-“

Well, that’s pretty much it about the s_account, lets look at the next variable – currencyCode

Now, this is one variable which should be ideally set in the s_code. Why I am saying this? Coz its non persistent, so you gotta send this information with every image request. Now, in most cases the currency of the website is kinda constant and even if it changes you can write a small code with certain if else statements and allot the currency accordingly. Thus, try declare in the s_code itself.

But, yes, why it is used. It is basically used to facilitate currency conversions. If you were not aware then SiteCatalyst offers a base currency for each report suite which can be different from the currency at which the item is sold on the website. Now, the currencies are different hence the revenue should also be different. That is we need something to facilitate conversions. currencyCode does just that. Example –

You are having a global website, but you have your head office at US. So, to track your performance you will prefer the reports in US dollars. But across the world you are not selling in dollars. So, lets suppose you are making sales in Japan. So, on the japanese website, you will set s.currencyCode = “YEN” and you can have the base currency of your report suite as Dollars. Now, SiteCatalyst will automatically convert the revenue from Yen to Dollars in your reports. That’s really sweet.

Few things to keep in mind –

There are certain currencies like Swedish Krona that don’t use period “.” but use comma “,” in there currency. But periods and commas have different meaning to SiteCatalyst. So, only use periods and not commas while feeding revenue data to any variable.

The default value for this is USD. If you require any conversion, you can simply ignore this

The debugger parameter is CC

Ok, the next one – charSet

This one is only to be used when you are going to feed in data to variables which is having non ASCII characters. Here is a link to the ASCII table – I AM ASCII TABLE.So, just cross check what you are feeding to the variables.

SiteCatalyst uses two character sets – UTF-8 and ISO-8859-1

This one plays a very crucial role for tagging international websites whose character set is beyond the standard ASCII characters.

The charSet declaration is used by Analytics servers to convert the incoming data in UTF-8

The value in s.charSet should match the value of charSet declared in Meta tag or http header.

As we know that the reports in SiteCatalyst can be populated in multiple languages. When we chose any non English language, then SiteCatalyst uses UTF-8 encoding.

Each Analytics variable has a defined length limit expressed in bytes.For standard report suites, each character is represented by a single byte; therefore, a variable with a limit of 100 bytes also has a limit of 100 characters. However, multi-byte report suites store data as UTF-8, which expresses each character with one to four bytes of data.This action effectively limits some variables to as little as 25 characters with languages such as Japanese and Chinese that commonly use between two and four bytes per character

The JS file must define the charSet variable . (All page views and traffic are assumed to be standard 7-bit ASCII unless otherwise specified.) Setting the charSet variable , tells the Analytics engine what language should be translated into UTF-8. Some language identifiers used in meta-tags or JavaScript variables do not match up with the Analytics conversion filter .

I will take ahead from here tomorrow.

Okay Aug, 22nd 11.17 Am

The only thing I believe that matters most to become successful is consistency.

First, time for some blog read. SiteCatalyst today.

I found this post by Adam Greco really Awesome –


http://adam.webanalyticsdemystified.com/2010/05/17/crm-integration-2-passing-crm-data-to-web-analytics/

Aug 24th, 6.30 PM, Sunday

I have been a bit pressed with time for the last two days. Actually was pressed only on Friday,

yesterday I did not do anything that productive except the gym. It was weekend so i could train

guilt free. By the time i left the gym last night, i was so tired i could sleep in the gym it self 😛

Anyway the next configuration variable – trackDownloadLinks

This is one is pretty simple and sweet. It does two things –

  1. As the name suggests, provides the provision to track download links
  2. Affects the clickMap data for the downloadable links

The implementation is pretty simple. Its value is either true or false –

s.trackDownloadLinks = true

I wonder why would anyone keep it false. So, if its set to true anytime someone clicks on a downloadable link, the data is sent to Analytics engine. What data? The name of the link and the click instance.

When its combined with the clickMap. The data will be shown along with the page on which the link was clicked. If you set this variable to false, clickMap data will be affected.

That’s pretty much it about this one. So, the next one – trackExternalLinks

I will not spend much time on it. Its pretty straight forward. If its set to true it tracks clicks on external links else it does not. All links whose domain does not match with the one you define in are external links. This one also affects the clickMap data.

So, the next – trackInlineStats

this one is important. Again this only has the value as either true or false. If it is set to true then only the clickMap data is recorded otherwise not. It populates the clickMap reports

Next – linkDownloadFileTypes

This one works in tandem with trackDownloadLinks. It is simply a comma separated list of extensions for various file types that exist on your site. If a particular extension is not part of this link, the corresponding downloadable file will not get tracked for Analytics.Thing to keep in mind is that Analytics can track only left clicks downloads and not the right click save as downloads, as left click is under the scope of browser and right click and save as is beyond the scope of browser and under the scope of operating system. Example –

s.linkDownloadFileTypes=”exe,zip,wav,mp3,mov,mpg,avi,wmv,doc,pdf,xls,xml”

Tomorrow I will start from linkInternalFilters

Ok, its August 25th, 10:30 AM

As usual I will start my day with some blog read. Google Analytics today. Here I come Mr. Cutroni

I read – http://cutroni.com/blog/2014/01/09/set-google-analytics-content-grouping/

Good as usual

Now, the next variable – linkInternalFilters

This variable is used in conjunction with trackExternalLinks. When we define linkInternalFilters we basically tell SiteCatalyst or the Analytics engine about what are the links or domains that we don’t want to be tracked as external links or in other words you are telling the Analytics engine that these are my internal links. So, you also need to specify the domain on which the ananlytics is getting implemented. Lets look at an example –

s.trackExternalLinks=true

s.linkInternalFilters=”javascript:,example.com,example1.com,exampleAffiliate.com”

In the above example i have told the Analytics engine that dude links to example.com, example1.com and exampleAffiliate.com should not be considered as external.

Again, notice this I am not using any spaces in the syntax.

Cool, lets move ahead – linkExternalFilters

This one is used when you are very particular about the exit links you want to track. In other words there are only certain exit links you want to track and not all exit links. So, using this configuration variable you provide that list of exit links to be tracked to the analytics engine. For example, I have many exit links on my site but I want to track only BadaBoom.com

s.trackExternalLinks=true //yep bro track ’em

s.linkInternalLinks=”javascript:,example.com”

s.linkExternalLinks=”BadaBoom.com”

Now, what we have done is that we have provided two filters to the Analytics engine. One tells what are my internal links. This will filter out all the links that I don’t consider my internal links. Next I put a filter around the external links for the links I want to be tracked. The rest of the external links won’t be displayed in the exit links report. If you don’t want the second filter simply leave it blank-

s.linkExternalFilter=””

Now, the next one – linkLeaveQueryString. By the way we are half way done 🙂

I find the name of this dude very complex – It says leave Query String, but if you set it to true, it does not leave query string. So, to deal with this one – tell it what you don’t want and it will give you what you want. Another thing all the configuration variables that have “link” in them will affect ummm.. yeah.. the link tracking. So, while tracking the links do you want to consider the Query String parameters or you don’t want? Just remember example.com and example.com?query=badaboom are essentially the same pages but when you set s.linkLeaveQueryString = true, this will track both links as separate and different. So, if you want only those links that lead to example.com irrespective of whether it was example.com?query=Badaboom or example.com?query=BadaBadaBoom set s.linkLeaveQueryString= false.

So, next one – linkTrackVars

This one is very very very important as any goof up with this will drastically affect your server calls and data being captured by other variables. So, lets dance.

This one is used to send data on exit, download and custom links. Its very dumb and freaks out it no instructions are given to it. You have to tell it what are the custom variables to which the data needs to be sent upon the click events. If you don’t tell it, its panics and resends data for all the variables who have a value at the time of a click to the analytics engine and hence duplicates the data for the other report causing erroneous inflation. So, never leave this one blank. I repeat never leave this one blank!

in the Analytics s_code define it as none –

s.linkTrackVars=none

Ideally, this should be recorded in the onClick events. example –

<a href=”index.html” onClick=”

var s=s_gi(‘rsid’);

s.linkTrackVars=’prop1,prop2,events’;

s.linkTrackEvents=’event1′;

s.prop1=’Custom Property of Link’;

s.events=’event1′;

s.tl(this,’o’,’Link Name’);

“>My Page

The above example i copied from the SiteCatalyst Implementation guide. In this example we are doing a custom link tracking.the var s = s_gi(‘rsid’) is providing the report suite Id which is rsid in this case.We want to send data to prop1, prop2 and event1 on click of this link

s.tl() is used for custom link tracking

Please take notice we don’t provide value to linkTrackVars as s.prop1 or s.prop2 or s.events.We simply write prop1,prop2,events. Again, no commas. For tracking events we use linkTrackEvents which I will be covering next.Ok, so tomorrow I will take from linkTrackEvents. I want to you to smile right now. May the rest of your day be super amazing. Cheers!

August 26th, 12:07 pm, yes got a bit delayed today. Was working up for my other project –

livAwesome, its a self-help project I am working upon. You can check that out.

Ok, lets start with a blog. SiteCatalyst today.

Nice, so the next vatiable – linkTrackEvents

So, whenever you are doing a custom tagging for a link, and on specific onClick events you want to fire apart from prop and eVar an event as well, linkTrackEvents is used over there. While defining this event you first need to include “events” notice the “s” in linkTrackVars and in the following statement specify the name of the event you will be using in linkTrackEvents.Again, you will not be specifying the values as s.event1 or s.event2, you will simply write event1,event2,event3….

The declaration of event in s.events=eventX is still required. Please refer the to the example above for better clarity.

Ok, now lets look at the next one – cookieDomainPeriod

Any mistakes with this one can screw up your data, particularly around the count of unique visitors. Let me explain what it does. The setting on this variable directly affects the acceptance/rejection and functionality between the browser and Analytics cookie.

If you find the understanding of this variable difficult then don’t take alot of load. Simply put the number of periods in your DOMAIN as the input to this variable. Put only the number of periods in Domain and NOT THE SUBDOMAIN.

By, default the value of this variable is 2.

If your domain is AwesomeAnalytics.com then the cookie domain period is 2 if it is AwesomeAnalytics.co.in then the cookie domain period is 3.

Now, if the page is having URL www.badaboom.AwesomeAnalytics.com then still the cookie domain period is 2 and not three because we are providing the name of domain and not the subdomain.

But, what happens if my cookieDomainPeriod is 2 but my domain has three periods. For

example, AwesomeAnalytics.co.in. In this case the s_code will attempt to set a cookie for co.in domain and not AwesomeAnalytics.co.in. Now, since co.in is not a valid domain, the browser will get pissed off and ask the s_code to bugger off and reject the cookies, this will terribly affect the count of unique visitors and functionality of many of the plugins.

Another example you have a subdomain badaboom.AwesomeAnalytics.com. Here the CookieDomainPeriod is 2 and you give the value three. The analytics will assume that the domain is badaboom.AwesomeAnalytics.com and not AwesomeAnalytics.com. Now, if you have another subdomain NotBadaBoom.AwesomeAnalytics.com. This will be separate

website altogether, so would be AwesomeAnalytics.com. So, this will affect the functionality of cookies and also in linkInternalFilters you have the name of your domain and not the subdomain, things will get screwed up.

One more thing – enter the values a string that is 2 is “2”

If its still not clear just set the value to number of periods in your domain please.

Now, next fpCookieDomainPeriod.

The fp stands for first party. This is used by Analytics to set up first party cookies OTHER

THAN visitor Id cookie (s_vi) that is it does not control s_vi. The visitor click map cookie -s_sq and cookie checker -s_cc, plus cookies set up by a few plugins like getValOnce are affected by this variable.

Again simply put the number of periods in your domain into this period.

Tomorrow, I will take it up from dPlugins. Cheers!

August 28th, if you did not notice, i did not blog yesterday, was really busy with some other work. Its been two days I have not been to gym.

Ok, so getting down to business – doPlugins

So you might be aware that there are various plugins that you can use to enhance the functionality of your Analytics implementation. To use plugins apart from adding the plugin code to the Appmeasurement.js library or the s_code, You need to configure three other variables –

usePlugins

doPlugins

s_doPlugins function

First, set s.usePlugins=true

second, define the s_doPlugins function third, feed to output of s_doPlugins function to s.doPlugins variable –

Example, we want to feed a default value to prop1 by using some plugin. Add this code to s_code –

usePlugins=true

function s_doPlugins(s){

s.prop1=”Plugin output”

}

s.doPlugins=s_doplugins

Just a piece of information – When visitor comes to your website, then the Javascript files get cached to the visitors browser. The Analytics is driven by that cached version only, till it gets updated. So, if you have made any updates in your Appmeasurement library or s_code and the change is not getting reflected that is because the visitor’s browser is using the cached version. This can happen to you also during testing. You might have made the correct update, but it won’t get reflected because your browser is not deploying the latest library. So, its a good practice to clear your cache before testing any updates.

September 3rd. Blogging after let me count – 5 days! I went home actually and post my return I have not been able to manage my time well. But, anyway the essence lies in moving forward.*Fingers popping*

Just three left – dynamicAccountSelection, dynamicAccountMatch and dynamicAccountList. All three are used in conjunction with each other. As their name suggests, they are used to dynamically select the report suite to which the data is to be sent.

Lets start with dynamicAccountSelectio. The input is just true or false. If want dynamic selection, simply set it to true.

Now, dynamicAccountList. In the syntax of dynamicAccountList we specify two things. One, the report suite Ids and two, the URLs.

We basically tell the Javascript to select particular report suite IDs based on the URL of the webpage. So, lets have a look at the syntax.

s.dynamicAccountList=”reportsuiteId1,reportsuiteId4=AwesomeAnalytics.com;reportsuiteId5,reportsuiteId3=BadaBoom.com”

Key characters to keep in mind here are comma, semi colon and equal to. Look at the way I have put them here. I so wish SiteCatalyst had been consistent with it. like for products, different product declarations are separated by comma and individual properties by semi colon. Here the case is opposite. So, keep these things in mind.

Now, dynamicAccountMatch. This is kinda complex for people who are not that geeky. So, you first enable dynamicAccountSelection, then you provide the criteria in the dynamicAccountList and then, you tell the Javascript where to implement this criteria, and this is done through dynamicAccountMatch.

The value you feed to dynamicAccountMatch are DOM objects on which you want to apply the criteria mentioned in dynamicAccountList. Following are the DOM objects to which you can apply dynamicAccountSelection –

window.locatio.host – to apply it to domain, this is also by default

window.location.pathname – to apply it to the path in the URL

( window.location.search?window.location.search:”?”) – to apply it on a Query string

window.location.host+window.location.pathname – Host name and Path

window.location.pathname+(window.location.search?window.location.search:”?”) – Path + Query String

window.location.href – Entire URL

Keep in mind dynamic Account selection is not supported by the latest Appmeasurement for Javascript.
And Voila! we are done! So, how many days it took me? 15 days! jeez. But, the point is this blog now exists in this world and we both learned something. Consistency matters.
Please give me a feedback if you have some time.
Till then, Happy Analytics!
Cheers!