A little over a year ago, I wrote a blog post about why you shouldn’t freak out about people opt-ing out of being tracked in Google Analytics. Yesterday, Google announced that they are using SSL to encrypt search queries and responses for people that are using Google.com and are logged into their Google accounts. The result of this change is that referrals from Google organic search in Google Analytics and other clickstream measurement tools will not be able to determine the keywords used in the search that brought a user to the studied site. The reporting of the fact that the visit was referred by Google organic search will be maintained.
Although this has widespread implications for both Search Engine Optimization and site optimization activities, I’d encourage you to not freak out about this change for a lot of same reasons that I outlined last year.
Why I’m Not Freaking Out – And You Shouldn’t Either
I have no idea what to expect with this change – there is no way for me to predict how many of my sites’ visitors are going to be coming from logged-in Google account holders. I do know the current impact of Google organic search to my portfolio of sites – it is the single biggest driver of organic (as opposed to paid) search traffic, providing, on average, 75% of visits. According to both Comscore and Hitwise, Google had a 66% market share of U.S. searches in September 2011.
This traffic is important to the overall goals of all of our sites. We spend a lot of time and effort on building and maintaining our sites’ search traffic and this change has serious implications for both the quality and quantity of data that we use to drive these efforts. Despite this, I’m still not freaking out.
Using the same thought exercise as my previous post, imagine if 50% of Google’s organic traffic had its keywords obfuscated because those visitors were logged into their Google accounts when they performed the search that ultimately brought them to your site:
It (Still) Isn’t about the Individual Visit (or in This Case, Search) – Search data, like all other clickstream data, is useful in aggregate. By looking at the keywords searched and grouping them by theme, we can calculate the value of certain types of keywords vs. others (for example, brand terms vs. product terms). Losing 50% of this data will affect the number of Long Tail keywords that get reported and potentially affect the reporting of narrow segments of traffic that have few reported keywords, but not radically affect our conclusions on the aggregate “search intent” of our visitors.
Aggregate Data is More about Precision than Accuracy - With this thought exercise, we are again losing a bit of Accuracy without losing Precision. There is no reason to suspect that individuals search and post-search behavior is going to change because they happen to be logged into their Google account. The assumption that these individuals as a group do not significantly behave differently than all Google organic visitors is one that is easily tested within Google Analytics by comparing the group with reported keywords with the one whose search terms are obfuscated. Because of this (testable) assumption, we can draw conclusions that the “search intent” of the missing 50% in aggregate is going to be similar to the fully-reported 50%.
Perhaps most importantly, from a privacy standpoint, this is the right thing to do. I spend a lot of time connected to Wi-Fi in other offices, coffee shops, hotels, at conferences, and other places where a nefarious system administrator could easily snoop on my search queries and other non-encrypted web usage data. Google’s new two-factor authentication makes me secure accessing Google products (including Google Analytics!) while connected to potentially sketchy Wi-Fi. Now I have the same level of comfort while using Google search in potentially unfriendly places.
The data will show the impact of the change in the next few days as it is rolled out to everyone. Regardless of the scope of the data that has been affected, I hope this post had made a strong argument for not freaking out about it.
I was inspired to write this post this evening after looking at my LinkedIn social graph and seeing the recent career arcs of some of my former colleagues (more on this later).
This post is to thank and highlight a video from a person that has had a tremendous amount of influence to my career: Avinash Kaushik. This video was recorded back in 2007, and talks about a topic that is near and dear to my heart:
In the wrong context or to the wrong people, talking about “culture” causes people’s eyes to glaze over. Based on my experience, I am no longer one of those people. My tale of developing a data-driven culture in a large corporation follows:
In 2006, I accepted a position as a Senior Manager of Web Analytics for a large business services firm. As the product manager of the organization’s enterprise-wide web analytics software and data collection framework, I had my hands full developing a data capture and reporting framework as part of a complete web reboot by the company. Although implementing an enterprise click stream tool as well as a framework for web data integration into the company’s data warehouse was a technically complex task, it was fairly straightforward once the requirements were determined.
What was not straightforward, however, was how to develop a data-driven culture in regards to how the company used its web data.
The organization had nine different, quasi-independent business units with about thirty-five different web sites. Nobody from the business units was focused on web analytics, however, each business unit had a web team that was focused on managing the content on the sites. My goal was to transform those positions from content managers into data driven product managers of their web sites.
So how did I attempt to accomplish this? I empowered the web managers with their own data. I trained the web teams on both our clickstream and data warehouse tools and gave them the ability to independently develop actionable insight about their clients’ web usage.
Less than a year later, these managers could look at both individual and aggregate customer data and determine how specific web-based activity affected their business units’ bottom line. They had total visibility into all of the company’s marketing data, allowing them to explore the data and develop objective arguments for action.
Looking at my social graph on LinkedIn, I see that, three years later, some have moved into different roles either within or outside of the firm, but at least four of those former web managers have moved on to be web analysts, two with a top-tier web analytics consulting firm.
My approach here was directly influenced by Avinash’s first book, blog, and talks that he was giving at the time. His guidance was, and continues to be, useful and inspirational for the entire online marketing community.
This post is in praise of a simple tool: the QR Code. QR codes are graphics that represent text strings, typically website URL’s. Using their cameras and QR scanning software, smart phone users can scan QR codes to launch specific website URL’s in their mobile browser.
Although the “QR” code is one type of two dimensional code (other common ones: Aztec Code, MaxiCode) the term “QR Code” has been extended to encompass any two dimensional code that is readable by scanner software on mobile devices. The QR Code standard is a set standard and license-free, so the platforms for both consumption and generation are interchangeable.
So what’s to like about QR codes?
QR codes are easy to consume. All the major mobile platforms either support the QR Code standard natively or have free QR scanning applications readily available. To consume a QR code, a smart phone user needs to simply “take a picture” of it with their phone.
QR codes are easy to create. Since the QR Code is based on a set standard, there are a number of web services that will produce them based on URL input. I’ve been using Kaywa’s generator, but URL shorteners, such as goo.gl and bit.ly now also generate them along with their shortened URL’s.
QR codes are easy to track. Much like a shortened URL or a vanity URL, there is opportunity to tag incoming URL’s to allow tracking of traffic generated by QR code scans. This is a key practice when attempting to determine use of and return from QR code usage.
Where are they useful? With the explosion in advanced smartphone usage, there is increasing opportunity to embed these codes in a wide variety of applications. I have personally seen QR codes used in billboards, magazine advertisements, bus shelters, bus wraps, business cards, conference badges, and, oddly enough, men’s rooms.
So why aren’t they everywhere? The sad truth is that they aren’t everywhere. They are still so rare to see “in the wild” that I am still surprised to see them, even in situations with obvious utility.
What are some other uses? I’d like to see QR codes everywhere where a web resource could be useful. I’d like one on my appliances or in my car that can point me to product information. I’d like one at Starbucks and Chipotle that would allow me to order and pay while standing in line. I’d like to see them on TV that would allow me to connect with shows and their advertisers in addition to vanity URL’s. The potential applications are legion.