A Theory on Google’s Not Provided
According to (Not Provided) Count, “not provided” data in Google Analytics will hit 100% on December 16th, 2013. We’ve been watching not provided crawl from the originally reported and promised 10% to 40, 50, 60 and 70+% over the last few months. Recently, Google announced that they would be securing all searches, pushing “not provided” to 100%, much to the dismay and frustration of site owners and SEO professionals everywhere.
Plenty of SEO experts have offered their advice on how to work around not provided and some of them have come up with really clever alternatives. Their workarounds might require a little bit more work on your part but the data can be salvaged in some part if you are really willing to put the time and effort in.
Google says the reason they are encrypting search is to better protect the privacy of their users. (The fact that the same data is still available in AdWords is a debate for another day…) There could be many reasons for this decision, and we can all speculate about them until the cows come home, but the other day a thought occurred to me. What if one of the reasons Google is pushing “not provided” out is because they want to make it harder to do black hat SEO?
This is just a theory so I am sure there are plenty of holes in it at this point but hear me out…
One of the cornerstones of black hat SEO is picking a few main keywords and hammering away at it. Let’s build 1000s of links with keyword rich anchor text and over-optimize pages of content to focus on 2 or 3 keywords, that way we can rank better for that keyword! But if you don’t know which keywords are driving (and converting) the most/best traffic on your site your black hat SEO program becomes more like a shot in the dark as opposed to a calculated move.
For instance, if I knew that “business intelligence tools” drove a lot of high-quality traffic and produced the most conversions to my site, I could use my SEO campaign to focus solely on that one keyword and basically ignore the rest. Why bother with keywords that don’t work as well? This kind of pigeonhole approach to SEO often leads to black hat and web spam tactics like keyword stuffing, link exchanges, forum spamming and so forth. Site owners are more willing to try and bend the rules to get to the top. And for a long time those tactics worked! The number of links was the only thing that mattered, so building a ton of links for one keyword would help a site that didn’t always deserve it gets to the top. But as Google and the search algorithm got smarter, especially in the last two years, those loopholes to the top of the SERPs were closed. Tricking your way to number one was getting more and more difficult to do, and even harder to sustain.
By replacing the keyword data with “not provided” Google is removing the hard and fast knowledge from site owners that a certain keyword is performing better than others. Webmaster Tools might show you impressions and click through data but that’s not the whole story. By doing that they are making it harder for a site owner to knowingly pick a keyword to build their entire SEO campaign on. Now, if you want to succeed you have to build your online presence naturally, and almost holistically. You can’t rely on one or two keywords for success because you can’t track just one or two keywords as easily as you used to. Now your SEO has to be completely about building the brand as a whole and improving the quality of your entire website, not just garnering more links for a short list of keywords.
As I said before this is just a theory: I am sure Google has plenty of reasons for making “not provided” the default setting for keyword data, and I doubt they will be telling us the real reasons why any time soon. But this was just a thought that struck me the other day and as crazy as it may seem why couldn’t it be just one more reason for Google to go “not provided.”
Categorized in: Search Engines
Like what you've read? Please share this article