Could Links Be On Their Way Out of SEO?
Why does Google bother using links as a ranking factor? That’s because when Google originally launched the algorithm was built to function much like academic citations. A well-researched and respected paper is eventually linked to and cited by subsequent papers in that field, further boosting the initial paper’s value. Google took the same approach with links; the more linked to a site was the more valuable it must be, right? Well this approach worked for a long time and it worked really well until site owners realized that more links meant better rankings. Then the great link scramble began!
Site owners went after every link they could find with little thought for quality, relevancy or context. And this worked and worked really well until Google got smart and pushed the first Penguin update live, which took down sites that were playing fast and loose with their link building. Sites weren’t just hit, they were completely buried in the SERPs until they cleaned up their link profile. We have one client that admitted they were playing the game (albeit, trying to be smarter than their competition) and when they finally got caught it took them almost two years to recover!
Since Penguin first hit the Google algorithm (and there have been 25+ iterations since then), site owners have been running scared of link building. They are even approaching great and totally legitimate links with trepidation because it often feels like Google has set an invisible and ever-moving target when it comes to building a “natural” link profile. No one wants to come into work one day and see his or her traffic has vanished overnight. And if site owners are afraid to build or even naturally earn new links then how reliable of a ranking factor can links be?
Well if what SearchEngineLand.com reported in early March comes to pass, links may not be long for the world of SEO.
As New Scientist recently reported, a team of research scientists at Google has published a paper (PDF) explaining the idea of Knowledge-Based Trust (KBT), an alternate way of determining the quality of web pages by looking at how accurate they are.
The quality of web sources has been traditionally evaluated using exogenous signals such as the hyperlink structure of the graph. We propose a new approach that relies on endogenous signals, namely, the correctness of factual information provided by the source. A source that has few false facts is considered to be trustworthy…
…Along those lines, the authors say this way of measuring trustworthiness “provides an additional signal for evaluating the quality of a website,” and could be used “in conjunction with existing signals such as PageRank” — not necessarily as a replacement.
Google has gotten a lot better at understanding the contextual relevancy of a page and doesn’t require things like keyword tags or keyword-rich anchor text to understand what a page is about. The content of your site is what matters the most when determining what search term a page should rank for. Google’s Knowledge Graph is definitely pulling millions, if not billions, of facts right into the SERPs so there is no denying that Google knows the “truth,” but at the same time with no barrier to entry anyone can publish anything and call it fact. Even false truths can go mainstream if they pick up enough steam.
So will Google get rid of links, or somehow downplay the value of links as a ranking factor? Only time will tell.
Categorized in: Link Building
Like what you've read? Please share this article