Evolution of SEO – From Link Building to Trustworthiness

In recent years, Google has made changes to its search algorithm to tackle the problem of black hat link building. This has already changed the way digital marketing is performed and we are expecting more changes to come. The biggest change has been the downgrading of web links as a ranking factor and the corresponding rise in the importance of quality content.

In the early days of Google, the most important mechanism for the ranking of websites was PageRank, which is the system developed by Larry Page, one of Google’s founders, to determine a website’s popularity based on the number of links pointing towards it. If you wish to learn more about how Google developed its first search engine, you should read The Anatomy of a Large-Scale Hypertextual Web Search Engine by Sergey Brin and Lawrence Page. This 1998 paper, published by Stanford University, discusses the prototype of a large-scale search engine.

In 1998, Larry Page noted that, “The citation (link) graph of the web is an important resource that has largely gone unused in existing web search engines.“ – Google changed that. The system made perfect sense at the time. Before search engines were developed, the only way to discover a website was by following a link on a page. This is why web directories became so popular during the 1990s.

For many years this system was an effective way to index pages, so much so that a majority of Internet users switched from using Yahoo!, AltaVista, Excite and Lycos to use Google instead.  Website owners and marketers soon realised that the system could be gamed. Black hat SEO was born.

Google used to rank websites based on the number and the quality of links. Although a low quality link did not have much “PageRank”, creating thousands of low quality links had a similar effect to creating a single high quality link. Automated link building was born.

Related:   How To Protect Your Business From WannaCry Ransom Virus

Google’s first major way of tackling this problem was the launch of the Panda algorithm in February 2011, which applies an automated penalty to a website that has low quality, repetitive or spammy pages. Google made its first attempt to rank sites based on quality of content rather than links alone.

However, this was not as effective as Google had hoped, so in April 2012, Google launched the Penguin update, which penalises websites that have many low quality links. There is still some debate over whether or not Penguin is actually a penalty at all – some SEOs believe that the Penguin updates (there have now been six updates, the last being in October 2014) are simply removing the PageRank of many links; sites that used to rely on low quality links stopped ranking.

The age of quality

So, where are we now? To date, as far as we are aware, Google has not yet started actively promoting websites with good quality content; Panda only demoted sites with low quality content.

It is expected that the next generation of Google updates will start to look at the quality of content, such as the factual accuracy of statements within articles. In February 2015, Hal Hodson reported in New Scientist that a Google research team is currently working on a new model to determine the trustworthiness of a page.

“Instead of counting incoming links, the system – which is not yet live – counts the number of incorrect facts within a page. “A source that has few false facts is considered to be trustworthy,” says the team (read Knowledge-Based Trust: Estimating the Trustworthiness of Web Sources). The score they compute for each page is its Knowledge-Based Trust score.”

Related:   Introducing The World Cup (With A Dash Of SEO News!)

Google will still be penalising sites for bad content, rather than promoting sites with good content. However, this is a sign that Google is developing a much more advanced search algorithm that can analyse the content of a page or website in minute detail and determine how good the site is.

Of course, we have no idea how Google will measure a page that offers various opposing arguments or even discusses mistakes made in the past and provides their resolution; many excellent resources are full of inaccurate information, this is what makes them excellent resources!

Your SEO if Safe with FSE

The cornerstone of our services is content marketing. We use highly skilled copywriters who carefully research stories and articles before producing informative and factually correct articles. This practice should ensure that our SEO services will not only continue to produce good results, but in the future, also provide additional value to a website.

We are slowly moving towards a search environment where the quality of content on a website will be more important than the number of links pointing at a site. We are certainly not there yet, but if Google is working on this, we can be sure that it will not be long before quality content will be the number one ranking signal.

Read more: 5 Simple Ways To Get More Business Leads »