How To Assess Website Quality Now That Toolbar PageRank is Dead

Google recently switched off the Toolbar PageRank (TBPR) completely, much to the shock of the SEO community. For a long time Google’s TBPR was the best way to quickly determine the quality of a website, at least in terms of SEO. Now it is gone, what can replace it?

A Short History of PageRank

PageRank (PR) was developed in 1996 by Larry Page and Sergey Brin, who went on to create Google. They developed the idea as part of a research project – PageRank was built from earlier ideas on how information could be mathematically sorted and ranked according to the popularity and relationship between different items. It’s a little known fact that it was not named PageRank because it ranked web pages, but because it was developed largely by Larry Page. You can read the original research paper here.

PageRank provided a system for ranking web pages based on “link popularity” and scored pages on a logarithmic scale, and TBPR was shown as a number from 0 to 10. Unfortunately, as soon as professional SEOs understood PageRank, which was very quickly because Google openly shared most of the details of how it worked, people started to abuse the system to game Google.

How PageRank Helped SEOs

No other search engine provided SEOs with an easy way to determine how valuable a page, or website, was, so until the development of PageRank, SEO was largely guesswork. But Google released the Google Toolbar PageRank, which was used in some web browsers and on browser toolbar plugins, and this allowed SEOs to quickly determine the value of a website.

Google also decided to provide the information via API, which meant that every SEO tool developer included it in their data. TBPR was important because SEOs could start to focus their campaigns only on websites that were already ranked well.

How Does PageRank Work?

The PageRank system is still alive within Google, but it is now just a small part of the overall picture. The basic principle of PageRank was that the all links to a web page carry some value, but links from more valuable pages, i.e. those with a higher PageRank, carry more value.

For example, if you had a link from the BBC homepage (one of the relatively few websites with a PageRank of 10) this would carry a massive amount of weight and it would quickly help to propel your website to the top of Google search for any keywords that you are focusing on. However, as we know, getting a link on a high value site is near impossible for small businesses. But, this is where SEOs realised a new opportunity – many smaller websites had good PageRank too, and it is relatively easy to obtain links on websites that had a PageRank of between 3 and 6, sometimes higher. So, SEOs would develop strategies for gaining links from websites with an average PageRank.

But, this is not all. Because, SEOs also realised that, because PageRank works on a logarithmic scale, all sites carried some value. So, rather than getting 1 link on a PR 10 site, you could get 10 links on PR 5 sites, or 100 links on PR 1 sites, for example. This led to the craze of automated link building – creating thousands, sometimes millions, of links on low quality websites just to improve search results positions. Black Hat SEO was born and people were selling links by the million (on Fiverr you can buy 100,000 links for just $5).

Related:   There Will Be No More Panda Updates

The Deaths of Toolbar PageRank

Google never used to update TBPR on a regular basis – it was every few months or so – but in October 2013, Google announced that they would not be updating the visible TBPR again. The last known update was in December 2013, so for the past 2 ½ years there has been no change, although this had not stopped SEOs from using it still.

In April 2016, Google turned off TBPR. Overnight, all data was wiped and TBPR has since shown zero for every web page. Google PageRank is officially dead.

How To Rate a Website Now?

This closure of TBPR has led many people to ask how we can rate web pages now. There are fortunately many methods available – most have been around for many years and some were developed specifically to provide an alternative to TBPR that aims to closely mimic what Google showed.

Moz Domain and Page Authority

One of the most popular tools today is Moz’s Domain Authority (DA) and Page Authority (PA). These metrics aim to replicate Google’s TBPR in that they provide a value based on the number and quality of links that are held within Moz’s own database. The data on hand is far less than that of Google, but it is always growing and considered a good guide.

A Domain Authority of 30 is considered by many to be a worthy site, and sites with a DA of over 60 are thought to be very valuable in SEO terms. While DA provides a value for an entire website, Page Authority helps to determine how good a single page is. Sometimes poor site navigation means that a page on a website may not carry much value – PA is a good tool to determine this.

Moz have developed a whole suite of SEO tools over the years, such as their Open Site Explorer, Keyword Explorer and the MozBar. Open Site Explorer shows Domain Authority and Page Authority.

Moz Site Explorer provides data on DA and PA, as well as new and established links to a site. A site with no new links could be considered a “dead” site, in that while it may have been popular in the past, it is no longer growing. This is thought to be one of the many ranking factors.

Majestic Backlinks

Majestic (formerly Majestic SEO) provide a wide range of data on a website, but it is mostly used by SEOs to look at the backlinks in place. Because PageRank counts backlinks, this is a simple tool to determine how valuable a site may be. The number of links alone does not determine the value of a site, but Majestic has provided a couple of additional tools to help with this.

Majestic’s own site explorer provides the Trust Flow and Citation Flow metrics. Trust flow is based on predicted clicks from links on other websites – if a website is actually providing a good user experience, i.e. if people read it, the links should carry more weight. Citation flow is more like PageRank, in that it is based on the total number of links.

Majestic also provides a breakdown of links based on both keywords and the types of websites that are linking. Seeing the keywords helps to weed out sites that have been heavily spammed. While the spam may not always harm the site, it will give the impression that a site is more popular than it really is.

Another known Google ranking factor is getting links on relevant websites. For instance, a health based website that mostly has links from other health websites should rank better than one that has links from less relevant sites. This again helps Google to reduce the impact of web spam, where links are placed wherever it is easiest to get them, regardless of the theme of the site.

Related:   How Conversion Rate Optimisation (CRO) Improves Business

SEM Rush

Finally, there is SEMRush. What SEMRush attempts to do that other web analytics do not, is provide a prediction of actual organic search traffic to a website. If a site is ranked well in Google it should be receiving traffic, and SEMRush attempts to show this. It does this by determining the positions in Google search of the most popular keyword terms and predicting the number of clicks. For example, if a website is ranking first for a keyword phrase that is searched 1,000 times a month, we would expect it to receive around 300 visits for that search.  This is a great tool to help determine when a site has been penalised, and possibly less favourable than the other metrics such as DA and total backlinks suggest.

Holistic Checks

Holistic checks are not grounded in any science, but do provide a simple way to quickly determine whether or not to pursue a link opportunity. Of course, what we call holistic is probably applied somehow by Google AI. Let’s run through a few simple checks.

Web Design

Does the website look like it has been well designed, or just thrown together, based on a simple template? If a site provides value to the owner they are more likely to make the effort to make it look great. If people are not reading it, they will not invest in a great design.

Writing Quality

Is the written content of a consistently high quality? Many English language sites are written by people whose first language is not English. These sites are often specifically made to look good to SEOs (i.e. they have good DA and PA) but do not really provide value to visitors. Google can probably determine this quicker than you, so take the time to look at a few pages to be sure that the website is a positive environment for your brand.

Google Indexing

OK, this is not holistic, but there is no tool. Find a blog post or any other dated page on a website and check to see if recent entries are indexed in Google. The more popular a website is, the more often Google crawls and indexes pages. For example, very popular websites will have new pages indexed within a few minutes of publication, but less favoured websites may not see their new content indexed for days, or even weeks.

If you are looking to write a guest blog and the previous posts were added a week ago and still not indexed, this is a sign that Google has either never rated the site well, or that there is a penalty in place.

Warning – Beware of Google Penalties!

Many websites are designed to sell links – this is a fact of the Internet. A common “trick” is to purchase a domain that has good metrics, such as a good Domain Authority and many links listed in Majestic, and create a blog where guest posts placements are sold. However, one thing that the metrics do not highlight is when there is a penalty.

Google may have either applied a manual penalty on a website for selling links, which will mean that all the ranking metrics are meaningless. This is why taking a holistic approach is important today.

Don’t Rely On One Metric

It is tempting to rely on a single method to rate a page, but this is never a good idea. While some tools provide a handy metric, they are often useless when used on their own. It is far better to combine the free analytics with some more holistic analysis of a website, as discussed above, to be sure that a website is of a high enough quality to support your SEO campaign. Google uses around 200 factors to rank pages, so don’t expect to get the full picture from a single statistic.

Read more: Google Ditches Old News In Favour Of Fresh Content »