Curamando

Google Makes Changes to how they Sort Websites for Searchers

By 27 September, 2019 No Comments

At 2 pm Eastern Time on September the 24th of 2019, Google started rolling out a Broad Core Algorithm Update. This update will need about a week to take full effect in all markets, and webmasters, as well as businesses that focus on their organic traffic, should expect that there might be shifts in how they are performing (for better and for worse).

This is the third Broad Core Algorithm Update so far in 2019. The first one was in March, the second in June and then there’s the one we will be experiencing in the coming week.

What is a Broad Core Algorithm Update?

Tweet from Google SearchLiaison about the Core Update

As of 2019, Google has begun to announce large updates before they happen, so webmasters around the world will have a better understanding of why they can see a lot of changes.

This is when Google makes changes to the algorithm they use to calculate in which order to show webpages when someone types (or talks) in a query to their search engine. However, to answer this more satisfactorily, we need a short recap of what this is:

The ranking algorithm is a mathematical equation that multiplies more than 200 aspects found on or connected to individual webpages. The end outcome is a “bid” for where to rank that webpage for the performed query.

To simplify (a bit), the algorithm is kind of like this:

Bid = [f1] * [f2] * [f3] * [f4] * … * [f200]

(f = factor – and each factor looks at different aspects)

So, a Broad Core Algorithm Update changes this mathematical equation. Maybe they swap places (f134 moves to the f28 bracket), or maybe they add more factors (f201 and f202), or maybe they even removed a factor.

These changes have a lot to say in how a web page’s bid will turn out, which means that these changes can have a massive impact on specific industries – or on certain practices used.

Google makes several small changes daily

Google’s ranking algorithm gets its inputs from subroutines at work in the Google Index (their database on all webpages they have found and all the intel they connect to each page), and a subroutine can be looking at something like this (let’s call this subroutine: Is this a safe site to send a user to?):

  1. Is the webpage served over HTTPS?
    Yes = 5 points
    No = 0 points
  2. History of having been hacked?
    Yes = -4 points
    No = 0 points
  3. Are all resources served over HTTPS?
    Yes = 3 points
    No = 0 points
  4. Does the webpage ask for personal details that can be abused?
    If https + never been hacked = 2 points
    If http + never been hacked = 1 point
    If been hacked + http = 0 points
  5. Summarise points

The above is just a suggestion on how a subroutine can reward the aspects it is set to look upon (it probably isn’t exactly like that – this is to exemplify).

Google often changes subroutines like these on how much the 1st section is weighed on the cost of the third section or fourth section or vice versa. Considering that Sundar Pichai (Google CEO) said to the US congress that their algorithm is looking at around 200 factors (subroutines), changes occur more often than once a day. Also, when these changes occur, they have a reasonably minimal impact in itself on what order webpages are presented when you look at Google results as a whole (though these can be devastating, or very rewarding, to a single business).

Why does Google make these changes?

Only a select few at Google have access to see all aspects weighted within the algorithm, and wherein the algorithm that aspect has an input. The reason for this is simple; this algorithm is at the core of how Google can produce better results for queries than other search engines (so knowing 100% of it would be something every potential competitor would love to do).

However, with enough time and a static algorithm, it isn’t entirely impossible to reverse engineer it. This is likely one of the reasons Google needs to stay in constant flux (to preserve what makes them unique from competitors).

However, the main issue Google is addressing when they do such updates focuses on giving Google-users a better Search Engine Results Page (SERP) when they search for something. Also, Google doesn’t always get this right. The March 2019 Core Update had devastating effects on many businesses within the health-industries, so in their June 2019 Core Update, they either reverted some of the changes made – or added something to compensate for the changes made in the March 2019 update.

What does this mean?

When Google rolls out this Core Algorithm Update (over a week), there will be some turmoil. Pages are expected to jump up and down in the Search Engine Results Pages (SERPs) until it settles. Some industries won’t notice anything, while others will notice a lot.

So, if you’re experiencing a surge of traffic or a loss of traffic – you should wait until the Core Algorithm Update is finished rolling out (October the 2nd), before starting to evaluate if this is something that needs to be addressed, or not.

Businesses that lose visibility might feel like they are being penalized, while other businesses will feel they finally have been rewarded for their efforts. Google won’t say how the update was thought to impact industries or businesses who do something they like/dislike. The only thing website owners can do, according to Google – is to focus on their users, provide better user-experiences, better answers, and to simplify their visits. Maybe Google doesn’t reward it just yet, but your website visitors will appreciate it until Google does.

PS! Considering the timing of this blog entry from Google, most likely, the September 2019 Core Algorithm Update includes support for new preview options. You can find more on this here: Google Webmasters Central Blog.