Google’s Algorithms

Google Algorithms

Google is the most popular search engine and the one that indexes the most content in real time. To deliver quality results that keep you in that position, implement new improvements in the way websites are crawled, ranked, and indexed in their SERPs. This is where Google’s algorithms come in.

Google updates its algorithms more than 500 times a year, which is an update every 17 hours or so. Most of these updates or changes are of little relevance and do not usually change the search results and their positions much, they only correct certain errors or improve the efficiency of Crawler.

Instead, there are bigger updates that can affect SERP rankings. This can be due to updates of existing algorithms or the integration of new algorithms.

What is a Google algorithm?

Algorithms are highly complex computer applications created by Google that crawl and index websites to add them to results pages. Its mission is to emulate a “human” moderation so that through its different aspects it can automatically classify the most relevant websites in the most sensible and fair way based on the criteria used by its users.

How does a Google algorithm work?

Google algorithm has the function of classifying in which positions a website will appear. The algorithm decides in which position your website will be. Each algorithm is entrusted with a task and a type of area that it must analyze. Depending on the ability or success that we have had in meeting the requirements of the algorithm in question, it will decide in which position in the SERP the website will be displayed.

Each algorithm analyzes a part of the website (on page) and what happens outside of it (off page), following some requirements and variables that are modified or improved with each update that Google makes on them.

The most popular Google algorithms:


Google is getting more rigorous with searches on mobile devices . From his official blog he warned that “Mobile-first is not going to change the way in which the positioning of a website is calculated, if not the way in which the information on that website is processed” ( For now ).

In the same way, they also reported that the most suitable mobile versions are the responsive ones and not the mobile-specific versions (such as This is because they want the websites to show the same content for all devices. Also, pages not adapted for mobile or with a slow loading speed will have less visibility in the search engine.

To understand exactly why this algorithm is called Mobile First, the explanation is straightforward.

Before, the Google robot added the pages to its index based on the classification that the algorithm gave it by analyzing the desktop version. With the new update, the index will show results as you track and rate the mobile version of the site.

What does this mean for our website?

Well, this update affects us differently depending on the mobile version of our website.

  • If we only have a desktop version (It is not responsive)

You should do a web redesign as soon as possible. This type of website greatly increases the bounce rate due to the large number of mobile users that exist today and, in the same way; the web will be losing positions for this. In a short time the positioning will be affected even more, due to this algorithm.

  • If you have a mobile version in independent URL (

The web will begin to lose positions, now that Google will begin to crawl the webs from its mobile version, all the content that is missing in the mobile version compared to the desktop version will never be detected.

  • If your website is responsive

You should not worry about anything, they are the web quality standards that Google wants in its results.

Update “Brackets”

The changes in the results were noticed with the beginning of March 2018. Google had not confirmed it yet, but many of the SEO experts began to hear rumors and see the ranking of websites fall. Later, a Google worker on Twitter named it Brackets and commented that it was not an update intended to punish sites. Rather, it was a routine update of the core of Google, which happens twice a year and will compensate the websites that have so far been little rewarded.

This update has boosted the positioning of websites that previously did not benefit, despite their quality content and their hard work to climb the SERPs, which is why the quality of the content offered by the website has prevailed. Websites that have rich cards configured have also benefited. All of this results in a better CTR . All this is what it seems that Google has taken into account when updating the core, since for its part it has not confirmed what the update was about.

What are the Google algorithms?

Google Penguin and Google Panda

Due to the importance of these algorithms, we have dedicated a chapter just to explain how the Google Panda and Google Penguin algorithm works .

Google fred

In 2017 , an unconfirmed update by Google came to light, although there were workers who made reference to the issue on the Twitter network making jokes and suggesting that this update could be called “Fred”. It could be considered similar to Google Panda, since Google Fred tracks and rates the content taking into account the quality and quantity of the backlinks, the abusive use of advertising (with the sole objective of obtaining money without really contributing anything of use to it). user), sites with poor usability and structure of the web or scarce and low-quality content. Influencing this on the website’s positions in search results. The goal is clearly to penalize content monetization networks.

Google owl

The objective of this algorithm launched in April 2017 is to include only real news and articles in the SERPs. This would exclude false and unpleasant content, which may contain violence, explicit sexual content, or other offensive issues. For this Google also improved and included with the update of the algorithm, in the autocomplete section of its search bar, a link that opens a form from which you can report inappropriate prediction suggestions.

Mobile Friendly

Given the growing number of mobile device users and that more than 60% of searches through Google are performed on mobile devices or tablets, Google decided to take action by launching a new algorithm update in 2015 called Mobile Friendly.

The importance of a good functioning on mobile devices for Google is reflected in the large number of positioning factors that revolve around them.

We have decided to bring them all together in this section.

  • A good responsive design that allows comfortable navigation. (Easy reading, large buttons, clean aesthetics)
  • Light weight (If we load the web from a mobile network such as 3G or 4G it may take longer than necessary)
  • Framed images and elements
  • Little CSS if possible, no Flash.

We can know if Google interprets our website as “Mobile Friendly” or not from Google Search Console.

So far that our website has a good organic positioning on the desktop does not imply that it has it on mobile. This is because they are processed separately, thus offering the user the best experience depending on the device.

If our website is dedicated to the publication of news or articles, we can also consider the implementation of the AMP. This standard eliminates almost all the Javascript, CSS and compresses the HTML causing the loading speed to be very high in exchange for a flatter visual style.

Google Pigeon

In 2014 Google launched this algorithm that would be in charge of tracking and qualifying online businesses that practiced local SEO. Google Pigeon has been launched with the intention of benefiting both users and local web directories, with the idea of showing users more accurate results for their searches.

Due to complaints from Yelp, one of the most well-known and important web directories, Google decided to launch this algorithm and thus take advantage of and improve the results offered to users since the changes brought by this new algorithm did not affect only the results organic search but also to Google Maps and Google My Business results.

Google Pigeon tracks and rates based on the quality of the content, the speed of the web, the usability and structure of the site, it checks if all SEO practices are White Hat , if there is duplicate content and if the external links provide quality and value to the website.

Google Payday Loan

With the arrival of this algorithm in 2013, it is intended to clean and filter search results that are prone to content with SPAM, illegal SEO techniques , pornography, drugs and the like.

This algorithm was first launched in the United States where Google reveals that only 0.3% of the results were affected. In the following months in which the algorithm was updated in the rest of the world, it was detected that in the search queries made from Turkey it affected 4%, since it is a market where more fraudulent content moves. That is why this update did not affect the different locations equally, because this depended on the market type of each location.

Google Knowledge Graph

This update, released in May 2012, gave Google more insight, removing the limitation of treating queries as simple terms. With this update of the Google algorithm, each query made by the user regarding a celebrity location or events, etc; Google not only shows organic results but semantic information that it has found and collected from different websites, such as Wikipedia for example.

Google Venice

A change in the Google Venice algorithm was presented in February 2012. With this change, Google wanted to facilitate local searches in organic results for its users, so that local businesses would benefit from this new update. So that we better understand how this algorithm facilitated the search for users, to search for yoga classes in their city, for example, before they had to put “Yoga classes in Madrid” from this update, users only had to search for the term yoga classes and Google would return the results based on the location.

Google Page Layout

This algorithm was officially revealed two months after its launch in 2012 via Matt Cutts. The purpose of Google Page Layout was to avoid showing the user websites that were displayed at the top of the pages with multiple ads, forcing the visitor to make an infinite scroll until reaching the content and as a consequence offering a low user experience .

Google Hummingbird

This algorithm, which was updated in 2010, is somewhat more advanced than the others. Its goal is to correctly understand complex searches by classifying them according to context to offer the user just what they are looking for.

To use this algorithm to our advantage, we must apply a good Longtail technique, thus achieving better positions in the search results in which more complex keywords have been used.

This algorithm update was thought to cover the lack that Google Panda had in understanding that it should not show the same results for different questions or uses of the so-called “long-tail key chains” (Longtail). These keywords are increasingly used due to mobile use and voice searches. Giving Google Hummingbird the ability to know how to respond differently to questions from Where? or what? what is known as “the user’s moments”.

Google Caffeine

In August 2009, with the launch of this algorithm by Google, it improved the speed of spiders’ crawling, thus achieving its goal, which was to show results with current content more quickly to its users. As a benefit for webmasters, this sped up the page indexing process.

More than an algorithm update that altered the rankings, it was a change in the way spiders indexed in order to show users information more relevant to their queries.

Google Vince

This algorithm update was one of the first (2009), and was launched to favor well-known brands in the offline world using good SEO practices. As Matt Cutts said, “Brands are the solution, not the problem, and they have a fundamental value for humans.”

With this, brands such as vodafone, Sony and more; they benefited in ranking for searches with highly competitive terms such as mobile or similar.

Google Everflux What is it?

At this point we already know the whole core of Google and the amount of updates it receives and how this affects the movement of positions on websites.

The fluctuations or variations that are continuously perceived in the SERPs is known as Google Everflux (formerly known as “Google dance”), and they refer both to minimal variations that can be noticed with a routine update to more notable variations that can occur with a great update and change in the core.