The Seekers

Internet search engines


Search engines have become the main personal assistant for Internet users.

In this chapter, we are going to talk about them: Internet search engines.

What is a search engine?

The basic answer would be to define it as a computer system looking for files on web servers related to a query. We could put it another way:

“A search engine is a system that offers answers to a question.”

We write the question in the search box and it is in charge of offering us the best answer.

Search engines are classified into four classes according to their functionality:

  • Hierarchical search engines.
  • Directories.
  • Metasearch engines.
  • Vertical Search Engines.

Let’s get to know each one of them:

Hierarchical search engines: they are the best known and used today. We talk about Google, Bing, etc. They are the search engines that index content on a website and show them on their results pages through a hierarchical classification established by their algorithms .

Hierarchical search engines: The most common like Google or Bing

Directories : They were the first Internet search engines characterized by organizing information based on categories, themes, location, etc. Today they are an active and influential part at the SEO level as we will see later. Yahoo, Terra, Dmoz … do they sound familiar?

Directories, very popular in the late nineties. Pages like Yahoo, Terra, etc.

Meta Searchers: A meta search engine is a search engine composed of results from several Hierarchical search engines. Meta-crawler is one of the best known.



Vertical Search Engines : These are search engines that focus on a specific sector. This allows them to analyze the information in greater depth and offer the user advanced search tools. For example, the search engine “Nestoria” specialized in the real estate sector.


Once we focus on the type of hierarchical search engine in which we are going to execute our SEO strategies, it is important that we know its operation to understand some very important factors in SEO positioning.

How does a hierarchical search engine work?

Search engines basically work on two screens: One is the screen that we find when accessing it, where we will find the search box or box. The other is the results page, which we will refer to later as SERP , which stands for: Search Engine Result Pages.

Search engines, to show us their results, need to store a high volume of information from the web. For this they use applications called ” robots ” or “spiders”. These applications read the content of the billions of web pages on the Internet and store it in their database and then display it in their results. The process of reading and storing that information is called the ” indexing process .”

How does a search engine see our website?

The spiders or search engine robots read and interpret our web in the same way that we find and read a book in a library:

  • The first thing we look at is the cover. Therefore, the cover of our website is essential in the eyes of search engines . Inside the cover, the most relevant thing is the title. As you can see, web pages also display a title on their cover. The images and content of our page define, in the eyes of the search engine, the relevance of our content based on the key chains by which we want to position ourselves.
  • Once we open the book, we find another determining factor: its index, which, taken to our website, would be interpreted as our main menu. You must bear in mind that the order of our main menu will determine the relevance of the content. That is why the elements placed further to the left are more relevant to the search engine. For this reason, on the website that I show you, we have placed the term “SEO positioning” in the first position of the menu.
  • We continue reading our book and we find paragraph heading texts, texts in bold, italics … in short, books show us visual guides to be able to mark highlighted and relevant content. We can do the same on our website by applying html code and style sheets to mark content. But beware! You have to know the limits to avoid penalties and, in this course, you will discover them.


The reading done by search engine robots on a website will depend on the user experience of the site in question.

What is user experience (UX)?

User experience is the perception that the user performs a task on an on-screen interface, in this case a website. The user experience refers to what the user feels when entering our website and encompasses the design of the site, the impression that the user of the brand takes, the trust that it generates, even the feelings that it has.


UX and SEO How does the user experience affect SEO?

Currently SEO and user experience are closely connected. This is because the main objective of search engines is for the user to find an answer to what they are looking for.

If the user experience of your website is adequate, you will find the information, navigate through the different sections and with this, you will be sending a clear message to the search engine: “this website has quality content related to the search I have done.” This has many advantages for your website, which we will discuss later.

On the other hand, a bad user experience can cause the loss of conversions on the website and visibility in the SERPs, among other consequences.

How to improve the user experience?

We can increase the user experience improvement on our website by enhancing different factors on our page such as:

Increased loading speed

Main qualitative value for Google since April 2018. We know that currently a user does not have much time, so if they have to waste it accessing a website, they will choose to go to another result that offers a better experience.

By offering the user a fast loading speed, we will have more chances that the user decides to stay on our website. But be careful, because not only the loading speed depends on the improvement in the user experience. (Later we will see in more depth the loading speed and how it influences SEO).


Quality content = better SEO

When the user makes a query in a search engine such as Google, he expects that when he accesses a result he will find the effective and complete answer to what he was looking for.

If our content is not related to their searches, that is, we have not chosen the keywords well or we have not been successful when developing the content, the user will return to the SERP with all the consequences that this has: CTR decrease , increased bounce rate , lower conversions, more users who come to our competition and all this has a negative impact on our visibility and SEO positioning.


Responsive design: Mobile adaptation

Nowadays, the percentage of users who access the Internet from a Smartphone has far exceeded the users who access from the desktop, so offering a good user experience for these devices is of vital importance if you want to maintain the flow of visits from your website. A website that does not adapt to the screen of a mobile, that is, that does not present a responsive design, will have a worse SEO positioning than one that is adaptable.

In addition, we must take into account the recent update of Google’s algorithms: Mobile First.

Clean and clear design

The good design of a site is in the balance between clarity and elegance. When the design prevents the site from being unusable, not very intuitive and not at all clear, the user will end up confused and surrendering in the battle of finding what they were looking for on the site, as a consequence we will obtain just what we do not want, give that user to another result from the SERP, to our competition.

Advantages of having a good user experience

There are multiple advantages and reasons why we should opt, if we do not already have it, for a good user experience on our website.

CTR improvement What is CTR?

The CTR (Click Through Rate) is the percentage of clicks that a search result receives compared to the times it has been shown on the results page.

Among other things, with a satisfactory user experience we are managing to improve our CTR, since the user does a Google search that closes on our page.

Decreased bounce rate

The direct consequence of an increase in CTR is that the bounce rate decreases with this search closure on our website.

Conversions increase

High CTR + a low bounce rate has a very positive consequence: more conversions. This is because the user got a clear and quick answer to exactly what they were looking for. By providing the user with this solution or result, we are avoiding the possibility that the user leaves without making a conversion, whether it is sending a form, making a purchase, visiting a page, etc.

Improved positioning

A good structure of the site can get Google to show your search result in the SERPs with more links to your site as we can see in the following image.


In addition, all the above advantages will cause the Google robot to consider your site as a relevant website for the user, improving the positions in the SERP.

We have previously discussed how a spider or robot sees our website. But what exactly is a spider and how does it work?

What is a spider or Google robot?

The Google robot, which in a display of creativity was dubbed “Googlebot” is the robot that is responsible for crawling and indexing new pages or updated pages to include them in the index. It is considered the most powerful crawling and indexing tool ever created.

How does Googlebot work?

Googlebot begins its crawling task by accessing a list of page URLs, generated in previous crawling tasks. This list is complemented by the sitemaps that we provide from the Google console . The crawl is algorithmic and tells the robot which pages to crawl, how deep and how often. These scans are done at high speed.

The crawling and indexing of the page makes our website appear in the search results, if the robot finds relevant content on our site that will answer the user’s questions. With this we will be able to give our website visibility.

For this we must make it easy for the Google spider to do its job as much as possible, indicating what content it should access in order for it to track and display it in the SERP and, on the contrary, blocking the entry to content that we do not want to be recorded in the index. 

In order for the robot to include our pages in the SERP, we must offer it unique and quality content, which complies with Google’s guidelines, since if not, the spider can make the decision that it is not relevant content to show to users. We will talk in more detail later about how to create quality content and optimize it for Google.

Google robots or spiders

  • APIs-Google
  • AdSense : This robot visits the page to know its content and to be able to serve convenient ads.
  • AdsBot for Mobile Web The AdsBot robot performs quality checks on ads placed on mobile web pages. There is a robot for Android and another for iPhone.
  • AdsBot : Checks the quality of the ads on the website that are displayed from the desktop.
  • AdsBot-Google-Mobile-Apps : Check the quality of ads placed in Android applications, follow the same rule as the AdsBot robot.
  • Googlebot-Image
  • Googlebot-News
  • Googlebot-Video
  • Googlebot

These are all the robots that the Google search engine uses to crawl a website today.