Dictionary SEO of the Royal Digital Language:

We present a complete SEO dictionary with all the terms related to the subject so that you can solve your digital doubts.

Within SEO there are many terms and concepts that we need to be clear about when optimizing and positioning our site or project in the best possible way.

SEO dictionary with all the necessary terms to optimize and position your website Tweet this
The correct interpretation of a term is fundamental. That’s why in this section we will be adding new terms and “words”, all related to SEO so that you are always up to date.

A

Google Algorithm


The Google Algorithm is the way in which the search engine positions the pages before a search, that is to say, it is what decides whether you come out first, second or on the second page.

This algorithm changes about 500 times a year and it is difficult to keep track of it. That’s why it’s best to be aware of important changes such as Panda and Penguin, how they affect SEO and how to recover from them.

Anchor text


Anchor text is the text visible in a link or hyperlink that provides information about the content to which we want to direct the user and search engines.

Search engines have improved over time and increasingly use more and more factors to create their positioning rankings. One of these metrics or factors is link relevance. The relevance of a link depends both on the authority of the page from which the link comes, as well as the visible anchor text. Of course, the link must always be as natural as possible or Google will understand it as a bad practice.

We can classify anchor texts into the following types:

Naked or without anchor text. Only a URL is shown. For example: www.40defiebre.com
Generic. Includes words such as: “this blog”, “click here”, “this page”.
Keyword. Depending on whether we are interested in positioning one or another keyword, we use a different anchor text and choose the terms we want to highlight, for example “Link Building”.
Name. When it is made up of a different text to the previous ones, and the objective is to link to a brand, a website, etc. The link would be: “40defiebre”.
B
Backlinks
Backlinks are the links or incoming links that point from other pages to your own. The number of backlinks to your site is important because the more relevant sites that link to you, the more visibility your site will gain in the eyes of Google. Make sure they are natural and convenient links, always quality before quantity.

Black Hat SEO or negative SEO
In SEO, Black Hat is the attempt to improve the search engine ranking of a website by using unethical techniques or techniques that contradict Google’s guidelines, “cheating”. These practices are increasingly penalised by Google. Some examples of Black Hat SEO are:

Cloaking
SPAM in forums and blog comments
Keyword Stuffing
C
Keyword cannibalisation
Keyword cannibalisation occurs when there are several pages on a website competing for the same keywords, confusing the search engine by not knowing which page is the most relevant for that keyword, causing a loss in positioning.

How do you solve this? The easiest way is to focus each page on one or two keywords at most and in the case that this cannot be avoided, we will have to create a main product page from where the pages of the different formats can be accessed, in which we will include a canonical tag to the main product page.

Cloaking
Cloaking is a widely used Black Hat SEO technique that consists of displaying different content depending on whether it is a user or a search engine robot that reads it.

Google is very hard on this practice and although it may have worked years ago, forget it, it is out of what search engines are looking for with their updates: a more natural, ethical and user-focused SEO.

Duplicate content
Duplicate content is when the same content appears in multiple URLs and that in principle is not a reason for penalisation, unless a high percentage of your website has duplicate content. Having a few duplicate pages will not make Google angry with us, but avoiding it will give it clues that we are on the right track.

Although it does not imply a penalty, it can generate a loss of positioning potential due to search engines not knowing which are the most relevant pages for a given search.

CTR
The CTR (Click Through Rate) is the number of clicks a link gets in relation to its number of impressions. It is always calculated as a percentage, and is a metric that is commonly used to measure the impact a digital campaign has had.

How to calculate the CTR?

As we said before, CTR is calculated as a percentage. It is obtained by dividing the number of clicks a link has obtained by the number of times it has been seen by users (impressions) multiplied by 100.

Let’s see an example: Let’s imagine that we have a result in Google that has been seen 2000 times and has obtained 30 clicks, our CTR would be calculated as follows:

CTR= (Clics / Impressions) x 100 = (30 / 2000) x 100 = 1,5 %.
CTR = 1,5
D
Keyword Density
Keyword density is the percentage of times a word (or series of words) appears in the text as a whole compared to the total number of words.

A few years ago, keyword density was one of the most important factors in SEO positioning, as it was the method used by search engines (Google, Yahoo, Bing) to identify the main topic of a page.

However, SEO has changed, now Google’s guidelines recommend writing as naturally as possible, i.e. writing for the user rather than for the search engine.

Although some people still recommend not to exceed 3% keyword density, there is no ideal percentage.

E
Canonical tag
The Canonical tag was introduced by Google, Yahoo! and Bing in 2009 to solve the problem of duplicate or similar content in SEO.

If there is no canonical tag in your code on a set of pages with duplicate or similar content, search engines will have to decide which URL best matches what the user is specifically looking for. However, if we introduce this tag, we are the ones telling Google and other search engines which is our preferred page. This will improve the process of indexing and positioning of our website in SERPs.

Example of canonical tag: “

Let’s see an example: if our website is the platform from which we sell flats in the neighborhood of Chueca in Madrid and we have several pages with very similar content, we must choose as canonical URL for which we want to position ourselves. This can be the one that has brought us the most traffic or the one that brings us the most profit.

To use the canonical tag effectively in SEO, just follow these steps:

Choose which is the canonical or main page.
Decide which are your secondary pages that can compete with the main one.
Add the canonical tag on the secondary pages pointing to the main page between “” and “”.
Put the canonical tag on the main page pointing to itself between “” and “”.

Robots tag
The Meta Robots tag is an HTML tag that is used to tell search engines to treat a url in a certain way.

This tag is necessary if we do not want our website to be indexed or positioned in search engines.

This function can also be performed through the Robots.txt file of the page.

The difference between using the Meta Robots tag and the Robots.txt file is as follows:

With the Meta Robots tag, we tell Google that we do not want to index certain pages, but that we are interested in having them crawled by the bots.
However, if we use the Robots.txt file, we tell the bots not to bother to enter and crawl certain pages.
This difference is important to keep in mind. You will understand it better with an example:

Imagine you have 2 urls that you don’t want to appear in Google’s index.

Url 1: blocked by robots.txt file

This url will not be crawled and will not be indexed (a priori, never trust Google 100% :-P).

Url 2: blocked with the meta robots tag

This url, when blocked with the meta robots tag, will not be indexed but it will be crawled by search engines, which will make all the content to be analysed and therefore, search engines will be able to crawl and follow links to other pages.

G
Google Panda
Google Panda is a change in Google’s algorithm that was released in the United States in February 2011 and in Europe in April 2011. When it came out, it affected more than 12% of all search results.

The maxim with Panda to avoid being penalised is to be sure that your content is totally original and provides great value to your user, that you keep the page updated or that you even look for new formats to enrich your contribution to the user. The most actionable metrics in this case will be the bounce rate, the CTR in your search results, the dwell time and the number of page views.

Google Penguin
Penguin is the official name for Google’s algorithm update designed to fight webspam. The update was launched in April 2012.

It focuses on the off-site factors of a website, rewarding those sites that have a link profile with links from high quality, non-manipulated domains, and trying to punish those pages that have violated Google’s guidelines, have unnatural link profiles, too many links on low-quality sites, etc.

It was Google who, from the beginning, decided that links generated to a website were a sign that its content was relevant. That’s why everyone started generating links on a massive scale. However, Google Penguin is a “where I said I said Diego”.

Improvements to the algorithm include better detection of low value links, purchased links, links in article networks, directories and basically any dynamic that involves trying to modify the link profile of your website. The best way to make sure Penguin doesn’t penalise you is to respect Google’s guidelines and passively generate links through your content.

How does SEO change with Google Penguin?

Natural links, i.e. links generated passively or through real value. Article syndication, spinning, hidden links, directories (free or paid), promotions that result in links, etc. are prohibited.
Anchor text variety: It no longer makes sense to generate links with a link text that you want to rank. If Google detects a pattern that it does not consider natural, it can penalise you.
Search in your niche: The most valuable links are those from domains and pages in your niche or that talk about related topics.
Quality, not quantity: It is preferable to generate few quality links than many of little value.

Robots tag
The Meta Robots tag is an HTML tag that is used to tell search engines to treat a url in a certain way.

This tag is necessary if we do not want our website to be indexed or positioned in search engines.

This function can also be performed through the Robots.txt file of the page.

The difference between using the Meta Robots tag and the Robots.txt file is as follows:

With the Meta Robots tag, we tell Google that we do not want to index certain pages, but that we are interested in having them crawled by the bots.
However, if we use the Robots.txt file, we tell the bots not to bother to enter and crawl certain pages.
This difference is important to keep in mind. You will understand it better with an example:

Imagine you have 2 urls that you don’t want to appear in Google’s index.

Url 1: blocked by robots.txt file

This url will not be crawled and will not be indexed (a priori, never trust Google 100% :-P).

Url 2: blocked with the meta robots tag

This url, when blocked with the meta robots tag, will not be indexed but it will be crawled by search engines, which will make all the content to be analysed and therefore, search engines will be able to crawl and follow links to other pages.

G
Google Panda
Google Panda is a change in Google’s algorithm that was released in the United States in February 2011 and in Europe in April 2011. When it came out, it affected more than 12% of all search results.

The maxim with Panda to avoid being penalised is to be sure that your content is totally original and provides great value to your user, that you keep the page updated or that you even look for new formats to enrich your contribution to the user. The most actionable metrics in this case will be the bounce rate, the CTR in your search results, the dwell time and the number of page views.

Google Penguin
Penguin is the official name for Google’s algorithm update designed to fight webspam. The update was launched in April 2012.

It focuses on the off-site factors of a website, rewarding those sites that have a link profile with links from high quality, non-manipulated domains, and trying to punish those pages that have violated Google’s guidelines, that have unnatural link profiles, too many links on low quality sites, etc.

It was Google who, from the beginning, decided that links generated to a website were a sign that its content was relevant. That’s why everyone started generating links on a massive scale. However, Google Penguin is a “where I said I said Diego”.

Improvements to the algorithm include better detection of low value links, purchased links, links in article networks, directories and basically any dynamic that involves trying to modify the link profile of your website. The best way to make sure Penguin doesn’t penalise you is to respect Google’s guidelines and passively generate links through your content.

How does SEO change with Google Penguin?

Natural links, i.e. links generated passively or through real value. Article syndication, spinning, hidden links, directories (free or paid), promotions that result in links, etc. are prohibited.
Anchor text variety: It no longer makes sense to generate links with a link text that you want to rank. If Google detects a pattern that it does not consider natural, it can penalise you.
Search in your niche: The most valuable links are those from domains and pages in your niche or that talk about related topics.
Quality, not quantity: It is preferable to generate few quality links than many of little value.
K
Keyword
It refers to the keyword (or keywords) to refer to the terms for which we want to attract traffic to our website through search engines. You must take into account some factors associated with the Keywords (abbreviated KW) such as competition, number of searches, conversion or even the potential as a branding tool.

The choice of one keyword or another will condition the strategy, the content of a page, the appearance of that keyword in texts and tags, and other SEO positioning factors.

Keyword stuffing
Keyword Stuffing is a Black Hat SEO technique that consists of the excessive use of keywords within a text with the misguided aim of giving more relevance to this word. Google very often penalises this type of over-optimisation.

To avoid any kind of negative action by Google, texts should always be written to provide value to the user, and in the way that best suits your audience profile. If the text manages to provide useful, original and well-synthesised information, this will be a better indicator for Google than any variation in the number of keywords in the text.

There is no percentage that defines a perfect keyword density and Google recommends naturalness above all.

L
Link baiting
Technique of attracting links organically by creating high-value content. One of the essential factors for search engine positioning is the number of links to a given page.

Link Baiting aims to get a large number of users to link to content on our site. To do this we must create original, relevant and innovative content, such as articles, videos or infographics that attract the attention of users.

Link building
Link building is one of the fundamentals of SEO, which seeks to increase the authority of a page as much as possible by generating links to it.

The algorithms of most search engines such as Google or Bing are based on the factors of SEO on-site and SEO off-site, the latter based on the relevance of a website, whose main indicator is the links that point to it or backlinks. There are other factors, such as the anchor text of the link, whether the link is a follow or not, brand mentions or links generated on social media.

It is important to bear in mind that good content is often linked naturally, so the effort to get links happens organically and with less effort than in other ways.

Link juice
This is the authority that a page transmits through a link. Google positions web pages according to their authority and relevance, this is transferred from one page to another through links, and this transmitted authority is what we call Link Juice.

To understand it we have to understand a web page as a big glass of juice (web) to which we make several holes (links) at the base. In this way, a glass with one hole will transmit all its link juice through that single hole. If it has 10 holes, each hole will pass 10% of the total link juice, and so on.

Long tail
The long tail is a statistical term that refers to the distribution of a population.

Let’s say your website attracts traffic through 100,000 keywords, and you focus on the 100 with the most visitors. Let’s imagine that around 20% of the total traffic (depending on the nature of your website) corresponds to these terms, and the remaining 80% will correspond to terms with a very small number of searches. So the vast majority of the traffic your website attracts comes from terms that you are not analysing and that you don’t even know what they are.

This is what we call long tail, the searches with more specific terms that individually generate very little traffic, but as a whole are the largest source of visits to the web. The term is applicable to other realities apart from online marketing; it was popularised by Chris Anderson in an article in Wired, giving examples of companies that have succeeded thanks to the business generated by their long tails, such as Amazon, Netflix or Apple.

M
Meta Tags
Meta tags are information included in web pages that are not directly seen by the user. They are used to provide information to browsers and search engines to help them better interpret the page and are written in HTML language within the web document itself.

Meta tags have proven to be important for SEO because of their ability to affect the behaviour of search engines, providing information about which pages a website should rank for, giving a description of the website or blocking access or indexing of the website by search engine robots.

Microformats
Microformats are a simple form of code that gives semantic meaning to content so that machines can read it and understand our products or services.

If we add Microformats to our website, Google can read it and display it in search results. This information can include user ratings, photo and author name, video, audio, etc.

N
Not provided
The term “not provided” is a term used in Google Analytics that identifies all “safe” traffic within Google, i.e. all traffic that comes from users who are logged in to their Google Account.

What happens with this data, what do I do with it? In this post we will show you the different options for you to know how to interpret this data.

O
Off-site SEO
This is the part of SEO work that focuses on factors external to the web page we are working on and that affects our site, including external links, social signals, mentions and other metrics that reinforce the authority of the page.

One of the most important tasks of off-site SEO is link building, generating links that point to your page on external websites, which will give Google a higher relevance.

On-site SEO
On-site SEO or On-page SEO is a set of internal factors that influence the positioning of a website. These are those aspects that we can change ourselves on our page such as:

The meta information, such as the title or the meta-description.
The URL
The content
The attribute on images
The web structure
Internal linking
HTML code
On-site SEO optimisation is an essential process that every website must take care of if it wants to appear in search results.

P
Pagerank
Page Rank is the way in which Google measures the importance of a website, the search engine classifies the value of websites on a scale of 1 to 10.

When a page links to another website, it transmits a value, and this value depends on the Page Rank of the linking page itself.

Google has now stopped publicly updating Page Rank, and no one can now see how a website is rated by the search engine.

However, although they still use it internally to establish their search results, it has less and less weight within the algorithm as a whole.

The Page Rank is given by factors such as the number of links and domains pointing to the website, the quality of these, the age of the domain, etc.

Q
Query
The English term “query” means doubt or question. When we talk about databases, a query or query string is a request for data stored in the database, although in a generic way it can refer to any interaction. When we talk about search engines, a query is the term that we type in Google, a query that will later derive in a SERP.

R
Search Engine Ranking
Search Engine Ranking is the position your website occupies on a search results page. That is, the position in which you appear in Google, Yahoo. Bing…. when a user performs a search.

To improve our positioning we must use strategies and tools that help us to optimise our website, increasing accessibility, usability and content.

S
Schema Markup
Schema Markup is the specific vocabulary of tags (or microdata) that you can add in the HTML code of your website to provide more relevant information. This way, search engines will understand your content better and provide better results. Also, improve the way your page is displayed with rich snippets that appear below the page title.

Schema.org is the reference website for this type of strategy where you will find all kinds of hierarchies and ways to organise your content. But wait, what can I structure? Hundreds of things! Nowadays there is a wide variety of tags to structure and I’m sure there will be more in the future: places, events, films, books, recipes, people, etc. Also, to make it much easier Google created “Markup Helper”. Very useful.

SERP
(Search Engine Results Page) refers to the results page of a search engine, such as Google or Bing.

It is the page that appears after a search, where the results are displayed in order.

The more a website is optimised according to the quality criteria of search engines, the more likely it is to rank better in the SERPs.

Sitemap
A sitemap is an XTML document that is submitted to search engines. This document allows search engines to have a complete list of the pages that make up a website, so that they can index pages that their robots cannot access because there are no direct links, or because they are behind a form, etc.

Spinning
Spinning is a Black Hat SEO technique that refers to the creation of an article by reusing different original texts.

In this way, content generation is accelerated in a simple way. It can be carried out by means of software that automates the process of modifying the content or manually, making it appear to be different texts through synonyms or changes in order and words.

Although this technique has been widely used, doing it automatically is one of Google’s penalisation factors. Since its now famous Penguin, Google has been detecting these practices more frequently.

W
White Hat SEO
White Hat SEO are those ethically correct techniques that comply with the guidelines set by the search engines to position a website.

Their aim is to make a page more relevant to search engines. To achieve a good White Hat SEO there are some characteristics that you should take into account:

White Hat SEO is the most beneficial way to optimise the positioning of a website in the medium-long term.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *