5 Googley SEO Terms Do They Mean What You Think They Mean?

Many terms are used within the search industry. These words all have very specific meanings when it comes to search engine optimization (SEO) and how you implement your strategy. However, many of these terms are often used incorrectly.

Sometimes this misuse causes no issues. After all, a rose by any other name and all.

However, there are other times where not properly understanding these terms can lead to making mistakes that could cost you in the long run, in terms of traffic, position, and conversions.

So let's look at some of the most commonly misunderstood terms in Google search.

Most people think the robots.txt file is used to block content from the search engines.

This isn't how robots.txt works.

A robots.txt is used to stop a page or part of a site from being crawled and indexed, but not the URL(s) itself. So site owners wind up with this listing in Google's search results:

Erroneously, webmasters and site owners will add pages and folders to their robots.txt file, thinking this means that page won't get indexed. Then they see the URL in Google SERPs and wonder how it got there.

Robots.txt won't prevent the URL from being indexed, just the page content. If Google knows about the page from a link or other method, it will index the URL along with the infamous "page cannot be displayed" description.

Read the rest here:
5 Googley SEO Terms Do They Mean What You Think They Mean?

Comments are closed.