SEO Agency >> SEO Tools >> 

Audit Robots.txt | Robots Exclusion Checker

Audit Robots.txt | Robots Exclusion Checker

Short description : Audit Robots.txt

Robots Exclusion Checker Mise en avant
Robots Exclusion Checker is an extension that detects robot exclusions that prevent a web page from being indexed and crawled.

Long Description : Audit Robots.txt

Description Robots Exclusion Checker

Robots Exclusion Checker is a Google Chrome extension that detects robots meta tags, robots.txt exclusions and x-robots-tags present on a specific URL.  Specifically, the extension was designed to help SEO specialists identify exclusions that would prevent a web page from being crawled and indexed by search engine robots.

What is Robots Exclusion Checker? 

Robots Exclusion Checker allows you to check if a robot exclusion affects the indexing and crawling of a given URL on the web.  Thanks to simple color indicators, the extension allows you to see at a glance the indexability status of a web page by search engines:  Source: Robots Exclusion Checker  Red to mean that the ‘URL has exclusions that block the search engine crawling process.  Yellow to signify a warning. There would therefore be elements that can potentially prevent the page from being crawled by search engines.  Green to mean crawling of the page by search engine crawlers is allowed. It can therefore be indexed and ranked in search results.  When an indexing problem is detected, the extension also displays the cause of the blocking to help you solve the problem as soon as possible.  This saves time and ensures that all your important pages are accessible to search engines.  Robots Exclusion Checker is practically the first extension to bring together in a single interface all the elements that can prevent a URL from being crawled and indexed by search engine robots. And all this in a simple and easy to digest format. The extension can be downloaded from the chrome web store and remains particularly suitable for digital marketers who are in charge of e-commerce stores or large complex sites. 

Robots Exclusion Checker: How to install and use the extension? 

Using the extension is easy to learn:
  • Click the “Add to Chrome” button to install the extension on your browser
  • Once installed finished, go to the page you want to analyze 
  • Scroll down the list of extensions installed on your browser then click on the Robots Exclusion Checker logo. 
Automatically, a pop-up window should appear on the screen presenting the information collected by the extension. 

Robots Exclusion Checker: The SEO elements that are checked by the extension 

Here is the list of elements on which you can detect the presence of potential indexing problems on your site with the Robots Exclusion Checker extension: 

1. Status Robots.txt

Robots Exclusion Checker crawls your pages and tells you its Robots.txt status between “Allow” or “Deny”.  The tool also shows the matching rules found on the page with a “Copy to press” feature.   You also have the option to view your Robots.txt file directly online from the extension’s interface with the matching rules highlighted.  If no rule has been detected, this shows that there is no problem to report on the side of the Robots.txt file. 

2. Meta Robots Tag Status 

When it comes to the Meta Robots Tag status of your pages, the extension marks icons in different colors depending on whether it’s the Meta Robots tags that direct crawlers to “index “, “noindex”, “follow” or “nofollow”.  As for the directives which do not prevent the indexing of the page by the search engines, they will also be displayed, but classified apart from the alerts. These include the “nosnippet” and “noodp” tags.  Robots Exclusion Checker easierto view directives by allowing full HTML robots meta tags to be seen in the source code directly from the extension’s interface. 

3. X-Robots Tag Status

Detecting robots directives in the HTTP header can be a bit difficult task. But with the Robots Exclusion Checker extension, you can easily identify all exclusions, as well as the entire HTTP header.

4. Canonical Tag Status 

It is true that canonical tags do not directly influence the indexing of web pages, but they play an important role in the behavior of your URLs in search results.  With Robots Exclusion Checker, you can identify your web pages indexable by search engines, but which present a canonical inconsistency.  These pages will be marked with an orange icon in the “Canonical Tag” section of the extension. 

5. Status Nofollow Tag 

This is a new feature of the Robots Exclusion Checker extension that allows you to identify all links that use a “nofollow”, “ugc” or “sponsored” rel attribute value. 

Exclusion Checker Bots: Pricing 

Free Extension

Videos : Audit Robots.txt

Images : Audit Robots.txt

Company : Audit Robots.txt

Company description Robots Exclusion Checker Robots Exclusion Checker is a tool developed by Sam Gipson, an active SEO specialist for 15 years. Currently an SEO consultant, he has set himself the task of supporting digital professionals and companies to improve the visibility of the web pages of their website.  This is one of the reasons why he developed this SEO tool. Specifically, the Robots Exclusion Checker tool is a chrome extension that alerts you to search engine indexing or crawl restrictions. The Robots Exclusion Checker extension has a simple interface and brings together all the elements that influence the indexing and exploration of websites by search engine robots. Downloadable for free, Robots Exclusion Checker is an asset for e-commerce site owners. Through these different features, this extension verifies:
  • robots.TXT;
  • robots meta tags;
  • x-robots tags;
  • canonical tags;
  • nofollow tags.

Contact : Audit Robots.txt

Social Media : Audit Robots.txt

Others Tools : Audit Robots.txt

Leave a comment

Alexandre MAROTEL

Founder of the SEO agency Twaino, Alexandre Marotel is passionate about SEO and generating traffic on the internet. He is the author of numerous publications, and has a Youtube channel which aims to help entrepreneurs create their websites and be better referenced in Google.