Page cannot be displayed due to robots.txt

Last updated: December 6, 2021

A page that cannot be displayed due to robots is actually one of the most common errors in the Google Search Console that users face. These error messages usually reads as follows: “pages cannot be crawled due to robots.txt restriction” or “pages cannot be displayed due to robots.txt”. This is not a very difficult error to resolve, so continue reading to find out how to do just that. 

Why does this happen?

This happens because of a few lines of text in the robots.txt file that read like:

User-agent: *

Disallow: /

This basically tells Google that it shouldn’t crawl your site, and the * indicates that any and every bot shouldn’t crawl the page. If a site isn’t crawled then it isn’t indexed, and if it isn’t indexed, it wont show up on search results, or if the text reads:

User-agent: Googlebot

Disallow: /

It will still show up in the results, it will just have the incorrect title and a meta description like “No information is available for this page.”. 

 

How to resolve this error

As mentioned, the fix for this is rather easy and only requires you to edit your robots.txt file. You have two options for fixing this error; allowing all robots to crawl your page, or only allow Googlebot to crawl the page. 

To allow any robot to crawl the page, you should edit the robots.txt file to read:

User-agent: *

Allow: /

If on the other hand you only want Googlebot to be crawling your page then the text should read:

User-agent: Googlebot

Allow: /

And if for some reason you only want Googlebot to crawl a specific page, it should read like: 

User-agent: Googlebot

Allow: /example-page/

This applies to disallowing Googlebot and others from crawling your page too, which then would read:

User-agent: Googlebot

Disallow: /example-page/


SHARE THIS ARTICLE

OTHER BLOG POSTS