Robots.txt and Google

Google’s recent declaration makes robots.txt all the more important. John Muller from Google informed that in case Google bot requests for robots.txt for your website and finds it unreachable, Google would prefer to stay on safe side in such cases and do not index the website.

By “unreachable” it was meant, when the server times out or fails to send any proper response. By no way it means that you will have to have a robots.txt file. In case Google requests for one and finds a 404 error response , Google would still index the site. Only when they do not get a proper response from your server while requesting the robots.txt file will they refrain from indexing the site.

Keeping in mind the legal issues that often crop up with Google, this is pretty much a logical stand as there might be websites with no-index directives in their robots.txt file which Google might index if they are unable to read the robots.txt.

Though it is actually a rare scenario to get a server time out on robots.txt, it was definitely good on Google’s part to inform the same and alert the webmasters.

Read more on SEO at SEO Consultant India Blog

About "" Has 249 Posts

Check out the About SRC Page for more details about Saptarshi Roy Chaudhury.

Leave a Reply

Your email address will not be published. Required fields are marked *