August 23, 2013

Solution: Robots.txt unreachable.We were unable to crawl your Sitemap

Posted by Bennixville   on
"Network unreachable: robots.txt unreachable.We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely."

The error message above usually occurred while submitting your sitemap file to Google's webmaster tool and this happens for several reasons.

First, the sitemap file you've submitted might be corrupted or not an acceptable format to Google's guidelines ( for more reference about sitemaps). Remember Google used html and xml formats. For convenience you can use the free sitemap creation tools online :

After creating your sitemap file, download it and upload to your site root using FTP. You can preview your sitemap file using the url : www.your
This is how the error looks like in Webmatertool
Second reason why you're getting the unreachable network error is you don't have any robot text file in your site root. The robots.txt file is the way your site  tell search robots which pages you would like them to visit or not to visit.Its not mandatory for search engines but generally search engines obey what they are asked not to do.Robots file can be setup also as a protection againts thieves accessing your private files or pages.

Check now if you're be able to see your robots.txt file :  www.your, if not create your own robots file >> open a notepad then, copy and paste below text:

User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/

Save it as robots.txt file and upload to your site root.Now try submitting you sitemap again.
Third, the last issue is probably in your web hosting provider.Inform your web host to allow Google bot to access your site and this eventually works like a spell.

If you still had encounter the problem don't hesitate to put a feedback below and I'll try to help you.