I decided to check the traffic statistics for this blog today (I usually do daily..) and was surprised to see my traffic fall over 50%. Wondering if I’ve pissed off somebody and got a bad review.. or something was wrong with the way I designed my site.
The answer was neither of that. The real problem was:
Googlebot not finding my robots.txt file
I logged into my Google Webmaster Tools area and found over 62 errors – mostly containing the “Robots.txt Unreachable” error. The help icon beside the errors stated:
Google encountered an error when trying to access this URL. We may have encountered a DNS error or timeout, for instance. Your server may have been down or busy when we tried to access the page.
My guess is that my server had some downtime when Googlebot was checking this website out. Have you had any trouble lately with getting your site crawled?
UPDATE (8:00PM) – I checked again to see that the errors were down to 17.. mostly consisting of me restricting robots.txt to different file types. I think the problem was fixed as quickly as it appeared.