Googlebot can't access your site

We are running Ultimate V3.0.6 and something odd is happening with Google trying to spider the site. We keep randomly getting emails similar to this:


[quote]

[url=“http://www.littlefornow.com/”]http://www.littlefornow.com/[/url]: Googlebot can’t access your site



Over the last 24 hours, Googlebot encountered 77 errors while attempting to access your robots.txt. To ensure that we didn’t crawl any pages listed in that file, we postponed our crawl. Your site’s overall robots.txt error rate is 100.0%.

You can see more details about these errors in Webmaster Tools.







[font=Helvetica, Arial, sans-serif][size=3]

Recommended action[/size][/font][font=Helvetica, Arial, sans-serif][size=3]If the site error rate is 100%:[/size][/font][list]

[]Using a web browser, attempt to access [url=“http://www.littlefornow.com/robots.txt”]http://www.littlefornow.com/robots.txt[/url]. If you are able to access it from your browser, then your site may be configured to deny access to googlebot. Check the configuration of your firewall and site to ensure that you are not denying access to googlebot.

[
]If your robots.txt is a static page, verify that your web service has proper permissions to access the file.

[]If your robots.txt is dynamically generated, verify that the scripts that generate the robots.txt are properly configured and have permission to run. Check the logs for your website to see if your scripts are failing, and if so attempt to diagnose the cause of the failure.

[/list][font=Helvetica, Arial, sans-serif][size=3]If the site error rate is less than 100%:[/size][/font][list]

[
]Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors.

[]The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website.

[
]If your site redirects to another hostname, another possible explanation is that a URL on your site is redirecting to a hostname whose serving of its robots.txt file is exhibiting one or more of these issues.

[/list][font=Helvetica, Arial, sans-serif][size=3]

After you think you’ve fixed the problem, use Fetch as Google to fetch[url=“http://www.littlefornow.com/robots.txt”]http://www.littlefornow.com/robots.txt[/url] to verify that Googlebot can properly access your site.[/size][/font]



[/quote]









When I visit the robots.txt file in my browser I am shown the file just fine.



When I go into Webmaster Tools and fetch as Google it fails.



Ultimate is installed on littlefornow.com, we have two other storefronts greendiaperstore.com and angelbunz.com. When I go into WMT for Angel Bunz and tell it to fetch the SAME robots.txt file it succeeds! I don’t get it… any ideas?

What are the permissions to the robots file?

I am not sure what they were before but my server tech support changed them to 755 and now Google seems to be happy… its just so odd that the other two stores did not have this issue and they all share the same file.

[quote name='rmsilver7' timestamp='1365865661' post='159937']

I am not sure what they were before but my server tech support changed them to 755 and now Google seems to be happy… its just so odd that the other two stores did not have this issue and they all share the same file.

[/quote]



That's really odd as files shouldn't have more then 666 - then there is an another issue.