Message From Google

Hi everyone. I got this e-mail from Google today.

“[color=#000000][font=Arial, sans-serif][size=5]Googlebot cannot access CSS and JS files[/size][/font][/color]

[color=#575757][font=Arial, sans-serif][size=3][color=inherit]To: Webmaster of …,

Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.”

Does anyone know where the CSS and JS files are located?

Thanks for any help in advance.

Mike W.[/color][/size][/font][/color]

JS files are in the /js directory tree

css files (in V4) is in the var/cache/ directory tree. ( In earlier versions they can be all over the place.)

Current V4.3.3 robots.txt does not block these directories so would assume you have an older robots.txt file or you have added entries to it. So just go remove those entries from your robots.txt if they are there.

Here is the default robots.txt content of 4.3.3 version:

User-agent: *
Disallow: /app/
Disallow: /store_closed.html

Thank You!

Sorry forgot to mention I'm running CS-Cart version 3.04. Any ideas on where the bulk of the CSS files are?

Probably as in other posts /var/cache directory for cached .css

but the actual css in 3.04 in skins/yourskin/customer


And for addons, it can be anywhere it is specified in V2/V3 (or even V4 if it's coded that way; you don't have to use the 'style' smarty tag).

[quote name='Mikew' timestamp='1438195544' post='224816']

Thank You!

Sorry forgot to mention I'm running CS-Cart version 3.04. Any ideas on where the bulk of the CSS files are?




skins/YOUR_THEME/customer/addons - css file can be in each directory or subdirectory

Actually V2/V3 addon css/js files can be located anywhere. But that is the suggested location.

Thank you! I'm think I got the bulk of the CSS locations corrected based on everyone's suggestions.

You are welcome!

Hmmm, I got the same msg for a v2 website.

We had in robots.txt:

Disallow: /images/thumbnails/
Disallow: /skins/
Disallow: /payments/
Disallow: /store_closed.html
Disallow: /core/
Disallow: /lib/
Disallow: /js/
Disallow: /schemas/

I dont get why Google needs to crawl the js directory.?

Surely a bot only needs to access the same pages as any visitor would… clearly imy understanding is a bit lacking.

Because they want to catch people trying to include links and other code that is hidden from the user but present in the source. Most robots are not sensitive to 'display' type css properties and don't generally care about styling (or injections by JS). But now that css has gotten lots of extended capabilities, the want to get a better understanding of what's driving the site. I.e. they are digging deeper into your site to detect and possible fraudulent actions. Their robots should already have this data since they've done a GET of the page. But I'm assuming that they are trying to reconcile the page source with the robots.txt and are actually giving robots.txt priority over the actual html page source.

Ok, that makes sense… I guess. Not sure they should be called “fraudulent actions” but I do see the importance of weeding out these pages.