Dear All
I'm seeing an issue with a clients website where the description in search engine results is being blocked by the robots file.
I have used webmaster tools to test this and they are not being blocked by the robots.txt file. The only notable aspect we have altered is adding an CS Cart seo addon to add a canonical url to a possible duplicate product.
robots.txt
User-agent:
Disallow: /?
Disallow: /index.php
Disallow: /images/thumbnails/
Disallow: /skins/
Disallow: /payments/
Disallow: /store_closed.html
Disallow: /core/
Disallow: /lib/
Disallow: /install/
Disallow: /js/
Disallow: /schemas/
Thanks in advanced
Steve
robots.txt controlls access to 'files'. It does not control the parsing of those files.
Yeah its a very odd one although Googling the issue throws up a fair amount of discussion on it especially related to Wordpress. I feel its nothing to do with robots what so ever.