I wrote something about this issue before. So, my SEO company noticed back in the summer, that Googlebot wasn’t crawling www.mysite.com for me. They were crawling mysite.com, which is how I had officially set it up in Google Webmaster Tools.
What I did was change my domain in local.config.php to www.mysite.com. I then went to Google Webmaster Tools and added www.mysite.com to my list of domains. I then went into SITE SETTINGS from both asked Google to view the site with www, regardless.
Now, about a week later, Googlebot has finally crawled www.mysite.com and I get this error message.
<br /> The requested URL /search?q=cache:6DoDVIBPTCkJ:www.mysite.com/+&cd=1&hl=en&ct=clnk&gl=us was not found on this server.<br /> ```<br /> <br /> What is going on here? Googlebot had no issue scanning mysite.com but can not scan www.mysite.com even when I clearly set it up in local.config.php. I also went to Bing.com and noticed that they have crawled www.mysite.com too with no errors. Yahoo? Same thing... crawled with no errors. Nobody is having any issues with me switching from mysite.com to www.mysite.com, just Google and it's bot.<br /> <br /> I need some help. When I go into Google Webmaster Tools, and visit Crawl > Blocked URLS for both the www.mysite.com and mysite.com settings, I noticed today that www.mysite.com has 147 blocked urls and mysite.com has 208. If this is the SAME site, why is there a 61 blocked url difference between sites? That seems strange.<br /> <br /> This is what both sites show in GTW as the current robots.txt content (scanned as of 10 hours ago):<br /> <br /> ```php <br /> User-agent: *<br /> Disallow: /addons/<br /> Disallow: /cgi-bin/<br /> Disallow: /blog/<br /> Disallow: /controllers/<br /> Disallow: /core/<br /> Disallow: /info_pages/<br /> Disallow: /install33<br /> Disallow: /files/<br /> Disallow: /js/<br /> Disallow: /lib/<br /> Disallow: /new/<br /> Disallow: /old/<br /> Disallow: /payments/<br /> Disallow: /production<br /> Disallow: /schemas/<br /> Disallow: /shippings/<br /> Disallow: /skins/<br /> Disallow: /var/<br /> Disallow: /admin1310.php<br /> Disallow: /config.php<br /> Disallow: /config.local.php<br /> Disallow: /init.php<br /> Disallow: /prepare.php<br /> Disallow: /shippingkit<br /> Disallow: /*?<br /> Disallow: /store_closed.html<br /> Sitemap: http://www.mysite.com/sitemap.xml<br /> ```<br /> <br /> I did try to fetch both sites again, and I am getting a response of: [b]Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0[/b] which could be a problem? Not sure.<br /> <br /> Basically, it looks to me like Googlebot is not crawling the www version of my site. I don't know if that is because something in the Cs-Cart Software is blocking or or it is a server issue. My sever claims that Cs-Carts needs to be formatted to view the cart with or without www which they say it is not. I'm starting to believe them. I've asked the helpdesk to assist me with this before and they said, "we failed to find any restrictions" and then told my support credit just ran out. Great. Has anybody else encountered this before?<br /> <br /> My shopping cart was listed for about two years as www.mysite.com. Then about four years ago, I changed it to mysite.com when I updated Cs-Cart from 2.0 to 3.0. Didn't think much of it at the time when installing it. Now, I come to find out that Google can not scan the www version of my site and it is effecting ranking. I'm not sure why it can't, but I would love to get the bottom of it.<br /> <br /> I'm running CS-Cart: version 3.0.6 PROFESSIONAL (btw). <br /> <br /> Thank you for any help that you can provide.