Is 4.0. Software Structured Much Better Than 3.0 For Seo?

I’ve noticed lately that in Google Webmaster Tools, a CsCart store that I have which runs on CS-Cart: version 3.0.6 PROFESSIONAL software has 36 url parameter issues that it does not like and 231 blocked urls. I don’t know why Google is even scanning these url parameters, for instance, but it has resulted in multiple duplicate meta description warnings because it is cycling through these as individual urls, even though they are the same page technically and same content. I don’t know why CS-Cart 3.0.6 was not formatted properly to tell GWT to ignore these particular types of urls. Oh well…



So, another site that I host uses CS-Cart 4.0.3. That cart in GWT shows 11 blocked urls (not 100’s, which yes could be because of the robots.txt settings), 1 parameter issue for attachment ids which is not as annoying as ones like products_per_page, and the site in GWT has zero html improvement warnings!



It seems like the majority of problems for me are related to issues with the way Google sees 3.0.6 content, how it is structured and how it is built for SEO?



It’s interesting because my site dropped in ranking around August 1st of last year. A lot of it had to do with the Google Penguin update, but I also think a lot of it had to do with Google all of a sudden viewing Cs-Cart content differently and not liking what it was seeing. I mean, as of 07/2012, we had zero blocked urls by Googlebot on our 3.0.6 cart. Then around the 10th and 11th of 07/2013, Googlebot failed had 28 and 48 failed crawl attempts on those days, resulting in our blocked url report jumping up to 419 blocked urls, right after those failed attempts. This is also around the time that a lot of that Penguin stuff started and Google was probably changing the way it crawled content.



I attached a screenshot below of the GWT report showing this odd jump.



[attachment=7551:ari-report-dc-robots-error.jpg]



At the time I figured, my drop in ranking had much more to do with Penguin and less to do with CsCart but I no longer believe that. You see, in November of 2012, I submitted a list of my urls to be scanned for toxic links. The report showed that 55% of my inbound links at the time were toxic and 33% were suspicious. I uploaded a disavow list and then ran the report again and repeated the process a few more times. As of now, I am running at 1% toxic! Yet, I have not jumped back to where I was in Google before around the time of Penguin. Well, tonight for the sake of argument I asked the same software to scan another site of mine which targets the same keywords, ranks on the first page of Google, but uses Wordpress software which I feel is better structured for Google and SEO (no offence). The report shows 12% toxic (compared to 1%), 88% suspicious (compared to 33%) and 0.8% healthy (compared to my 1%) against my CsCart shop which has fallen off of Google. The only other thing that I did noticed on the Wordpress cart besides more errors and more toxic links, was in GWT, there is one needed html improvement for a short description but nothing about duplicate content, no problems with parameters, and no crawl issues with Googelbot minus a little stint in December when my host was having problems but as of a day or two ago, has crawled with zero issues reported.



Could this be because Wordpress is simply formatted better in Google’s eyes and they can work around problems it sees? Of course, I bring this up to the helpdesk and I know they’ll never say that 3.0.6 has bugs, but what about others out there? Should I upgrade to the benefits of 4.0. It seems to me to be a cleaner, better formatted software. I know that I can go through and try and fix url parameter settings and try to fix code with the cart to reserve this, but to me it just leads to a bigger issue with 3.0.6.



What all of this says to me is that something changed with the way Googlebot viewed by site around 07/2013. I will mention that I was running 2.0 as of 2007 and had since upgraded to 3.0, and that may have resulted in some problems that were boiling at the surface, problems that Google failed to take notice of til this summer. One of those problems could have been cart urls which at one time were not SEO-friendly and a lot of the old urls were missing because of the upgrade.



Still, the fact is, in and around that time, I had multiple failed attempts for Googlebot to crawl my site. Again, issues such as these have not been a problem with 4.0.



Please me some feedback. Thanks. :)

ari-report-dc-robots-error.jpg

Not sure about all the points but …

Remember just because you see things as fixed and have reported them to G , many weeks can pass before it decides to consider starting to move you up the ranks again.

Also at the time you were operating before penguin, you and possibly other sites were doing well, but by what G decides not “best practices” so after the penguin a whole load of other sites that were keywording for similar things as you but their rank was not doing not so well before penguin as they were not using perhaps the tactics you were using have now all of a sudden replaced you in Googles eyes as a better option for the user. How many of these? who knows and how long until you can catch them up ? same old story, add content that the user finds engaging and usable and Google will follow.

Have you re submitted the urls etc etc

John

[quote name='johnbol1' timestamp='1389520019' post='174953']

Have you re submitted the urls etc etc

[/quote]



Thanks John.



Resubmitting the urls? Mean, have I resubmitted my sitemap to Google? Yes, plenty of times.



I understand good content is the key. BUT the one site - which I ran the other report on for s***s and giggles, has not been updated in months nor has it been really promoted through social media. I never did link building it either. It was a self hosted Wordpress site with a lot of good content out the gate, but that was it. My main site, which fell off of Google… I've been submitting new well written 600 to 1000 word articles every day and I'm budging in Google's eyes, but not where I want to be.



I really feel like my problem is a combination of toxic links and a cart with a lot of bugs that frustrates Google. Bugs that it does not seem to get when dealing with Wordpress blogs, for some reason.



I have a general idea of how SEO works; that if you want a certain page to rank for certain keywords, you need healthy back links with keyword anchor text pointing to that specific page to make it rank better for that specific keyword. Of course, if the back links pointing to a page are toxic, then Google makes it so that page can not rank for those particular keywords. What I've noticed is that we still rank for keywords (not the first page, but like the 6th page) but instead of landing on specific pages, Google is pushing that traffic to alternative pages. It's almost like if you sold tshirts and wanted to rank the homepage for “tshirts”, but had too many toxic tshirt back links, Google will let you show up for tshirts but it's putting that keyword traffic on a post about buying tshirts or somewhere else as an alternative.



We're getting traffic. To alternative pages sure, but it's traffic. Still, I feel like if you remove the error issues… it would be enough to push us back to the top. I say that too because the other site which ranks, again has 0.8% healthy links and of them, not one was a keyword rich url. I mean, even if you eliminated all of the toxic links on my struggling site, there should still be enough keyword juice left to outrank a site which only has 0.8% healthy links. That is what frustrates me.

yeah, dont take it the wrong way I am not saying you are doing the wrong things just looking at it from a different angle, I see your evidence would suggest what your saying, but I mean there are probably many other sites out there actually glad that penguin came along and made the playing field more level for now by getting back at some black hats.

[quote][color=#282828][font=arial, verdana, tahoma, sans-serif]Resubmitting the urls? Mean, have I resubmitted my sitemap to Google? Yes, plenty of times.[/quote][/font][/color]

As for the urls I mean under the crawl menu, fetch as google, submit link not just sitemap



Thanks

John

i agree with ckad79, we work hard on seo but cant help but think that google really doesn't like cs-cart as of late.



our rankings just seem to keep dropping, we're even thinking about switching cart software as we cant seem to get the rankings we need.



yes its a great cart packed with great features at a great price, but if we aren't getting the traffic we need due to software bugs whats the point of sticking with cs-cart.

I brought up the seo point of this cart many times on here.

If you want to get the cart up to sbnuff buy 4sprungs advanced seo add on. It fixes the numerous problems with cscart. I have it and it really works well.



Also just a hint for your seo … you need healthy back links with keyword anchor text pointing to that specific page to make it rank better for that specific keyword.



Be careful as anchor text links pointing at a page with the exact keyword doesn't work anymore. Or at least that is the current opinion.



What you need is contexual links pointed at your page. Or just part of the keyword. Example : Birthday Gifts. Instead link with just Birthday.