Jump to content

  • You cannot start a new topic
  • You cannot reply to this topic

Rankling worse using version 2 Rate Topic   - - - - -

 
  • ryzer
  • Junior Member
  • Members
  • Join Date: 31-Oct 08
  • 4 posts

Posted 05 November 2009 - 07:22 AM #1

HI there

We were using cs-cart 1.3.5 and now that we have upgraded to 2.0.8 it seems our rankings are going down. We are using a third party sitemap submission mod, but it is from the same place we got our sitemap mod for v1.3.5. This is very troubling to us as products that previously ranked really well are now showing up 2 or 3 pages in with the categories that they are included in ranking better than the product itself..

Very confused at this point.



- sorry about the typo in the title.. haha

 
  • Lazy
  • Member
  • Members
  • Join Date: 06-Oct 08
  • 78 posts

Posted 05 November 2009 - 10:05 AM #2

Hi there

you have in affect changed all the sites html (coding ) so what google rankd you on is not the same = drop in serps

good news if you have pulled all the urls and item titles across you SHOULD recover quite quickly the same positions , I suggest a increased seo campaign adding as many Quality relevant links as possible,ALSO CONTENT IS KING this will help speed up the recovery.DONT PANIC SIT TIGHT UNLESS IT YOUR SEASON AND THEN GET SMASHING AD WORDS just to keep the $$$$ rollin in

PS we are currently sat very high on a good few 100 keywords and are in the process of upgrading now,


HI there

We were using cs-cart 1.3.5 and now that we have upgraded to 2.0.8 it seems our rankings are going down. We are using a third party sitemap submission mod, but it is from the same place we got our sitemap mod for v1.3.5. This is very troubling to us as products that previously ranked really well are now showing up 2 or 3 pages in with the categories that they are included in ranking better than the product itself..

Very confused at this point.



- sorry about the typo in the title.. haha



 
  • Darius
  • Douchebag
  • Members
  • Join Date: 20-Apr 08
  • 3541 posts

Posted 05 November 2009 - 03:42 PM #3

HI there

We were using cs-cart 1.3.5 and now that we have upgraded to 2.0.8 it seems our rankings are going down. We are using a third party sitemap submission mod, but it is from the same place we got our sitemap mod for v1.3.5. This is very troubling to us as products that previously ranked really well are now showing up 2 or 3 pages in with the categories that they are included in ranking better than the product itself..

Very confused at this point.



- sorry about the typo in the title.. haha


If you do not do anything about it you will get to very bottom. I spent more then 2 months recovering from 135 to 2x move...

 
  • Darius
  • Douchebag
  • Members
  • Join Date: 20-Apr 08
  • 3541 posts

Posted 05 November 2009 - 03:48 PM #4

Hi there

you have in affect changed all the sites html (coding ) so what google rankd you on is not the same = drop in serps

good news if you have pulled all the urls and item titles across you SHOULD recover quite quickly the same positions , I suggest a increased seo campaign adding as many Quality relevant links as possible,ALSO CONTENT IS KING this will help speed up the recovery.DONT PANIC SIT TIGHT UNLESS IT YOUR SEASON AND THEN GET SMASHING AD WORDS just to keep the $$$$ rollin in

PS we are currently sat very high on a good few 100 keywords and are in the process of upgrading now,


Won't help.

First you need to check what exactly and how many pages are indexed by google.

For example 135 had links made like /?target=
2x ?dispatch=

now run test

google.com enter

site:www.domain.com target

Now every link you will get as result (I got about 5000+ pages) eqals to your domain.com/index.php

Your cart will not gonna give 404 code for google to remove this non existing link, the fore it remains as some page is loaded on request. Now each such attempt confuses google what is the actual entry page of your domain domain.com or domain.com/index.php or index.htm index.html or whatever with target in link...

If you won't do anything about it you will be very sorry for moving to 2x.

+ if you have ssl you are out of the business soon

 
  • ryzer
  • Junior Member
  • Members
  • Join Date: 31-Oct 08
  • 4 posts

Posted 05 November 2009 - 11:13 PM #5

Won't help.

First you need to check what exactly and how many pages are indexed by google.

For example 135 had links made like /?target=
2x ?dispatch=

now run test

google.com enter

site:www.domain.com target

Now every link you will get as result (I got about 5000+ pages) eqals to your domain.com/index.php

Your cart will not gonna give 404 code for google to remove this non existing link, the fore it remains as some page is loaded on request. Now each such attempt confuses google what is the actual entry page of your domain domain.com or domain.com/index.php or index.htm index.html or whatever with target in link...

If you won't do anything about it you will be very sorry for moving to 2x.

+ if you have ssl you are out of the business soon


Sorry for my ignorance but you have lost me here, what is it that I should be doing. Also what do you mean if I have SSL I will be out of business soon?

 
  • Traveler
  • Senior Member
  • Members
  • Join Date: 02-Feb 07
  • 932 posts

Posted 06 November 2009 - 03:21 AM #6

I am also lost - but interested.

Version 4.9.2


 
  • Darius
  • Douchebag
  • Members
  • Join Date: 20-Apr 08
  • 3541 posts

Posted 06 November 2009 - 06:24 AM #7

Ok for starters post here (or by PM) a link to your website.

What I ment SSL enabled checkout gets all shop (most of it) indexed by google as https

 
  • moka
  • Senior Member
  • Members
  • Join Date: 09-Feb 08
  • 634 posts

Posted 06 November 2009 - 04:08 PM #8

My guess to get around the ssl issue would be to put a nofollow and disallow into the cart pages. That way google can't go into the cart and come back out to regular website as https.

My experience with 2.0 was a disaster. I switched back. My conversions went immediatly back up after reverting, and traffic is slowly back up too.

 
  • Darius
  • Douchebag
  • Members
  • Join Date: 20-Apr 08
  • 3541 posts

Posted 06 November 2009 - 04:20 PM #9

I use

RewriteCond %{SERVER_PORT} ^443$
RewriteRule ^robots.txt$ robots_ssl.txt

in .htaccess for ssl problem

and

User-agent: *
Disallow: /

in robots_ssl.txt

took about a month for google to forget about ssl i have.