Google indexing mainly categories

Hi,

I don’t understand the importance of a site map. Why would Google not index all pages naturally once they “discovered” the site.

I have another non cs-cart site, no site maps and really no SEO optimization, but every page is indexed.

Don’t they all get indexed with and without a map?

I find that once the site starts to get indexed the robots come back more frequently and get all the pages.

Bob

pbannette, sitemap is important for big size stores. I also have a small site without sitemap, no problem, and a wordpress blog, no issues, but maximum pages in those sites are 200.

My online store has more than 20 000 product pages alone and believe me, google does not even index 4000 by itself.

Like Madaha, it was not like this before, I actually had nearly 50 000 entries , now I have barely 8000. How come?

It is just too big a difference, wouldn’t you agree?

I am amazed that there hasn’t been any input from the CS team on this subject!



I think it needs looking at, and some sort of answer for every ones sake. :rolleyes:

[quote name=‘BarryH’]I am amazed that there hasn’t been any input from the CS team on this subject!



I think it needs looking at, and some sort of answer for every ones sake. :rolleyes:[/QUOTE]



I totally agree. You know, everyone congratulated me on how well positioned my store was and now, well, I can not even find myself on the net, let alone my customers. Coincidence or not, it all changed when I upgraded. :frowning:



Going back to the cache cleaning issue, is it possible to make a cron job for getting cache cleared every 2 hours or so? I tried but the path is fr4om admin section and I forgot it requieres a password ,so cron didn’t work. Any ideas?



See what I mean??!!! I had the same url for sitemap before and never ever worried about clearing my cache, it all worked well, no complaints from Google. Now I get error messages everyday.

Before upgrade to 2.1, almost all pages indexed by Google. After upgrade to 2.1, only categories and sub-categories indexed by Google. There are other people meet the same thing as I do.You don’t think the difference caused by upgrade to 2.1?






[quote name=‘mirnitagl’]pbannette, sitemap is important for big size stores. I also have a small site without sitemap, no problem, and a wordpress blog, no issues, but maximum pages in those sites are 200.

My online store has more than 20 000 product pages alone and believe me, google does not even index 4000 by itself.

Like Madaha, it was not like this before, I actually had nearly 50 000 entries , now I have barely 8000. How come?

It is just too big a difference, wouldn’t you agree?[/QUOTE]

how do i check what is indexed and the rank of the products? i have google webmaster tools account and submitted a sitemap



thanks.

Why CS-Cart technology team keep silent for this serious problem? Google only index main and sub categories page and the page loading time takes too long time ! Xmenu reports timeout for most pages! CS-cart technology team need to do something to solve this problem. It’s a paid software, not a free script.

mine is



Submitted URLs 96

57 URLs in web index



i have seo enabled and is using the domain.com/sitemap.xml file

What might be the problem? The new cache method may causes Googlebot can’t crawl the product page? such as googlebot crawl product pages as empty pages or googlebot meet problem when crawl product pages?



The problem might be caused by the new cache method. Anyone have a way to prove it?

[quote name=‘purelife’]how do i check what is indexed and the rank of the products? i have google webmaster tools account and submitted a sitemap



thanks.[/quote]

Search in Google with this operator: ‘site:yourdomain.xx’



All Google operators bellow:



[url]Google Guide: Error 404: Missing Page

I have 1026 pages and 523 indexed and mostly they categories/subcategories. 50% nominally speaking



I have none CS sites…



I have 115 pages and 111 pages indexed 95% nominally speaking

I have 213 pages and 198 pages indexed 95% nominally speaking



When we were using CS 1.3.5 (last before 2..) we had approx 905 pages and approx 890 indexed again 95% nominally speaking.



There is something NOT quite right here.



Also our page rankings plummeted after leaving 1.3.5 We had many products ranked 1 and 2 and were pretty much well listed. Not any more!!!



Someone should raise as a bug! Not me because I don’t seem to get much joy with bugs. :confused: :rolleyes:

[quote name=‘BarryH’]I have 1026 pages and 523 indexed and mostly they categories/subcategories. 50% nominally speaking



I have none CS sites…



I have 115 pages and 111 pages indexed 95% nominally speaking

I have 213 pages and 198 pages indexed 95% nominally speaking



When we were using CS 1.3.5 (last before 2..) we had approx 905 pages and approx 890 indexed again 95% nominally speaking.



There is something NOT quite right here.



Also our page rankings plummeted after leaving 1.3.5 We had many products ranked 1 and 2 and were pretty much well listed. Not any more!!!



Someone should raise as a bug! Not me because I don’t seem to get much joy with bugs. :confused: :rolleyes:[/QUOTE]



you have just described my situation, only that in our case we had more than 20000 pages indexed, now you can easily take a zero from there

this problem should not be underestimated, it’s useless to have an updated cart if nobody can find it.



Please, can anyone tell me how to bring this into attention for CS-Cart team?

what’s this forum admin name?

[quote name=‘mirnitagl’]you have just described my situation, only that in our case we had more than 20000 pages indexed, now you can easily take a zero from there[/QUOTE]



This may happen if google finds your pages as duplicates on your own website. Meaning that same content pages can be accessed with tons of links that point to same page like tags, feature options and so on…



Heres a sample



domain.com/category/product +



&sess_id=

?currency=

?subcats=

index.php?

?category_id=

&layout=

?sef_rewrite=

?productid=

?features_hash=

does using blocks in each product page that have items in like “recommended items, you may also be interested in…”



those blocks all have a few items in each product page. so that means its links to other products so is that duplicates that’s why they are not indexed?



i searched my site using

site:domain.com SPECIFIC PRODUCT NAME



and i get a few results and on bottom it says:



“In order to show you the most relevant results, we have omitted some entries very similar to the 15 already displayed.

If you like, you can repeat the search with the omitted results included.”



so are they omitting because my site has too many reference to same products because of those block?

I just know one thing, for almost 2 years my cs-cart store was very very well positioned, like i said 20 000 pages indexed, and ever since I upgraded this has gone down the toilet, it is a disaster, so obviously upgrading has killed that seo work.

Why? and most important, How can I fix it?



Darius, I have almost the same pages I had before, so are you saying that after 2 years google is finding my pages as dupicates all of the sudden?



Madaha, are asking if the new cache method is the problem or assuring it is the problem?



Please, can someone throw some light over this. It is a serious issue. If you can’t be found in search engines you just do not exist, your business does not exist.

[quote name=‘mirnitagl’]I just know one thing, for almost 2 years my cs-cart store was very very well positioned, like i said 20 000 pages indexed, and ever since I upgraded this has gone down the toilet, it is a disaster, so obviously upgrading has killed that seo work.

Why? and most important, How can I fix it?



Darius, I have almost the same pages I had before, so are you saying that after 2 years google is finding my pages as dupicates all of the sudden?



Madaha, are asking if the new cache method is the problem or assuring it is the problem?



Please, can someone throw some light over this. It is a serious issue. If you can’t be found in search engines you just do not exist, your business does not exist.[/quote]



It is normal for an upgrade to produce negative traffic for a fixed amount of time. It’s mearly Google recrawling the new body content (layout) and updating it’s cache.

JesseLeeStringer, thank you for participating here.



So, if I got it right, it means all we have to do is wait and things will go back to how they used to be.



Does it also explain sitemap.xlm gives error every single day? It was not like that before. Everty time I ckecked sitemap was ok, now everytime I check I see a red X all over the screen. I’m not the only one with the same issue.

Any ideas or sudgestions to solution that?

[quote name=‘mirnitagl’]JesseLeeStringer, thank you for participating here.



So, if I got it right, it means all we have to do is wait and things will go back to how they used to be.



Does it also explain sitemap.xlm gives error every single day? It was not like that before. Everty time I ckecked sitemap was ok, now everytime I check I see a red X all over the screen. I’m not the only one with the same issue.

Any ideas or sudgestions to solution that?[/quote]



Yes - It’s a PITA to explain to clients since they dont understand why it dips and the reasons for it. In most cases it’s due to big changes such as image file names being rewritten by CS-Cart (I’ve engineered a solution to this) as product_detailed_.jpg etc.



Sitemap: Remove the sitemap and readd it - Google is a little retarded when you upgrade the store and it’s part old/new. By removing it you effectively provide a new submission that is crawled from top to bottom - hence the recrawls.

are asking if the new cache method is the problem or assuring it is the problem? - assuring only, needs someone to find out if Googlebots meet problem when crawl the pages.