SEO Google Index - Thousands of sites being deindexed - WHY?

Hey yall,

we are running out of ideas. We started out with stage which unfortunately got us indexed 100k+ sites unintentional. We got this solved & removed but now Google seems to have big issues indexing our original walanco.de website.

Does Google have an issue with the code of cscart somehow? We had around 5k websites indexed but since today it is only 240 sites. Sometimes we have server issues, but that cannot be the reason. The worst pages are being indexed if any. Not sure why.

It is all a guessing game by now since we have basically tried everything there is to do. The website is decent, lots of good content and optimized pages. Sometimes a bit slow due to API queries. But nothing obvious to go from 5k to 240 sites.

In total we have 100k+ sites that can be indexed. We would be happy with getting 10k for starters.

Anyone has had similar issues with their new store? Any hints on where to optimize (backlinks?) etc would be appreciated.

My boss thinks it must be an issue on cscart coding structure?

Same here, in March indexed sites dropped with 30%, second and third week in May, it was back as it was. Now since May 27th, dropped 30% again.

Yeah pretty frustrating and neither Google nor Bing lets us know what the actual problem is. Could be anything.

Have the same issue. 2 cs cart websites on the same server. Google indexing pages and after start deindexing. After again indexing and again deindexing.


And this is happening more than 6 month. What is interesting both websites indexing and deindexing almost at the same time.

We now went from 5k indexed pages to 8 pages indexed. We have no idea whats going on or what we can do to solve this issue.

To my previous post. See above second picture of indexing. Below latest pic of this website continues:

And I believe in a week time google again will index and after deindex.
My 2 websites 1+ year old. Have different themes but on the same server. No issues with Bing or Yandex. They have indexed pages.
I believe is a problem with google. Maybe they have tech issues, or they have new algorithm what decides you will be visible or not. Google better knows what people need to see. :grinning:

What type and version you have of cs cart?

Our version: CS-Cartv4.17.2.SP3

I know Google is testing a new type of search called ‘AI Search’, so it’s possible that’s where the problem is until they fully implement it.

Multi-Vendor v4.18.1

Google started again indexing my pages as I mentioned before.

Possible again after time will deindex.
Is the file .htaccess can impact on indexing?

Thank you

I can confirm, we are having the same issue since march our site went from #3 in google for a niche market all they down to #30 aka page 3. We made no changes no update … and as a result have concluded that, googles new algo pushed on mid march does not like the structure of the cs-cart is built upon. So after almost 10 years we are moving to a new vendor.

edit: we think that it has to do with how “core web vitals” read the site, as we have more than 300+ Poor rated pages, of which we can not optimize for as its just the function on how cs-cart is built and loads. IE external css, we even went as far as setting up cloudflare to attempt to decrease the stats. For anyone with that same issue setup your search engine console and review the pages there.


it’s since late April 2025
Semrush Signalls Hig Activity / Hig volatility on ( local -It for me- ) Serps
on both Desk and Mob DB’s

the - very rude - way I do use, actually is “bouncing it back” on Search Console ( to reevaluation), and it’s working at 30/35 % success.

It’s a Google -only- Thing, as you can see, and it’s NOT about Cs Cart or a Specific Country/Category .

The annoyin thig is Big G is every day Slower… hence to get a pair get longer.

(aka they mark it “useless” in a rush but, to revert back to normal get a lot of time )

Here below, same period with IT and US Mobile Compared, Same Dynamics, basically

You see it, at least, it’s not a Nice/category nor a Software brand, nor a ‘local’ thing
being English and Romances Languages very differents

the “evil Spikes” -:joy:-, are basically the same, also in rollout times

-visual from Semrush-

1 Like

I’d try removing /app/ from the robots.txt disallowed and also removing anything in the htaccess that would prevent bot from crawling pages, crawl delays, etc.

I’ve had a huge deindexing issue, but IMO it was my attempts to filter the ahrefs bot and all the Chinese and so on bots crawling gigabytes of data. I’ve used htaccess and robots.txt. A couple of weeks ago I was bashing my head on what happened so I tested the robots.txt on some website and chose Google user-agent and it said it has no access (which shouldn’t be, I’ve never filtered Google, but it read the /disallow and /craw delay of other robots, and also one of the htaccess lines was *bot, so may be it prevented it.

Now I started with new sitemaps, etc, and I went through countless hours of fixing “poor” pages (my initial bite is slow, otherwise I get 85+ or even 90+ on mobile pages and my core vitals seem okay-ish, but it is slow. I didn’t even realize Google Search Console integrated page speed actually tests some virtual indexed page, not the actual live page, but testing the said page I got great results, yet google said it has CLI issues.

BTW, I’m still on 4.9.2 something, a lot of interconnected addons and I couldn’t upgrade to 4.11 on the test server, so I gave up for now.

It is getting worse for Walanco. We are 6 months in and basically running blind for prospects. We as well as SEO Agencies cant really explain what the issue is.

Did anyone else confirm having these issues when launching your cscart and will it even out once the domain authority increases? Would give us piece of mind if so.



What does Google say in the “See details” when you click on a specific URL from the list?

It does look okay to me - Not sure what the issue is. Google keeps it very general which doesnt help us at all.

Why does it say there is no referring sitemap?
The SEO module is on I guess, did you rebuild the sitemap?

We are using cron job together with ab:advanced sitemap addon. Every night the job is running but we had split the sitemap into serveral produc1, 2 etc

I just saw those all go to a 404 page. Also images at 192 same as product xml is a bit confusing. We have 100k products in total.

https://walanco.de/index.php?dispatch=ab__advanced_sitemap.sitemap

also not sure where this one is coming from
https://walanco.de/custom_links1.xml

So you have over 18,000 categories but only 192 products? If not, you need to contact AB and ask them why only 192 products are being produced in the sitemap. I’m not sure why you’d want to use a 3rd party. Seems overkill but that’s just me. Maybe if it was working correctly, it might be a lot better than the default Google sitemap.?

I think the Largest Contentful Paint (LCP) is one of the biggest issues of CSCART - It takes 9 seconds for the consent tool to render, sometimes a simple cat desc is the Largest Contentful Paint (LCP) - how is that even possible. Simple html on top of the page. Basically words blocking other elements to render. No wonder Google crawler have no patience