we are running out of ideas. We started out with stage which unfortunately got us indexed 100k+ sites unintentional. We got this solved & removed but now Google seems to have big issues indexing our original walanco.de website.
Does Google have an issue with the code of cscart somehow? We had around 5k websites indexed but since today it is only 240 sites. Sometimes we have server issues, but that cannot be the reason. The worst pages are being indexed if any. Not sure why.
It is all a guessing game by now since we have basically tried everything there is to do. The website is decent, lots of good content and optimized pages. Sometimes a bit slow due to API queries. But nothing obvious to go from 5k to 240 sites.
In total we have 100k+ sites that can be indexed. We would be happy with getting 10k for starters.
Anyone has had similar issues with their new store? Any hints on where to optimize (backlinks?) etc would be appreciated.
My boss thinks it must be an issue on cscart coding structure?
Have the same issue. 2 cs cart websites on the same server. Google indexing pages and after start deindexing. After again indexing and again deindexing.
And I believe in a week time google again will index and after deindex.
My 2 websites 1+ year old. Have different themes but on the same server. No issues with Bing or Yandex. They have indexed pages.
I believe is a problem with google. Maybe they have tech issues, or they have new algorithm what decides you will be visible or not. Google better knows what people need to see.
I can confirm, we are having the same issue since march our site went from #3 in google for a niche market all they down to #30 aka page 3. We made no changes no update ⌠and as a result have concluded that, googles new algo pushed on mid march does not like the structure of the cs-cart is built upon. So after almost 10 years we are moving to a new vendor.
edit: we think that it has to do with how âcore web vitalsâ read the site, as we have more than 300+ Poor rated pages, of which we can not optimize for as its just the function on how cs-cart is built and loads. IE external css, we even went as far as setting up cloudflare to attempt to decrease the stats. For anyone with that same issue setup your search engine console and review the pages there.
Iâd try removing /app/ from the robots.txt disallowed and also removing anything in the htaccess that would prevent bot from crawling pages, crawl delays, etc.
Iâve had a huge deindexing issue, but IMO it was my attempts to filter the ahrefs bot and all the Chinese and so on bots crawling gigabytes of data. Iâve used htaccess and robots.txt. A couple of weeks ago I was bashing my head on what happened so I tested the robots.txt on some website and chose Google user-agent and it said it has no access (which shouldnât be, Iâve never filtered Google, but it read the /disallow and /craw delay of other robots, and also one of the htaccess lines was *bot, so may be it prevented it.
Now I started with new sitemaps, etc, and I went through countless hours of fixing âpoorâ pages (my initial bite is slow, otherwise I get 85+ or even 90+ on mobile pages and my core vitals seem okay-ish, but it is slow. I didnât even realize Google Search Console integrated page speed actually tests some virtual indexed page, not the actual live page, but testing the said page I got great results, yet google said it has CLI issues.
BTW, Iâm still on 4.9.2 something, a lot of interconnected addons and I couldnât upgrade to 4.11 on the test server, so I gave up for now.
It is getting worse for Walanco. We are 6 months in and basically running blind for prospects. We as well as SEO Agencies cant really explain what the issue is.
Did anyone else confirm having these issues when launching your cscart and will it even out once the domain authority increases? Would give us piece of mind if so.
We are using cron job together with ab:advanced sitemap addon. Every night the job is running but we had split the sitemap into serveral produc1, 2 etc
I just saw those all go to a 404 page. Also images at 192 same as product xml is a bit confusing. We have 100k products in total.
So you have over 18,000 categories but only 192 products? If not, you need to contact AB and ask them why only 192 products are being produced in the sitemap. Iâm not sure why youâd want to use a 3rd party. Seems overkill but thatâs just me. Maybe if it was working correctly, it might be a lot better than the default Google sitemap.?
I think the Largest Contentful Paint (LCP) is one of the biggest issues of CSCART - It takes 9 seconds for the consent tool to render, sometimes a simple cat desc is the Largest Contentful Paint (LCP) - how is that even possible. Simple html on top of the page. Basically words blocking other elements to render. No wonder Google crawler have no patience