Inneficient File Caching

Hello,



I've been contacted several times by my host and I have also seen a great slowness of cs-cart. Using file caching, there are hundreds of thousand files being cached constantly and the server is killed by this process. I am on a very reliable service, webfaction, and I was contacted because of issues from this problem.



Clearing the cache using a cronjob kills the functionality of caching. Why is this happening? I meen for christ sakes, there were 70mb, 600.000 files. For Christ's sakes!



I am on 4.01, had trouble updating on 4.1.2, a lot of things don't work properly and most importantly settings aren't saved, and am also not sure if this would fix the problem either.



I am all open to suggestions, especially from the cs-cart team, which I have paid several times. I have contacted the greek cs-cart representative about the update and haven't got a reply for a week now, which I find unacceptable - it's not an issue of the representatives but the technical department that slows this down.



I am very VERY unstatisfied, after all this problems and bugs, reaching to an end, and still have calls from clients every day several times, because their website is slow.

Finally was able to locate what's wrong. Either if I am using file cache or mysql cache the same thing is happening.



There are hundreds of files (or records in the case of mysql) created CONSTANTLY starting with pfilters* .



What is this? Why is it happening and how can it be avoided? Even without traffic, by refreshing in phpmyadmin inside the database, I see the records in the cache table increasing by the second. In a few hours time the database size increases from 35mb original size up to 400mb.



Admins, HELP!



edit: I am certain that other people encountering slowness happens from this thing, your replies will be precious!

Same thing happens here!!

Product filters are expensive.

They should not be. Product filters are just tags.

Not really… They are a matrix of values related to a product. Take for instance a fishing reel. It has various “line capacities” for different “line weights”. So you might have 200yd @ 30lb, 400 @ 20lb, 1000 @ 10lb, etc. And reel capacities might apply to 100 products.



Given that, there are 300 unique combinations of the product and feature. Hence you have 300 cached items for just this one feature. The math can compound rather rapidly if you use features indiscriminately. I'm not sure, but I think even an empty feature value for a product might be cached. I.e. there might be an entry for “line capacities” for “fishing rods” which will eat up another node.



Product features were a big strain on the DB in versions earlier than V4. In V4, they are now cached when first accessed.

Well, I am going all the way with features. i.e. dozens of features and filters per product. We will see what issues I encounter. Maybe it will require a Varnish type cache to speed it up.

Having same features on 2x I had no problems at all.

Just to be clear. I was not implying there was any problem in cs-cart as it relates to product features. The poster didn't know what 'pfilter' files were for and why there were so many generated. I simply tried to explain how it all works. I was NOT trying to identify the cause of hsi problem (which is more than likely something outside of cs-cart).



Just wanted to clarify my discussion points.

Support has no solution apart from a cron script that they offered to clear cache. Upgraded my plan to more free space will see how it goes.



PS new plan has 320 GB of space, but if registry small files are never removed by a cart, this space will also be filled one day…

Attached disk read write graphic, 4x writes something all time…

Same problem here!