What's the latest advice for caching Cs Cart?

Hi, I've been reading through some old threads about caching for CS cart, some people turning it off, some using sqlite etc. Most of the instructions/hacks seem to be outdated.



I'm just wondering what people are doing for best site performance from version 2.2.1 and upward? (I'm running 2.2.2)



I also read that some people have setup a cron job to clear the cache once or twice a day, so I'm also wondering about the implications of that; whether it actually helps or hinders site performance.



I have to say, I've noticed our site slowing down lately, I realise their could be a lot of other factors at play, but I would like to investigate the best caching method if possible.



Our site has medium to heavy traffic, almost all products have options, and there are a number of filters such as 'shop by brand'



Lastly, under the Logs settings, I noticed you can check or uncheck a number of options - can some or most of these be disabled to improve site performance?



Many thanks,



Scott.

NEVER use mysql as the backend cache. Why use the same backend to cache what your're caching (settings tables, lang variables, etc.). Makes no sense.



I use sqlite and have had good luck with shmem too. However shmem is going to be a function of your hosting environment. If you don't have a dedicated server (and control over real, physical memory), you probably shouldn't use shmem. But it will be fastest since everything is held in memory versus being on disk.



Logging is a pretty small part of the overall process. It is also done at the end of most transactions so it really has very little impact on performance. The biggest win from a performance perspective is to turn off the statistics addon. However, one should do a cost/value judgment as to whether this is a good idea for your particular situation. Doing so might impact things like fraud detection and other things that keep track of IP addresses, sessions, etc. But if you have a pretty active site, this could gain you the most.



Performance of active cs-cart sites is penalized. Lower traffic sites get benefits that just don't pan out for active sites. I love to see people publish page speeds for sites that are doing a single access to a page on an otherwise unloaded site. More active sites are penalized because cs-cart abuses AJAX requests and initiates them on every page load (versus user requests for which it was designed). This causes about 3-6 page loads for every requested page just to do things like initialize some javascript data that might not be used on most page accesses. If you have access to your server via SSH, do the following command and you'll see what I mean. You will find multiple 'established' connections for every IP address accessing your store. That is a separate PHP process, cache read, etc. for each one. The command is:

netstat -tp 2>/dev/null | grep 'ESTABLISHED|Send-Q' | grep -v ':imap'


This will also give you a good view of your actual crawler activity, etc.

[quote name='tbirnseth' timestamp='1318798631' post='123814']

Performance of active cs-cart sites is penalized. Lower traffic sites get benefits that just don't pan out for active sites. I love to see people publish page speeds for sites that are doing a single access to a page on an otherwise unloaded site. More active sites are penalized because cs-cart abuses AJAX requests and initiates them on every page load (versus user requests for which it was designed). This causes about 3-6 page loads for every requested page just to do things like initialize some javascript data that might not be used on most page accesses. If you have access to your server via SSH, do the following command and you'll see what I mean. You will find multiple 'established' connections for every IP address accessing your store. That is a separate PHP process, cache read, etc. for each one. The command is:

netstat -tp 2>/dev/null | grep 'ESTABLISHED|Send-Q' | grep -v ':imap'


This will also give you a good view of your actual crawler activity, etc.
[/quote]

Why wouldn't CS-Cart fix this if its such a big issue then?

backwards thinking, too much work, etc.

Hi Tony, thanks for your detailed reply. I tried sqlite on our Aussie server and kept getting errors about the 'database being locked' The isp seemed pretty clueless and wanted to roll back the php from 5.3 to 5.2. We have a highly customised Cs Cart website and I didn't know what the ramifications of rolling back the php might be. Shmem would've been good but we just downgraded from our dedicated server to a VPS. Servers cost a fortune here in Aus - the dedicated was over $400 a month.



Are there any benefits to disabling or clearing out the cache once a day? I read of some users doing that, although it seems weird to empty a cache when its there to speed things up. :~)



By the way, i didnt start this thread to bash Cs Cart. We are very happy with the product, but I am a little concerned about the performance at times, particularly logging into Admin. That can sometimes take a few minutes, even with the storefront still responding normally.

[quote name='Scott_C' timestamp='1318840541' post='123864']

Hi Tony, thanks for your detailed reply. I tried sqlite on our Aussie server and kept getting errors about the 'database being locked' The isp seemed pretty clueless and wanted to roll back the php from 5.3 to 5.2. We have a highly customised Cs Cart website and I didn't know what the ramifications of rolling back the php might be. Shmem would've been good but we just downgraded from our dedicated server to a VPS. Servers cost a fortune here in Aus - the dedicated was over $400 a month.



Are there any benefits to disabling or clearing out the cache once a day? I read of some users doing that, although it seems weird to empty a cache when its there to speed things up. :~)



By the way, i didnt start this thread to bash Cs Cart. We are very happy with the product, but I am a little concerned about the performance at times, particularly logging into Admin. That can sometimes take a few minutes, even with the storefront still responding normally.

[/quote]

I wonder how much of colocation at Aus?

cs-cart's target market is small merchants. Most merhchants blame their host when performance problems occur. Cs-cart does not look at (nor do they test) moderately or heavily used sites. If you run the command I provided, it will be clear what's going on on your site.



Regarding sqlite and errors… I do not know. I've not seen that problem on any site I've configured with sqlite.



I'm glad the provide options for the cache handler. Choosing/finding the right one for any particular sites need requires analysis and understanding of what resources the configuration of a site uses. My ratings in performance for the options available would be (highest to lowest):

Shmem

Sqlite

Files

MySQL



But like most performance issues, your mileage may vary…

What do you mean by small business, not moderately to heavy? We have a couple stores

each gets about 10k visitors a month. Would that be ok?

Hi Tony, thank you again. I afraid i can't seem to get your line: # netstat -tp 2>/dev/null | grep 'ESTABLISHED|Send-Q' | grep -v ':imap' - running over SSh, but I have very little experience with that method.



In the absence of SQLite working, Im thinking to change from 'mysql' caching back to 'file' - just wondering the best way to do this on a live store? I know how to make the edit in the config.local.php file but do I need to clear out the cache directory first? i don't want answer nasty error messages appearing to customers in the frontend.



Regards,



Scott.


[quote name='tbirnseth' timestamp='1318896757' post='123899']

cs-cart's target market is small merchants. Most merhchants blame their host when performance problems occur. Cs-cart does not look at (nor do they test) moderately or heavily used sites. If you run the command I provided, it will be clear what's going on on your site.



Regarding sqlite and errors… I do not know. I've not seen that problem on any site I've configured with sqlite.



I'm glad the provide options for the cache handler. Choosing/finding the right one for any particular sites need requires analysis and understanding of what resources the configuration of a site uses. My ratings in performance for the options available would be (highest to lowest):

Shmem

Sqlite

Files

MySQL



But like most performance issues, your mileage may vary…

[/quote]

@Scott - make the change in config.local.php then clear the cache. The cached files won't be there so it will try to build them if you do it while the site is active. No big deal, the cart will handle the change fine.



@Sole - it's more about simultaneous access than total number of users. Only way to tell is to monitor performance periodically and see how your site is behaving as it relates to the load. All I can tell you is that if there are 3-6 cgi processes spawned for each physical page read then that's going to add to the load on the server and each of those processes is going to try to read/write the cache data.

Hi Tony, we have Cs Cart on a US server (not yet live) and it has Sqlite3 installed. I changed the config.local.php file caching to 'sqlite' and I get this error;



Warning: require(/home/mmc/public_html/core/db/sqlite.php) [function.require]: failed to open stream: No such file or directory in /home/mmc/public_html/init.php on line 34

Fatal error: require() [function.require]: Failed opening required '/home/mmc/public_html/core/db/sqlite.php' (include_path='/home/mmc/public_html/lib/pear/.:/usr/lib/php:/usr/local/lib/php') in /home/mmc/public_html/init.php on line 34




I tried changing it to 'file' and get this error;


Warning: require(/home/mmc/public_html/core/db/file.php) [function.require]: failed to open stream: No such file or directory in /home/mmc/public_html/init.php on line 34

Fatal error: require() [function.require]: Failed opening required '/home/mmc/public_html/core/db/file.php' (include_path='/home/mmc/public_html/lib/pear/.:/usr/lib/php:/usr/local/lib/php') in /home/mmc/public_html/init.php on line 34




Looking in the core/db directory, there are only two file; mysql.php and mysqli.php - the installation files don't have any .php file for 'sqlite' or 'file'



Do you have any ideas?



Kind Regards,



Scott.


[quote name='tbirnseth' timestamp='1318911294' post='123905']

@Scott - make the change in config.local.php then clear the cache. The cached files won't be there so it will try to build them if you do it while the site is active. No big deal, the cart will handle the change fine.



@Sole - it's more about simultaneous access than total number of users. Only way to tell is to monitor performance periodically and see how your site is behaving as it relates to the load. All I can tell you is that if there are 3-6 cgi processes spawned for each physical page read then that's going to add to the load on the server and each of those processes is going to try to read/write the cache data.

[/quote]

What version of cs-cart are you running? The file it's trying to load should be

core/cache/class.cache_backend_sqlite.php



I think this is the file for 2.1 and beyond. Prior to that, you probably need to load zeke's custom registry that will use the PDO version of sqlite3 rather than that native one. You'd have to search the forums for “registry.php” for that version.

Hi Tony, running 2.2.2 Definitely have the core/cache/class.cache_backend_sqlite.php file - perhaps permissions need to be set for it? i.e; make it writable?



I did see Zeke's thread about the PDO hack, but the code he referenced seemed to be outdated - in other words I couldnt find the snippet he referred to in the file of 2.2.2


[quote name='tbirnseth' timestamp='1318912116' post='123909']

What version of cs-cart are you running? The file it's trying to load should be

core/cache/class.cache_backend_sqlite.php



I think this is the file for 2.1 and beyond. Prior to that, you probably need to load zeke's custom registry that will use the PDO version of sqlite3 rather than that native one. You'd have to search the forums for “registry.php” for that version.

[/quote]

[quote name='tbirnseth' timestamp='1318911294' post='123905']

@Scott - make the change in config.local.php then clear the cache. The cached files won't be there so it will try to build them if you do it while the site is active. No big deal, the cart will handle the change fine.



@Sole - it's more about simultaneous access than total number of users. Only way to tell is to monitor performance periodically and see how your site is behaving as it relates to the load. All I can tell you is that if there are 3-6 cgi processes spawned for each physical page read then that's going to add to the load on the server and each of those processes is going to try to read/write the cache data.

[/quote]





Tony,



I'm a little worried now that all the time and effort I have put into setting up CS-Cart

may be a wash. It sounds like your saying the performance could be horrible for me.



I plan to run 3 websites as I mentioned average of 10k visitors a month probably doubles

during holidays. Average of 300-375 a day. I'm not sure if that's a lot for CS_Cart

to handle or not. Were using Wired Tree VPS with 1024 ram.



Do you think CS-Cart will perform well for us?

Solesurvivor, you can simulate 500 simultaneous visitor on your site using loadimpact.com

Test take 15 minutes, and give you a report on memory used and load speed.

@colortone - tells you nothing about what's really going on on the server. I watched as it ran and it only created one connection to the server so not sure how it's deriving it's info of 10, 20, etc. simultaneous users. I also did not see the load average on the server change after the test started. So I'd have to take the info they report with a grain of salt.



You had me excited for a minute…



Update: I let this run for a while longer and it does tend to increase the number of connections. It seems to be grabbing images mostly. It doesn't seem to really be navigating the site (following links) as I monitor the access log. And what I see on the server side looks nothing like what a normal shopper looks like. I'm guessing this is due to things like browser type and it behaving more like a robot than a user and since all accesses are from a single IP (different ports), the session manager and statistics won't kick in as heavily as a user's access would (from different IP's).



But it's a decent page-load analyzer… But I'm looking at things from the server side, not the client (and yes, if server is bad, client will be bad too). Note also this is running from Stockholm so there will be some latency in the requests to my sites in Dallas Tx.

[quote name='tbirnseth' timestamp='1318915873' post='123917']

@colortone - tells you nothing about what's really going on on the server. I watched as it ran and it only created one connection to the server so not sure how it's deriving it's info of 10, 20, etc. simultaneous users. I also did not see the load average on the server change after the test started. So I'd have to take the info they report with a grain of salt.



You had me excited for a minute…



Update: I let this run for a while longer and it does tend to increase the number of connections. It seems to be grabbing images mostly. It doesn't seem to really be navigating the site (following links) as I monitor the access log. And what I see on the server side looks nothing like what a normal shopper looks like. I'm guessing this is due to things like browser type and it behaving more like a robot than a user and since all accesses are from a single IP (different ports), the session manager and statistics won't kick in as heavily as a user's access would (from different IP's).



But it's a decent page-load analyzer… But I'm looking at things from the server side, not the client (and yes, if server is bad, client will be bad too). Note also this is running from Stockholm so there will be some latency in the requests to my sites in Dallas Tx.

[/quote]



That said do you think CSC can handle any number of con-current visitors say 30-50 or do you think its not built for that? I wish someone from CSC would chime in on this one.

Is very difficult to answer your question if csc can handle some amount of visitors, without knowing the page size and the content. Use firebug on your site, and hit the tab “net”. It will tell you the page size and how much of that load is got from the cache.



Think about it like the car race, where every “weight” you reduce get you a chance to increase speed and gas miles. So ask yourself how many addons you are using, are your images at their lowest size possible, are blocks there without any purpose, etc. Get your site “fit” first.



Is hard to blame csc, scripts optimization takes time. I read forum from other shopping cart, and all they encounter the same problems. Once they increase traffic or get more addons working the site demand more resources.

[quote name='colortone' timestamp='1318971284' post='123984']

Is very difficult to answer your question if csc can handle some amount of visitors, without knowing the page size and the content. Use firebug on your site, and hit the tab “net”. It will tell you the page size and how much of that load is got from the cache.



Think about it like the car race, where every “weight” you reduce get you a chance to increase speed and gas miles. So ask yourself how many addons you are using, are your images at their lowest size possible, are blocks there without any purpose, etc. Get your site “fit” first.



Is hard to blame csc, scripts optimization takes time. I read forum from other shopping cart, and all they encounter the same problems. Once they increase traffic or get more addons working the site demand more resources.

[/quote]





@colortone : I see your point. I just checked and my home page shows 1.7s (onload: 1.74s) after having cleared my cache. I think thats good. How can I tell what load is from cache?

If your site is performing to your satisfaction, leave it alone.



If it starts to fade then hire someone who has the knowledge, skills and experience to characterize what's going on and recommend how the site/system can be adjusted.



CSC is a reliable shopping cart and makes pretty good use of database optimization, etc. However, it is very resource intensive simply in the way it's put together (architected). That's why I state that it is built for smaller merchants so that they can afford to get reasonable performance without having to invest in dedicated servers and other things a moderate sized independent merchant would invest in.



I have clients with over 100,000 products and performance is good. But it takes a dedicated server to do so.



If you are not on a dedicated server, then you have much fewer options and much less control about shared resources like disk controllers, ethernet cards, physical memory allocation, CPU, etc. are allocated.



You can't judge the performance of a system by simply banging on the home page over and over again. The only way is to monitor the underlying system while it is in use by real shoppers/customers. The monitoring will reveal where resources are falling short. You can then work with your host to increase those resources that need it.



There are no silver bullets and there is no escaping careful analysis and characterization. Every site will be different. I can only convey to you what I see on the servers I manage and the characteristics of the sites that I host on those servers.