Optimizing Cs-Cart For Better Performance

When you start putting blocks of content and banners on the page, it will slow down for sure. :-)

[quote name=‘kingsleypress’ timestamp=‘1415814631’ post=‘196802’]

When you start putting blocks of content and banners on the page, it will slow down for sure. :-)

[/quote]



I’ll working hard the next weeks on it.

I have to make some few fixes yet. to make it even faster.

Anyway I want to keep this style. clean, easy and fast. I like it :mrgreen:

Let’s see… :mrgreen:

[quote name='Magpie Don' timestamp='1414695472' post='195732']

3. Enable gzip compression: the single greatest accelerator you are going to implement.

[/quote]



Have you had any user reports of broken layouts after enabling gzip? I have a corporate client with many users that reported problems after I turned that on. I was unable to replicate the problem locally, but from their screenshots, it looked like something prevented the compressed CSS and Javascript from making it to their browser. Perhaps an overzealous firewall was placing the gzipped content in quarantine? Disabling gzip fixed the problem immediately. My .htaccess file did specify mod_deflate.



thanks,

Glen

The gzip content compression is different than the gzip'd files that cs-cart uses for css. The gzip compression is an output filter in your Apache service (web service) and also an input filter on the browser. Note that if you're using V4, you must have the following lines in your .htaccess for the browser to get proper css sent to your browser (it's an output filter).





AddEncoding gzip .gz
RewriteCond %{REQUEST_FILENAME} \.(js|css)$
RewriteCond %{HTTP:Accept-encoding} gzip
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule ^(.*)$ $1.gz [QSA,L]




Header unset ETag
FileETag None
ExpiresActive On
ExpiresDefault "access plus 1 year"


ForceType text/css
Header set Content-Encoding: gzip
Header set Cache-control: private




Header unset ETag
FileETag None
ExpiresActive On
ExpiresDefault "access plus 1 year"


ForceType text/javascript
Header set Content-Encoding: gzip
Header set Cache-control: private



Header set Access-Control-Allow-Origin "*"




This directs Apache to srip the .gz suffix from the files and makes the header be “Encoding: gzip”.

[quote name='tbirnseth' timestamp='1415851535' post='196840']

The gzip content compression is different than the gzip'd files that cs-cart uses for css. The gzip compression is an output filter in your Apache service (web service) and also an input filter on the browser. Note that if you're using V4, you must have the following lines in your .htaccess for the browser to get proper css sent to your browser (it's an output filter).

[/quote]



My .htaccess does contain this (I think CS-Cart 4.2.x adds that automatically). I believe my customers are behind a firewall, proxy, or antivirus software that is blocking the compressed content regardless of the .gz suffix.



I found a Google Code blog post discussing the problem:

http://googlecode.bl…web-faster.html



This blog further discusses the issues:

http://www.stevesoud…t-getting-gzip/



I haven't been able to find a reliable online gzip capability tester. I'd like to send my client to one to confirm my hunch.



-Glen

Wow, I've used gzip compression since 2002 on all websites I've ever developed. Much of the world is behind firewalls, most of the world uses AV software. I've never heard a single reported problem from users in these environments (which I was IN myself when developing the websites, i.e. behind a firewall and anti-virus protected by Norton).

Those posts are little help as they are from 2009 and start with a concern with IE6.

It may be that your users were experiencing a browser caching problem?

[quote name='tbirnseth' timestamp='1415851535' post='196840']

The gzip content compression is different than the gzip'd files that cs-cart uses for css. The gzip compression is an output filter in your Apache service (web service) and also an input filter on the browser. Note that if you're using V4, you must have the following lines in your .htaccess for the browser to get proper css sent to your browser (it's an output filter).





AddEncoding gzip .gz
RewriteCond %{REQUEST_FILENAME} \.(js|css)$
RewriteCond %{HTTP:Accept-encoding} gzip
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule ^(.*)$ $1.gz [QSA,L]




Header unset ETag
FileETag None
ExpiresActive On
ExpiresDefault "access plus 1 year"


ForceType text/css
Header set Content-Encoding: gzip
Header set Cache-control: private




Header unset ETag
FileETag None
ExpiresActive On
ExpiresDefault "access plus 1 year"


ForceType text/javascript
Header set Content-Encoding: gzip
Header set Cache-control: private



Header set Access-Control-Allow-Origin "*"




This directs Apache to srip the .gz suffix from the files and makes the header be “Encoding: gzip”.

[/quote]



I tried this Tony but it did not speed up things for us

If you are using out of the box V4, then this must be in your .htaccess file or your site will simply not work.

It won't speed anything up since the css is already compressed. This simply changes the name of the file from [ugly_filename].css.gz to [ugly_filename].css and sends the appropriate header to tell the receiver that the content is gzipped.



if you are not using V4, then you need an output filter to gzip the css/js which will zip the data and tell the receiver that the data is zipped.

Do you guys have an ideas on reducing the number of database queries ? (I know CS say they have done some good work on this for the next version but I need to reduce them for 4.22 now).



My problem is a nutshell.



I don't have money to throw at new hosting. I updated the site last week and for a few days it was throwing over 100,000 queries at the database. I am using “file” cache frontend and “database” backend.



The site is fine as it is for local uk traffic but at times I can see a first-byte of 8 secs for US and some parts of Europe. The whole site loads in 2 secs in the UK.

Given what you describe, your issues is more related to latency than it is to DB. Remember that every page has many different “requests” that are made to fulfill it. Images, javascript, off-site javascript like to Google Analytics, etc. For each of these transactions, the further away you are from the host (usually measured in “hops” versus actual distance), the longer each request takes. Many of these are synchronous (I.e. serial and the page doesn't complete until they are all done).



So if you're seeing longer times for US customers than UK,you might want to consider using a CDN. But if you can't afford better hosting, then affording a quality CDN is probably not in your budget either.



So suggestions would be for you to:[list=1]

[]Reduce/remove all unnecessary conversion scripts you might be running for various shopping/indexing networks

[
]Reduce the size of your images

[/list]

The symptoms you describe are not DB issues and you've given no indication of how you measured the 100K queries.

[quote name='tbirnseth' timestamp='1424294887' post='205742']

Given what you describe, your issues is more related to latency than it is to DB. Remember that every page has many different “requests” that are made to fulfill it. Images, javascript, off-site javascript like to Google Analytics, etc. For each of these transactions, the further away you are from the host (usually measured in “hops” versus actual distance), the longer each request takes. Many of these are synchronous (I.e. serial and the page doesn't complete until they are all done).



So if you're seeing longer times for US customers than UK,you might want to consider using a CDN. But if you can't afford better hosting, then affording a quality CDN is probably not in your budget either.



So suggestions would be for you to:[list=1]

[]Reduce/remove all unnecessary conversion scripts you might be running for various shopping/indexing networks

[
]Reduce the size of your images

[/list]

The symptoms you describe are not DB issues and you've given no indication of how you measured the 100K queries.

[/quote]



I'm using Cloudfront, it's just the database issue. I don't have any scripts other the normal google crap. I have about ten sites on this hosting account, all the other sites are fine although a few show the same “first-byte” problem it's not as high as cs-carts.



Probably would he helpful is cs-cart had a button to rebuild the basic cache rather then delete the whole of it. At least then it would be doing so much on every visit till that basic cache is first built.

Not sure what you mean by “basic cache”. There are basically (no pun intended) 2 caches (not counting the thumbnail directories built-on the fly). The registry and templates. You can clear the templates separately by using ?ctpl.



The vast majority of DB queries are not cached. What is cached are the settings and other DB queries that don't tend to change like company info, etc. These are all stored in the registry and if the registry has an entry, the DB is NOT queried. But things like products and categories, filters and the like are not cached since varying degrees of information are needed depending on the user context.

I upgraded yesterday from 4.2.4 to 4.3.3. My GTMetrix scores went up from B B to A B. I will probably have A A when I turn CDN back on again (turned it off during the upgrade, and while I get all my add ons sorted out).



However, on Google's PageSpeed insights, I'm only getting 82/100 for desktop and a miserable 64/100 for mobile.

Hello,

I have problem with cs-cart speed..

after many hours.. or 1 day something happening with cache and it slow down the website..

if I go and clean the cache then the speed is ok..

is there any way to find out what happening?

CS-Cart Version: 4.2.4

PHP: 5.6.32

I also have set redis for "cache_backend.

That would be a question for Redis.... You might want to turn off Redis till you get the issue addressed.

I replaced my .htaccess file with the sugested one but none of my web pages are working, just home page is loading other pages are giving error "The requested URL /biscuits/ was not found on this server.

Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request."

Can you suggest a post or some details on how to optimize cs cart. I'm using 4.3.3.

Thanks in advance

Can anyone give me a clue what to add to leverage browser caching for the below report from gtmetrix

https://www.google.com/recaptcha/api.js?onload=onRecaptchaLoaded&render=explicit (5 minutes)
https://www.google.com/recaptcha/api2/webworker.js?hl=en&v=v1523860362251 (5 minutes)
https://www.google-analytics.com/analytics.js (2 hours)
https://www.hivis.co.uk/app/addons/product_designer/data/icons/resize.svg (2 days)
https://www.hivis.co.uk/app/addons/product_designer/data/icons/rotate.svg (2 days)
https://www.hivis.co.uk/app/addons/product_designer/data/icons/trash.svg (2 days)
https://www.hivis.co.uk/design/themes/responsive/media/images/icons/ajax_loader.svg?1524598904 (2 days)

my htaccess

DirectoryIndex index.html index.php

    # Compress HTML, CSS, JavaScript, Text, XML, fonts
    AddOutputFilterByType DEFLATE application/javascript application/x-javascript text/javascript application/json
    AddOutputFilterByType DEFLATE application/x-font application/x-font-opentype application/x-font-otf application/x-font-truetype application/x-font-ttf font/opentype font/otf font/ttf application/x-woff application/x-font-woff
    AddOutputFilterByType DEFLATE text/css text/html text/plain

    # Remove browser bugs (only needed for really old browsers)
    BrowserMatch ^Mozilla/4 gzip-only-text/html
    BrowserMatch ^Mozilla/4\.0[678] no-gzip
    BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
    Header append Vary User-Agent

Header set Access-Control-Allow-Origin "*"

Cache all images for 2 weeks

ExpiresActive on ExpiresDefault "access plus 2 weeks" Header set Cache-Control "max-age=1209600"

my htaccess

DirectoryIndex index.html index.php

    # Compress SVG
    AddType image/svg+xml svg svgz
    AddOutputFilterByType DEFLATE image/svg+xml
# Compress HTML, CSS, JavaScript, Text, XML, fonts
AddOutputFilterByType DEFLATE application/javascript application/x-javascript text/javascript application/json
AddOutputFilterByType DEFLATE application/x-font application/x-font-opentype application/x-font-otf application/x-font-truetype application/x-font-ttf font/opentype font/otf font/ttf application/x-woff application/x-font-woff
AddOutputFilterByType DEFLATE text/css text/html text/plain


    # Remove browser bugs (only needed for really old browsers)
    BrowserMatch ^Mozilla/4 gzip-only-text/html
    BrowserMatch ^Mozilla/4\.0[678] no-gzip
    BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
    Header append Vary User-Agent

Header set Access-Control-Allow-Origin "*"

Cache all images for 2 weeks

ExpiresActive on ExpiresDefault "access plus 2 weeks" Header set Cache-Control "max-age=1209600"

Howdy,

I adjusted it in the quote.

Sadly, you can not change the headers of the Google analytics JS file, kind of hypocritical but that's just the way it is.

Kind regards,