MySQL nightmares saga

I installed Jet Profiler for MySQL. This shop is really one big MySQL nightmare. :smiley: It can easy melt MySQL on 8 GB RAM, quadcore machine overwhelming other 20 webs running mostly TYPO3.



This is result of calling dispatch=sales_reports.reports.







First unsuccessful try ends with:

Allowed memory size of 134217728 bytes exhausted (tried to allocate 86 bytes)



OK, let’s try second with 256 MB memory_limit, another error:

mod_fcgid: read data timeout in 360 seconds

Premature end of script headers




OK, upped IPCCommTimeout to 3600, let’s try again:

BConnection reset by peer: mod_fcgid: read data from fastcgi server error…

[/B]

You better not even want to know what kind of queries it runs on product_features tables. To be continued… :smiley:

Welcome to my life :stuck_out_tongue:

A sales report is probably about the worst DB hog you can find in this or any other cart that does not generate separate sales data as orders are placed. This is because it has to look at all orders to derive the info it’s looking for.



I’d be interested to see what it does on just a normal view of a product. A fair test would be to:

  1. clear the cart cache
  2. Refresh a product detail page capturing the DB stats
  3. Refresh again now that the cache is built capturing the DB stats.

Result: Can’t launch sales reports even with 384M php memory limit, that must be a bug.



Anyway, new test:

  1. launch profiler
  2. clear cache (in my case mysql)
  3. called ab -n 1000 -c 10 [url]http://www.xxx.xx/xxxx/xxx/xxx.html[/url]



    This is the result:
<br />
Concurrency Level:      10<br />
Time taken for tests:   98.613 seconds<br />
Complete requests:      1000<br />
Failed requests:        0<br />
Write errors:           0<br />
Total transferred:      62889000 bytes<br />
HTML transferred:       62396000 bytes<br />
Requests per second:    10.14 [#/sec] (mean)<br />
Time per request:       986.133 [ms] (mean)<br />
Time per request:       98.613 [ms] (mean, across all concurrent requests)<br />
Transfer rate:          622.79 [Kbytes/sec] received<br />
<br />
Connection Times (ms)<br />
              min  mean[+/-sd] median   max<br />
Connect:        0    0   0.0      0       1<br />
Processing:   292  983 1197.2    875   25570<br />
Waiting:      141  598 297.3    578    4622<br />
Total:        292  983 1197.2    875   25571<br />
<br />
Percentage of the requests served within a certain time (ms)<br />
  50%    875<br />
  66%    988<br />
  75%   1059<br />
  80%   1096<br />
  90%   1229<br />
  95%   1347<br />
  98%   1543<br />
  99%   5691<br />
 100%  25571 (longest request)<br />

```<br />
<br />
and this is how it looks like:<br />
![](upload://6I6PuuEmauXO02up6hgyYXrei26.jpeg)<br />
![](upload://6I6PuuEmauXO02up6hgyYXrei26.jpeg)<br />
![](upload://6I6PuuEmauXO02up6hgyYXrei26.jpeg)<br />
![](upload://6I6PuuEmauXO02up6hgyYXrei26.jpeg)

So 98% of requests are serviced in about 1.5seconds?

Seems like you have configured the cache to be in the same database that you’re caching! That’s why using mysql for the backend caching method should not be used.



You also seem to be using an addon that is DB intensive. The table tx_realurl_urlencodecache is not part of the standard product.



Yes, cs-cart does do a LOT of JOINS. Most are pretty well indexed so should not require table scans. So I’m assuming this would be an area to drill into.



Surprised that sessions is so heavy. Probably because you cleared the cache. Be interesting to see the run:

  1. Turn off addons that are not part of the standard distribution (apples to apples)
  2. Change caching method to sqlite or files
  3. Load a page with features to build the cache
  4. Start your profiler and use another product detail page.