70% memory used - VPS

Hi all, I am a little amazed, and also worried about our site! I have just migrated to a 4 x xeon i7 with 256mb dedicated and 512 burst ram. We are having about 5-10 people on the site at one time, 866 products and yslurp constantly searching the site.



The site according to Google is 74% slower than all other sites online obviously not accurate at all but still not nice to read and the memory used is constantly on 74-76% loading at all times.



I have read through the forums, have gzip compression enabled, smart optimizer installed and keep enabled is not on.



I would LOVE any further advice on what I might be able to do. my ini_set is currently at 128mb.



Regards

[quote name=‘AmitP’]Hi all, I am a little amazed, and also worried about our site! I have just migrated to a 4 x xeon i7 with 256mb dedicated and 512 burst ram. We are having about 5-10 people on the site at one time, 866 products and yslurp constantly searching the site.



The site according to Google is 74% slower than all other sites online obviously not accurate at all but still not nice to read and the memory used is constantly on 74-76% loading at all times.



I have read through the forums, have gzip compression enabled, smart optimizer installed and keep enabled is not on.



I would LOVE any further advice on what I might be able to do. my ini_set is currently at 128mb.



Regards[/quote]

Hi,



be aware that the cPanel and other server aplications will consume at least 128MB of the RAM.

Can more RAM be assigned to a VPS account? When purchasing the VPS there was some concern by the sales staff to ask whether or not I would have enough RAM. If I was to get 512mb and 1gb burst would that vastly improve the performance. I just have visions of the site getting more traffic and grinding to a halt.



I also realise that server and website perfomance matter to google, and they do like faster servings websites.

[quote name=‘AmitP’]Can more RAM be assigned to a VPS account? When purchasing the VPS there was some concern by the sales staff to ask whether or not I would have enough RAM. If I was to get 512mb and 1gb burst would that vastly improve the performance. I just have visions of the site getting more traffic and grinding to a halt.



I also realise that server and website perfomance matter to google, and they do like faster servings websites.[/quote]

Sure, you can upgrade your package to an higher VPS. The staff can’t know if for example 258MB will be enought in advance. Some carts work well on a shared account, some don’t if the traffic and number of products increase. It’s for each account (site) different.

If you’re really having 5-10 concurrent visitors and yslurp is not being throttled, I would use the following to calculate your needed RAM:

Overhead = 512;

Users = 5;

Mem_size = 128;

RAM = Overhead + (Mem_size * Users);



This would be a minimum.



Also, be sure to put an entry in your robots.txt of:

Crawl-delay: 2.0

Request-rate: 30



The first tells yslurp not to process at more than 1 page every 2 seconds (wait 2 seconds in between requests).

The second tells Google and others not to request more than 30 pages/minute (or it could be one page every 30 seconds, I forget).

Thank you for this,



I have set the request-rate to 1/30m.



Regards

Hello Tbirnseth,


[quote name=‘tbirnseth’]If you’re really having 5-10 concurrent visitors and yslurp is not being throttled, I would use the following to calculate your needed RAM:

Overhead = 512;

Users = 5;

Mem_size = 128;

RAM = Overhead + (Mem_size * Users);



This would be a minimum.



Also, be sure to put an entry in your robots.txt of:

Crawl-delay: 2.0

Request-rate: 30



The first tells yslurp not to process at more than 1 page every 2 seconds (wait 2 seconds in between requests).

The second tells Google and others not to request more than 30 pages/minute (or it could be one page every 30 seconds, I forget).[/QUOTE]



Thank you for the info!



Very interesting,





Lee Li Pop

Regardless of the crawl rate, I would have to say you have other issues. I am on shared hosting and I have bots that crawl pages every second and sometimes multiple pages at the same time and I have not experienced server loads nor memory issues.