My website is taking over 70% cpu usuage on shared hosting that is with Fastcomet. I cannot afford VPS therefore can someone advise i can run my website on a home server or NAS such as a DLINK or Qnap? I have 100mbps broadband and only get 100 visitors per day. Thanks.
I use Fastcomet's shared hosting and get about the same number of visitors per day as you and don't have a problem with that setup. Where did the 70% cpu usage statistic come from? 70% sounds excessive but I don't know how to test that.
A 100mbps connection for downloading is usually about 20mbps uploading which is how fast the site will be for users. You can check both here: http://www.speedtest.net/. Mine shows 95Mbps downloading and 19Mbps uploading.
I only recently had this over 70% cpu useage. I haven't changed anything. I have noticed very high hits. I have blocked all these IP, hopefully it will resolve the issue.
Unknown robot (identified by 'bot' followed by a space or one of the following characters _+:,.;/\-)
188.143.232.70
17,210
17,210
3.88 GB
23 Jun 2016 - 07:00
188.143.232.72
10,986
10,986
1.74 GB
23 Jun 2016 - 07:06
188.143.232.40
7,303
7,303
1.18 GB
20 Jun 2016 - 09:12
188.143.232.22
7,058
7,058
1.14 GB
20 Jun 2016 - 09:06
188.143.232.34
7,053
7,053
1.13 GB
23 Jun 2016 - 06:49
188.143.232.62
7,013
7,013
1.13 GB
23 Jun 2016 - 06:51
188.143.232.13
6,404
6,404
1.03 GB
23 Jun 2016 - 06:54
188.143.232.21
6,180
6,180
1016.99 MB
23 Jun 2016 - 06:33
188.143.232.24
6,154
6,154
1014.15 MB
23 Jun 2016 - 06:31
188.143.232.19
6,118
6,118
1006.35 MB
23 Jun 2016 - 06:30
188.143.232.35
6,093
6,093
1002.36 MB
23 Jun 2016 - 06:30
188.143.232.15
6,081
6,081
999.99 MB
23 Jun 2016 - 06:32
188.143.232.11
6,079
6,079
1000.52 MB
23 Jun 2016 - 06:29
188.143.232.43
6,076
6,076
1001.64 MB
23 Jun 2016 - 06:30
188.143.232.41
6,038
6,038
993.80 MB
23 Jun 2016 - 06:32
188.143.232.37
6,020
6,020
988.59 MB
23 Jun 2016 - 06:32
188.143.232.14
5,993
5,993
986.99 MB
20 Jun 2016 - 09:13
188.143.232.16
5,971
5,971
983.87 MB
23 Jun 2016 - 06:32
188.143.232.26
5,964
5,964
981.01 MB
23 Jun 2016 - 06:32
188.143.232.10
5,804
5,804
952.25 MB
23 Jun 2016 - 06:32
Search engine spiders “crawl†your website. If this occurs too frequently, performance may be affected. Please make sure that your robots.txt file contains a Crawl-delay setting of 30 seconds or higher. For example:
User-agent: * Crawl-delay: 30
I also recommend to set up connection and requests limit on your web-server to prevent CPU overloading. If you're nginx user you can do it with easy by specifying limit_req and limit_conn. If you don't know how, ask your hosting provider to do it for you.
I enabled cloudfare, the cpu load has been lower than 25% since then.