Backing Up & Monitoring

I checked out the site monitoring threads and will check those out.



Does anyone use a third party tool for site backups? My host only backs up once a week and that scares me, plus I suspect they only keep one version. I can backup through the CP and of course the database backup through CS-Cart is nice too, but if I have to do it manually it ain't gonna happen. I'd prefer something automatic every day.



It would be -really- great if one tool could handle the monitoring and backups, just to make my life easier and happier.



Or if there is a tool within CS-Cart that I have not discovered yet let me know!

New host? Backing up once a week would be a disaster if it was ever needed.

Your site and DB should be backed up daily by your hosting company.

[quote name='tbirnseth' timestamp='1308875681' post='115638']

New host? Backing up once a week would be a disaster if it was ever needed.

Your site and DB should be backed up daily by your hosting company.

[/quote]

Yup, it's Hostgator, which otherwise I am happy with. I was surprised by that because I have cheaper hosting plans that are backed up daily or every other day, so honestly I didn't even think to look into it before I switched. I took it as a given which was a mistake.

Since you have a VPS, I believe there is some way to do a cron job that does it automatically for you. I've just not figured this one out yet.

cd /home/of/your/store; tar -czf /your/safe/place/store_backup.tgz .

hostgator will not take backup once you reached the 25GB space, as done for my BABY Plan. :huh:

[quote name='tbirnseth' timestamp='1308888322' post='115653']

cd /home/of/your/store; tar -czf /your/safe/place/store_backup.tgz .

[/quote]



This is why I still have not figured the “cron” thing out yet. The “cron” world is still just a lost, not found world for me and I just do not know how or where to use it.

[quote name=‘clips’ timestamp=‘1308926205’ post=‘115694’]

This is why I still have not figured the “cron” thing out yet. The “cron” world is still just a lost, not found world for me and I just do not know how or where to use it.

[/quote]

Me too! I’m fairly nerdy but Cron is a big mystery. :unsure:

Just about any question has a video answer on youtube these days. I use them on a daily basis which has taught me lots and lots that I otherwise would of never known.



Type in Cpanel Cron Job and you will find lots of video tutorials. Typically I find the video with the most views/most recent and go with it. These tend to be the most relevant.



Stu

WHM → Backup → configure it → done



(I have mine set so it backups my VPS every night at 3am and copies it onto my other VPS at a different host, and the other way around as well.)



Also, on one VPS I have a plan for 10 euro a month so that the host backups on an outside location daily.

If you have a VPS then you should have the administrative skills to manage it. If you don't, then you need to hire out to someone who does. That's why I strongly discourage merchants without Linux/Unix Administrative experience from going the VPS route. In most cases it is far overkill. Not to repeat, but a VPS is still a “share” of a machine. If the host overloads it, then you'll get less than you would with a shared virtual server.



Since most VPS environments are canned profiles (can be easily recreated by the host), it shouldn't be necessary to back up the whole VPS more than once a month or so. But your store and DB should be backed up daily.

Oh thanks, I see the backup option under WHM. When I checked with HostGator they didn't mention that. I will check it out. I totally agree that I am in over my head a little with VPS.



Also I am looking into outside backup services. With the news being what it has been lately, I think trusting my hosts to backup is a really bad idea. So far whenever I needed to restore a backup with a host it's been no problem, but times are different now.

I have created a bash script that backsup up the cs-cart site then sends it via ftp to a backup server at home. This is started via cron which also send an email to me when done. I am still on my own for backing up the database which I just use the tools in the admin. This backup of the database is backuped by the above script. I try to do the database backup daily and run this script weekly

I have tried to document the script.


#!/bin/bash #bash script need to start off with something like this, can change a bit from machine to machine
cd /home/were-ever/public_html #change to base dir of cs-cart
f=backup$(date +%d-%m-%Y_%H%M).tgz #assign "f" name of backup file.. mine uses word backup and the date and time in it, so the name will always be different, so not to over right a previous backup.

bakdir=/home/were-ever/site_backups/ #assign "bakdir" path to were I want backup file created.. mine is above the directory of the website, so as not to create backups of other backups.
b=$bakdir$f #put together "f" and "bakdir" from above
echo $f # show above string "f" reported to email from cron job
tar -cvzf $b -X ~/public_html/exclude.files . > /dev/null # create the backup file, using the exclude to exclude some temp directories from the backup
echo "backup" $b "is done" #tell me the backup was done. reported to email from cron job
c=/home/were-ever/site_backups/backup$(date +%d-%m-%Y)* #assign "c" a path and filename of the backups for that day, without the hour and minutes, but an * to see all that days backup files
ls -ogh $c |awk '{print ("\n") $7, $3, $4, $5, $6}' #sorts out list of files created in the line above into a readable format for me, reported to email from cron job
echo -e "\n ftp has started \n " #show that I'm about to start ftp of the file to my off site server reported to email from cron job
ftp -n "ftp_domain_name_of_off_site_server (happens to be in my basement)" << cmd #starts ftp process and I think the << cmd passes on the next few lines untill quit
user "ftp_username" "ftp_password" #passes on user and password to ftp
lcd ~/site_backups #changes working directory on website server
/cd /jdnabackups #changes directory on the remote server
put $f #starts moving current backup file asigned to "f" above
dir #prints a list of files on the remote backup machine reported to email from cron job
quit #Quits the ftp program
cmd #I really don't remember why this line, I guess to exit from bash script




And here is the email I receive:


backup15-10-2012_0001.tgz
backup /home/site_backups/backup15-10-2012_0001.tgz is done
/home/site_backups/backup15-10-2012_0001.tgz 343M Oct 15 00:07
ftp has started

Local directory now /home/site_backups
drwxr-xr-x 2 5004 client1 4096 Oct 14 23:08 .
drwxr-xr-x 32 5004 client1 4096 Sep 18 16:45 ..
-rw-r--r-- 1 5004 client1 360251858 Sep 30 23:11 backup01-10-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324872380 Jul 1 23:10 backup02-07-2012_0001.tgz
-rw-r--r-- 1 5004 client1 350120123 Sep 2 23:11 backup03-09-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324422374 Jun 3 23:07 backup04-06-2012_0001.tgz
-rw-r--r-- 1 5004 client1 840377742 Mar 5 2012 backup05-03-2012_0001.tgz
-rw-r--r-- 1 5004 client1 341198243 Aug 5 23:15 backup06-08-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324386503 May 6 23:08 backup07-05-2012_0001.tgz
-rw-r--r-- 1 5004 client1 358772249 Oct 7 23:22 backup08-10-2012_0001.tgz
-rw-r--r-- 1 5004 client1 358314884 Apr 8 2012 backup09-04-2012_0001.tgz
-rw-r--r-- 1 5004 client1 325057330 Jul 8 23:08 backup09-07-2012_0001.tgz
-rw-r--r-- 1 5004 client1 823128895 Feb 10 2012 backup10-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 357640655 Sep 9 23:19 backup10-09-2012_0001.tgz
-rw-r--r-- 1 5004 client1 823158139 Feb 11 2012 backup11-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324476414 Jun 10 23:06 backup11-06-2012_0001.tgz
-rw-r--r-- 1 5004 client1 823219587 Feb 12 2012 backup12-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 840727289 Mar 11 2012 backup12-03-2012_0001.tgz
-rw-r--r-- 1 5004 client1 823240712 Feb 13 2012 backup13-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 830494518 Feb 13 2012 backup13-02-2012_1235.tgz
-rw-r--r-- 1 5004 client1 341665356 Aug 12 23:20 backup13-08-2012_0001.tgz
-rw-r--r-- 1 5004 client1 830522750 Feb 14 2012 backup14-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324493573 May 13 23:07 backup14-05-2012_0001.tgz
-rw-r--r-- 1 5004 client1 830559073 Feb 15 2012 backup15-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 359247192 Oct 14 23:09 backup15-10-2012_0001.tgz
-rw-r--r-- 1 5004 client1 831510094 Feb 16 2012 backup16-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 358495264 Apr 15 2012 backup16-04-2012_0001.tgz
-rw-r--r-- 1 5004 client1 325539055 Jul 15 23:20 backup16-07-2012_0001.tgz
-rw-r--r-- 1 5004 client1 831513268 Feb 17 2012 backup17-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 358770695 Sep 16 23:15 backup17-09-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324522258 Jun 17 23:08 backup18-06-2012_0001.tgz
-rw-r--r-- 1 5004 client1 840188195 Mar 18 2012 backup19-03-2012_0001.tgz
-rw-r--r-- 1 5004 client1 831595875 Feb 20 2012 backup20-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 341640229 Aug 19 23:12 backup20-08-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324403117 May 20 23:05 backup21-05-2012_0001.tgz
-rw-r--r-- 1 5004 client1 358964010 Apr 22 23:07 backup23-04-2012_0001.tgz
-rw-r--r-- 1 5004 client1 327338891 Jul 22 23:10 backup23-07-2012_0001.tgz
-rw-r--r-- 1 5004 client1 360002438 Sep 23 23:31 backup24-09-2012_0001.tgz
-rw-r--r-- 1 5004 client1 324813239 Jun 24 23:07 backup25-06-2012_0001.tgz
-rw-r--r-- 1 5004 client1 348247412 Mar 25 2012 backup26-03-2012_0001.tgz
-rw-r--r-- 1 5004 client1 831377509 Feb 27 2012 backup27-02-2012_0001.tgz
-rw-r--r-- 1 5004 client1 349925130 Aug 26 23:24 backup27-08-2012_0001.tgz
-rw-r--r-- 1 5004 client1 356455187 Mar 28 2012 backup28-03-2012_1509.tgz
-rw-r--r-- 1 5004 client1 324379423 May 27 23:05 backup28-05-2012_0001.tgz
-rw-r--r-- 1 5004 client1 360311261 Apr 29 23:23 backup30-04-2012_0001.tgz
-rw-r--r-- 1 5004 client1 327020895 Jul 29 23:21 backup30-07-2012_0001.tgz
-rw-r--r-- 1 5004 client1 68646104 Apr 18 08:02 survey_backup18-04-2012_0902.tgz
-rw-r--r-- 1 5004 client1 68646104 Apr 18 08:04 survey_backup18-04-2012_0904.tgz




It may not be pretty, but seems to do the job for me.

Hope this helps,

Dave

Add this to your script for backing up the DB. This assumes you have a my.cnf file that sets the db_user and db_password so you don't have it sitting on the command line for someone to grab from a ps.

```php

db_name=“whatever your db name is”

db_bkup_file=echo $f|sed 's;tgz;sql.gz;

mysqldump $db_name | gzip > $db_bkup_file

```php

After hearing a horror story from someone whose host just dissapeared, backups and website and all, I started to look for a 3rd party / offsite backup provider again. I used Amazon storage before but kept running into problems with cancelled backups, etc…
In short: I stumbled upon Backupsheep (https://backupsheep.com/), set it up (7 months ago now) and have not looked back. So for everyone who is still not backing up their website or is using their hosts backup… go check it out. I set it up in minutes and sleep better now :slight_smile:

FYI I backup my complete website each month, and the database daily.

1 Like

While there is a lot of backup solutions available (including the built-in CS-Cart backup), we wrote our own small bash script to quickly make backups.

#!/bin/sh
DATA=`date +%Y%m%d_%H_%M_%S`
SERVER='your_server'
DB='database_name'
USER='database_user'
PASS='database_password'
tar --exclude='../var/cache' --exclude='../images/thumbnails' --exclude='../backup' --exclude='../old' --exclude='../dev'  -zcf ./$DATA.tgz ../
mysqldump --skip-lock-tables -h $SERVER -u $USER -p"$PASS" $DB > ./$DATA.sql

You should put this script in the /backup directory (it should be on the same level as /app or /design directory), replace the credentials with those from your store, and launch the script. It packs both files and database of the store, ignoring some directories with are usually not necessary, and often take a lot of space.

1 Like

But this backs up on your own server? That doesn’t help a lot with host problems or a hardware failure :slight_smile:

Flow - after prepare backup, via ftp or sftp can transfer anywhere.
We have been using this solution for a long time and it has saved our “ass” many times :wink:
This is a great way to make a quick backup if, for example, you start making changes in software.
In addition, there is a second script that quickly puts the store in a different location from files prepared before.

Best regards
Robert

2 Likes

May be the following module can help you?

1 Like

ah yes of course … this script is perfect to then upload to a service like backupsheep.

1 Like