How to speed up this backup script.

How to speed up this backup script.

am 07.12.2005 15:08:46 von Bart Kastermans

Below is the script I use for making backups. I am over a rather slow
connection. And this script even if there are no backups takes 20
minutes. Partly due to the slow connection of course, but there is
a lot of authentication going on. Any suggestions on how to speed
things up?

Best,
Bart
http://www.kastermans.nl/bart/

************** the script ***************************
#!/bin/bash
# script to take care of backup of files
#
# This script does not protext against accidental deletions that are
# not found out about before a hard disk crash (then lose the local
# older backups, and uploaded ones mirror the deletion of the file).
# Could be fixed by removing --delete. Reason is to conserve space a
# bit. And getting rid of old files would be a terrible hassle
# (removing them from between still interesting files).
#
# Also in that case it doesn't protext against accidental changes.
# This is what subversion is for though.
#
# Note though that the VU files are being backed up by the VU. So
# there is some protection coming from this. Would be somewhat
# complicated to make use of though.
#
# The Music files and Picture files do not get this protection.

echo running .ssh/ssh_prime
/home/bkasterm/.ssh/ssh_prime
source /home/bkasterm/.ssh-agent-info-bkastermpc

echo Starting backup >> .backuplog
date >> .backuplog
echo
echo Starting backup to VU
echo
echo >> .backuplog
echo Starting backup to VU >> .backuplog

# Files in ToVu.txt
for i in `cat /home/bkasterm/bin/backuphelpfiles/ToVu.txt`
do
echo rsync -avz --delete /home/bkasterm/$i
flits.cs.vu.nl:laptop-backup/$i
echo rsync -avz --delete /home/bkasterm/$i
flits.cs.vu.nl:laptop-backup/$i >> .backuplog
rsync -avz --delete /home/bkasterm/$i flits.cs.vu.nl:laptop-backup/$i
>> .backuplog
done

echo
echo Backup to VU done
echo Backup to VU done >> .backuplog
echo
echo Starting backup to Office Mac
echo Starting backup to Office Mac >> .backuplog
echo


# Files in ToOfficeMac.txt
for i in `cat /home/bkasterm/bin/backuphelpfiles/ToOfficeMac.txt`
do
echo rsync -avz --delete /home/bkasterm/$i
bkastermmac.math.lsa.umich.edu:laptop-backup/$i
echo rsync -avz --delete /home/bkasterm/$i
bkastermmac.math.lsa.umich.edu:laptop-backup/$i >> .backuplog
rsync -avz --delete /home/bkasterm/$i
bkastermmac.math.lsa.umich.edu:laptop-backup/$i >> .backuplog
done

echo >> .backuplog
echo Backup to Office Mac done
echo Backup to Office Mac done >> .backuplog
echo >> .backuplog

echo Starting local backup
echo Starting local backup >> .backuplog

dir=$(date +%Y%W)

if test ! -e /home/bkasterm/Backup/$dir
then
mkdir /home/bkasterm/Backup/$dir
echo made weekly directory $dir
echo made weekly directory $dir >> .backuplog
fi

for i in `cat /home/bkasterm/bin/backuphelpfiles/Local.txt`
do
echo rsync -avz --delete /home/bkasterm/$i
/home/bkasterm/Backup/$dir/$i
echo rsync -avz --delete /home/bkasterm/$i
/home/bkasterm/Backup/$dir/$i >> .backuplog
rsync -avz --delete /home/bkasterm/$i /home/bkasterm/Backup/$dir/$i
>> .backuplog
done

echo local backup done
echo local backup done >> .backuplog

echo Complete backup done
echo Complete backup done >> .backuplog

Re: How to speed up this backup script.

am 07.12.2005 20:57:47 von nospam

One method:

schedule a job that runs locally and generates a file containing
filenames-to-be-backed-up.
(include a logfile that you can parse on the other end to let you know of
any issues in the running of this script).

Then, when you subsequently login you can simply iterate through
the files to be copied list.

Should you need it to be faster still, modify the list of modified file
names idea to create a compressed tarball
containing the files-to-be-transferred, so that you have only one compressed
file to transfer.

Re: How to speed up this backup script.

am 08.12.2005 03:31:50 von Bart Kastermans

I have "static" files with directories and files to be backed up. So
no need to generate such a thing.

The one compressed file to transfer would not be quicker because
as I understand it rsync works with diffs. If a big file changes in a
small
way then only the small change is transferred. And for files that have
not changed, this is noted, and nothing is done about the file.

The only speed increase I can currently imagine is to remove the
repeated authentication. You suggest this is somehow possible
by "when you subsequently login you can simply iterate through the
files to be copied list". I have no idea how to achieve this though.

Best,
Bart

Re: How to speed up this backup script.

am 08.12.2005 19:12:48 von eedmit

In article <1133964526.470001.282490@g14g2000cwa.googlegroups.com>, "Bart Kastermans" writes:
> Below is the script I use for making backups. I am over a rather slow
> connection. And this script even if there are no backups takes 20
> minutes. Partly due to the slow connection of course, but there is
> a lot of authentication going on. Any suggestions on how to speed
> things up?
>
> Best,
> Bart
> http://www.kastermans.nl/bart/
>
> ************** the script ***************************
> #!/bin/bash
> # script to take care of backup of files
> #
> # This script does not protext against accidental deletions that are
> # not found out about before a hard disk crash (then lose the local
> # older backups, and uploaded ones mirror the deletion of the file).
> # Could be fixed by removing --delete. Reason is to conserve space a
> # bit. And getting rid of old files would be a terrible hassle
> # (removing them from between still interesting files).
> #
> # Also in that case it doesn't protext against accidental changes.
> # This is what subversion is for though.
> #
> # Note though that the VU files are being backed up by the VU. So
> # there is some protection coming from this. Would be somewhat
> # complicated to make use of though.
> #
> # The Music files and Picture files do not get this protection.
>
> echo running .ssh/ssh_prime
> /home/bkasterm/.ssh/ssh_prime
> source /home/bkasterm/.ssh-agent-info-bkastermpc
>
> echo Starting backup >> .backuplog
> date >> .backuplog
> echo
> echo Starting backup to VU
> echo
> echo >> .backuplog
> echo Starting backup to VU >> .backuplog
>
> # Files in ToVu.txt
> for i in `cat /home/bkasterm/bin/backuphelpfiles/ToVu.txt`
> do
> echo rsync -avz --delete /home/bkasterm/$i
> flits.cs.vu.nl:laptop-backup/$i
> echo rsync -avz --delete /home/bkasterm/$i
> flits.cs.vu.nl:laptop-backup/$i >> .backuplog
> rsync -avz --delete /home/bkasterm/$i flits.cs.vu.nl:laptop-backup/$i
> >> .backuplog
> done
>
> echo
> echo Backup to VU done
> echo Backup to VU done >> .backuplog
> echo
....

Have *one* rsync with an exlude file!

command="rsync -avz --delete \
--exclude-from=/home/bkasterm/bin/backuphelpfiles/NotToVu.tx t \
/home/bkasterm flits.cs.vu.nl:laptop-backup"

echo "$command" >> .backuplog
eval $command

--
Michael Tosch @ hp : com

Re: How to speed up this backup script.

am 09.12.2005 12:00:13 von Bart Kastermans

That is an excellent option. Not only does it perfectly answer
my question, but it also makes the script more robust. Excluding
things that need not be backed up, means that new directories
and files automatically get backed up unless specificaly specified
otherwise.

Best,
Bart

Re: How to speed up this backup script.

am 09.12.2005 14:36:53 von cfajohnson

On 2005-12-09, Bart Kastermans wrote:
> That is an excellent option.

What is? Please read .

> Not only does it perfectly answer
> my question, but it also makes the script more robust. Excluding
> things that need not be backed up, means that new directories
> and files automatically get backed up unless specificaly specified
> otherwise.


--
Chris F.A. Johnson, author |
Shell Scripting Recipes: | My code in this post, if any,
A Problem-Solution Approach | is released under the
2005, Apress | GNU General Public Licence