transferring files (curl, wget, ...) but check local existence first
am 26.09.2007 17:17:51 von spluqueHi,
I'm planning a small bash script to regularly download files from an ftp
site. The files are created daily in the ftp site and have the date
included in their name (yymmdd.txt). I'm thinking of having an option to
specify the beginning date for the downloads, so that one can download the
files starting at any date through today. The two utilities I'm somewhat
familiar for doing this are curl and wget, but none of them have an option
to avoid overwriting any files that exist locally. Therefore, I'd
appreciate some input as to how best to proceed. Are there other tools
that allow for this proviso? If not, I'm thinking of a while loop:
---<---------------cut here---------------start-------------->---
# ${BEG} and ${TODAY} have the date substring of the file and a compatible
# representation of today's date.
DATE_COUNTER="${BEG}"
while [ "${DATE_COUNTER}" -lt "${TODAY}" ]; do
if [ ! -f "${DATE_COUNTER}.txt" -a -r "${DATE_COUNTER}.txt" ]; then
curl ftp://url.com -O
else
echo "${DATE_COUNTER}.txt exists locally. Skipping."
fi
DATE_COUNTER=$(date -d "${DATE_COUNTER} + 1 days" +%y%m%d)
done
---<---------------cut here---------------end---------------->---
but this has the disadvantage of calling curl (or wget) multiple times
which may cause some strain on the server. curl has a convenient wildcard
system for specifying several url's in the same call, but without the
overwrite check, is not useful to me here. Is there a more efficient way
of doing this? Thanks.
--
Seb