I have a shell / bash script that works perfectly for making backups, the problem is that I have large files that are giving problems in running the script. The script has to compress the file in tar.gz
format and it does this, but when it arrives at 6GB + or- the script continues to compress the file but passes the next lines and the backups are faulty, the server must have a set_time_limit;
The script:
MYSQLDUMP="$(which mysqldump)"
$MYSQLDUMP -u $DBUSER -h $DBHOST -p$DBPASS $DBNAME | gzip > $TIMESTAMP.sql.gz
ssh $USER_SSH@$HOST_SSH "tar -zcf - $HOME" > $TIMESTAMP.backup.tar.gz
tar -zcf $TIMESTAMP.tar.gz $TIMESTAMP.backup.tar.gz $TIMESTAMP.sql.gz
SUCCESS=$?
rm $TIMESTAMP.sql.gz
rm $TIMESTAMP.backup.tar.gz
I did not put the variants because I think it is not necessary.
Before ending set_time_limit(0);
it removes the 2 files from the end lines ... if the file is less than about 6 GB or 7GB this does not happen