Syncing with Rclone
I have a ton of data spread across two different GSuite accounts. Well into the tens of TBs. I need to pull it all back down, without going through the process of manually screwing with it. I grabbed
someone else’s sync script and set it up on my
crontab for my local user.
If you have any issues setting up rclone, definitely check out my post:
Getting rclone Setup For Google Team Drive.
Below is the script, which I’ve saved as
# This was pilfered from https://creedofman.com/scripting-tidbits-google-drive-and-rclone-backups/
# rclone flags:
# --stats=5s (print stats every 5 seconds, default without setting is 1 minute)
# -P (show progress)
# --max-transfer=[int]G (end `rclone copy` after # GBs of data has been transferred, for Google Drive limit to 750GB total in script)
# --transfers=[int] (set number of parallel transfers, default without setting is 4)
# --exclude-if-present .rclone-ignore (if `.rclone-ignore` is present in a directory, ignore the directory and its contents, `.rclone-ignore` can be any file)
echo "Checking for already running rclone"
if ! pgrep -x "rclone" > /dev/null
rclone -P --stats =5s --max-transfer =100G sync config1:/ ~/storage/gdrive/config1/
rclone -P --stats =5s --max-transfer =650G sync config2:/ ~/storage/gdrive/config2/
echo "rclone already running, exiting"
Now we need to run it daily at 4am.
0 4 * * * /home/livearchivist/storage/la/rclone-sync-script/rclone-sync.sh