You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have about 5000 Google accounts I'd like to download their own Google drive files.
If I do one run of all of the accounts, one bad file will crash the backup of the remaining accounts. Plus it will take days.
Right now I have a BASH script that allows me to backup all of the accounts (one at a time) but 15 accounts simultaneously. (I can actually max out our company's internet pipe this way.) However I then wind up with 5000 backup folders, each having only one account. I then wrote another script the rsyncs all of the folders into one "combined" folder.
If I were to add a command line parm to make the destination folder NOT add the timestamped subfolder, would this create issues? Since I'm using a ZFS filesystem with snapshotting, I don't really need the timestamped folder names.
The text was updated successfully, but these errors were encountered:
I'm trying to figure out if this is a bad thing.
I have about 5000 Google accounts I'd like to download their own Google drive files.
If I do one run of all of the accounts, one bad file will crash the backup of the remaining accounts. Plus it will take days.
Right now I have a BASH script that allows me to backup all of the accounts (one at a time) but 15 accounts simultaneously. (I can actually max out our company's internet pipe this way.) However I then wind up with 5000 backup folders, each having only one account. I then wrote another script the rsyncs all of the folders into one "combined" folder.
If I were to add a command line parm to make the destination folder NOT add the timestamped subfolder, would this create issues? Since I'm using a ZFS filesystem with snapshotting, I don't really need the timestamped folder names.
The text was updated successfully, but these errors were encountered: