Ever run into the problem of maxing out your cloud capacity in one provider (most possibly on the free tier) and being
left with no other solution but to upgrade because you can’t move your data?
Or got into the solution of migrating data from a work account to a backup?
I got into somewhat of a fix lately as I had to transfer over 4TB of data from AWS to a Google Drive account (with
infinite free storage, courtesty my alma mater). What makes matters worse is that I have only 4TB of metered usuage in
my broadband network, post which I’m downgraded on speed (and we obviously don’t like slow networks).
The final solution that I came up with didn’t drain by broadband quota and didn’t cost me in time or money, which I
think is a great outcome.
The trick is to run the transfer on a VPS solution like digitalocean, with tmux
to resume
sessions. This allows you to use the bandwidth of the VPS providers and not your home network!
Let’s get everything setup:
- Get a free digital ocean account using this link
- Spin up a basic Ubuntu instance on Digital Ocean.
- Install rclone using the following command.
rclone
is the swiss army knife of transferring files between different service providers, unlike scp
it does allow
you to transfer between two different hosts and doesn’t have a one local node
policy. With rclone setup, the rest is
easy.
Install the cloud providers that you want to transfer files between.
Once you have the source and destination providers setup, the transfer is easy.
Running with --ignore-existing
makes the process idempotent in case the SSH connection fails. I prefer running the
command in a tmux
session so that I can reattach the session in a few days when the transfer might have
completed.
Until the next post!