Encrypted offsite backup
Having a good set and forget, but really set and double-check every now and then, strategy for your backups is important. Backups need to be automated to get done but also need to be tested to make sure that you can recover files when needed. This article will look at a home or small company setup doing large scale backups on a budget.
Besides the basic data security with redundant disks or raids you also need to have an offsite backup. For several reasons, there might be a fire destroying all the redundancy or a break-in where all the hardware is lost. You can manage your own offsite if you have the hardware and location, back in the day I had a BitTorrent based backup between my apartments located on two different continents. Most people don’t have that option so we are going to look at the cloud, the obvious solution to this issue.
To make this work we need a lot of storage for a low cost and we need to secure the data. In my case, we are looking at about 5Tb of data that I need to offload from my NAS to a secure backup location. The cheapest storage I found is Google Drive since I have more then 5 users on my G Suite account I have unlimited storage. Looking into this there might be some throttling when you upload a large amount of data in a short time. This shouldn’t be a problem since we will upload everything and then just run differential backups that shouldn’t be that large.
There are a number of different solutions here but I’m looking for a few things that I think are important.
- Security, I want built-in encryption for my offsite. I prefer it to be seamless and handled in the background.
- Open source, if I’m trusting it with the encryption of my backups I want to make sure that the program is solid.
- Support for several different storage systems. At the moment Google Drive works best for me but I might change solutions in the future.
- No service dependency, I don’t want a cloud service holding my encryption keys or anything like that. I want to run it locally against whatever storage I want.
All these put together I really like rclone as a good candidate for this setup. It ticks all the boxes above and supports a bunch of different cloud storage solutions.
Rclone is supported on a lot of different platforms. I will start the initial setup by running it on windows. You can download the rclone for different architectures from their website. Once downloaded and unzipped we run the initial configuration,
The first step is to set up a new remote, in this case, Google Drive. When selecting a name for your remote do not use spaces in the name, ut’s allowed but will make it more difficult later. I will name mine GDrive for future reference.
Under storage, I select number 13 “Googe Drive” as the type. It will now ask for Google Application Client Id, if not provided it will run under rclone’s Id. That is fine for testing but I recommend against it for a few reasons. The client Id is the identification of the application gaining access to your Google Drive. The built-in id is shared between all rclone users and may potentially be a security threat. With your own client id for each service running you can disable access on a central level whenever needed. Also, the Google Drive API used comes with limitations on the number of requests each second per client id, which will make your backup much faster on its own client id. How to setup your own id is well documented here: https://rclone.org/drive/#making-your-own-client-id
The next section is file access, “Access to files created by rclone only” seems like the logical choice but it’s not. This is connected to the client id and if lost or changed at any point it will have to sync everything again. I recommend full access if you’re using your own client id that you control. Then set the root folder to a specific backup folder or even better it’s own shared drive, more on that later.
Now it will ask for a root_folder_id, this is the folder that rclone will sync your backup to. I created a shared Google Drive, since I’m using G Suite, and created a folder named after my backup site. Then navigate to that folder and copy the id from the browsers url.
The next two questions are not needed at this time. Just answer the default answer and then a consent/login screen will appear. Since I’m using a shared drive or a team drive I’ll answer yes to the question Configure this as a team drive? Then select the correct drive from the list presented.
Confirm all the information and save the remote.
Encrypt the remote
We can now sync all our files to Google Drive but we need to encrypt them as well. The encryption is handled by another remote that in turn uses the Google Drive remote we just created. Select to create a new remote and name it something appropriate.
Select option 10 “crypt” for creating an encryption remote. It now want’s a remote to encrypt and a subfolder. I want to encrypt the entire remote GDrive that I created so which means Gdrive:
Next question is about filename encryption, you should really do that to protect the context and metadata around your data. Otherwise, the backup will contain all your filenames in cleartext. The same goes for directories.
It will now ask for two passwords for the encryption. Use strong passwords and keep track of them or you will lose your backups. Then just save the configuration and you are done. I do recommend to password protect your configuration since it contains your encryption passwords, you can do that from the main menu.
Start the sync
rclone sync \192.168.6.5Pictures EncGDrive:Pictures -P
So this will sync of all the files on the NAS share Pictures to an encrypted folder on the Google Team Drive. The -P will show you progress as it runs.
Pingback: Rclone from CLI to GUI « Hackviking