Martin Aspeli
Junior Member
Amazon S3 or Amazon Simple Storage Service is a service offered by Amazon Web Services (AWS) that provides object storage through a web service interface. Amazon S3 uses the same scalable storage infrastructure that Amazon.com uses to run its global e-commerce network. Rclone crypt remotes encrypt and decrypt other remotes. A remote of type crypt does not access a storage system directly, but instead wraps another remote, which in turn accesses the storage system. This is similar to how alias, union, chunker and a few others work. It makes the usage very flexible, as you can add a layer, in this case an encryption layer, on top of any other backend.
Existing users of Backblaze B2 Cloud Storage may want to take advantage of the Backblaze B2 S3 Compatible API with existing data stored in B2 Cloud Storage from before this feature was offered. There are two main ways you can transfer data from an existing bucket to a newer S3 compatible bucket. With the news of HubiC starting to die off and constant errors in my rclone logs, has anyone got any experience with B2? I'm always sceptical of cheap storage, even more so with HubiC dying. I never did use their paid 'home' backup solution because I've always kept my important files and projects on my Nas and had that. Open a command prompt and type rclone config. We will start by creating a “backblaze” container pointing to our B2 bucket. Select ‘n’ for a new remote and name it “backblaze”. Use the bucket applicationKeyId as the account and the Application Key itself as the key.
Many of us used Crashplan Home to back up a FreeNAS volume to the cloud. That service is discontinuing, and to be honest, Crashplan was always fairly painful to set up. We now have better alternatives, and here's one of them:- Backblaze B2 as cloud storage. It's faster, cheaper and easier to manage.
- rclone in a cron job to sync files nightly.
1) Create a jail.
2) Mount a dataset on /mnt/Backup or similar that contains the data you want to back up.
3) Install postfix via ports and configure it as an MTA. I used Gmail's SMTP server. Only needed if you want email alerts of successful/failed backups.
4) Sign up to Backblaze B2 and create a 'bucket' for your backups. I think the bucket name as to be globally unique on B2 (waa?).
5) Download rclone from https://rclone.org/downloads/. You likely want the FreeBSD AMD64 binary. Copy it into your jail (or download it via curl in your jail itself). Copy the rclone binary to /usr/local/bin if you prefer to avoid typing the path each time.
6) Configure rclone to talk to your B2 bucket. See https://rclone.org/b2.
7) Run the initial backup. I did about 350Gb in 4 days. You likely want to use something like 'screen' to run this in a terminal you can later detatch from and come back to. I used the following command to do this:
You should obviously adjust the backup path and bucket name according to your setup. If you have a large data set, running this command can take a very long time. If you want to get more information about what it is doing, add the option '--log-level INFO'.
8) Create a script to run nightly backups. Here's mine:
Rclone Backblaze B2
Rclone Backblaze B2 Price
The if/else/fi bit at the end is sending an email telling me if the backup succeeded or failed (the absence of an email overnight means the backup didn't run). For that to work, you have to have postfix or another MTA set up in the jail. Google how to do this if you don't know already. You could also just ignore this part and check the logs periodically, or find some other alerting system.The other thing to note is that I've chosen to use the --fast-list option (saves money and time, but uses more memory), --copy-links (follows symlinks in the backup directory) and --b2-hard-delete (by default, rclone only hides files you delete locally, which means they still cost you money; with this option they are gone from the remote backup. That may or may not be what you want!). The --min-age flag is used to let rclone ignore files that have been very recently modified, e.g. they are partially downloaded or transferred to the NAS. I write all the logs to /var/log/rclone.log (for which ideally you'd set up log rotation) and log at INFO level (reasonably chatty if you have a lot of changes, so you may want to dial it down to NOTICE level to avoid filling up your disk with log files).
9) Create a cron job to run this script nightly, e.g.
... and then install with
Adjust the path and timings (5am every night in this case) as required.
Rclone Backblaze
That's it. It looks like it'll cost me maybe $20/yr to back up 350gb of photos, which is pretty good, and there's no weird Java/ssh-tunnel/config file mangling like there was with Crashplan.