Using Rclone to backup Filespace data to Cloud Storage

  • Updated
Maintaining a copy of your critical data in a secure, off-site location is a key aspect of a robust disaster recovery strategy. While LucidLink Filespaces offer excellent data redundancy and availability, backing up your data to an additional cloud storage provider such as AWS S3 ensures that you have multiple layers of protection.


While there are many software tools that can be utilized for backup tasks, Rclone is a free open source software package that supports all major operating systems. This guide will describe the process for using a tool like Rclone to streamline the process of backing up data from a Filespace to a cloud storage provider.

Tools Required

  • Rclone: A command-line program that supports various cloud storage providers and simplifies data operations.
  • A Supported Cloud Storage Provider
    • Amazon S3
    • Backblaze B2
    • Wasabi
    • Box
    • Dropbox
    • Google Cloud Storage
    • Microsoft Azure Blob Storage
    • Microsoft OneDrive
    • And many more...

Backup Use Case: Filespace to AWS S3

In this guide we will walk through the steps of backing up data from a LucidLink Filespace to an AWS S3 bucket as an example of the general process. While there are many performance and cost level options for the AWS S3 service, in this example we will just be using S3 Standard storage class.

Step-by-Step Guide

Host system:

It is important to choose a host system with the appropriate resources (CPU, RAM, and SSD cache space) and the highest Internet bandwidth available, for both uploads and downloads, so that data transfers can execute as fast as possible. For some organizations, this may be a local workstation or server. If local Internet speeds are limited, creating a host system on a cloud service such as AWS EC2 may be the best option.

1. Download and Install Rclone

Download Rclone for the OS of the host system (Windows, macOS, or Linux) from Rclone's official website.

2. Configure Rclone for AWS S3

Open a terminal or command prompt and run the following command to start the Rclone configuration:

rclone config

Follow these steps in the Rclone configuration:

  • Create a new remote: Type n for a new remote and press Enter.
  • Name the remote: Give your remote a name, e.g., myS3Backup.
  • Choose the storage type: Type 5 for Amazon S3 and press Enter.
  • Choose the provider: Type 1 for AWS S3
  • AWS Access Key ID: Enter your AWS Access Key ID.
  • AWS Secret Access Key: Enter your AWS Secret Access Key.
  • Region: Choose the region your S3 bucket is in (e.g., us-east-1).
  • Skip optional settings: For endpoint, location_constraint, acl, and SSE.
  • Choose storage class: Type 1 for Default, in this example.
  • Advanced configuration: Type n to skip advanced configuration.
  • Type Y to keep remote settings.
  • Type Q to quit config.

3. Mount Your Filespace

Mount your LucidLink Filespace to your host system. On a Mac or Windows system, follow the standard process for connecting to a Filespace using the Lucid App. If you are using a Linux system, please consult the Knowledge Base article that describes the connection process for Linux systems:

4. Perform the Backup

Use the rclone copy command to copy your Filespace data to your AWS S3 bucket. The rclone copy command copies files and subfolders from the Filespace source path to the S3 bucket, but does not transfer files that are identical on the source and destination, testing by size and modification time or MD5SUM. Importantly, rclone copy does not delete data from the destination like the rclone sync command does. Adding the --transfers 30 option enables multiple concurrent transfers to speed up the backup process, and the -P flag provides real time progress updates.

On a Windows systems:

rclone copy L:\path\to\filespace myS3Backup:bucket-name/path --transfers 30 -P

On macOS:

rclone copy /Volumes/path/to/filespace myS3Backup:bucket-name/path --transfers 30 -P

On Linux systems, replace /Volumes/path/to/filespace with the configured mountpoint for the Filespace.

Since large data transfers can take many hours, adding the --dry-run or --interactive flag to your command provides a method for confirming that Rclone will copy the right source data in the desired manner. Once you have confirmed the appropriate command settings you can remove the --dry-run flag in order to execute the backup task.

The first time you run the command Rclone will execute a full backup of the dataset. Since the rclone copy command does not transfer files that are identical on the source and destination, subsequent command executions provide an incremental backup of any new data since the last backup run. Incremental backups should take less time than the initial full backup, assuming that rate of change for the Filespace data is something less than 100%. Automating backup processes can provide continuous incremental backups on a schedule that meets the needs of your organization and its specific workflow patterns. The following Knowledge Base article provides an example of setting up an Rclone backup script to run at scheduled intervals:

Additional Commands and Options

Rclone offers a variety of commands and options to manage your backups. Some useful commands include:

  • rclone sync - Creates an exact mirror of the source, deletes files on the destination. Useful for periodically purging files on the backup destination that have been deleted or moved from the source.
  • rclone move - Moves files from source to destination, can be used for an archive process.
  • rclone check - Checks if the files in the source and destination match.

For more detailed command and options information, please refer to the Rclone documentation.


This guide provides a step-by-step process for backing up your Filespace data to AWS S3 using Rclone. While this guide focuses on AWS S3, Rclone supports various storage providers, making it a versatile tool for data backup. Designing and executing a data backup plan is crucial for protecting your data and ensuring its availability during unexpected events. Each organization has unique data lifecycle and operating expense considerations, but a robust backup strategy is essential for data security and availability.

Was this article helpful?

0 out of 0 found this helpful