Hostwinds Tutorials

Search results for:


Table of Contents


Prerequisites
Step 1: Install rclone
Step 2: Set Up a Remote
Step 3: Sync Your Files
Step 4: Automate the Process with Cron
Step 5 (Optional): Encrypt Your Files
Optional Settings for Better Control
Alternatives to rclone
Tool Comparison at a Glance
s3cmd – Simple and Script-Friendly
s3fs – Mount Object Storage Like a Filesystem
AWS CLI – Ideal for Amazon S3 Integrations
Wrapping Up

Sync a Linux VPS with Object Storage (Rclone)

Tags: VPS,  Linux 

Prerequisites
Step 1: Install rclone
Step 2: Set Up a Remote
Step 3: Sync Your Files
Step 4: Automate the Process with Cron
Step 5 (Optional): Encrypt Your Files
Optional Settings for Better Control
Alternatives to rclone
Tool Comparison at a Glance
s3cmd – Simple and Script-Friendly
s3fs – Mount Object Storage Like a Filesystem
AWS CLI – Ideal for Amazon S3 Integrations
Wrapping Up

Keeping your Linux VPS synced with object storage is a smart way to manage backups, store static assets, or offload data for performance and storage flexibility.

This guide walks through the process using rclone, a lightweight and powerful tool that works with most object storage services.

Prerequisites

Here's what you'll want to have in place before getting started:

  • A Linux VPS: You'll need shell access to your VPS. This is where the files or directories you want to sync are located.

  • An object storage provider: There are several object storage service providers to choose from. Pick one that can scale with your resources, integrates with your existing infrastructure, and of course fits your budget.

  • Access credentials: You'll need an access key and secret key to authenticate with your storage provider's API. These are used securely by the sync tool to interact with your bucket.

Step 1: Install rclone

rclone is a free, open-source command-line utility for managing files on cloud storage. It supports numerous different providers, and works well for syncing, copying, encrypting, and automating transfers.

1. log in to your VPS via SSH.

ssh user@your-vps-ip

2. Install rclone:

curl https://rclone.org/install.sh | sudo bash

This will fetch and install the latest version.

If you prefer using your system's package manager:

Debian/Ubuntu:

sudo apt install rclone

CentOS/RHEL:

sudo yum install epel-release
sudo yum install rclone

Fedora:

sudo dnf install rclone

Step 2: Set Up a Remote

Now you'll set up rclone so it can talk to your storage bucket

Start the interactive config tool:

rclone config

Follow these steps in the menu:

  1. Choose n to create a new remote.
  2. Give it a name like myremote. This name is used in commands to refer to the connection
  3. Pick your storage provider from the list.
  4. Enter your access key and secret key.
  5. Input any region-specific endpoints or configuration as required.
  6. Accept the default options unless you know you need something specific.

Now let's test your setup:

rclone ls myremote:

If all is configured correctly, you'll either see a list of your storage buckets or it will return nothing without showing an error.

This configuration creates a continued, reusable connection profile, and allow you to use it across multiple directories or scripts.

Step 3: Sync Your Files

Now that rclone is set up, you can sync a local directory (like your website or project files) with your object storage.

For example, say you want to back up your website from /var/www/html to your object storage:

rclone sync /var/www/html myremote:backups/html --progress

Here's what this command does:

  • Compares your local folder (/var/www/html) to the target in object storage (myremote:backups/html
  • Transfers any new or updated files
  • Removes files from the destination that no longer exist in the local source (you can avoid this—see below)

If you want to only upload new or changed files without removing anything from the destination, you can use copy instead:

rclone copy /var/www/html myremote:backups/html --progress

Step 4: Automate the Process with Cron

To keep your files updated automatically, you can set up a cron job:

1.Open your crontab:

crontab -e

2. Add this line to sync files every day at midnight:

0 0 * * * /usr/bin/rclone sync /var/www/html myremote:backups/html --quiet

You can change the time and path as needed. Make sure the path to rclone matches the location where it was installed (which rclone will tell you).

Step 5 (Optional): Encrypt Your Files

If you're handling sensitive data—user files, internal documents, or anything you wouldn't want exposed—rclone lets you add client-side encryption before uploading.

Here's how we set that up:

1.Run rclone config again:

rclone config

2. Add a new remote and choose crypt as the storage type.

3. Point it to your original remote's folder (e.g., myremote:backups/html).

4. Choose a password and confirm.

After that, you can use your encrypted remote to upload files like this:

rclone sync /var/www/html mycryptremote:html --progress

Optional Settings for Better Control

Here are a few helpful flags you can use:

--fast-list speeds up the process of scanning for large files by using more memory during file listing:

rclone sync /var/www/html myremote:backups/html --fast-list

--bwlimit caps bandwidth to avoid slowing down other processes:

rclone sync /var/www/html myremote:backups/html --bwlimit 1M

--log-file keeps a detailed record of each sync to a log file:

rclone sync /var/www/html myremote:backups/html --log-file=/var/log/rclone.log

Alternatives to rclone

While rclone is one of the most flexible and widely recommended tools for syncing with object storage, there are other utilities worth knowing about—especially if you're looking for different workflows, deeper integration, or specific compatibility.

Here are some alternatives, how they work, and when they might be a better fit depending on your needs.

Tool Comparison at a Glance

Tool

Best For

Notes

rclone

Cross-provider sync, advanced workflows

Supports 40+ cloud services, very customizable

s3cmd

S3-only tasks, scripting, simplicity

Lightweight and straightforward

s3fs

Filesystem-style access, application compatibility

Good for legacy apps or simple drag-and-drop workflows

AWS CLI

Deep S3 integration, AWS-native setups

Ideal for full AWS environments

s3cmd – Simple and Script-Friendly

s3cmd is a command-line tool specifically built for interacting with Amazon S3 and S3-compatible object storage services. It's well suited for scripting simple upload, download, and sync tasks, especially in cron jobs or automated deployment pipelines.

If you're managing S3 buckets across environments or want a tool that sticks closely to the S3 API, s3cmd offers simple, familiar commands and solid documentation.

When to use it:

  • You're already working with S3 or a compatible provider (Wasabi, Backblaze B2 with S3 API enabled, etc.).
  • You want a lightweight, straightforward tool that plays well with shell scripts.
  • You prefer a tool designed specifically for the S3 protocol.

Installation:

sudo apt install s3cmd   # On Debian/Ubuntu

Configuration:

s3cmd --configure

You'll be prompted for:

  • Access Key
  • Secret Key
  • Endpoint (if not using AWS)
  • Optional encryption or HTTPS settings

Basic sync example:

s3cmd sync /var/www/html/ s3://your-bucket/html/

s3fs – Mount Object Storage Like a Filesystem

s3fs lets you mount an S3 bucket as if it were a local directory. This means you can use standard Linux commands (cp, mv, rsync, etc.) to work with your cloud storage like you would a normal disk.

This is especially useful if you have tools that aren't cloud-aware but can write to a file path. s3fs makes it seamless to redirect that output to object storage. However, performance isn't always as fast as native file systems, and it's not ideal for high-frequency read/write activity.

When to use it:

  • You need applications to read/write files as if they were on a local disk.
  • You're working with software that doesn't support direct cloud uploads.
  • You prefer the flexibility of direct filesystem access to object storage.

Installation:

sudo apt install s3fs

Setup credentials:

echo ACCESS_KEY:SECRET_KEY > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs

Mount the bucket:

mkdir ~/mybucket
s3fs your-bucket-name ~/mybucket -o passwd_file=~/.passwd-s3fs

AWS CLI – Ideal for Amazon S3 Integrations

The AWS CLI is Amazon's official command-line tool for managing nearly every part of AWS—including S3.

If you're doing more than just backups—such as setting bucket permissions, managing versioning, or triggering Lambda functions—the AWS CLI provides more control and options than other tools.

When to use it:

  • You're already using AWS services and want to integrate storage tasks into your workflow.
  • You need full access to S3 features like lifecycle rules, permissions, and object tagging.
  • You want compatibility with automation or CI/CD pipelines.

Installation (Linux):

curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
unzip awscliv2.zip
sudo ./aws/install

Configure it:

aws configure

When prompted, enter:

  • Access Key
  • Secret Key
  • Default region
  • Output format (json, text, etc.)

Sync example:

aws s3 sync /var/www/html s3://your-bucket-name/html

Wrapping Up

Keeping your VPS in sync with object storage is a reliable way to back up data and make content accessible. Whether you're running regular uploads or syncing live folders, rclone gives you the tools to do it efficiently.

With options for encryption, logging, and automation, you can set up a system that works quietly in the background—just the way you want it.

Written by Hostwinds Team  /  June 11, 2021