Skip to main content

rclone Sync

rclone sync makes the destination identical to the source. It copies new and changed files, and deletes files from the destination that don't exist in the source. This makes it the most powerful — and most dangerous — rclone transfer command.

Destructive Command

sync will delete files at the destination that don't exist at the source. Always run with --dry-run first, and consider using --backup-dir to keep deleted files for recovery.

Basic Syntax

rclone sync SOURCE DESTINATION [flags]

The destination becomes an exact mirror of the source. Direction matters:

# Local → Remote (push mirror)
rclone sync /local/path remote:bucket/path

# Remote → Local (pull mirror)
rclone sync remote:bucket/path /local/path

# Remote → Remote
rclone sync remoteA:path remoteB:path

Key Flags

FlagDescription
--dry-run / -nEssential — preview changes without executing
--progress / -PShow real-time progress
--verbose / -vShow files transferred and deleted
--backup-dir PATHMove deleted/overwritten files here instead of permanently deleting
--suffix SUFFIXAdd suffix to backup files (e.g., .bak)
--max-delete NAbort if more than N files would be deleted
--track-renamesDetect renamed files (avoids re-upload)
--transfers NParallel file transfers (default 4)
--bwlimit RATEBandwidth limit (e.g., 10M)
--log-file PATHWrite detailed log
--log-level LEVELDEBUG, INFO, NOTICE, ERROR

Safe Execution Pattern

Step 1 — Preflight Check

# Verify source exists and is not empty
test -d /srv/data && test "$(ls -A /srv/data | wc -l)" -gt 0 || {
echo "Source is missing or empty — aborting"
exit 1
}

# Verify remote is reachable
rclone lsd remote:bucket > /dev/null 2>&1 || {
echo "Remote unreachable — aborting"
exit 1
}

Step 2 — Dry Run

rclone sync /srv/data remote:bucket/data \
--dry-run -v --log-file /var/log/rclone-dryrun.log

Review the output carefully — look for unexpected deletions.

Step 3 — Execute with Safety Net

rclone sync /srv/data remote:bucket/data \
--backup-dir remote:bucket/_deleted/$(date +%Y-%m-%d) \
--progress --log-file /var/log/rclone-sync.log --log-level INFO

Step 4 — Verify

rclone check /srv/data remote:bucket/data --one-way

Practical Examples

Mirror Website to Cloud Backup

rclone sync /var/www/html backup-s3:my-bucket/www-mirror/ \
--backup-dir backup-s3:my-bucket/www-deleted/$(date +%Y-%m-%d) \
--progress

Mirror with Deletion Protection

# Abort if more than 50 files would be deleted (prevents empty-source disasters)
rclone sync /srv/media remote:archives/media \
--max-delete 50 \
--progress

Nightly Mirror Script

nightly-sync.sh
#!/bin/bash
DATE=$(date +%Y-%m-%d)
LOG="/var/log/rclone/sync-${DATE}.log"
SRC="/var/www/html"
DST="backup-s3:my-bucket/www-mirror"

# Safety checks
test -d "$SRC" || { echo "Source missing" >> "$LOG"; exit 1; }
COUNT=$(find "$SRC" -type f | wc -l)
test "$COUNT" -gt 0 || { echo "Source empty" >> "$LOG"; exit 1; }

# Sync with backup dir for deleted files
rclone sync "$SRC" "$DST" \
--backup-dir "backup-s3:my-bucket/deleted/${DATE}" \
--max-delete 100 \
--log-file "$LOG" --log-level INFO \
--transfers 8

# Post-sync verification
rclone check "$SRC" "$DST" --one-way >> "$LOG" 2>&1

Sync with Rename Tracking

# Detect renamed files to avoid re-uploading
rclone sync /srv/photos remote:photos-mirror/ \
--track-renames --progress

sync vs copy

Behaviorsynccopy
Adds new files
Updates changed files
Deletes destination extras
Safe for append-only archives
Use caseExact mirrorsAdditive backups
Rule of Thumb
  • Use copy when the destination is an archive (you never want to lose data there)
  • Use sync when the destination must be an exact mirror of the source

Common Pitfalls

PitfallConsequencePrevention
Running sync without --dry-runUnexpected mass deletion at destinationAlways dry-run first
Empty source directoryDeletes everything at destinationPreflight checks for source size
Swapping source and destinationProduction data overwritten with backupDouble-check direction
No --backup-dirDeleted files unrecoverableAlways use backup-dir for important data
No --max-deleteCatastrophic deletion from bugs or empty mountsSet a reasonable deletion limit

What's Next

Examples with Output

1. Mirror Local Folder to Cloud with Deletions

Make the remote exactly match your local folder. Command:

rclone sync /home/user/photos gdrive:photos --dry-run

Output:

2024/01/15 12:00:00 NOTICE: old_photo.jpg: Not deleting as --dry-run is set
2024/01/15 12:00:00 NOTICE: new_photo.jpg: Not copied as --dry-run is set

Note: Always use --dry-run before actual sync to avoid accidental deletions.

2. Actual Sync with Progress

Execute the mirror and see real-time updates. Command:

rclone sync /home/user/docs remote:docs --progress

Output:

Transferred:   5 MiB / 5 MiB, 100%, 1 MiB/s, ETA 0s
Checks: 100 / 100, 100%
Deleted: 5 (files)
Transferred: 2 / 2, 100%

3. Sync with Deletion Protection (Backup Dir)

Move files that would be deleted to a safety folder instead. Command:

rclone sync /local remote:backup --backup-dir remote:old_versions

Output:

(No output on success unless -v is used)

Goal: Files removed from /local will appear in remote:old_versions.

4. Sync and Track Renames

Avoid re-uploading large files that were just moved or renamed locally. Command:

rclone sync /movies remote:movies --track-renames -v

Output:

2024/01/15 12:00:00 INFO  : movie_v2.mp4: Moved (server-side)
2024/01/15 12:00:00 INFO : movie_v1.mp4: Deleted

5. Limit Concurrent Operations

Prevent overwhelming the remote API with too many checks. Command:

rclone sync /src remote:dest --checkers 4 --transfers 2

Output:

(Standard progress output reflecting slower but safer execution)