rclone Sync
rclone sync makes the destination identical to the source. It copies new and changed files, and deletes files from the destination that don't exist in the source. This makes it the most powerful — and most dangerous — rclone transfer command.
sync will delete files at the destination that don't exist at the source. Always run with --dry-run first, and consider using --backup-dir to keep deleted files for recovery.
Basic Syntax
rclone sync SOURCE DESTINATION [flags]
The destination becomes an exact mirror of the source. Direction matters:
# Local → Remote (push mirror)
rclone sync /local/path remote:bucket/path
# Remote → Local (pull mirror)
rclone sync remote:bucket/path /local/path
# Remote → Remote
rclone sync remoteA:path remoteB:path
Key Flags
| Flag | Description |
|---|---|
--dry-run / -n | Essential — preview changes without executing |
--progress / -P | Show real-time progress |
--verbose / -v | Show files transferred and deleted |
--backup-dir PATH | Move deleted/overwritten files here instead of permanently deleting |
--suffix SUFFIX | Add suffix to backup files (e.g., .bak) |
--max-delete N | Abort if more than N files would be deleted |
--track-renames | Detect renamed files (avoids re-upload) |
--transfers N | Parallel file transfers (default 4) |
--bwlimit RATE | Bandwidth limit (e.g., 10M) |
--log-file PATH | Write detailed log |
--log-level LEVEL | DEBUG, INFO, NOTICE, ERROR |
Safe Execution Pattern
Step 1 — Preflight Check
# Verify source exists and is not empty
test -d /srv/data && test "$(ls -A /srv/data | wc -l)" -gt 0 || {
echo "Source is missing or empty — aborting"
exit 1
}
# Verify remote is reachable
rclone lsd remote:bucket > /dev/null 2>&1 || {
echo "Remote unreachable — aborting"
exit 1
}
Step 2 — Dry Run
rclone sync /srv/data remote:bucket/data \
--dry-run -v --log-file /var/log/rclone-dryrun.log
Review the output carefully — look for unexpected deletions.
Step 3 — Execute with Safety Net
rclone sync /srv/data remote:bucket/data \
--backup-dir remote:bucket/_deleted/$(date +%Y-%m-%d) \
--progress --log-file /var/log/rclone-sync.log --log-level INFO
Step 4 — Verify
rclone check /srv/data remote:bucket/data --one-way
Practical Examples
Mirror Website to Cloud Backup
rclone sync /var/www/html backup-s3:my-bucket/www-mirror/ \
--backup-dir backup-s3:my-bucket/www-deleted/$(date +%Y-%m-%d) \
--progress
Mirror with Deletion Protection
# Abort if more than 50 files would be deleted (prevents empty-source disasters)
rclone sync /srv/media remote:archives/media \
--max-delete 50 \
--progress
Nightly Mirror Script
#!/bin/bash
DATE=$(date +%Y-%m-%d)
LOG="/var/log/rclone/sync-${DATE}.log"
SRC="/var/www/html"
DST="backup-s3:my-bucket/www-mirror"
# Safety checks
test -d "$SRC" || { echo "Source missing" >> "$LOG"; exit 1; }
COUNT=$(find "$SRC" -type f | wc -l)
test "$COUNT" -gt 0 || { echo "Source empty" >> "$LOG"; exit 1; }
# Sync with backup dir for deleted files
rclone sync "$SRC" "$DST" \
--backup-dir "backup-s3:my-bucket/deleted/${DATE}" \
--max-delete 100 \
--log-file "$LOG" --log-level INFO \
--transfers 8
# Post-sync verification
rclone check "$SRC" "$DST" --one-way >> "$LOG" 2>&1
Sync with Rename Tracking
# Detect renamed files to avoid re-uploading
rclone sync /srv/photos remote:photos-mirror/ \
--track-renames --progress
sync vs copy
| Behavior | sync | copy |
|---|---|---|
| Adds new files | ✅ | ✅ |
| Updates changed files | ✅ | ✅ |
| Deletes destination extras | ✅ | ❌ |
| Safe for append-only archives | ❌ | ✅ |
| Use case | Exact mirrors | Additive backups |
- Use
copywhen the destination is an archive (you never want to lose data there) - Use
syncwhen the destination must be an exact mirror of the source
Common Pitfalls
| Pitfall | Consequence | Prevention |
|---|---|---|
Running sync without --dry-run | Unexpected mass deletion at destination | Always dry-run first |
| Empty source directory | Deletes everything at destination | Preflight checks for source size |
| Swapping source and destination | Production data overwritten with backup | Double-check direction |
No --backup-dir | Deleted files unrecoverable | Always use backup-dir for important data |
No --max-delete | Catastrophic deletion from bugs or empty mounts | Set a reasonable deletion limit |
What's Next
Examples with Output
1. Mirror Local Folder to Cloud with Deletions
Make the remote exactly match your local folder. Command:
rclone sync /home/user/photos gdrive:photos --dry-run
Output:
2024/01/15 12:00:00 NOTICE: old_photo.jpg: Not deleting as --dry-run is set
2024/01/15 12:00:00 NOTICE: new_photo.jpg: Not copied as --dry-run is set
Note: Always use --dry-run before actual sync to avoid accidental deletions.
2. Actual Sync with Progress
Execute the mirror and see real-time updates. Command:
rclone sync /home/user/docs remote:docs --progress
Output:
Transferred: 5 MiB / 5 MiB, 100%, 1 MiB/s, ETA 0s
Checks: 100 / 100, 100%
Deleted: 5 (files)
Transferred: 2 / 2, 100%
3. Sync with Deletion Protection (Backup Dir)
Move files that would be deleted to a safety folder instead. Command:
rclone sync /local remote:backup --backup-dir remote:old_versions
Output:
(No output on success unless -v is used)
Goal: Files removed from /local will appear in remote:old_versions.
4. Sync and Track Renames
Avoid re-uploading large files that were just moved or renamed locally. Command:
rclone sync /movies remote:movies --track-renames -v
Output:
2024/01/15 12:00:00 INFO : movie_v2.mp4: Moved (server-side)
2024/01/15 12:00:00 INFO : movie_v1.mp4: Deleted
5. Limit Concurrent Operations
Prevent overwhelming the remote API with too many checks. Command:
rclone sync /src remote:dest --checkers 4 --transfers 2
Output:
(Standard progress output reflecting slower but safer execution)