rclone Copy
rclone copy transfers files from source to destination, skipping files that already exist with the same size and modification time. It never deletes files at the destination — making it the safest transfer command for everyday use.
copy is rclone's default, non-destructive transfer. It adds new files and updates changed files at the destination but leaves everything else untouched. If you're unsure whether to use copy or sync, start with copy.
Basic Syntax
rclone copy SOURCE DESTINATION [flags]
| Part | Description |
|---|---|
SOURCE | Local path or remote:path to copy from |
DESTINATION | Local path or remote:path to copy to |
Transfer Directions
# Local → Remote (upload)
rclone copy /local/path remote:bucket/path
# Remote → Local (download)
rclone copy remote:bucket/path /local/path
# Remote → Remote (server-side copy if supported)
rclone copy remoteA:path remoteB:path
# Local → Local
rclone copy /source/path /dest/path
Key Flags
| Flag | Description |
|---|---|
--dry-run / -n | Preview what would be copied without transferring |
--progress / -P | Show real-time transfer progress |
--verbose / -v | Show files being transferred |
--transfers N | Number of parallel file transfers (default 4) |
--checkers N | Number of parallel hash checkers (default 8) |
--bwlimit RATE | Limit bandwidth (e.g., 10M for 10 MB/s) |
--max-age DURATION | Only copy files newer than this (e.g., 24h, 7d) |
--min-size SIZE | Skip files smaller than SIZE |
--max-size SIZE | Skip files larger than SIZE |
--include PATTERN | Only copy files matching pattern |
--exclude PATTERN | Skip files matching pattern |
--log-file PATH | Write logs to a file |
--log-level LEVEL | Set log level: DEBUG, INFO, NOTICE, ERROR |
Practical Examples
Upload a Website Backup to S3
rclone copy /var/www/html backup-s3:my-bucket/www-backup/ --progress
Download Remote Files to Local Server
rclone copy gdrive:projects/2024 /home/user/projects/2024 --progress
Copy Only Recent Files
# Copy files modified in the last 24 hours
rclone copy /var/log/nginx remote:backups/logs --max-age 24h --progress
# Copy files modified in the last 7 days
rclone copy /var/www/uploads remote:backups/uploads --max-age 7d
Bandwidth-Limited Transfer
# Limit upload to 5 MB/s to avoid saturating the connection
rclone copy /backup/db remote:offsite/db --bwlimit 5M --progress
Copy with Filters
# Only copy images
rclone copy /var/www/uploads remote:media/ \
--include "*.{jpg,jpeg,png,gif,webp}" --progress
# Exclude cache and temp files
rclone copy /var/www/app remote:backups/app/ \
--exclude "cache/**" \
--exclude "*.tmp" \
--exclude "node_modules/**" \
--progress
Production Backup Script
#!/bin/bash
DATE=$(date +%Y-%m-%d)
LOG="/var/log/rclone/backup-${DATE}.log"
# Preflight check
rclone lsd backup-s3:my-bucket > /dev/null 2>&1 || {
echo "Remote unreachable" | tee -a "$LOG"
exit 1
}
# Dry run first
rclone copy /var/www/html backup-s3:my-bucket/www/${DATE}/ \
--dry-run --log-file "$LOG" --log-level INFO
# Execute
rclone copy /var/www/html backup-s3:my-bucket/www/${DATE}/ \
--progress --log-file "$LOG" --log-level INFO \
--transfers 8
echo "Backup completed: $(date)" >> "$LOG"
copy vs sync vs move
| Behavior | copy | sync | move |
|---|---|---|---|
| Adds new files | ✅ | ✅ | ✅ |
| Updates changed files | ✅ | ✅ | ✅ |
| Deletes destination extras | ❌ | ✅ | ❌ |
| Removes source files | ❌ | ❌ | ✅ |
| Safe for append-only archives | ✅ | ❌ | ❌ |
Use copy for backups, uploads, and any scenario where you want to add data without risking deletion. Only reach for sync when you specifically need the destination to mirror the source exactly.
Common Pitfalls
| Pitfall | Consequence | Prevention |
|---|---|---|
Using copy when you need mirror | Stale files accumulate at destination | Use sync for exact mirrors |
No --log-file in cron jobs | Silent failures go unnoticed | Always log automated copies |
Forgetting --progress on large transfers | No visibility into transfer status | Add -P for interactive sessions |
| Source path typo | Copies wrong directory or creates empty destination | Verify paths with rclone ls first |
| Default 4 transfers too slow | Under-utilizes fast connections | Increase --transfers 8 or 16 for fast links |
What's Next
Examples with Output
1. Copy a Local Directory to a Remote
Upload your project files to cloud storage. Command:
rclone copy ./my-project gdrive:backups/my-project --progress
Output:
Transferred: 10.5 MiB / 10.5 MiB, 100%, 2.1 MiB/s, ETA 0s
Transferred: 150 / 150, 100%
Elapsed time: 5.2s
2. Copy a Single File and Rename It
Using copyto behavior within copy by specifying the filename.
Command:
rclone copy /tmp/report.pdf remote:archives/2023-report.pdf
Output:
(No output on success unless -v is used)
3. Parallel Transfer Enhancement
Speed up the transfer of many small files by increasing parallel workers. Command:
rclone copy ./assets remote:bucket/assets --transfers 16 --checkers 32
Output:
2024/01/15 12:00:00 INFO :
Transferred: 1.2 GiB / 1.2 GiB, 100%, 45 MiB/s, ETA 0s
Transferred: 2500 / 2500, 100%
4. Copy Only New Files
Use a dry-run to see which files would be added to the destination. Command:
rclone copy /src /dest --dry-run -v
Output:
2024/01/15 12:00:00 NOTICE: newfile.txt: Not copied (dry run)
2024/01/15 12:00:00 NOTICE:
Transferred: 1 / 1, 100%
5. Limit Bandwidth to Save Network Resources
Restrict the upload speed to not interfere with other activities. Command:
rclone copy /large-video gdrive:videos --bwlimit 2M --progress
Output:
Transferred: 50.2 MiB / 500 MiB, 10%, 2 MiB/s, ETA 3m45s