rclone Md5sum
rclone md5sum prints MD5 hashes for every file at the given path. The output format matches the standard md5sum command — making it compatible with standard verification workflows.
Use md5sum for quick integrity checks. Note that not all backends support MD5 natively — use hashsum for maximum flexibility, or sha1sum where SHA-1 is preferred.
Basic Syntax
rclone md5sum REMOTE:PATH [flags]
# Generate MD5 sums for all files in a bucket
rclone md5sum remote:my-bucket
# Generate for a specific directory
rclone md5sum remote:my-bucket/backups/2024
# Save to file for later verification
rclone md5sum remote:my-bucket > checksums.md5
Output Format
d41d8cd98f00b204e9800998ecf8427e documents/report.pdf
e4d909c290d0fb1ca068ffaddf22cbd0 config/settings.json
098f6bcd4621d373cade4e832627b4f6 data/export.csv
Key Flags
| Flag | Description |
|---|---|
--download | Download files to compute hash locally (for backends without server-side MD5) |
--include PATTERN | Only hash files matching pattern |
--exclude PATTERN | Skip files matching pattern |
Practical Examples
Verify Backup Integrity
# Generate checksums of source
rclone md5sum /var/www/html > /tmp/source.md5
# Generate checksums of backup
rclone md5sum remote:backups/www > /tmp/backup.md5
# Compare
diff /tmp/source.md5 /tmp/backup.md5
Hash Specific File Types
# Only hash database dumps
rclone md5sum remote:backups --include "*.sql.gz"
Save Checksums with Backup
# Generate and upload checksums alongside the backup
rclone md5sum remote:backups/daily/$(date +%Y-%m-%d) > /tmp/checksums.md5
rclone copyto /tmp/checksums.md5 remote:backups/daily/$(date +%Y-%m-%d)/checksums.md5
Backend Support
| Backend | Native MD5 | Notes |
|---|---|---|
| S3 / Wasabi | ✅ | MD5 stored as ETag (for non-multipart uploads) |
| Google Drive | ✅ | MD5 always available |
| OneDrive | ❌ | Supports SHA-1 instead |
| Dropbox | ❌ | Uses its own hash format |
| SFTP | ❌ | Use --download to compute locally |
| Local | ✅ | Computed locally |
If MD5 is not supported by your backend, use rclone hashsum with a supported hash type, or add --download to compute hashes locally (slower but always works).
Common Pitfalls
| Pitfall | Consequence | Prevention |
|---|---|---|
| Backend doesn't support MD5 | Empty hash output | Use hashsum with a supported type, or --download |
| Multipart uploads on S3 | ETag is not a simple MD5 | Use rclone check instead for S3 verification |
| Large datasets | Slow — must read/hash every file | Filter with --include to narrow scope |
What's Next
Examples with Output
1. Generate MD5 sum for a single file
Get the hash of a specific object for verification. Command:
rclone md5sum remote:backups/db.sql
Output:
b3d8c8e8d8c8e8d8c8e8d8c8e8d8c8e8 db.sql
2. Batch hash all files in a folder
Create a checksum list for a directory of files. Command:
rclone md5sum gdrive:photos/2023
Output:
d41d8cd98f00b204e9800998ecf8427e IMG_001.jpg
e4d909c290d0fb1ca068ffaddf22cbd0 IMG_002.jpg
3. Save hashes to a local file
Export checksums to match the standard md5sum utility format.
Command:
rclone md5sum remote:bucket > remote_files.md5
Output:
(No output; remote_files.md5 created with hashes and paths)
4. Check for hashing support
Identify if the backend supports server-side MD5 computation. Command:
rclone md5sum remote:bucket --download
Output:
(If not natively supported, rclone downloads and hashes each file)
5. Filter files before hashing
Only calculate checksums for critical file types. Command:
rclone md5sum remote:bucket --include "*.enc"
Output:
a94a8fe5ccb19ba61c4c0873d391e987 secrets.enc