Skip to main content

rclone Md5sum

rclone md5sum prints MD5 hashes for every file at the given path. The output format matches the standard md5sum command — making it compatible with standard verification workflows.

Quick Summary

Use md5sum for quick integrity checks. Note that not all backends support MD5 natively — use hashsum for maximum flexibility, or sha1sum where SHA-1 is preferred.

Basic Syntax

rclone md5sum REMOTE:PATH [flags]
# Generate MD5 sums for all files in a bucket
rclone md5sum remote:my-bucket

# Generate for a specific directory
rclone md5sum remote:my-bucket/backups/2024

# Save to file for later verification
rclone md5sum remote:my-bucket > checksums.md5

Output Format

d41d8cd98f00b204e9800998ecf8427e  documents/report.pdf
e4d909c290d0fb1ca068ffaddf22cbd0 config/settings.json
098f6bcd4621d373cade4e832627b4f6 data/export.csv

Key Flags

FlagDescription
--downloadDownload files to compute hash locally (for backends without server-side MD5)
--include PATTERNOnly hash files matching pattern
--exclude PATTERNSkip files matching pattern

Practical Examples

Verify Backup Integrity

# Generate checksums of source
rclone md5sum /var/www/html > /tmp/source.md5

# Generate checksums of backup
rclone md5sum remote:backups/www > /tmp/backup.md5

# Compare
diff /tmp/source.md5 /tmp/backup.md5

Hash Specific File Types

# Only hash database dumps
rclone md5sum remote:backups --include "*.sql.gz"

Save Checksums with Backup

# Generate and upload checksums alongside the backup
rclone md5sum remote:backups/daily/$(date +%Y-%m-%d) > /tmp/checksums.md5
rclone copyto /tmp/checksums.md5 remote:backups/daily/$(date +%Y-%m-%d)/checksums.md5

Backend Support

BackendNative MD5Notes
S3 / WasabiMD5 stored as ETag (for non-multipart uploads)
Google DriveMD5 always available
OneDriveSupports SHA-1 instead
DropboxUses its own hash format
SFTPUse --download to compute locally
LocalComputed locally
tip

If MD5 is not supported by your backend, use rclone hashsum with a supported hash type, or add --download to compute hashes locally (slower but always works).

Common Pitfalls

PitfallConsequencePrevention
Backend doesn't support MD5Empty hash outputUse hashsum with a supported type, or --download
Multipart uploads on S3ETag is not a simple MD5Use rclone check instead for S3 verification
Large datasetsSlow — must read/hash every fileFilter with --include to narrow scope

What's Next

Examples with Output

1. Generate MD5 sum for a single file

Get the hash of a specific object for verification. Command:

rclone md5sum remote:backups/db.sql

Output:

b3d8c8e8d8c8e8d8c8e8d8c8e8d8c8e8  db.sql

2. Batch hash all files in a folder

Create a checksum list for a directory of files. Command:

rclone md5sum gdrive:photos/2023

Output:

d41d8cd98f00b204e9800998ecf8427e  IMG_001.jpg
e4d909c290d0fb1ca068ffaddf22cbd0 IMG_002.jpg

3. Save hashes to a local file

Export checksums to match the standard md5sum utility format. Command:

rclone md5sum remote:bucket > remote_files.md5

Output:

(No output; remote_files.md5 created with hashes and paths)

4. Check for hashing support

Identify if the backend supports server-side MD5 computation. Command:

rclone md5sum remote:bucket --download

Output:

(If not natively supported, rclone downloads and hashes each file)

5. Filter files before hashing

Only calculate checksums for critical file types. Command:

rclone md5sum remote:bucket --include "*.enc"

Output:

a94a8fe5ccb19ba61c4c0873d391e987  secrets.enc