Skip to main content

rclone Ls

rclone ls lists all files recursively at the given path, showing file size (in bytes) and full relative path. It's the most common inspection command — equivalent to a recursive ls -lR but for any rclone remote.

Quick Summary

ls is recursive by default and shows files only (not directories). Use lsd for directories, and lsf for customizable output formats.

Basic Syntax

rclone ls REMOTE:PATH [flags]
# List all files in a remote bucket
rclone ls remote:my-bucket

# List files under a specific prefix/path
rclone ls remote:my-bucket/backups/2024

# List local directory
rclone ls /var/www/html

Output Format

Each line shows: SIZE PATH

    1234 documents/report.pdf
56789 images/photo.jpg
456 config/settings.json

Key Flags

FlagDescription
--max-depth NLimit recursion depth
--include PATTERNOnly show files matching pattern
--exclude PATTERNHide files matching pattern
--min-size SIZEOnly show files larger than SIZE
--max-size SIZEOnly show files smaller than SIZE
--min-age DURATIONOnly show files older than DURATION
--max-age DURATIONOnly show files newer than DURATION

Practical Examples

Inspect a Remote Bucket

# See everything in a bucket
rclone ls remote:my-bucket

# See only the top level (no deep recursion)
rclone ls remote:my-bucket --max-depth 1

Find Large Files

# List files larger than 100 MB
rclone ls remote:backups --min-size 100M

Filter by File Type

# List only SQL dump files
rclone ls remote:backups --include "*.sql.gz"

# List only images
rclone ls remote:media --include "*.{jpg,png,gif,webp}"

Find Recent Files

# Files modified in the last 24 hours
rclone ls remote:uploads --max-age 24h

# Files older than 30 days
rclone ls remote:archive --min-age 30d

Verify Backup Contents

# Count files and check sizes after a copy/sync
rclone ls remote:backups/www | wc -l
rclone ls remote:backups/www | awk '{sum += $1} END {printf "%.2f GB\n", sum/1073741824}'

ls vs lsd vs lsf vs lsl

CommandShowsRecursiveDirectoriesFormat
lsFiles onlySIZE PATH
lsdDirectories only❌ (1 level)MODIFIED_DATE DIR
lsfFiles and/or dirsConfigurableConfigurableCustomizable
lslFiles onlySIZE MODIFIED_DATE PATH

Common Pitfalls

PitfallConsequencePrevention
Running ls on a bucket with millions of filesExtremely slow, high API costsUse --max-depth 1 or --include to narrow scope
Expecting directories in outputls only shows filesUse lsd for directories
No size unitsSizes are in raw bytes, hard to readPipe through awk or use rclone size

What's Next

Examples with Output

1. List all files in a bucket

Get a recursive view of every file in your storage. Command:

rclone ls gdrive:backups

Output:

    1234 archive.tar.gz
56789 logs/app.log
456 metadata.json

2. Limit listing depth

See only the files in the top two levels. Command:

rclone ls remote:data --max-depth 2

Output:

   10000 level1.txt
5000 folder1/level2.txt

3. Filter for large files

Identify files taking up significant space. Command:

rclone ls remote:bucket --min-size 1G

Output:

10737418240 giant_backup.iso

4. Search for specific file types

List only files that match a pattern. Command:

rclone ls remote:bucket --include "*.png"

Output:

   150000 images/logo.png
80000 assets/icon.png

5. Check for recently modified files

List files modified in the last hour. Command:

rclone ls remote:bucket --max-age 1h

Output:

     1024 current_status.txt