sizes: better disk usage reporting in Terminal
I’ve come up with a lot of ways to see what’s taking up space in my directories from Terminal over the years. I’ve finally built one that covers all the little niggles I’ve had in the past.
Let’s start with the other ways you can do this.
du
Since we’re talking about disk usage, the obvious choice is du
, the “disk usage” command. To see the filesize of every file in the directory you can run du -sh *
. The -h
switch tells it to output human readable sizes, so it looks like:
4.0K test.rb
4.0K token.js
8.0K utils
1.2M webexcursions
This is pretty close to what I want, but it can’t be sorted by size. Also, du
reports in blocks (512B per segment), so if you’re interested in accurate readings on files under 4kb, it won’t do it.
ls
You can also use ls -l
to list all files along with their file sizes (and a whole bunch of other info). You can sort by size with -S
(or -Sr
for reverse order), and -h
works here too to show human readable size formats. So that’s closer to what I want, but there’s a whole bunch of irrelevant info as well as the fact that ls
isn’t going to report the total size of directories (all the files they contain added together) the way du
will.
ncdu
I also have to mention ncdu
, an ncurses utility that’s excellent for exploring disk usage. It’s overkill for what I want, but worth checking out (and available via Homebrew, brew install ncdu
).
My Solution
You have no reason to recall this, but I’ve tried to solve this in the past. I wrote a bash function called sizeup
that would do the trick. It’s super slow, though, and does things the hard way. So I decided to put ls
and du
together with some of my own sorting and formatting to get fast filesize info. I call it sizes
.
Installation
I have the script posted in this gist. Save that file in your path, name it sizes
, and make it executable (chmod a+x sizes
).
Usage
To use it, just run sizes
. You can optionally pass it a directory, e.g. sizes ~/Desktop
, and it will operate there. And for whatever reason, I added help to it, so sizes -h
will show you the obvious lack of other options.
The script will output a listing of all of the files in the current directory with sizes in bytes, kilobytes, megabytes, etc., calculated to 2 decimal places. It includes hidden files and reports actual sizes of directories (without traversing them). Output is colorized, with colors ascending from blue to red based on file size, and filenames colorized to indicate regular files, directories, and hidden files.
On any directory containing under 20GB it’s quite fast. Large directories can take a while to calculate, but you’d have the same delay using du
directly.
How It Works
The script starts with an ls -l
listing (actually ls -lSrAF
) of the directory, using Ruby regex to extract the size (in bytes) and filename from the output.
Then it detects directories, which will be insufficiently reported by ls
, and passes them to du
to get the block-based filesize of the combined contents. It multiplies the size by 512 and gets as close to an accurate byte reading as possible. (It should be noted that the GNU coreutils version of du
has a -b
switch that will report in bytes. I wanted to make this without additional dependencies.)
The sizes are humanized, colorized, sorted, and output with the filenames and a total for the directory at the bottom.
I could easily be wrong, but I assume there are other people like me who want to find the space hogs without decoding too much output or loading up DaisyDisk. If so, enjoy the script.
Ryan Irelan has produced a series of shell trick videos based on BrettTerpstra.com posts. Readers can get 10% off using the coupon code TERPSTRA
.