My MacBook Ran Out of Storage Mid-Deadline. So I Built the Tool That Fixed It Forever.

infosecwriteups.com · Chetan Chinchulkar · 3 days ago · tutorial
quality 7/10 · good
0 net
My MacBook Ran Out of Storage Mid-Deadline. So I Built the Tool That Fixed It Forever. | by Chetan Chinchulkar | in InfoSec Write-ups - Freedium Milestone: 20GB Reached We’ve reached 20GB of stored data — thank you for helping us grow! Patreon Ko-fi Liberapay Close < Go to the original My MacBook Ran Out of Storage Mid-Deadline. So I Built the Tool That Fixed It Forever. A personal story about invisible disk hogs, a year-long secret weapon, and 14.5GB recovered in under 2 minutes. Chetan Chinchulkar Follow InfoSec Write-ups · ~7 min read · April 4, 2026 (Updated: April 4, 2026) · Free: Yes The Docker Build That Never Happened It was 45 minutes before I needed to demo a project. I hit build. My Mac beachballed. Then came the error I'd never seen before — not a code error, not a dependency conflict. Storage. I had run out of disk space so completely that Docker couldn't even start building an image. On a 256GB MacBook Air M1, that moment hits differently. There's no easy upgrade path. There's no "just add more storage." You sit there, demo clock ticking, wondering how your machine went from fine to full without a single obvious warning. I'd felt the slowness creeping in for weeks — the beachball spinning at inopportune moments, apps taking longer to open, builds dragging. I kept dismissing it. I'll clean it up this weekend. I never did. And then, deadline pressure, no Docker build, and a very uncomfortable few minutes scrambling to free up space fast enough to matter. That was the moment I stopped treating storage as an afterthought. The Problem Nobody Warns You About Here's the thing about running out of disk space as a developer: the culprit is almost never what you expect. It's not your photos. It's not your videos. It's the invisible tax of being someone who builds things. Every Node.js project you've started — including the ones you abandoned after two hours — has a node_modules folder sitting there, quietly occupying anywhere from 200MB to over 1GB of space. Every Python project has a virtual environment. Every build pipeline generates artifacts. Every framework has its cache. And they compound. Fast. I had projects from two years ago I'd completely forgotten about. Tutorial repos I followed along with once and never touched again. Side ideas that never made it past day one. Every single one of them had its own full copy of dependencies sitting on my disk. The worst part? A lot of these were invisible. If you use .venv as your Python virtual environment name (which I do, as a convention), macOS Finder won't even show it to you by default. Dotfiles are hidden. You can't see what you can't see. I'd open up Storage in System Settings and see the generic "Developer" category eating up dozens of gigabytes, with no way to drill down into exactly what the problem was. It was maddening. My Embarrassing Manual Process Before I built anything, I had a process. I'm going to share it because I think many developers will recognize themselves in it. My first instinct after the Docker incident was to reach for dust — a A du replacement written in Rust that gives you a visual breakdown of what's occupying space. If you haven't used it, it's genuinely excellent. I ran it from my home directory and finally got a picture of where all my gigabytes were going. The answer, at first glance, was caches. Application caches, framework caches, build caches. So I deleted them. Freed up a couple of gigabytes. Felt good about it. Two weeks later, they were back. Caches are designed to rebuild themselves — that's the whole point. I'd treated the symptom, not the problem. So I ran dust again, this time pointing it specifically at my projects folder. That's when it node_modules started showing up — project after abandoned project, each one carrying hundreds of megabytes of dependencies I hadn't touched in months. Sometimes over a gigabyte per folder. I started deleting those manually. But then I remembered I'd also worked on Python projects. I checked a few folders and didn't see anything obvious. Then I actually opened a project directory in terminal and looked carefully — there it was, a .venv folder, completely invisible at a glance because macOS hides dotfiles by default. dust hadn't flagged it prominently. Finder didn't show it. It had just been sitting there, silently occupying space. That's when I switched to the find command. find . -name "node_modules" -type d -maxdepth 3 Run it. Get a wall of output. Try to figure out which of these I could safely delete. Second-guess myself. Run another find for .venv . Then another for venv . Then remember build , dist , __pycache__ , .cache ... each one a separate command, a separate wall of output, a separate round of manual rm -rf . The whole process would take hours. Not because it was technically hard — it wasn't. But it was tedious, error-prone, and incomplete by design. I always finished feeling like I'd probably missed something. I did this every few months when things got bad enough. And every time, I told myself there had to be a better way. The Pattern I Couldn't Unsee One day — after probably my fourth or fifth round of this manual process — I noticed something obvious that I'd somehow never consciously registered before. It was always the same folders. node_modules . .venv . venv . build . dist . __pycache__ . .cache . target (for Rust projects). The names were consistent. The problem was consistent. Only my response to it was inconsistent and manual. I didn't need to think about what to find. I just needed something that could find all of them, show me what was there, let me decide what to delete, and do it safely. I went looking for existing tools. I found some disk usage analyzers, storage cleaners, and GUI apps. But none of them gave me what I actually wanted: something that lived in my terminal, could be run from anywhere with a single command, presented the results interactively so I could check and uncheck things before committing, and had zero external dependencies so I could trust it completely. I wanted a TUI. An interactive checklist in the terminal. Something I could make a symlink to and run from any directory, any time, without thinking about it. Since it didn't exist the way I wanted it, I built it. Building Cruft I called it cruft — because that's what it finds. The accumulated junk that builds up silently over months of development work. The design goals were simple and came directly from my own frustration with the manual process: It had to be interactive. No more piping find output into rm -rf . Cruft presents everything it finds in a TUI with checkboxes. You see the folder, you see the size, you decide. Nothing gets deleted without your explicit say-so. It had to be safe. Before any deletion happens, you get a confirmation step. The tool tells you exactly what it's about to remove and how much space it will free. You confirm. Only then does it act. It had to be zero dependencies. I didn't want to install a Python package, a Homebrew formula, or manage an external runtime. Cruft is published as an npm package, but the tool itself has no runtime dependencies. npm i -g cruft-cleaner And you're done. It had to live in the terminal. I set it up as a symlink so I can type cruft from anywhere on my system. It searches from that directory downward, shows me everything it finds, and I'm done in minutes. The categories it hunts (which is configurable): node_modules , Python virtual environments (both venv and .venv ), build directories, dist folders, package caches, and more — the same list of usual suspects that used to eat hours of my time. The Secret Weapon Year Here's something I haven't mentioned yet: I built Cruft about a year before I made it public. I used it quietly, just for myself, for twelve months. Every couple of weeks I'd run it, clear out whatever had accumulated, and go back to work. It became as routine as emptying the trash. That year, I wanted to know whether I should share it. And I kept talking myself out of it. It's too simple. Someone's probably already built something better. Who would actually want this? The thing that changed my mind was remembering what it felt like before I had it. The manual hours. The missed .venv folders. The Docker build that killed my deadline. If even one other developer avoids that afternoon because of Cruft, it was worth publishing. 14.5GB. Under 2 Minutes. The first time I ran Cruft on my full system, it recovered 14.5GB in under 2 minutes . That same cleanup, done manually with find commands and careful folder-by-folder review would have taken me the better part of an afternoon. Realistically, I would have gotten tired, missed things, and not even recovered the full amount. 14.5GB on a 256GB MacBook Air is nothing. That's 5.6% of your total storage — recovered from folders that were doing absolutely nothing except sitting there. Since then, I have run it regularly. A few minutes, a few gigabytes freed, and my machine stays healthy without me ever having to think about it. Try It Yourself If any of this sounds familiar — the sluggish builds, the mystery storage usage, the manual find commands — Cruft is now public. npm i -g cruft-cleaner or One-time use (no installation needed) npx cruft-cleaner Then just run cruft from any directory. It'll show you everything it finds. You pick what to delete. It handles the rest. NPM : https://www.npmjs.com/package/cruft-cleaner GitHub: https://github.com/Chetan-Chinchulkar/cruft How much storage did you recover? And what would you want it to do that it doesn't do yet? Drop a comment — I read all of them, and the roadmap is genuinely shaped by what people actually need. Let's build this together. Thanks for reading. If this was useful, a clap or two goes a long way—it helps other developers find it when they're Googling "where did my disk space go at 11pm the night before a deadline." #open-source #software-development #programming #javascript #developer-tools Reporting a Problem Sometimes we have problems displaying some Medium posts. If you have a problem that some images aren't loading - try using VPN. Probably you have problem with access to Medium CDN (or fucking Cloudflare's bot detection algorithms are blocking you).