Deleting Files Older Than X Days
on October 6, 2017
Today I'm sharing a handy tip that lets you easily delete old files recursively (i.e., inside a directory that may contain multiple levels of subdirectories).
I needed to do this on my hosting server at Siteground, to delete entries in ~/tmp where there were a lot of junk, including files as old as 2007. I was doing some spring cleaning, so to speak, and wanted to remove old junk to keep my account well-maintained and easily under my account limits. (Honestly, though, I was already well under the limit easily; I was just trying to minimize the usage numbers in my control panel. I just can't stop myself from doing optimizations - even micro-optimizations)
My first instinct was to use "find", and then pipe that to "rm". However, piping to "rm" doesn't work directly like you'd expect (the short of it is that rm only looks at arguments, and ignores stdin). You can still achieve the same thing using xargs, like so:
find /your/folder/path/ -mindepth 1 -mtime +200 | xargs rm
However, I did some googling and it turns out that you don't need the pipe, or xargs, or even rm for that matter. It must be such a common use case to use find to know which things to delete and then delete them, that find itself already has that capability:
find /your/folder/path/ -mindepth 1 -mtime +200 -delete
-mindepth 1 means any file of depth 1 (i.e., directly inside the main folder) or higher depth (i.e., files inside subdirectories of my main folder, no matter how deep) are fair game for deletion - as long as they match the other criteria, of course.
-mtime +200 means "match files that are older than 200 days". 200 days was the limit I used to clear out the contents of my ~/tmp folder; most of these are various log files.
-delete tells find to delete all found files.
There is no "undo delete" when you're working in the terminal, though, so be careful with this new power.