Monday, October 5, 2015

Finding old files

One of the systems at work automatically backs up to one of the file servers but unfortunately it doesn't follow it's own rules. The backups are supposed to happen once a week and get purged after a year. The thousands and thousands of backup files would lead me to believe that's not working right. That side of it is on the network team to fix but it is on me to remove all the old files. They decided we didn't need anything before Jan 1, 2015 and I need to remove them. There are 50 devices that back up and each one has a multi file back up from each day so there are way too many things to go through and manually remove them. I'm fairly new to powershell but I knew I could whip up a quick script.

Finding old files is easy. Piping Get-ChildItem to Get-Member reveals there is a "LastWriteTime" propery. It's easy to get anything that's been written in the last year.

Get-ChildItem | where LastWriteTime -le (get-date).addyears(-1)
That'll work but not really what I wanted. I need files that were written on or before a certain day, not just in the last year.

 Get-ChildItem | where LastWriteTime -le 1/1/2015
Ok that's better. I'm not really worried about specific files since each backup is in it's own folder.
Get-ChildItem -directory -recurse | where LastWriteTime -le 1/1/2015
That did it. I just get a list of directories back. Unfortunately I'm getting some false positives. Looks like I'll have to actually write a script.

cd Relevant Folder

$directory = Get-ChildItem -Directory

foreach ($d in $directory) {                                                                                                              (Get-ChildItem .\$d\backup -directory -Recurse | where lastwritetime -le                             1/1/2015) | remove-item -Force -Recurse
}
There we go. That'll get all the directories in the share and then step through each one and find the directories that are too old and send them to be deleted.

No comments:

Post a Comment