WIDBA

WIDBA - The busiest DBA in Wisconsin, its all I do and I do it well

Thursday, July 12, 2012

Using Powershell to Monitor Files


Just another quick script, I had been left some instructions by a previous coworker that I need to check about 70 some folders each week to see if files had been placed there.  I got through about 20 of them and the lazy gene started to kick in, by the time I would have finished checking them the first week this script was done.  This particular script only looks in the root of each folder for a file (archive, etc subfolders) for a file that has been modified in the past 180 days.



$basepath = "\\mynetworkshare\ftpfiles\"
$path = Get-ChildItem -Path $basepath


# only want to get files that have been modified in the past 180 days
$dt = (Get-Date).AddDays(-180)


# Loop through each folder and only look in that folder for a file (use -recurse if you want to go deeper)
foreach($flder in $path)
{
    # Verify the $flder is a folder and not something else
    if((Test-Path $flder.FullName -pathtype container) -eq $true)
    {
        $flder.GetFiles() | WHERE{$_.LastWriteTime -gt $dt} | Select Fullname, LastWriteTime
    }
}

2 comments:

  1. Any reason in particular why you are not using Get-Childitem for this?

    You could shorten the command to something like this:
    Get-Childitem -Recurse -Force | Where {$_.LastWriteTime -gt (Get-Date).AddDays(-180) | Select Fullname,LastWriteTime

    ReplyDelete
    Replies
    1. Nope - the biggest thing in my case is I don't want to "recurse" each folder's folders. I only want to go one level deep from the root (due to archive folders, etc).

      Delete