1

I created a PowerShell script to remove all files and folders older than X days. This works perfectly fine and the logging is also ok. Because PowerShell is a bit slow, it can take some time to delete these files and folders when big quantities are to be treated.

My questions: How can I have this script ran on multiple directories ($Target) at the same time?

Ideally, we would like to have this in a scheduled task on Win 2008 R2 server and have an input file (txt, csv) to paste some new target locations in.

Thank you for your help/advise.

The script

#================= VARIABLES ==================================================
$Target = \\share\dir1"
$OlderThanDays = "10"
$Logfile = "$Target\Auto_Clean.log"
#================= BODY =======================================================
# Set start time
$StartTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
Write-Output "`nDeleting folders that are older than $OlderThanDays days:`n" | Tee-Object $LogFile -Append 
Get-ChildItem -Directory -Path $Target | 
 Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-$OlderThanDays) } | ForEach {
 $Folder = $_.FullName
 Remove-Item $Folder -Recurse -Force -ErrorAction SilentlyContinue
 $Timestamp = (Get-Date).ToShortDateString()+" | "+(Get-Date).ToLongTimeString() 
 # If folder can't be removed
 if (Test-Path $Folder)
 { "$Timestamp | FAILLED: $Folder (IN USE)" } 
 else
 { "$Timestamp | REMOVED: $Folder" } 
 } | Tee-Object $LogFile -Append # Output folder names to console & logfile at the same time 
# Set end time & calculate runtime
$EndTime = (Get-Date).ToShortDateString()+", "+(Get-Date).ToLongTimeString()
$TimeTaken = New-TimeSpan -Start $StartTime -End $EndTime
# Write footer to log
Write-Output ($Footer = @"
 Start Time : $StartTime
 End Time : $EndTime
 Total runtime : $TimeTaken
$("-"*79)
"@)
# Create logfile
Out-File -FilePath $LogFile -Append -InputObject $Footer
# Clean up variables at end of script
$Target=$StartTime=$EndTime=$OlderThanDays = $null
marc_s
760k185 gold badges1.4k silver badges1.5k bronze badges
asked May 12, 2014 at 12:12

1 Answer 1

1

One way to achieve this would be to write an "outer" script that passes the directory-to-be-cleaned, into the "inner" script, as a parameter.

For your "outer" script, have something like this:

$DirectoryList = Get-Content -Path $PSScriptRoot\DirList;
foreach ($Directory in $DirectoryList) {
 Start-Process -FilePath powershell.exe -ArgumentList ('"{0}\InnerScript.ps1" -Path "{1}"' -f $PSScriptRoot, $Directory);
}

Note: Using Start-Process kicks off a new process that is, by default, asynchronous. If you use the -Wait parameter, then the process will run synchronously. Since you want things to run more quickly and asynchronously, omitting the -Wait parameter should achieve the desired results.

Invoke-Command

Alternatively, you could use Invoke-Command to kick off a PowerShell script, using the parameters: -File, -ArgumentList, -ThrottleLimit, and -AsJob. The Invoke-Command command relies on PowerShell Remoting, so that must enabled, at least on the local machine.


Add a parameter block to the top of your "inner" script (the one you posted above), like so:

param (
 [Parameter(Mandatory = $true)]
 [string] $Path
)

That way, your "outer" script can pass in the directory path, using the -Path parameter for the "inner" script.

answered May 12, 2014 at 13:53
Sign up to request clarification or add additional context in comments.

7 Comments

Thank you Trevor. I was also looking at Start-Job, which seems a bit similar. Is there also a way to limit it to only 3 or 4 directories being processed at the same time? Otherwise it might kill the server..
Check out the Invoke-Command command. It has the -AsJob and -ThrottleLimit parameters. These should help you control script execution. Keep in mind that Invoke-Command relies on PowerShell Remoting, so that has to be enabled at least locally.
Oops - disregard. The -ThrottleLimit parameter only throttles the systems that a script is being remotely deployed to. It doesn't throttle the objects that are being inputted to it.
Maybe it's even better with regards to load balancing to have the script run remotely with Enter-PSSession and use variables for server name and local path. But then I have to rewrite it completely and I need to know if PowerShell is a requirement to run this remotely.
Yes Trevor, on remote with Invoke-Command it runs only 22 seconds instead of 30 minutes for over 10.000 files. Big win there I would say :) Thanks again for your help.
|

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.