Create an account

Very important

  • To access the important data of the forums, you must be active in each forum and especially in the leaks and database leaks section, send data and after sending the data and activity, data and important content will be opened and visible for you.
  • You will only see chat messages from people who are at or below your level.
  • More than 500,000 database leaks and millions of account leaks are waiting for you, so access and view with more activity.
  • Many important data are inactive and inaccessible for you, so open them with activity. (This will be done automatically)


Thread Rating:
  • 954 Vote(s) - 3.47 Average
  • 1
  • 2
  • 3
  • 4
  • 5
How to delete empty subfolders with PowerShell?

#1
I have a share that is a "junk drawer" for end-users. They are able to create folders and subfolders as they see fit. I need to implement a script to delete files created more than 31 days old.

I have that started with Powershell. I need to follow up the file deletion script by deleting subfolders that are now empty. Because of the nesting of subfolders, I need to avoid deleting a subfolder that is empty of files, but has a subfolder below it that contains a file.

For example:

- `FILE3a` is 10 days old. `FILE3` is 45 days old.
- I want to clean up the structure removing files older than 30 days, and delete empty subfolders.

<!-- language: none -->

C:\Junk\subfolder1a\subfolder2a\FILE3a

C:\Junk\subfolder1a\subfolder2a\subfolder3a

C:\Junk\subfolder1a\subfolder2B\FILE3b

Desired result:

- Delete: `FILE3b`, `subfolder2B` & `subfolder3a`.
- Leave: `subfolder1a`, `subfolder2a`, and `FILE3a`.

I can recursively clean up the files. How do I clean up the subfolders without deleting `subfolder1a`? (The "Junk" folder will always remain.)
Reply

#2
I would do this in two passes - deleting the old files first and then the empty dirs:

Get-ChildItem -recurse | Where {!$_.PSIsContainer -and `
$_.LastWriteTime -lt (get-date).AddDays(-31)} | Remove-Item -whatif

Get-ChildItem -recurse | Where {$_.PSIsContainer -and `
@(Get-ChildItem -Lit $_.Fullname -r | Where {!$_.PSIsContainer}).Length -eq 0} |
Remove-Item -recurse -whatif

This type of operation demos the power of nested pipelines in PowerShell which the second set of commands demonstrates. It uses a nested pipeline to recursively determine if any directory has zero files under it.
Reply

#3
In the spirit of the first answer, here is the shortest way to delete the empty directories:

ls -recurse | where {!@(ls -force $_.fullname)} | rm -whatif

The -force flag is needed for the cases when the directories have hidden folders, like .svn
Reply

#4
Adding on to the last one:

while (Get-ChildItem $StartingPoint -recurse | where {!@(Get-ChildItem -force $_.fullname)} | Test-Path) {
Get-ChildItem $StartingPoint -recurse | where {!@(Get-ChildItem -force $_.fullname)} | Remove-Item
}

This will make it complete where it will continue searching to remove any empty folders under the $StartingPoint
Reply

#5
I needed some enterprise-friendly features. Here is my take.

I started with code from other answers, then added a JSON file with original folder list (including file count per folder). Removed the empty directories and log those.



param (
[switch]$Clear
)

# if you want to reload a previous file list
#$stat = ConvertFrom-Json (gc dir-cleanup-filecount-by-directory.json -join "`n")

if ($Clear) {
$stat = @()
} elseif ($stat.Count -ne 0 -and (-not "$($stat[0].DirPath)".StartsWith($PWD.ProviderPath))) {
Write-Warning "Path changed, clearing cached file list."
Read-Host -Prompt 'Press -Enter-'
$stat = @()
}

$lineCount = 0
if ($stat.Count -eq 0) {
$stat = gci -Recurse -Directory | %{ # -Exclude 'Visual Studio 2013' # test in 'Documents' folder

if (++$lineCount % 100 -eq 0) { Write-Warning "file count $lineCount" }

New-Object psobject -Property @{
DirPath=$_.FullName;
DirPathLength=$_.FullName.Length;
FileCount=($_ | gci -Force -File).Count;
DirCount=($_ | gci -Force -Directory).Count
}
}
$stat | ConvertTo-Json | Out-File dir-cleanup-filecount-by-directory.json -Verbose
}

$delelteListTxt = 'dir-cleanup-emptydirs-{0}-{1}.txt' -f ((date -f s) -replace '[-:]','' -replace 'T','_'),$env:USERNAME

$stat |
? FileCount -eq 0 |
sort -property @{Expression="DirPathLength";Descending=$true}, @{Expression="DirPath";Descending=$false} |
select -ExpandProperty DirPath | #-First 10 |
?{ @(gci $_ -Force).Count -eq 0 } | %{
Remove-Item $_ -Verbose # -WhatIf # uncomment to see the first pass of folders to be cleaned**
$_ | Out-File -Append -Encoding utf8 $delelteListTxt
sleep 0.1
}

# ** - The list you'll see from -WhatIf isn't a complete list because parent folders
# might also qualify after the first level is cleaned. The -WhatIf list will
# show correct breath, which is what I want to see before running the command.

Reply

#6
This worked for me.

$limit = (Get-Date).AddDays(-15)

$path = "C:\Some\Path"

Delete files older than the `$limit`:

Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force

Delete any empty directories left behind after deleting the old files:

Get-ChildItem -Path $path -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse
Reply

#7
To remove files older than 30 days:

get-childitem -recurse |
? {$_.GetType() -match "FileInfo"} |
?{ $_.LastWriteTime -lt [datetime]::now.adddays(-30) } |
rm -whatif

(Just remove the `-whatif` to actually perform.)

Follow up with:

get-childitem -recurse |
? {$_.GetType() -match "DirectoryInfo"} |
?{ $_.GetFiles().Count -eq 0 -and $_.GetDirectories().Count -eq 0 } |
rm -whatif
Reply

#8
This will sort subdirectories before parent directories working around the empty nested directory problem.

dir -Directory -Recurse |
%{ $_.FullName} |
sort -Descending |
where { !@(ls -force $_) } |
rm -WhatIf
Reply



Forum Jump:


Users browsing this thread:
1 Guest(s)

©0Day  2016 - 2023 | All Rights Reserved.  Made with    for the community. Connected through