Search This Blog

Tuesday, August 29, 2023

Remotely Check Disks Space, Find Large Files and Remove Old Files - PowerShell

Remotely Check Disks Space, Find Large Files and Remove Old Files - PowerShell

For a key client, I manage over 200 SQL Server hosts, all running as VMware virtual machines, hosting more than 500 SQL instances and over 5,000 databases. Aside from about 50 large, business‑critical databases, I have limited visibility into the rest, their purpose, supported applications, workloads, or user groups are often unknown.

This is largely due to the organization’s policy: they “buy” rather than build applications. Most software solutions come bundled with the hardware they purchase, often including multiple interfaces, one for IT administrators, one for employees, and another for customers.

The only reason I’m familiar with the large and critical databases is because they naturally draw attention whenever there’s an incident or outage.

Regarding SQL Server storage configuration, I suspect most DBAs, myself included, leave the default settings: auto‑growth enabled with no maximum file size. While exceptions exist, this configuration is common practice.

One example from my past examplies why that’s not always ideal. I once supported a highly transactional publisher database in a complex replication topology, serving over a thousand users entering data continuously during business hours. To maintain DML performance, the database had minimal indexing, and all read workloads were offloaded to replicated databases hosted nationwide and internationally.

In this setup, large transactions posed significant risk because they could clog the replication stream and disrupt business operations. To mitigate that, we set a hard 100 MB limit on the transaction log file. Any oversized transaction would immediately fail, a deliberate safeguard for a system generating more than billion dollars in profit.

In my current environment, such constraints aren’t necessary, so all databases still use the default auto‑growth settings with unlimited size. This works fine, until a disk runs out of space.

We closely monitor SQL Servers for uptime, availability, and performance, including disk usage. Our disk‑space alert rule is straightforward:

% free space falls below 15% and amount of free space is below 100GB


This dual condition minimizes false positives. For example, a 10 TB drive at 5% free space (500 GB remaining) won’t raise an alert, as adequate space remains. This threshold generally provides enough lead time to act before an outage occurs.

On average, I receive about five disk‑space alerts per week, each requiring prompt investigation and resolution. Fortunately, standardized SQL Server configurations and consistent disk layout conventions have helped us maintain overall stability and predictability in this environment:


Drive Letter

Purpose

C:

OS/System derive

D:

SQL Server Software/binaries

K:

Data files

L:

Transaction Log

O:

SQL Server Database Backups

T:

SQL Server Temp files

If the FILESTREAM feature is enabled, it’s typically placed on a separate drive, since those files tend to grow much faster in our environment.

When a disk space alert email arrives for one or more SQL Servers, I can usually make an educated guess about the cause just by looking at the drive letter. Over time, you develop an instinct for the usual suspects.

This write‑up isn’t a detailed troubleshooting guide, but here are a few examples:

O Drive (Database Backups):

  • Backup jobs may be failing or not deleting old backups as intended.
  • A large, one‑time transaction could have generated an oversized log backup and temporarily increased nightly differential backup sizes until the next full backup.
  • Significant ongoing data changes can inflate both transaction log and differential backup sizes.
  • Databases may have grown substantially, increasing backup sizes.
  • New databases might have been restored or populated with large data loads.

L Drive (Transaction Logs):

  • Transaction log backups may be failing or not running at all.
  • A large, single transaction could have caused excessive log growth. Since logs aren’t automatically reclaimed, that space remains allocated until manually reduced.

K Drive (Data Files):

  • The database may be expanding organically faster than anticipated.
  • New databases could have been added.

In some cases, it helps to reach out to the application teams to understand recent changes or unusual activity.

Before connecting to the SQL Server or logging in via RDP, I typically run a PowerShell script that generates a quick disk space summary and lists the largest files on each drive. The script also identifies file types (data, log, or backup), their corresponding SQL instance, and database. I can usually pinpoint the cause and resolve it within five minutes, all without directly logging into the server.

Common fixes include:

  • Adding more disk space when databases have legitimately grown or new ones have been added.
  • Shrinking a large transaction log back to normal size after a one‑off growth event.
  • Deleting outdated or unnecessary backup files.
  • Repairing or restarting failed backup or maintenance jobs.


Now about the PowerShell scripts,  here are the parts of that PowerShell script:

Get the disk space info:

Remember to change the value for $computer_fqdn

This assumes that your currently logged-on user account has the necessary permissions on the remote computer. If not then you can provide a different credentials by adding "-Credential (Get-Credential)" to the Get-WMIObject command.

Get-WMIObject  -ComputerName $computer_fqdn Win32_LogicalDisk -ErrorVariable ErrVar -ErrorAction SilentlyContinue | 
    Where-Object {$_.MediaType -eq 12} |
    Select-Object __SERVER,
                 @{n='DriveLetter';e={$_.Name}}, 
                 VolumeName,   
                 @{n='Capacity (Gb)' ;e={"{0:n0}" -f ($_.size/1gb)}},
                 @{n='FreeSpace (Gb)';e={"{0:n0}" -f ($_.freespace/1gb)}}, 
                 @{n='PercentFree';e={[Math]::round($_.freespace/$_.size*100)}} | 
    Format-Table -AutoSize 

Caveat: The above PowerShell command currently does not support the Mount Points


Results:







Finding Files on a Remote Computer:

More precisely, this script searches for files on a specific disk drive rather than the entire computer. To scan multiple drives, you’ll need to run the script separately for each one. You can also specify a folder or subfolder path if you’d prefer not to search from the root of the drive.

The script is designed to locate the largest files, with the threshold defined by the variable $filter_by_size. You can further refine your search by:

  • Setting a file name wildcard using $filter_by_name
  • Filtering by the last modified date using $filter_by_last_modified_before_days

In the example below, the script searches the O:\ drive for the five largest files whose names contain *FULL*.BAK, are larger than 100 MB, and were last modified more than 8 days ago.

You can adjust any of the filter values to suit your needs. To disable a filter, simply comment it out by placing a hash sign (#) at the beginning of that line. Finally, make sure to update the $computer_name variable with your target server name. You can also include multiple server names, separated by commas, for example:


$computer_name = @("Server1", "Server2", Server3")

# Name of the remote computer
$computer_name = @("SQLServer1")
 
# Drive letter or a subfolder on a drive to search for the large sizes
$remote_folder = 'O:\'   

# Add optional filters
$filter_by_name = '*FULL*.BAK'
$filter_by_size = 100*1024*1024    # 100*1024*1024 = 100MB
$filter_by_last_modified_before_days = -8 # Note the minus (-) sign

$top_n = 5  # Limit the results to the top n files by size
# Make sure the $filter_by_last_modified_before_days is a negative value
if($filter_by_last_modified_before_days -gt 0) {$filter_by_last_modified_before_days = $filter_by_last_modified_before_days * -1}

# Set the filters to default values if not already Set by the caller
if($top_n -eq $null -or $top_n -eq 0 -or $top_n -eq '') {$top_n=50}
if($filter_by_name -eq $null -or $filter_by_name -eq '') {$filter_by_name='*'}
if($filter_by_size -eq $null -or $filter_by_size -eq '') {$filter_by_size=0}
if($filter_by_last_modified_before_days -eq $null -or $filter_by_last_modified_before_days -eq '') 
{$filter_by_last_modified_before_days=0}


# Lets get the fqdn for the remote computer
$computer_fqdn = @()
foreach($computer in $computer_name){$computer_fqdn += @([System.Net.Dns]::GetHostEntry($computer).HostName)}


$large_files = @(Invoke-Command -computername $computer_fqdn -ArgumentList $remote_folder, $filter_by_name, $filter_by_size, $top_n  `
               -scriptBlock {Get-ChildItem $Using:remote_folder -Filter $Using:filter_by_name  -recurse -ErrorAction SilentlyContinue Force | 
                             where-object {$_.length -gt $Using:filter_by_size} | 
                             Sort-Object length -Descending | 
                             select   fullname, 
                                      @{Name="Size(GB)";Expression={[Math]::round($_.length / 1GB, 2)}}, 
                                      @{Name="Size(MB)";Expression={[Math]::round($_.length / 1MB, 0)}}, 
                                      LastWriteTime `
                             -First $Using:top_n
                            }  
                )
# Display
$large_files | sort -Property 'Size(MB)'  -Descending | ft PSComputerName, 'Size(MB)', 'Size(GB)', LastWriteTime, FullName

#$files_to_delete = $large_files | where-object {$_.LastWriteTime -lt ((Get-date).AddDays($filter_by_last_modified_before_days))} | select FullName

$large_files_count = $large_files.Count
"Number of files: $large_files_count"


Download above PowerShell script from GitHub at:

Find Largest Files on a Remote Server


Deleting Files Remotely:

Finally, let’s cover how to delete files on a remote computer.

To remove a single file, you can use PowerShell’s Invoke-Command with Remove-Item inside the script block:

Invoke-Command -ComputerName '<Computer Name>' -ScriptBlock {Remove-Item '<FileName>'}

To delete multiple files, list them directly in the Remove-Item command: 

Remove-Item 'O:\MSSQL\BACKUPS\FULL\TESTDB1.BAK', 'O:\MSSQL\BACKUPS\FULL\TESTDB1.BAK'


Alternatively, you can pass the file paths as an array, which is often cleaner when handling many files:


Remove-Item @('O:\MSSQL\BACKUPS\FULL\TESTDB1.BAK', 'O:\MSSQL\BACKUPS\FULL\TESTDB1.BAK')


You can also use wildcards, as shown in this Microsoft example: 

Remove-Item * -Include *.doc -Exclude *1*


However, I usually first use the previous PowerShell script to search the files on the remove computer that meet my criteria for deletion, and store the results into $large_files variable then use the following PowerShell code to delete the files.

$files_to_delete = $large_files
foreach($file_to_delete in $files_to_delete)
{
    Invoke-Command -ArgumentList $file_to_delete.PSComputerName, $file_to_delete.FullName -ComputerName $file_to_delete.PSComputerName -ScriptBlock {get-item $Using:file_to_delete.FullName}
#    Invoke-Command -ArgumentList $file_to_delete.PSComputerName, $file_to_delete.FullName -ComputerName $file_to_delete.PSComputerName -ScriptBlock {remove-item $Using:file_to_delete.FullName}
}

For safety reasons, I have commented out the line that actually deletes the files, so you have opportunity to review the output before uncommenting and re-execute the FOREACH block.


We can’t always delete files just because we think they’re no longer needed. Sometimes it’s unclear whether certain files are truly redundant or still required. In those cases, instead of deleting them, I prefer to compress the files using NTFS file compression. It compresses them in place, transparently, and operates in online mode, meaning there’s no need to stop SQL Server services before applying it.  
















As far as I know, enabling NTFS compression through PowerShell is a bit tricky. While the Set-ItemProperty cmdlet can modify file and folder attributes such as ReadOnly or Archive, it doesn’t support toggling the Compressed attribute. Instead, I use the compact.exe command from within PowerShell, like this:

compact /C <File Name>

PowerShell does comes with a Compress-Archive command that compresses file/s in Zip format. 

Invoke-Command -ComputerName '<ComputerName>'  -ScriptBlock {

$compress =
@{
  Path = "O:\dbbackup.bak"
  CompressionLevel = "Fastest"
  DestinationPath = "
O:\dbbackup.bak.zip"
}
Compress-Archive @compress
}

But now you still have the original file taking up the disk space. Once you verify that the Zip file was successfully created, you can then manually delete the original file.




Popular