Posts

Performance Counter Script

, ,

This particular script started out as a project tasking.   The request was simple, if not vague: find out why our computers are so slow.  Rather than remoting into a bunch of random computers and then taking screen shots of Task Manager, I came up with the following script: Download

I’ll walk through some basic examples of what the parameters do, and then delve into how they do it.

Example 1:

.\LocalCounterScript.ps1 -TargetComputer TEST-BOX-1 -MonitorDuration 5 -SamplingRate 5

This example runs the script against a remote system, it runs for five minutes, and it takes one sample every 5 seconds.  We’re not looking for any specific processes here, so it’s going to capture from the master list embedded in the script itself.   When it finishes running, the script will not only dump a CSV report containing the highest (or lowest where relevant) values, but it will also dump the raw data it collected.

Example 2:

.\LocalCounterScript.ps1 -TargetComputer TEST-BOX-2 -TargetProcess TaniumClient -logPath C:\Temp

This time, we’re still going into a remote system, but we’re specifically looking for the TaniumClient process.  While we are only capturing that one application process, the script will always capture basic system performance counters such as hard drive activity, memory usage, and CPU usage.  Also, we’ve redirected the log file to a new location.  Because we didn’t specify a monitor duration or frequency, it falls back to the script defaults of one minute duration with counters captured every two seconds.

Alternatively, you can run this script against a list of systems via a simple loop. Combined with the start-job cmdlet, you can quickly collect performance data from a wide range of systems.

 

But How Does It Work

Starting with the parameter block, here are the basic items I’ve decided we would want to change:

param (
        [Parameter()]
        [string]$TargetComputer=$ENV:COMPUTERNAME,
        [Parameter()]
        [double]$monitorDuration = 1,
        [Parameter()]
        [int]$samplingRate = 2,
        [Parameter()]
        [string]$logPath = "\\testServer\uploads\synack\counterscript\",
        [Parameter()]
        [string]$targetProcess = "BLANK"  # I put this in here if you want to only track a specific process (i.e collecting Tanium Report data so you only want to monitor the TaniumClient process). It also collects the generic system ones
)

The first thing we might want to change is the target computer. Since I was frequently running this script against a computer I was already logged into, the default value is left as the local computer.  While you can run this with “hostname” or even nothing at all, but because this generates a report at the end, having the actual computer name matters.

Monitor duration and sampling rate are pretty much what they sound like: how long do you want to monitor the computer and how frequently are you checking your counters.

Log Path is similarly self explanatory.  This is a root folder more than a specific name since if you’re doing multiple computers at a time, I didn’t want to specify a log file name.

Target Process is where you specify what process you’re trying to monitor. If you don’t specify anything, it will monitor everything on the master list I’ve hard coded into the script.  This was based on a list of all the processes we thought would significantly impact system performance.  If you do specify a process, it monitors that process as well as some basic parameters like hard drive, memory, and CPU usage.

# Variable Declaration
$masterCountersList = @() #this is the array that will hold the GenericCounters combined with the process specific counters so I can run them all at once
$startTime = (Get-Date -format "yyyy-MM-d_hhmmss")
$GenericCounters =
"\\$TargetComputer\PhysicalDisk(*)\% Idle Time",
"\\$TargetComputer\Memory\% committed bytes in use",
"\\$TargetComputer\Memory\Available MBytes",
"\\$TargetComputer\Memory\Free System Page Table Entries",
"\\$TargetComputer\Memory\Pool Paged Bytes",
"\\$TargetComputer\Memory\Pool Nonpaged Bytes",
"\\$TargetComputer\Memory\Pages/sec",
"\\$TargetComputer\Processor(_total)\% processor time",
"\\$TargetComputer\Processor(_total)\% user time" # these are all the counters that aren't process specific

This section is all of the basic counters I want to track whether I’m after a specific process or not.  Also, this sets our $startTime variable, which we’re using for logging purposes.

After the generic counters comes the massive list of counters we were tracking on a generic system scan.  I’m not going to paste them here, but they include everything from antivirus to Tanium client and SCCM.

# This turns the process names into full counter paths so we don't have to enter three of them per each process we add later down the line
function Return-CounterArray ($processName)
{
      $counters = @()
      $counters += "\\$TargetComputer\Process($processName)\Handle Count"
      $counters += "\\$TargetComputer\Process($processName)\Thread Count"
      $counters += "\\$TargetComputer\Process($processName)\Private Bytes"
      $counters += "\\$TargetComputer\Process($processName)\% Processor Time"
      return $counters
}

# This takes the generic counters and adds them to the master list along with all the processs based ones. I feel like this could probably be rolled up into the Return-CounterArray one, but it looks pretty like this
function Generate-Counters (){
    $allCounters = @()
    $allCounters += $GenericCounters
    $ProcessList | % {$allCounters += (Return-CounterArray $_)}
    return $allCounters
}

This section actually takes all of those processes in the list and generates the four different counters we want to track for each one. If I don’t use a function like this, we just end up with a colossal list in the script itself, and nobody wants that.

# Actually start the script now
$ReportArray = @() # this holds all the generate report objects for later export
$TempArray = @() # this holds the objects while we figure out the highest value
Write-host -ForegroundColor Green "$(get-date -format hh:mm:ss) - Starting report on $TargetComputer. Please be patient as the process begins."
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - Generating master list based on $($GenericCounters.Count) System counters and $($ProcessList.count) Processes."
$masterCountersList = (Generate-Counters)
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - Master list created with $($masterCountersList.count) items."
$maxSamples = [Math]::Round(($monitorDuration*60/$samplingRate), 0)	 #multiplies your monitor duration minutes by 60 and divides by your sampling interval. Rounds to 0 decimal places because Integers
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - Will take $maxSamples samples over the course of $monitorDuration minutes."
$rawCounterDump = @()

# This actually goes and gets the counter information. Woot. 
$rawCounterDump = Get-Counter -Counter $masterCountersList -SampleInterval $samplingRate -MaxSamples $maxSamples -ComputerName $TargetComputer -ErrorAction SilentlyContinue

# This will export everything to BLG files so you can review them in Perfmon later if you'd like (gives a pretty line graph!) 
if ($logPath[-1] -ne "\") {$logPath += "\"}
$endTime = (Get-Date -format "yyyy-MM-d_hhmmss")
$blgDump = $logPath+"$TargetComputer-$endTime-RawData.blg"
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - Dumping raw Perfmon data to $blgDump."

Now we’re getting into what actually does the work.  We’re using our get-date cmdlet to track time along the way, just in case something hangs. We’re also declaring our different arrays for holding information.  I also have it give me a count of the counters and processes, mostly for diagnostic purposes. If something looks off, it probably is.  This is also where we do a bit of math to tell the Get-Counter cmdlet how many samples we’re taking. Since it only handles integers, we have to round it to 0 decimal places.  After the math and setup is complete, we get the counters from our target computer, log when we finish, and dump the raw data as a BLG file that you can open later in PerfMon.

# now that the raw data has already been exported, this chunk turns that raw data into an array for further processing.
$rawCounterDump | Export-Counter -Path $blgDump
$rawCounterDump.countersamples | % {
    $path = $_.Path
    $obj = new-object psobject -property @{
        ComputerName = $TargetComputer
        Counter = $path.Replace("\\$($TargetComputer.ToLower())","")
        Item = $_.InstanceName
        Value = [Math]::Round($_.CookedValue, 2)	
        DateTime = (Get-Date -format "yyyy-MM-d hh:mm:ss")
    }
    $TempArray += $obj
}
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - $($TempArray.count) total samples collected."

Here, we are taking that raw data and converting it to an object array that is easier to search later.   Again, this outputs some diagnostic information just in case something looks off during the conversion.

# This bit takes all the entries in TempArray, gets the unique counter names, finds all entries for that counter name, looks for the highest (or lowest where it matters) value, and then adds only the matching entry to the "highest value" report
$UniqueCounters = ($TempArray | select -Property Counter -Unique).counter
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - $($UniqueCounters.count) unique counters discovered"
foreach ($c in $UniqueCounters)
{
    $targetEntries = $TempArray | ? {$_.Counter -eq $c}
    if ($c -eq "\PhysicalDisk(*)\% Idle Time" -or $c -eq "\Memory\Available MBytes" -or $c -eq "\Memory\Pool Nonpaged Bytes") {$highValue = ($targetEntries | Measure-Object -Property Value -Minimum).Minimum}
    else {$highValue = ($targetEntries | Measure-Object -Property Value -Maximum).Maximum}
    $selectedEntry = $TempArray | ? {$_.Counter -eq $c -and  $_.Value -eq $highValue}
    if ($selectedEntry.count -gt 1) {$selectedEntry = $selectedEntry[0]}
    $ReportArray += $selectedEntry
}

In this specific case, we wanted the most “significant” value for each counter over the measured time period. For available memory, that would be the lowest number, for a process CPU usage counter, it would be the highest number.   We find each uniquely named counter, look for everything that has that name, and then find the most significant value for that name.  Once we have it, we save it to our reporting array.

# Generates a file name based on what you asked the script to do, and dumps it to a CSV for manager-ization later. 
if ($targetProcess -eq "BLANK") {$outLog = $logPath+"$TargetComputer-$startTime-to-$endTime-Results.csv"}
else {$outLog = $logPath+"$TargetComputer-$TargetProcess-$startTime-to-$endTime-Results.csv"}
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - Writing report to $outLog."
$ReportArray | Export-Csv -Path $outLog -NoClobber -NoTypeInformation -Force
Write-Host -ForegroundColor Green "$(get-date -format hh:mm:ss) - Complete.`n"

This generates the log file based on the parameters you provided earlier. If you were monitoring a specific process, this will name the log file based on that.  Otherwise, it just uses the computer name. The date and time are stamped on as well for future reference, and a spreadsheet is generated for later processing by managers who like spreadsheets.   If your IT department is like ours, you probably already know which processes are killing your CPU cycles, but this will let your manager know that you’ve done due diligence.  I hope this helps, and it sure beats taking screen captures of Perfmon for hours on end.

Archiving AD Accounts with PowerShell

, , , ,

Problem:

I work for a small K-12 school district. Often when teachers retire, they return as substitutes. We are also required maintain public records for seven years. For these reasons, we do not delete staff accounts immediately after their employment ends.

I had a process for archiving employees and activating old accounts, but it was time consuming. I also wanted to ensure that procedures were followed by new staff members.  A few of the steps I take when archiving an account are, setting Logon Workstation and Logon Hours, adding the account to a “Archived Users” group, removing all other groups, and, of course, disabling the account.

I knew I could automate this with PowerShell, but I am late to game and didn’t really know where to start. I have adapted scripts for my needs, but I have not done a project like this from start to finish. In addition to automating some account management, I wanted to expand some PowerShell skills.

Solution:

I was lucky enough to attend Ignite 2018 and caught Tools, tips and tricks from the SysAdmin field by my friend, Harjit (@hoorge). Towards the end of the session, around 17:20, he shows Active Directory Admin Center (ADAC) and the awesome Windows PowerShell History Viewer. For those unfamiliar with this feature, every action that you do in ADAC generates PowerShell code.

Active Directory Administrative Center

Copy and paste the code into PowerShell ISE and string each step of your process together. You will end up with a script like the one below:

Set-ADUser -Identity:"CN=Mike,OU=Users,OU=ViaMonstra,DC=corp,DC=viamonstra,DC=com" -LogonWorkstations:"No-PC-for-You" -Replace:@{"logonHours"="0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0","0"}
Add-ADPrincipalGroupMembership -Identity:"CN=Mike,OU=Users,OU=ViaMonstra,DC=corp,DC=viamonstra,DC=com" -MemberOf:"CN=Archived Users,OU=Security Groups,OU=ViaMonstra,DC=corp,DC=viamonstra,DC=com"
Set-ADObject -Identity:"CN=Mike,OU=Users,OU=ViaMonstra,DC=corp,DC=viamonstra,DC=com" -Replace:@{'primaryGroupID'="2103"}
Remove-ADPrincipalGroupMembership -Confirm:$false -Identity:"CN=Mike,OU=Users,OU=ViaMonstra,DC=corp,DC=viamonstra,DC=com" -MemberOf:"CN=Domain Users,CN=Users,DC=corp,DC=viamonstra,DC=com"
Set-ADAccountExpiration -DateTime:"12/29/2018 00:00:00" -Identity:"CN=Mike,OU=Users,OU=ViaMonstra,DC=corp,DC=viamonstra,DC=com"

Not only do we have a starting point for a User Management script, but we are also getting a solid introduction to PowerShell and how it works. Taking this a step further, let’s look at how to replace the -Identity parameter with a variable. To accomplish this, I created two variables for the User Account, $Account and $AccountDetails. $Account asks for the user name. $AccountDetails uses Get-AdUser to gather details of the Account, specifically DistinguishedName.

$Account = Read-Host -Prompt 'Input the user name'
$AccountDetails = Get-ADUser $Account

At this point, we can now replace

-Identity:"CN=Mike,OU=Users,OU=ViaMonstra,DC=corp,DC=viamonstra,DC=com"

With

-Identity:$AccountDetails.DistinguishedName 

This small addition greatly increases the usefulness of this script. Since Distinguished Name contains the full path, I do not need to worry about typos in OU names. Also, I can hand this script off to help desk staff to archive accounts without worrying that a step was missed. Below is the full script that I currently use when archiving accounts. You will notice that I added several more variables, $DenyHours, $ArchivedUsersGroup, and $DisableOU. I used Richard Siddaway’s blog post, https://richardspowershellblog.wordpress.com/2012/01/26/setting-a-users-logon-hours/ to declare $DenyHours. $ArchivedUsers and $DisabledOU are variables that I declared for my group and OU for disabled accounts.

#Declare Logon Hours
[byte[]]$DenyHours = @(0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0)
#Declare OUs
$DisableOU = "OU=- Disabled Accounts,OU=Users,DC=ViaMonstra,DC=org"
#Declare Groups
$ArchivedUsersGroup = "CN=Archived Users,OU=Groups,DC=ViaMonstra,DC=org"
# Input Account and get Account details
$Account = Read-Host -Prompt 'Input the user name'
$AccountDetails = Get-ADUser $Account
#Disable Account
Disable-ADAccount -Identity:$AccountDetails.DistinguishedName
# Set Logon Restrictions
Set-ADUser -Identity:$AccountDetails.DistinguishedName -LogonWorkstations:"No-PC-for-You" -Replace:@{logonHours=$DenyHours
# Add Archive Flag for Email Autoreply Rules
Set-ADUser -Identity:$AccountDetails.DistinguishedName -Office:"Archive"
# Add Account to 'Archived Users' group and Set as primary
Add-ADGroupMember -Identity:$ArchivedUsersGroup -Members:$AccountDetails
Set-ADObject -Identity:$AccountDetails -Replace:@{'primaryGroupID'="ChangeToYourGroupID"}
# Set Description to Disabled Date
Set-ADUser $AccountDetails -Description "Account Disabled on $(Get-Date -format 'd')"
# Remove From all the Groups
Get-ADGroup -Filter {name -notlike "*Archived Users*"}  | Remove-ADGroupMember -Members $AccountDetails.samaccountname -Confirm:$False 
# Move Account to Disabled Users OU
Move-ADObject -Identity:$AccountDetails.DistinguishedName -TargetPath:$DisableOU

To recap, I defined a procedure to archive accounts. Using ADAC, I captured the PowerShell commands for each step. Finally, I replaced static information such as, Account Name, OU’s, and Groups with variables. I hope that this gives you some ideas for managing user accounts with PowerShell in your environment.

Update – I added a screenshot of ADAC, changed the disable account to the cleaner “Disable-ADAccount” and fixed a couple of typos.

Have you heard about Get-WQLObject?

, , , , ,

Have you heard about Get-WQLObject?


WQL is a part of SCCM administration whether you like it or not. Over time you may become quite savvy with writing up WQL queries and appreciate that it mimics SQL in just enough of a way to allow some fancy collections. 

You can also use the ‘Query’ section under monitoring to view, develop, and run these WQL queries outside of collection evaluation. This can be useful for getting some quick data in a one-off fashion at a point in time. Commonly I will need to interact with this data though such as when I’m remediating some issues via a script. As much fun as copy pasting to excel and importing a CSV is… I wrote a script instead! 


The Meat:


What I’m doing at it’s core is a WMI query. That is the real meat here. So… here’s a basic function that takes in a WQL query and gives you the raw output. 


$Query = @"
select distinct s.Name,
sw.ProductName,
sw.ProductVersion,
cbs.CNIsOnline,
os.Caption
from  SMS_R_System s
inner join SMS_G_System_INSTALLED_SOFTWARE sw on sw.ResourceID = s.ResourceId
inner join SMS_CollectionMemberClientBaselineStatus cbs ON cbs.ResourceID = s.ResourceId
inner join SMS_G_System_Operating_System os ON os.ResourceID = s.ResourceID
where sw.ProductName like "%7-Zip%"
and sw.ProductVersion NOT IN ("18.05.00.0","18.05")
order by sw.ProductVersion
"@

function Get-WQLObject {
    param(
        # WQL formatted query to perform
        [Parameter(Mandatory = $true)]
        [string]$Query,
        # SMS Provider to query against
        [Parameter(Mandatory = $true)]
        [string]$SMSProvider
    )
    $SiteCode = (Get-WmiObject -Namespace "root\sms" -ClassName "__Namespace" -ComputerName $SMSProvider).Name.Substring(5, 3)
    $Namespace = [string]::Format("root\sms\site_{0}", $SiteCode)
    Get-WmiObject -ComputerName $SMSProvider -Namespace $Namespace -Query $Query
}

Get-WQLObject -SMSProvider 'SCCM.CONTOSO.COM' -Query $Query

This almost shouldn’t even be a function, but it is because I put ‘function’ in front of it with a name and a {.

So, if you run this you will get some data back, though it is a bit ugly honestly. 

Ugly

But hey, this is PowerShell. It is object oriented. I can dig into this object. I can expand cbs (Founders Canadian Breakfast Stout anyone?) and then I can expand CNIsOnline and I’ll get the value I truly care about. Same for sw, and then ProductName, and ProductVersion. But that sounds tedious.


The Potatoes:


Right, so I technically have what I need. But I want more! Fatten things up a bit ya know? Couple scoops of potatoes. Disclaimer: This IS potatoes, it does slow things down a bit. Though to be fair, any large WQL query via PowerShell is going to be slow no matter what.


function Get-WQLObject {
    param(
        # WQL formatted query to perform
        [Parameter(Mandatory = $true)]
        [string]$Query,
        # SMS Provider to query against
        [Parameter(Mandatory = $true)]
        [string]$SMSProvider,
        # Optional PSCredential 
        [Parameter(Mandatory = $false)]
        [pscredential]$Credential
    )
    Begin {
        if ($PSBoundParameters.ContainsKey('Credential')) {
            $AddedDefaultParam = $true
            $PSDefaultParameterValues.Add("Get-WmiObject:Credential", $Credential)
        }
        $SiteCode = (Get-WmiObject -Namespace "root\sms" -ClassName "__Namespace" -ComputerName $SMSProvider).Name.Substring(5, 3)
        $Namespace = [string]::Format("root\sms\site_{0}", $SiteCode)
    }
    Process {
        $RawResults = Get-WmiObject -ComputerName $SMSProvider -Namespace $Namespace -Query $Query
        $PropertySelectors = $RawResults | Get-Member -MemberType Property | Where-Object { -not $_.Name.StartsWith('__') } | Select-Object -ExpandProperty name | ForEach-Object {
            $Class = $_
            $Properties = $RawResults.$Class | Get-Member -MemberType Property | Where-Object { -not $_.Name.StartsWith('__') } | Select-Object -ExpandProperty name
            foreach ($Property in $Properties) {
                [string]::Format("@{{Label='{1}.{0}';Expression = {{`$_.{1}.{0}}}}}", $Property, $Class)
            }
        }
    }
    end {
        if ($AddedDefaultParam) {
            $PSDefaultParameterValues.Remove("Get-WmiObject:Credential")
        }
        $PropertySelector = [scriptblock]::Create($($PropertySelectors -join ','))
        $RawResults | Select-Object -Property $(. $PropertySelector)
    }
}

Starting to look a lot more like a function! You know it should be a function when it starts doing weird things that you don’t feel like typing out over and over. 


$PropertySelectors = $RawResults | Get-Member -MemberType Property | Where-Object { -not $_.Name.StartsWith('__') } | Select-Object -ExpandProperty name | ForEach-Object {
    $Class = $_
    $Properties = $RawResults.$Class | Get-Member -MemberType Property | Where-Object { -not $_.Name.StartsWith('__') } | Select-Object -ExpandProperty name
    foreach ($Property in $Properties) {
        [string]::Format("@{{Label='{1}.{0}';Expression = {{`$_.{1}.{0}}}}}", $Property, $Class)
    }
}

Remember how I said we could ‘dig into’ the objects? This snippet above is generating a string (that we will join and turn into a scriptblock) to do just that. We are going to dynamically find our classes and properties by leveraging Get-Member and filtering out system properties. Those system properties don’t interest me in this context so I’m filtering them out with “Where-Object { -not $_.Name.StartsWith(‘__’) }” leaving me with the properties I care about.

Couple notes:

  •  I like [string]::Format. It is actually REALLY fast in terms of CPU time, and for me personally it makes things more ‘manageable.’ For some people it is just confusing. You can simply do $Potatoes = “$Var1″+’bacon’+”$Var2” or many other methods if you prefer. ($Var1 = ‘Cheese’ by the way)
  • In our generated string I am escaping $_ by writing out `$_. This is so that we don’t actually expand out $_ right now. I want to treat it is a string to be used later.

The output, based on the 7-Zip software query above, can be seen below and should look familiar if you’ve ever done some custom Select-Object with calculated properties.


@{Label='cbs.CNIsOnline';Expression = {$_.cbs.CNIsOnline}},
@{Label='os.Caption';Expression ={ $_.os.Caption}},
@{Label='s.Name';Expression = {$_.s.Name}},
@{Label='sw.ProductName';Expression ={$_.sw.ProductName}},
@{Label='sw.ProductVersion';Expression = {$_.sw.ProductVersion}}

To leverage this code in the way I am hoping to though, it cannot just be a string. It has to be a scriptblock that can be dot-sourced. Let’s do that!


 $PropertySelector = [scriptblock]::Create($($PropertySelectors -join ','))

Because we are going to be wanting to expand out from $_ in the context of $RawResults | Select-Object… we need to execute the scriptblock. Simply passing $PropertySelector won’t work, so instead we will dot-source the scriptblock.


$RawResults | Select-Object -Property $(. $PropertySelector)

This will ‘execute’ our $PropertySelector statement that we generated. So {$_.s.Name} will actually be {$<instance of $RawResults that we are piping>.s.Name} which is what we want!

And of course the output from the function now looks MUCH nicer.

Mmmm Potatoes!

The Gravy!


I know, I know. This has all been really dry. Enter… the gravy!

WQL… you’ve got that down right? You know a good 2-300 WMI Classes under the root\sms\site_<sitecode> namespace including their properties right? Yeah, me too. 

What if we could leverage all those queries you have under you ‘Queries’ node in monitoring?

$Gravy = ‘DynamicParam’


function Get-WQLObject {
    param(
        # WQL formatted query to perform
        [Parameter(Mandatory = $true, ParameterSetName = 'CustomQuery')]
        [string]
        $Query,
        # SMS Provider to query against
        [Parameter(Mandatory = $true)]
        [string]
        $SMSProvider,
        # Optional PSCredential (unfortunately I can't figure out how to use this cred in the DynamicParam WMI queries without providing info outside the function)
        [Parameter(Mandatory = $false, ParameterSetName = 'CustomQuery')]
        [pscredential]
        $Credential
    )
    DynamicParam {
        if (($SMSProvider = $PSBoundParameters['SMSProvider'])) {
            $ParameterName = 'SCCMQuery'
            $RuntimeParameterDictionary = New-Object System.Management.Automation.RuntimeDefinedParameterDictionary
            $AttributeCollection = New-Object System.Collections.ObjectModel.Collection[System.Attribute]
            $ParameterAttribute = New-Object System.Management.Automation.ParameterAttribute
            $ParameterAttribute.Mandatory = $true
            $ParameterAttribute.ParameterSetName = 'ExistingQuery'
            $ParameterAttribute.HelpMessage = 'Specify the name of a query that already exists in your ConfigMgr environment'
            $AttributeCollection.Add($ParameterAttribute)
            $SiteCode = (Get-WmiObject -Namespace "root\sms" -ClassName "__Namespace" -ComputerName $SMSProvider).Name.Substring(5, 3)
            $Namespace = [string]::Format("root\sms\site_{0}", $SiteCode)
            $arrSet = Get-WmiObject -ComputerName $SMSProvider -Namespace $Namespace -Query "SELECT Name FROM SMS_Query WHERE Expression not like '%##PRM:%'" | Select-Object -ExpandProperty Name
            $ValidateSetAttribute = New-Object System.Management.Automation.ValidateSetAttribute($arrSet)
            $AttributeCollection.Add($ValidateSetAttribute)
            $RuntimeParameter = New-Object System.Management.Automation.RuntimeDefinedParameter($ParameterName, [string], $AttributeCollection)
            $RuntimeParameterDictionary.Add($ParameterName, $RuntimeParameter)
            return $RuntimeParameterDictionary
        }
    }
    Begin {
        $SCCMQuery = $PsBoundParameters[$ParameterName]
        if ($PSBoundParameters.ContainsKey('Credential') -and -not $PSDefaultParameterValues.ContainsKey("Get-WmiObject:Credential")) {
            $AddedDefaultParam = $true
            $PSDefaultParameterValues.Add("Get-WmiObject:Credential", $Credential)
        }
        $SiteCode = (Get-WmiObject -Namespace "root\sms" -ClassName "__Namespace" -ComputerName $SMSProvider).Name.Substring(5, 3)
        $Namespace = [string]::Format("root\sms\site_{0}", $SiteCode)
        if ($PSCmdlet.ParameterSetName -eq 'ExistingQuery') {
            $Query = Get-WmiObject -ComputerName $SMSProvider -Namespace $Namespace -Query "SELECT Expression FROM SMS_Query WHERE Name ='$SCCMQuery'" | Select-Object -ExpandProperty Expression
        }
    }
    Process {
        $RawResults = Get-WmiObject -ComputerName $SMSProvider -Namespace $Namespace -Query $Query
        $PropertySelectors = $RawResults | Get-Member -MemberType Property | Where-Object { -not $_.Name.StartsWith('__') } | Select-Object -ExpandProperty name | ForEach-Object {
            $Class = $_
            $Properties = $RawResults.$Class | Get-Member -MemberType Property | Where-Object { -not $_.Name.StartsWith('__') } | Select-Object -ExpandProperty name
            foreach ($Property in $Properties) {
                [string]::Format("@{{Label='{1}.{0}';Expression = {{`$_.{1}.{0}}}}}", $Property, $Class)
            }
        }
    }
    end {
        if ($AddedDefaultParam) {
            $PSDefaultParameterValues.Remove("Get-WmiObject:Credential")
        }
        $PropertySelector = [scriptblock]::Create($($PropertySelectors -join ','))
        $RawResults | Select-Object -Property $(. $PropertySelector)
    }
}

There we go! Now we are nice and bloated!

What have I done!!!

  • Added ParameterSets
  • Added DynamicParam
  • Justified creating a function by making this thing nice and ugly

We now have two ParameterSets, one of which is less obvious because it is introduced in the ‘DynamicParam’ block. 


$ParameterAttribute.ParameterSetName = 'ExistingQuery'

And then we do a bit of magic which allows us to tab-complete our existing Queries that you have in SCCM right now. (Note: I’m excluding those that require parameters because… yeah I don’t feel like writing in that logic right now)


$arrSet = Get-WmiObject -ComputerName $SMSProvider -Namespace $Namespace -Query "SELECT Name FROM SMS_Query WHERE Expression not like '%##PRM:%'" | Select-Object -ExpandProperty Name

So what does this do for me? 

Tab completion!

You now have tab completion based on the name of the queries in SCCM. You can execute all of your pre-existing queries, and even find them with tab-completion or by using ctrl+space.

Ohhh buddy!

I did mention this above, but I will say it again. A slow query is still a slow query, and our calculated properties isn’t going to make it faster. This function adds some overhead to your query as we are also piping it to a select-object after the fact. BUT it is pretty cool right? Get those gears turning on DynamicParams!

Some good-to-mentions about the function:

  • To use the -SCCMQuery parameter you need to supply -SMSProvider first. if (($SMSProvider = $PSBoundParameters[‘SMSProvider’]))
  • Unfortunately, no -Credential param when you are using -SCCMQuery because I wasn’t able to figure out how to use the credentials from the parameter inside the DynamicParam which would be needed to get our list of expressions. Let me know if you figure it out!
  • It is on GitHub
  • I might have made some gross oversights and overcomplicated this

Cody Mathis
@CodyMathis123