PowerShell for Windows Admins


September 5, 2013  3:38 PM

Getting remote services

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

Getting the service on a remote server is easy

Get-Service -ComputerName Exch10

Getting a set of services on a remote machine isn’t difficult

Get-Service -ComputerName Exch10 -Name “MSExchangeAB”, “W32Time”, “W3SVC”

Ok so what about the scenario with multiple servers

Get-Service -ComputerName Exch10, server02

Multiple servers with a set of servers just means adding the –Name parameter and the list of services

Get-Service -ComputerName Exch10, server02 -Name “W32Time”, “W3SVC”

but you have to have the same list of services for each server.

Now lets get really picky and go for multiple servers where each one has a different set of services.

I thought about csv files but then how do you repent the list? Nested arrays – yuck.

I ended up with this approach. Its a bit messy but works and is easily expandable & changeable

$servers = “server02″, “exch10″
$server02_services = “BITS”, “NtFrs”, “MSMQ”, “Kdc”
$exch10_services = “MSExchangeAB”, “W32Time”, “W3SVC”

foreach ($server in $servers){
Get-Service -ComputerName $server -Name (Get-Variable -Name ($server + “_services”)).value |
select @{N=”Server”; E={$server}}, Status, Name, DisplayName
}

Create a list of servers. Create a list of services per server. Notice the naming convention for the variables.

Iterate over the servers using foreach. The server name comes from the foreach iteration variable. The list of services is created by using get-variable. Substitute the server name to create the variable name and use the Value property to give you the services of interest. Use select to add the computer name to the output.

I don’t use the *Variable cmdlets very often and this was a neat use of get-variable

September 5, 2013  2:46 PM

Discovering a users OU

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

Interesting question – how do you discover the OU in which an AD user is sitting? The Quest cmdlets were very helpful because they had a ParentContainer property. With the Microsoft cmdlets you have to do a bit of work

There are two places to look – the distinguished name and the canonical name

PS> $user = Get-ADUser -Identity Richard -Properties Canonicalname
PS> $user

CanonicalName : Manticore.org/Users/Richard
DistinguishedName : CN=Richard,CN=Users,DC=Manticore,DC=org
Enabled : True
GivenName : Richard
Name : Richard
ObjectClass : user
ObjectGUID : b94a5255-28d0-4f91-ae0f-4c853ab92520
SamAccountName : Richard
SID : S-1-5-21-3881460461-1879668979-35955009-1104
Surname :
UserPrincipalName : Richard@Manticore.org

Notice the different formats

The distinguished name is easiest

PS> ($user.DistinguishedName -split “,”, 2)[1]
CN=Users,DC=Manticore,DC=org

use split on the DistinguishedName. Note the format of the split command – - – “,”, 2

It means split on a comma and give me two elements – one containing the data before the first comma & the second containing all data after the first comma

The canonical name needs a bit more work

PS> $elements = $user.CanonicalName -split ‘/’
PS> $elements[0..($elements.Count - 2)] -join ‘/’
Manticore.org/Users

split the canonical name on ‘/’ and then recreate the string dropping the last element


September 4, 2013  2:26 PM

Integer sizes save the day

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

I was looking to convert some string data into integers – the data was bytes & I needed it in GB. The value I needed was buried in the middle of a string so I needed to do some string processing. Some other conditions forced me to use the Parse method of the integer class

Lets say I get a number like 2147483169 back in the data & I need to divide by 1gb

£> [int]::Parse(“2147483169″) / 1gb
1.99999955389649

I’ve put the string value directly in here rather than the complicated string processing I was actually using.

The point came when I realised that I could have much bigger numbers than 2Gb so what was the maximum I could work with

£> [int]::maxValue
2147483647
£> [int]::maxValue / 1gb
1.99999999906868

OK thats not enough. The standard [int] is 32bits & I know its big brother has 64 bits so how big is big brother

£> [int64]::maxValue
9223372036854775807
£> [int64]::maxValue / 1gb
8589934592

That’s more than big enough so I went with 64 bit integers.

This isn’t just a developer type thing because I was working with Exchange mail box sizes. So the moral of the story is to think about the possible values you’ll see in you data while you are coding to avoid errors or failure when you are running the script.


September 3, 2013  2:24 PM

Import and Export PSSession

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

A couple of comdlets that I’ve not really used much before have just come into prominance for me to solve a problem I’ve been working on. These cmdlets are:

Import-PSSession
Export-PSSession

The two cmdlets work with commands in a remote session but have subtle differences in terms of what they actually do.

Import-PSSession

This cmdlet imports commands (cmdlets, functions and aliases) from a PSsession on a remote (or local) machine into your current PowerShell session. You must have a remoting session open to the machine from which you want to import and that session must remain open while you are using the commands.

A temporary module is created to hold the imported commands. when you close PowerShell it is removed. Each command has an –AsJob parameter added.

The commands are imported into your session as functions and rely on remoting to access the remote machine. This means that you get inert objects back exactly as if you had run the command on the remote machine through a remoting session.

Export-PSsession

This cmdlet imports commands from a Powershell remotign session BUT it saves them as a module on your local machine. You have to then import the module to use the commands. You will need a remoting session the remote machine for the module to work – Powershell will create it for you when you import the module if one doesn’t exist.

The advantage is that you are importing a local module which should be faster than using import-pssession each time.


September 2, 2013  3:28 PM

Get-Alias surprise

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

On the principle that any day on which you learn something new is not a complete waste I stumbled over this today:

Get-Alias has a –definition parameter

I’d always done this

Get-Alias | where Definition -eq ‘select-string’

now I can save some typing

get-alias -Definition ‘select-string’

Easy


September 2, 2013  11:29 AM

Winter Scripting Games–maybe?

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

Did you enjoy the Summer Scripting Games? Do you want more?

We are exploring the possibility of running a Winter Scripting Games – see http://powershell.org/wp/forums/topic/winter-games-2013/

This will be a team event (minimum of 2) that would run late this year or early next year.

Interested?

Add a comment with your thoughts on the process and the timing to the thread at the above URL


September 1, 2013  1:56 PM

Get-Command trick

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

A useful trick with Get-Command. Use the –ListImported parameter

Get-Command -ListImported -CommandType cmdlet

shows the cmdlets imported into your session

Get-Command -ListImported -CommandType function

shows the functions imported into your session

Get-Command –ListImported

shows cmdlets and functions

useful for knowing what you’ve imported against what’s available


September 1, 2013  4:57 AM

Find the key of a WMI class

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

At least one property of a WMI class will be marked as a key. This information is held in the property Qualifiers which is a collection of data. The key is needed if you need to create an instance of the class. I’ve shown how to find the key with Get-WmiObject before but not with the new CIM cmdlets.

Its actually easier with the CIM cmdlets

$class = Get-CimClass -ClassName Win32_Process

foreach ($property in $class.CimClassProperties) {

$property | select -ExpandProperty Qualifiers |
foreach {
if ($_.Name -eq ‘key’){
$property
}
}

}

Get the CIM class. Iterate over the properties and expand the Qualifiers of the property. Check the names of the qualifiers for key & when you find it print out the property data


August 31, 2013  2:18 PM

Server Core Module

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

On a Windows Server 2012 system you will find a ServerCore module with two cmdlets

Get-DisplayResolution
Set-DisplayResolution

On a full GUI system the cmdlets work

PS> Get-DisplayResolution
1366×768
1280×1024

And thats it! Not a lot but it shows the basic information

You set the resolution like this
Set-DisplayResolution -Width 1366 -Height 768

If you look in the functions both actually call the setres command!


August 31, 2013  8:21 AM

Third Age of PowerShell

Richard Siddaway Richard Siddaway Profile: Richard Siddaway

We’re now firmly in the Third Age of PowerShell.

The First Age covered the betas and PowerShell 1.0

PowerShell was adopted by developers and admins (with a scripting background) that saw the need for better automation tools and went looking for them. Information was sketchy, and every new discovery of how to do something generated a blog post.

Exchange 2007 relied on PowerShell for some activities but most admins only used it when they had to and in a very begrudging way. The moans about functionality that was only available through PowerShell went on & on

while ($true){
Write-Host “Why can’t I use the GUI”
}

PowerShell was very niche with a relatively small number of (very vocal) supporters and was viewed as something that had to be used rather than a tool admins were comfortable with.

The Second Age started with the release of Windows Server 2008 R2 and PowerShell 2.0

Many of the functionality gaps were filled and PowerShell came of age. Microsoft made PowerShell support mandatory for all products – some did it better than others which is still true to day.

Admins began to sit up and take notice as the body of information grew. Blogs began to die away though which is a shame in many ways.

The Scripting Games cut over to being PowerShell only.

The start of the Third Age is defined by the release of PowerShell 3.0 and Windows Server 2012. The amount of PowerShell functionality has gone through the roof – there are still bits of the PowerShell functionality in Server 2012 I haven’t touched.

Admins are beginning to embrace PowerShell. The last 12 months or so I’ve heard a lot of statements that start “I can use PowerShell to do that..”

PowerShell is here to stay and its a must learn technology. The self-proclaimed industry experts are now jumping on the bandwagon and pushing PowerShell as if they invented it.

So where do we go from here.

PowerShell 4.0 will be with us in October with the availability of Server 2012 R2. It has some evolutionary features but I don’t think there’s anything revolutionary.

We’ll still be in the Third Age.

The Fourth Age will start when the majority of admins use PowerShell as a matter of course and you can’t really work on the Windows platform without it.

Come on Microsoft – Make my day & remove the GUI permanently from Windows Server.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: