PowerShell for Windows Admins


June 30, 2014  12:55 PM

Workflows 6: suspending jobs

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell, Workflow

One of the great things about work flows is that you can stop and start them. A workflow can be stopped, on a temporary basis, by using the Suspend-Workflow activity.

workflow suspend1 {
Get-Service

Suspend-Workflow

Get-Process

}
suspend1

 

This will run the Get-Service activity – and produce output to the console. The workflow will suspend and automatically create a job. You will see output like this:

HasMoreData     : True
StatusMessage   :
Location        : localhost
StartParameters : {}
Command         : suspend1
JobStateInfo    : Suspended
Finished        : System.Threading.ManualResetEvent
InstanceId      : bbe0903d-1720-46da-a6dd-e0a927aa9e11
Id              : 8
Name            : Job8
ChildJobs       : {Job9}
PSBeginTime     : 30/06/2014 19:40:29
PSEndTime       : 30/06/2014 19:40:29
PSJobTypeName   : PSWorkflowJob
Output          : {}
Error           : {}
Progress        : {}
Verbose         : {}
Debug           : {}
Warning         : {}
State           : Suspended

 

Notice the state.

You can  manage the job with the standard job cmdlets

£> Get-Job

Id Name PSJobTypeName State     HasMoreData Location  Command
– —- ————- —–     ———– ——–  ——-
8  Job8 PSWorkflowJob Suspended True        localhost suspend1

 

The job is restarted using Resume-Job. Once the job has finished you can use Receive-Job to get the rest of the data.

June 27, 2014  2:02 PM

Expanding server names

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell

I had a comment left on my recent post – “Bad practices – making scripts needlessly interactive” asking how to deal with the situation of N servers consecutively numbered e.g. server01 to server05. Taking the script from that post as an example:

 

[CmdletBinding()]
param (
[string[]]$computername
)

foreach ($computer in $computername){
Restart-Computer -ComputerName $computer
}

 

the questioner wanted to be able to do something like

./reboot –computername server0[1-5]

which would have the same effect as typing

./reboot –computername server01, server02, server03, server04, server05

 

There are two approaches that come to mind. The first approach would be to put some code to perform the expansion into the script and use parameter sets to differentiate between a set of full names or a group of names to expand.  This would give you something like this:

[CmdletBinding()]
param (

[parameter(ParameterSetName="Names")]
[string[]]$computername,

[parameter(ParameterSetName="Group")]
[string]$computergroup

)

switch ($psCmdlet.ParameterSetName) {
“Names”  {
foreach ($computer in $computername){
Restart-Computer -ComputerName $computer
}
}
“Group”  {
$groupdata = $computergroup.Replace(“]”, “”) -split “\["
$prefix = $groupdata[0]
$nums = $groupdata[1] -split “-”

$nums[0]..$nums[1] |
foreach {
Restart-Computer -ComputerName “$prefix$_”
}

}
default {Write-Error “Error!!! Should not be here” }
}

 

Two parameters are defined. The computername parameter takes one or more computer names. The computergroup parameter takes some kind of shorthand notation to describe a group of consecutively named machines such as server0[1-5]. Pameter sets are used to make the parameters mutually exclusive .

Processing is based on a switch determined by parameter set name.

For a set of computer names the processing is as previously.

For a shorthand notation the data is split to provide the prefix – ‘server0’  in this case.

The numbers are then split on the “-“ to effectively give a first and last which are used in a range operator and passed through foreach-object to the restart-computer cmdlet where the computer name is created.

This approach works but it has one major drawback. You would need to put the code to expand the server names into each and every script or function you wrote. That’s extra work we can avoid.

The essence of PowerShell cmdlets is that they are a small piece of code that perform a discrete job. If we follow that pattern we can split the expansion of the server names out into a reusable piece of code that then passes computer names to our other scripts or functions. I would make the expand code a function so that you can load it through a module or your profile.

function expand-servername {
[CmdletBinding()]
param (

[string]$computergroup

)
$computers = @()

$groupdata = $computergroup.Replace(“]”, “”) -split “\["
$prefix = $groupdata[0]
$nums = $groupdata[1] -split “-”

$nums[0]..$nums[1] |
foreach {
$computers += “$prefix$_”
}

Write-Output $computers
}

 

using the function generates the server names:

£> . .\expand-servername.ps1
£> expand-servername -computergroup server0[1-5]
server01
server02
server03
server04
server05

 

You can use the reboot script like this:

.\reboot.ps1 -computername (expand-servername -computergroup server0[1-5])

 

If I was doing this for my production environment I would make the reboot script into an advanced function that accepted pipeline input. The reboot and expand-servername functions would be part of a module that could auto load. Alternatively. make expand-servername part of a module that you auto load. You could expand the options in expand-servername to accomodate multiple patterns of names.

 


June 26, 2014  11:34 AM

Workflows: 5a CDXML modules update

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell, Workflow

In my last post I questioned why commands from CDXML didn’t fail as thee weren’t any activities defined for them.  Turns out that functions and other commands that don’t explicitly have their own workflow activities are implicitly wrapped in the inline script activity.  As CDXML modules effectively create a function – that means that workflows accept them and run.

Thanks to Steve Murawski for the answer.


June 24, 2014  2:11 PM

Workflows: 5 CDXML modules

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell, Workflow

Last time we saw that you’re not really using cmdlets in PowerShell workflows – you’re using workflow activities. Some cmdlets haven’t been packaged into activities and for those you need to put them in an Inlinescript block.  You can also use an Inlinescript block to run any arbitrary piece of PowerShell.

One thing I hadn’t tried was using some of the CDXML (WMI class wrapped in XML and published as a PowerShell module) modules that ship in Windows 8 and later. As far as I was aware they hadn’t been packaged as activities -

so thought I’d try this

workflow net1 {
parallel {
Get-NetAdapter
Get-NetIPAddress
}
}
net1

 

Surprisingly it worked.

The only reason I can think of that it works is that the way CDXML publishes the cmdlets as functions -  ls function:\Get-NetAdapter | fl *

Enables a workflow to use the function. Even if the module containing the cmdlet hasn’t been explicitly or implicitly loaded the workflow still runs.

You can tell these cmdlets haven’t been packaged as activities as the activity common parameters aren’t available on them.

One of life’s little mysteries. I’ll get to the bottom of it and find out why eventually. In the meantime our workflows just became a whole lot richer in terms of functionality.


June 23, 2014  11:24 AM

Bad practices – making scripts needlessly interactive

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell

the PowerShell community spends a lot of time talking about best practices when using PowerShell. I’m not convinced this approach is working as we keep seeing the same bad practices coming through on forum questions. I thought I’d turn the problem on its head and present a now-and-again series of bad practices. These are things I’ve seen people doing that either make their script far more complicated than it needs to be or just completely negates the power that PowerShell brings to your  daily admin tasks.

I think that a lot of this is due to people not taking the time to learn the best way to use PowerShell. The thinking goes something like this  – I can script with XXX language.  PowerShell is a scripting language. Therefore I know how to use PowerShell.

NO. WRONG.

PowerShell should be thought of more as an automation engine then a scripting language. The reason for using PowerShell is to perform tasks that are one or more of repetitive, boring, long, complicated, error prone, can’t be done in the GUI. The goal of automating a task should be that you can run it automatically without human intervention if you need to.

I’ll give an example.

Some years ago I was involved in migrating a new customer into the managed service environment of the company I was working for. Step one was to move them out of their previous supplier’s data centre. As part of that move I wrote a PowerShell script that shutdown every machine in that environment. The script took a list of computers from a file and shut them down in order. My final action was to shutdown the machine I’d used to run the script.

I actually ran the script manually but I could have run it as a scheduled job – which I did another time.

What do I mean by making scripts needlessly interactive?

Consider this example:

 

$hs = @”
Press 1 to reboot computera
Press 2 to reboot compuertb
Press 3 to reboot computera and computerb

“@

$x = Read-Host -Prompt $hs

switch ($x) {
1  {Restart-Computer -ComputerName $computera; break}
2  {Restart-Computer -ComputerName $computerb; break}
3  {
Restart-Computer -ComputerName $computera
Restart-Computer -ComputerName $computerb
break
}
}

 

The script defines the prompt – a menu in effect. Gets the user choice and performs the required reboots. It works BUT its so wasteful in terms of the effort required.  Even worse its not sustainable. When you add another computer your choices become a, b, c, a+b, a+c, b+c, a+b+c.  Now think what happens as your list of possible computers grows to 5 or 10 or 100!

Every time you find yourself typing Read-Host sop and ask yourself if its really needed. I’m not suggesting that cute little animals will die if you use it but you will cause yourself unnecessary work – now and in the future.

So what should you do?

Use a parameterised script.

 

[CmdletBinding()]
param (
[string[]]$computername
)

foreach ($computer in $computername){
Restart-Computer -ComputerName $computer
}

 

call the script reboot.ps1 for sake of arguments.

You can then use it like this

./reboot –computername computerA
./reboot –computername computerb

./reboot –computername computerA, computerb, computerc

 

if you have a lot of machines to reboot

./reboot –computername (get-content computers.txt)

put the names in a file (one per line) and call the script as above. You could even have different files for different groups of computers if required

Less typing and easier to use and maintain.

Now you’re using the power that PowerShell brings you.


June 22, 2014  8:23 AM

PowerShell.org – TechSession webinairs

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell

PowerShell.org are starting a series of technical webinairs. Details from

http://powershell.org/wp/techsession-webinars/


June 22, 2014  8:13 AM

Workflows: 4 Using cmdlets

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell, Workflow

This is a simple function to return some data about a system:

function get-serverdata {

Get-CimInstance -ClassName Win32_OperatingSystem

Get-Service

Get-Process
}

get-serverdata

 

The function will return the CIM data about the operating system, then the service data and then the process data. The only difference to running the cmdlets interactively is that the display for services and processes defaults to a list rather than a table.

Status      : Stopped
Name        : WwanSvc
DisplayName : WWAN AutoConfig

Id      : 1524
Handles : 81
CPU     : 0.015625
Name    : armsvc

 

Now lets try that as a workflow:

 

workflow get-serverdata {

Get-CimInstance -ClassName Win32_OperatingSystem

Get-Service

Get-Process
}

get-serverdata

 

You’ll see the data returned in exactly the same order – operating system, services and processses. The only difference is that a PSComputerName property is added to the output.

Status         : Stopped
Name           : WwanSvc
DisplayName    : WWAN AutoConfig
PSComputerName : localhost

Id             : 1524
Handles        : 81
CPU            : 0.015625
Name           : armsvc
PSComputerName : localhost

 

I didn’t emphasis this when discussing parallel and sequential processing earlier but the default action for a workflow is to process the commands sequentially. You use the parallel and sequence keywords to control that processing – for instance to run the cmdlets in parallel:

workflow get-serverdata {

parallel {
Get-CimInstance -ClassName Win32_OperatingSystem

Get-Service

Get-Process
}

}

get-serverdata

 

From this I see a block of service data (in table format) followed by the first process, another service, the CIM data and more services then mixed services and process data until then end.

I can’t emphasise this point enough – when running workflow tasks in parallel you have no control of the order in which data is returned.

 

You may be tempted to try something like this:

workflow get-serverdata {

parallel {
Get-CimInstance -ClassName Win32_OperatingSystem

Get-Service | Format-Table

Get-Process | Format-Table
}

}

get-serverdata

 

Don’t. You see an error message:

At line:6 char:15
+ Get-Service | Format-Table
+               ~~~~~~~~~~~~
Cannot call the ‘Format-Table’ command. Other commands from this module have been packaged as workflow activities, but this command was specifically excluded. This is likely because the command requires an interactive Windows PowerShell session, or has behavior not suited for workflows. To run this command anyway, place it within an inline-script (InlineScript { Format-Table }) where it will be invoked in isolation.
+ CategoryInfo          : ParserError: (:) [], ParseException
+ FullyQualifiedErrorId : CommandActivityExcluded

 

There are a couple of very important pieces of information in this message:

“packaged as workflow activities”

The commands used in the workflows in this post ARE NOT CMDLETS. I’ve capitalised that to emphasis. Workflows do not run PowerShell cmdlets – they use workflow activities. These are cmdlets repackaged to run in workflows. Not all cmdlets have bee repackaged as activities and not all cmdlets can be which leads to the second important point from the error.

“To run this command anyway, place it within an inline-script (InlineScript { Format-Table }) where it will be invoked in isolation. “

 

Any cmdlet or command that hasn’t been specifically converted to a workflow activity can be wrapped in an inlinescript block if you need to run it in a workflow. The example would become:

workflow get-serverdata {

parallel {
Get-CimInstance -ClassName Win32_OperatingSystem

inlinescript {
Get-Service | Format-Table

Get-Process | Format-Table
}

}

}

get-serverdata

 

You can see a list of the cmdlets that have not been implemented as workflow activities in the this link http://technet.microsoft.com/en-us/library/jj574194.aspx

Scroll down to excluded cmdlets.

 

PowerShell workflows look like PowerShell but they are not PowerShell. Next time we’ll dig a bit further under the covers of PowerShell workflows.


June 21, 2014  10:03 AM

PowerShell Summit Europe 2014 – - reasons to attend 1..7

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell

With the opportunity to register for the first PowerShell Summit in Europe fast approaching you may be looking for reasons you can use to convince your boss to let you go.  These are just some of the reasons you should attend:

Excellent sessions  -  our sessions are deliberately kept short and focussed. There’s no time for the speaker to waffle and pad with only 45 minutes. You get for information fro your money this way. The speakers are acknowledged experts in their areas with years of experience using PowerShell. We emphasise code over slides you get to see PowerShell in use – not just a long list of PowerPoint slides

Meet some the world’s foremost PowerShell experts – the speakers remain available throughout the whole conference. Our sessions aim to be interactive so you can ask questions and even provoke a discussion.

Get your questions answered – the Summit will be attended by members of the PowerShell team and a significant number of PowerShell experts such as PowerShell MVPs. They will be more than happy to answer your questions. If you can’t get your PowerShell question answered at a PowerShell Summit there probably isn’t an answer.

Talk to people with similar problems to solve – the attendees are all PowerShell users – this is not a conference to attend if you have mastered at least the basics of PowerShell.  Many of them will have similar problems, and even have the solution. The PowerShell community loves to share. This is an opportunity talk to those people and find your answers.

Contribute to a community resource – become a member of the PowerShell Team for a night. We will be solving one of the PowerShell Teams wish list items in a special evening event. Work with other PowerShell experts to create a resource that you can use. Its amazing how many learning points come out of these activities.

Get additional AWPP benefits – the conference is one of your AWPP benefits – http://powershell.org/wp/association-for-windows-powershell-professionals/ – take advantage of the script review process to get additional problems solved

Become better at using PowerShell – learn new techniques that you can apply. The demo code from the speakers will be available so you can use and adapt to help solve your problems.

You won’t find a better 3 days of PowerShell related activity.  Hope to see you there


June 20, 2014  1:38 AM

PowerShell Summit Europe 2014 – - update 2

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell

There seems to have been a bit of confusion regarding the European PowerShell Summit as the site will tell you that registration is currently unavailable.
There isn’t a problem and the Summit  HAS NOT sold out at this time. We just haven’t opened registration yet.
Registration will open on 15 July 2014


June 19, 2014  1:08 PM

Workflows: 3 parallel and sequence

Richard Siddaway Richard Siddaway Profile: Richard Siddaway
PowerShell, Workflow

I said in the first post in this series that you could force a workflow to perform tasks in parallel or in sequence. Starting with parallel you can force parallel execution by using the parallel keyword:

workflow thursday1 {
parallel {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}

1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}

1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
}

thursday1

In this workflow I’m printing out the numbers 1-12, the characters A-Z and a-z then repeating those actions another 2 times. The repetition is to force the parallelism to be visible. If you just run one repetition everything runs so fast that you don’t see any evidence of parallel activity.  if you look carefully at the output you will see evidence of parallelism. For instance the test I ran showed this partial sequence of results:

Y
16
17
B
18
C
19
20
D
a
b

No prizes for guessing that the keyword to force things to run sequential is sequence

workflow thursday2 {
sequence {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
}

thursday2

No matter how many times you run this you’ll always get the same sequence of results – numbers, upper case then lower case characters.

You can mix and match

workflow thursday3 {
parallel {
sequence {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
sequence {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
sequence {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
}
}

This runs the sequence of numbers, upper and lower case three times in parallel

or

workflow thursday4 {
sequence {
parallel {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
parallel {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
parallel {
1..26 | foreach {$psitem}
65..90 | foreach {[char][byte]$psitem}
97..122 | foreach {[char][byte]$psitem}
}
}
}

thursday4

 

which runs numbers, upper and lower case in parallel – three times in sequence

Try thursday3 and thursday4 and observe the results. I’d encourage you to experiment with combinations of sequence and parallel so that you get a feel for the difference between the two types of action.

Your observations should help reinforce the message from the first post – you can’t predict the order of results when performing tasks in parallel.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: