Monthly Archive

Categories

Deep Dive

PowerShell Deep Dive abstracts

With the Deep Dive just over 6 weeks away the programme is shaping up.Abstracts of some of the sessions can be viewed

http://blogs.msdn.com/b/powershell/archive/2011/09/01/8-abstracts-for-the-powershell-deep-dive-in-frankfurt.aspx

 

I will also be doing a session on working with events in PowerShell – WMI, .NET and the PowerShell engine

The discounted registration is open until 6 September  - see http://blogs.msdn.com/b/powershell/archive/2011/08/02/extending-discounted-registration-amp-session-proposal-deadline.aspx for details.

 

This is going to be a fantastic event – if you use PowerShell you need to be there

European PowerShell Deep Dive–presenting

I was notified last night that my submission to the Deep Dive was successful and I will be presenting. There will be a good number of PowerShell MVPs attending. This will be the PowerShell event of the year in Europe. The one in April in the USA was brilliant – this will be at least as good.

Event details from http://blogs.msdn.com/b/powershell/archive/2011/08/02/extending-discounted-registration-amp-session-proposal-deadline.aspx

PowerShell Deep Dive–registration

If you are thinking of going to the European Deep Dive, and if not why not, good news on the price.  The discounted registration period has been extended to 6 September

http://blogs.msdn.com/b/powershell/archive/2011/08/02/extending-discounted-registration-amp-session-proposal-deadline.aspx

European Deep Dive–more info

More information plus registration details from http://blogs.msdn.com/b/powershell/archive/2011/07/20/powershell-deep-dive-registration-info-amp-call-for-session-proposals.aspx

The one in Vegas in April was brilliant.  This is going to be better

European PowerShell Deep Dive

In April there was a Powershell Deep Dive at The Experts conference. It went so well that the event is to be repeated at the European version of The Experts Conference - October 17-18

Available details are limited but start here

http://blogs.msdn.com/b/powershell/archive/2011/07/12/powershell-deep-dive-the-experts-conference-europe-2011.aspx

Method definitions

When we are dealing with .NET objects we have methods and properties to deal with. Properties are easy.

lets create a simple object

$str = "QWERTYUIOP"

 

put our string object into get-member to see the properties

$str | Get-Member -MemberType property

 

In this case we get one property.

Methods we can get like this

$str | Get-Member -MemberType method

 

and we find there are 33 of them on a string object.  Some of the methods can be used in different ways i.e. have different definitions. For instance the substring method has a couple of definitions

PS> $str.substring.OverloadDefinitions
string Substring(int startIndex)
string Substring(int startIndex, int length)

 

When we look at the output of get-member for a method such as Replace we get this

Replace          Method     string Replace(char oldChar, char newChar), string Replace(string oldValue, string newVa...

Ideally we want to be able to see all of the definitions.  We could use

$str | Get-Member -MemberType method | Format-Table –wrap

 

but its not easy to read.  If you want to dig into the method definitions try this

 

001
002
003
004
005
006
007
008
009
010
011
012
013
function get-methoddefinitions {
 [CmdletBinding()]
 param ($obj)
 
 $obj | Get-Member -MemberType method | select name |
 foreach {
   $_.Name
  
   $cmd = '$obj.' + "$($_.Name).Overloaddefinitions"
   Invoke-Expression -Command $cmd 
   ""
  } 
}

 

We can use string substitution to get the method name into the string and then run it with Invoke-Expression.  Note how we use single quotes on the first part of the string to prevent substitution.  Our output for the replace method becomes

Replace
string Replace(char oldChar, char newChar)
string Replace(string oldValue, string newValue)

which is easy to read. 

The function could be extended to accept a method name to avoid displaying everything.

Splat

It sounds like something that should pop up on the TV while watching a rerun of the 1960s Batman TV series.  Its not though.

Splatting is another way of passing parameters to commands.  This command shouldn’t come as a surprise.

Get-ChildItem -Path "c:\test" –Recurse

We can also pass the parameters like this

001
002
003
004
005
$params = @{
  Path = "c:\test"
  Recurse = $true
}
Get-ChildItem @params

 

 

Create a hash table of your parameters using parameter name and value pairs.  Recurse is switch parameter so we need to set it to $true.

We then use Get-ChildItem  @params to run the command.  Notice the @ symbol rather than a dollar. This is the splat operator and it splats (sends) the values to the parameters.

This also works for our own functions. Lets create a simple function

 

001
002
003
004
005
006
007
008
009
010
011
012
013
014
function testsplat {
 [CmdletBinding()]
 param (
  [string]$comment,
  [int]$first,
  [int]$second,
  [string]$action
 )
 switch ($action){
  times {"$comment $($first * $second)"}
  add   {"$comment $($first + $second)"} 
 }

}

 

We have four parameters and depending on the value of the action parameter we multiply or add. Not very sophisticated but it demonstrates what we are doing.

We can run the function and use it like this

testsplat -comment "Answer is" -first 12 -second 3 -action add
testsplat -comment "Answer is" -first 12 -second 3 -action times

 

and we will get the expected results

Answer is 15
Answer is 36

We can splat the parameter values

$params = @{
comment = "Splat Answer is"
first = 12
second = 3
action = "add"
}
testsplat @params

 

and get the expected result

Splat Answer is 15

 

because we are dealing with a hash table we can simply change one parameter

$params["action"] = "times"
testsplat @params

and get

Splat Answer is 36

 

Where this will really work for me is testing functions. I can put this together to give me a simple way of testing.

 

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
function testsplat {
 [CmdletBinding()]
 param (
  [string]$comment,
  [int]$first,
  [int]$second,
  [string]$action
 )
 switch ($action){
  times {"$comment $($first * $second)"}
  add   {"$comment $($first + $second)"} 
 }

}

$params = @{
 comment = "Splat Answer is" 
 first = 12 
 second = 3 
 action = "add"
}
testsplat 
@params

$params["action"] = "times"
testsplat @params
######################################################

 

I can store my parameters in a hash table and change parameters as I wish. if I want to rerun the tests I have all of my previous tests effectively documented.  The row of ### symbols is just to draw attention to the results when I use ISE.

Splatting looks like it is going to make my testing easier.

PowerShell Deep Dive: VII using SMO

SMO  =  (SQL) Server Management Objects.  They first shipped with SQL Server 2005 and continued into SQL Server 2008. They are .NET classes that enable us to manage SQL Server systems programmatically – in our case from PowerShell. The SQL Server Management Studio in SQL Server 2005 and above is built on SMO.

One of the talks at Deep Dive was about discovering the Recovery Model across a number of SQL Server 2000/2005/2008 systems. They were using the SQL Server PowerShell provider that ships with SQL Server 2008 to access SQL 2005 and SQL 2008 systems. A different method was used for SQL Server 2000.

I asked why a simple SMO script wasn’t run remotely against all of the servers. I was told that SQL Server 2000 doesn’t support SMO therefore they couldn’t.

I didn’t think this was correct but didn’t have an environment available in which I could test it.

I’ve since managed to find time to test the idea

 

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
## get-database
## This uses the SMO assemblies to retrieve database
##
## Richard Siddaway May 2010

function get-database {
param([string]$sqlbox="")
## load SMO assemblies
## use $null to prevent display of assembly load information

$null = [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo")
#$null = [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum")
$null = [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended")
$null = [reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")

$server = New-Object -TypeName Microsoft.SqlServer.Management.Smo.Server -ArgumentList $sqlbox
$server
.databases
}

This simple function loads the SMO assemblies. If you are running on SQL Server 2005 then uncomment SmoEnum and comment out SmoExtended.

The function connects to the specified server and retrieves the database collection.  We can simply find the recovery model like this

get-database Asql2008-server | ft Name, RecoveryModel -a                                                                                                       
get-database Asql2005-server | ft Name, RecoveryModel -a                                                                                                       
get-database Asql2000-server | ft Name, RecoveryModel -a   

 

So – YES we can access SQL 200, 2005 and 2008 with SMO. Simply install the SQL Server client tools from SQL Server 2005 or 2008 onto your workstation and you can access all SQL server 2000, 2005 and 2008 systems in your environment.

Enjoy

PoshWSUS

One of the people I met at the recent PowerShell Deep Dive was Boe Prox.  Among other claims to fame he has produced a PowerShell module for administering WSUS. http://poshwsus.codeplex.com/

The latest version was posted in January 2011 and further updates are in the pipeline.

If you need to manage WSUS this is just the tool for you.  I’d recommend you down load it and give it a try.

Very useful set of tools.

PowerShell Deep Dive: VI WQL Query Speed–Remote

Looking at WQL query vs a Get-WmiObject filter on the local machine we saw that they were practically the same. If we used a where-object to do the filtering it took nearly twice as long.

I wanted to repeat these runs against a remote machine.  I use two Windows 2008 R2 servers for the test.

 

PS> 1..100 | foreach {Measure-Command -Expression {Get-WmiObject -Class Win32_Process -Filter "Name='Notepad.exe'" -computername webr201}} |

Measure-Object -Property TotalMilliseconds -Average

Count    : 100
Average  : 29.678681
Sum      :
Maximum  :
Minimum  :
Property : TotalMilliseconds

 

PS> 1..100 | foreach {Measure-Command -Expression {Get-WmiObject -Query "SELECT * FROM Win32_Process WHERE Name='Notepad.exe'" -computername webr201}} | Measure-Object -Property TotalMilliseconds -Average

Count    : 100
Average  : 30.669341
Sum      :
Maximum  :
Minimum  :
Property : TotalMilliseconds

 

PS> 1..100 | foreach {Measure-Command -Expression {Get-WmiObject -Class Win32_Process -computername webr201 | Where {$_.Name -eq 'Notepad.exe'} }} |

Measure-Object -Property TotalMilliseconds -Average

Count    : 100
Average  : 59.997321
Sum      :
Maximum  :
Minimum  :
Property : TotalMilliseconds

 

Results Summary

Filter: 29.678681

Query:  30.669341

Where:  59.997321

Again the filter and the query are nearly the same. I millisecond difference in the average of 100 runs is not enough to worry about. Using where-object is again about twice the time.

The results this time are quicker than running on the local machine.  This is because the server I used is more powerful than the laptop I used for the local test. The important thing is the relationships not the exact numbers. I ran the tests locally on the server and got similar pattern of results.

After all this I would say the running a full WQL query or using –Filter are about the same in speed. There may be a gain for the query if we selected properties as well but the extra typing and checking probably don’t justify the gain. Use a query or use a filter the results will be similar.  I’ll stick with the filter because its less typing.