Monthly Archives: October 2010

Dynamic Management Views

I’ve used DMVs in PowerShell and SQL Server demos a few times but hadn’t really dug into them.  There isn’t a lot of information available on DMVs but that will change early next year when SQL Server DMVs in Action is published - http://www.manning.com/stirk/ 

I’ve recently read the early access version and it is well worth investing in a copy if you are responsible for administering SQL Server.

The book doesn’t say so but it is very easy to run the scripts in the book from PowerShell

PowerShell on Bing UK

I’ve mentioned before that Bing US included an online reference to the PowerShell cmdlets under its Visual Search heading.

This is now also available on the UK version of Bing.  Navigate from the home page or jump straight in with

http://www.bing.com/visualsearch?g=uk_powershell_cmdlets&qpvt=Windows+PowerShell&FORM=Z9GE52#p=4

Creating a module

I’m writing a new PowerShell book and have got to the fun bit where I’m creating lots of scripts. I wanted to supply these as a series of modules. So each individual listing is a function. But I don’t want to manually maintain the module file as I’m working through each chapter. Usually I create a module with one .psm1 file that has all of the functions in it.

 

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015

## get folder name
$path = Get-Location
$folder = Split-Path -Path $path.path -Leaf

$mfile =  "$folder.psm1"

## remove old module file
if (Test-Path -Path $mfile){Remove-Item -Path $mfile}
 
## get files
Get-ChildItem -Filter *.ps1 | 
Select Name | Sort Name |
foreach {
 ". ./$($_.Name)" | Out-File -FilePath $mfile -Append
}

 

It is also possible to have a .psm1 file that runs a set of other PowerShell scripts. So that’s what I’ve done. The script shows how I can create a module file by getting the names of the individual PowerShell files adding the “. ./” in front to dot source them.

One quick module file with minimum effort.

PowerShell in Practice Review

Jonathan has written a review of my PowerShell in Practice book at http://www.jonathanmedd.net/2010/10/powershell-in-practice-review-and-39-discount-code.html

There is also a 39% discount code if you are quick.

User Group Meeting Cancelled

Due to unforeseen circumstances tomorrow’s (Tuesday 19 October) Live meeting presented by the UK PowerShell User Group is cancelled.

Apologies for the short notice.

I will reschedule for November.

PowerShell news

Couple of bits of news you might not have seen.

 

There’s a new PowerShell ebook available by PowerShell MVP Don Jones at http://nexus.realtimepublishers.com/accwp.php?ref=dj4

 

PASS (Professsional Association for SQL Server) is starting a virtual PowerShell group 

http://sqlvariant.com/wordpress/index.php/2010/10/announcing-the-new-powershell-virtual-chapter-of-pass/

more details from

http://powershell.sqlpass.org/

Recording script output

Problem

I need to record the output from my scripts.

 

Solution

Use Start/Stop-Transcript.

Oops. I forgot to tell you that I sometimes use ISE as well as the PowerShell console. Hmm. The Transcript cmdlets don’t work in ISE. The Scripting Guys showed how to capture the output of a script that was run in ISE in this recent post http://blogs.technet.com/b/heyscriptingguy/archive/2010/09/25/create-a-transcript-of-commands-from-the-windows-powershell-ise.aspx

OK so I know how to capture out in the console and in ISE but the methods don’t match.

We’ll assume that we want to capture the output of each script independently.  We’ll also assume a simple script of

Get-Process
Get-Service

 

001
002
003
004
005
006
007
008
009
010
011
012
013
014
015
016
017
018
019
020
021
022
023
024
025
026
027
028
029
030
031

if ($psise) {
  Clear-Host
  $file = "c:\test\ise-tran1.txt"
  $header = 
@"
    **********************
    Windows PowerShell Transcript Start
    Start time: $(Get-Date)
    Username : $env:USERDOMAIN\$env:USERNAME
    Machine : $env:COMPUTERNAME (Microsoft Windows NT
             $((Get-WmiObject win32_operatingsystem).version))
    **********************
     
"@

Out-File -FilePath $file -InputObject $header
}
else {
  Start-Transcript -Path "c:\test\ps-tran1.txt"
}

Get-Process
Get-Service


if ($psise) {
 $out = "Transcript stopped at $(get-date), output file is $($file) "
 $out
 Out-File -FilePath $file -InputObject $out -Append
 
 $psISE.CurrentPowerShellTab.Output.Text | 
 Out-File -FilePath $file -Append
}
else {Stop-Transcript}

 

Amending the Scripting Guys script we use the $psise variable to test if we are in ISE or not – if not we start-transcript.  If we are in ISE we set the header and output to the transcript file

We then run the script

At the end of the script we test if we are in ISE. if not we stop-transcript. If we are we copy the contents of the Output pane to our file.

This code can either be made part of your script templates or could be set as two functions that are loaded in your profiles.

Powershell UG October 2010


When: Tuesday, Oct 19, 2010 7:30 PM (BST)


Where: Virtual

*~*~*~*~*~*~*~*~*~*


Following on from the discussion at last months live meeting we will cover PowerShell jobs

Notes



Richard Siddaway has invited you to attend an online meeting using Live Meeting.
Join the meeting.
Audio Information
Computer Audio
To use computer audio, you need speakers and microphone, or a headset.
First Time Users:
To save time before the meeting, check your system to make sure it is ready to use Microsoft Office Live Meeting.
Troubleshooting
Unable to join the meeting? Follow these steps:

  1. Copy this address and paste it into your web browser:
    https://www.livemeeting.com/cc/usergroups/join
  2. Copy and paste the required information:
    Meeting ID: CW2FRT
    Entry Code: MJ?K4h5KG
    Location: https://www.livemeeting.com/cc/usergroups

If you still cannot enter the meeting, contact support

Notice
Microsoft Office Live Meeting can be used to record meetings. By participating in this meeting, you agree that your communications may be monitored or recorded at any time during the meeting.

Tee-Object cmdlet

How often do you use the Tee-Object cmdlet?  If you are anything like me I would guess fairly infrequently – if at all.  It does deserve to be considered.

Think of running  Get-Process.  We get a nice display on screen. We may decide to save the output

Get-Process | Out-File c:\test\proc1.txt

but now we don’t see the output.  Its in the file so we can look at it

Get-Content c:\test\proc1.txt

but that now becomes a two step process.

 

Tee-Object supplies the answer because it functions as a “T” junction and effectively splits the data

Get-Process | Tee-Object -FilePath c:\test\proc2.txt

gives us a display on screen (because Tee-Object ends the pipeline) and a file output – test with Get-Content c:\test\proc2.txt

We can add further processing after the Tee

Get-Process | Tee-Object -FilePath c:\test\proc2.txt | where {$_.Name -like "p*"} | Format-Table

 

Instead of a file we can tee to a variable

Get-Process | Tee-Object -Variable procs

Notice that we don’t use a $ in front of the variable name!!

$procs

will display the data again

If we look at the type of the data

$procs | gm

We see that its TypeName: System.Diagnostics.Process which means we can perform all our usual PowerShell processing e.g.

 

Unfortunately there isn’t a way to append data to file with Tee-Object  but it does have a very practical purpose and deserves a bit more attention.

Architecture, Administration and Automation

At first glance these three topics may seem to have nothing in common apart from the fact that they all begin with the letter “A”.  They are however intimately linked as we will see.

Architecture in an IT sense has many definitions (one per practising IT Architect at the last count) but I regard it as a the set of principles  around which you design your IT. To keep it simple I’ll restrict the discussion to infrastructure. I have heard much debate about the difference between architecture and design. I have a simple view – if products are mentioned its a design. As an example the decision to utilise virtualisation is an architectural one but whether to use Hyper-V or VMware for instance is a design decision.

So now we’ve decided what architecture is what its impact on administration. Quite simple really. One of the biggest problems facing IT administrators today is the complexity of the environment they are working in. You can easily find yourself in an environment with six versions of Windows (NT, 2000, 2003, 2003 R2, 2008, 2008R2 – I know the first two are out of support but I bet a lot of organisations are still using them) and that’s before you add in the complexity of 32 vs 64 bit and standard vs enterprise (or even datacenter) editions. Add a few applications – multiple versions and editions of SQL Server, a few Exchange servers, SharePoint, web servers, a raft of third party applications  -  plus file and print gives you a wide spread of skills that are needed. We mustn’t forget Windows itself plus the necessary additions of Active Directory and DNS. We haven’t even got to  the client systems and their applications which further muddy the waters.  Then we get to servers – virtualised plus one or more of the big vendors (usually more) and a whole bunch of different models add to the fun. The odd Linux or Unix server just to keep us awake and all the network, remote access and other issues and we end up with a very busy set of people.

It is the IT architect’s responsibility to architect/design complexity out of the environment. Standardise on specific sizes of servers, a single virtualisation platforms, minimise the number of Windows versions etc, etc etc.  This makes the administrator’s job easier because there is a relatively simple, standard set of items to wok with.

One of the biggest causes of downtime is human error. Reducing the complexity of the environment helps to reduce the possibility of error. The other way to reduce human error is to introduce as much automation as possible. The administrator has a responsibility to embrace and use automation to make their jobs easier and reduce errors.  The architect has the responsibility to ensure that the components selected in the architecture/design can be automated using the standard toolset within the organisation.

Architecture + Automation =  improved Administration