The PowerShell year – 2013

This year has been a pretty good year for the PowerShell community. The highlights include:

  • PowerShell 4.0 becomes available
  • A very successful PowerShell Summit in April
  • A community hosted and judged Scripting Games – though as PowerShell is the only accepted language maybe a name change is needed?
  • PowerShell in Depth and PowerShell Deep Dives are published

The big ticket item in PowerShell 4.0 is Desired State Configuration. This functionality was extended at the end of the year with the publication of the  Desired State Configuration Resource Kit Wave 1 -  see the PowerShell team blog at

The most important part of the announcement is that it is wave 1 – meaning we should expect more DSC resources in the New Year.

Looking forward to 2014 what do we expect?

  • More DSC resources
  • 2014 Winter Scripting Games – this time we’re making them a team based event. Should be interesting
  • A PowerShell Summit in Seattle in April
  • A European PowerShell summit later in the year

Assuming you already know the PowerShell basics, or more, where should you be spending your PowerShell time in 2014?

If your work involves creating servers on a regular basis make sure you understand DSC

If you need to administer many servers – look to PowerShell workflow, PowerShell jobs and Scheduled jobs.  These options seem to have slipped out of the limelight lately.

Workflows are different – they use a PowerShell syntax but aren’t pure PowerShell. Some of the rules for using them are a bit strange and need some practice.

PowerShell jobs were introduced in PowerShell 2.0 but have always been overshadowed by remoting. The ability to run PowerShell jobs asynchronously and schedule them makes for a very powerful system for performing bulk tasks overnight.

The last recommendation for 2014 – learn more about CIM/WMI.  A significant fraction of the PowerShell functionality in Windows 8/2012 and later is built on WMI. If you don’t understand how it works you won’t get the best out of it. The OMI initiative is gaining traction which makes CIM an even more important technology to learn.

I’d also recommend experimenting with any of the areas of PowerShell you don’t know so well.

Finally, and most importantly, share what you learn with the rest of the PowerShell community - is a very good place to start.

Future thoughts

I spent last night configuring a new Windows 8.1 device – finally retired my venerable HP laptop. One thing that struck me as I was working through the various installs was where my data was.  The last time I’d configured a new machine all of the data was held locally and I’d needed to copy the data onto the new machine. A cross over cable was used if I remember correctly.

This time the majority of my data is on my SkyDrive. Logging into Windows 8.1 with a Microsoft Live id means I get immediate access to SkyDrive content. SkyDrive is an integral part of Windows 8.1 rather than being an additional install.

The SkyDrive appeared. The configuration is good in that you can move the SkyDrive position on your local disk – open up the Location tab on the SkyDrive properties.

What did surprise me and start me thinking is that by default under Windows 8.1 SkyDrive data is not automatically downloaded to your local disk.  You get a stub file that when you click on it triggers the download of the contents. In previous versions as soon as you configured the SkyDrive app on your local machine it would start analysing your data and download.

Its easy enough to trigger a download for everything, or just some folders – right click the appropriate folder with your SkyDrive area on the local disk and select  Make available off-line.


This started me thinking – with SkyDrive defaulting to online content and Microsoft Office being able to save to SkyDrive – how long before all of our data is in the cloud. I wouldn’t be surprised to see future machines with much smaller disks than we assume are necessary today – enough to store the OS and applications. Though with web based Office applications available through Office 365 there could be a lot of people who only need a thin OS – sort of like a chrome book but that actually usable.

The only data you will store locally will be those files you are working on.

This change will require much better network support than we receive today. The broadband offerings in the UK are not up to supporting this approach and until there is consistent, fast broad band connectivity everywhere it will remain a pipe dream.

iOS7–1 giant step–backwards

Just upgraded my ipad to iOS7 overnight.  It looks awful.  Like a child’s toy.  Apple may have been a triumph of style over substance in the past but they’ve lost any claim to style with this one.

Third Age of PowerShell

We’re now firmly in the Third Age of PowerShell.

The First Age covered the betas and PowerShell 1.0

PowerShell was adopted by developers and admins (with a scripting background) that saw the need for better automation tools and went looking for them. Information was sketchy, and every new discovery of how to do something generated a blog post.

Exchange 2007 relied on PowerShell for some activities but most admins only used it when they had to and in a very begrudging way. The moans about functionality that was only available through PowerShell went on & on

while ($true){
  Write-Host "Why can't I use the GUI"

PowerShell was very niche with a relatively small number of (very vocal) supporters and was viewed as something that had to be used rather than a tool admins were comfortable with.

The Second Age started with the release of Windows Server 2008 R2 and PowerShell 2.0

Many of the functionality gaps were filled and PowerShell came of age. Microsoft made PowerShell support mandatory for all products – some did it better than others which is still true to day.

Admins began to sit up and take notice as the body of information grew. Blogs began to die away though which is a shame in many ways.

The Scripting Games cut over to being PowerShell only.

The start of the Third Age is defined by the release of PowerShell 3.0 and Windows Server 2012. The amount of PowerShell functionality has gone through the roof – there are still bits of the PowerShell functionality in Server 2012 I haven’t touched.

Admins are beginning to embrace PowerShell. The last 12 months or so I’ve heard a lot of statements that start “I can use PowerShell to do that..”

PowerShell is here to stay and its a must learn technology. The self-proclaimed industry experts are now jumping on the bandwagon and pushing PowerShell as if they invented it.

So where do we go from here.

PowerShell 4.0 will be with us in October with the availability of Server 2012 R2. It has some evolutionary features but I don’t think there’s anything revolutionary.

We’ll still be in the Third Age.

The Fourth Age will start when the majority of admins use PowerShell as a matter of course and you can’t really work on the Windows platform without it.

Come on Microsoft - Make my day & remove the GUI permanently from Windows Server.

Need for speed?

How fast does an admin script have to be?

My opinion has always been that if its significantly faster than me doing the same task by hand then that’s more than fast enough.

Is my time better spent developing new functionality compared to shaving a few % off the execution time of my scripts? If the script is long running its either because I’m hitting a lot of data or I’m hitting a lot of machines. In both cases the script itself probably isn’t the bottle neck and if its that long an operation I can always run it over night.  A mass mailbox migration may run over a long weekend!

Speed is relative and as long the script delivers its results in an acceptable time frame the absolute time doesn’t really matter.

Opinion–automate or suffer

I was on a course last week and one attendee uttered words to this affect “I won’t automate – it takes to long to write the code. I’ll keep doing it manually”

You may be able to do it faster the first time by performing the task manually. I can guarantee that the second time my automation will be faster and by the third time I’ll have recovered the effort I made in automating.

IT is getting more complicated, with more dependencies and less time to set things up. If you want consistent, quick results – automate. if you don’t – you’re in the wrong business.

What’s wrong with the Kindle app for Windows 8

A few things really including:

  • If you sign out you lose any downloads!
  • There doesn’t appear to be any way to read kindle format ebooks obtained from other sources. In other words you are locked into Amazon
  • If you sync your settings between Windows 8 systems – seems like a good idea – then only have a single registration for your Windows 8 Kindles
  • Not accessible directly from the Windows desktop

One the plus side it seems to work OK and the reading experience is reasonable, but the inability to read other ebooks means I need yet another ereader. I decided to standardise on the Kindle format for my ebooks – may need to re-think that decision

Overall the Kindle for Windows 8 app is disappointing and a backward step compared to the Kindle for Windows 7 application.

Infrastructure Architecture–from the Middle Ages to Now

i often wonder about how we go about performing Infrastructure Architecture in particular and Architecture in general. We spend a lot of time and effort creating frameworks such as Zachman and TOGAF; we create large bodies of data in the form of Enterprise Architecture bodies; we have patterns and reference architecture and the body of white papers and other advertorial content produced by, or on behalf of, vendors.

When it comes down to to actually putting infrastructure on the ground how much of this do we actually use & think about?

Are we like the research scientist looking for  the best answer to solve the problem that we are investigating or are we more like the master mason’s of the Medieval period that built Europe’s great cathedrals, churches and castles?

My feeling is that in many cases we are more like the later.

We have a set of techniques, tricks and tips that we know work because we have used them before. We learn new techniques by working with different people –changing jobs, contractors coming into the organisation  etc.  Often this information ends up being traded. We often serve a long apprenticeship working up through building servers, configuring OS and applications and troubleshooting. When we are deemed worthy – skilled and knowledgeable enough – we are presented with our own projects.

Pretty much parallels the Guild system of the Middle Ages!

Next time you are planning some infrastructure architecture think back on the heritage of how we apply our knowledge. Hopefully one day we will be in the position that it becomes more science than art – when that happens though some of the fun will have gone

The New IT ?

A recent headline asked "How will you fit into the New IT?"

My answer is what new IT?

I have been working in IT for well over 20 years and in that time I can't remember a period when there wasn't a significant change coming:

  • introduction of PCs
  • growth of networks
  • Internet
  • Rise of Windows and fall of Novell
  • Token ring giving way to Ethernet
  • Viruses and other malware
  • Virtualisation

The so called new IT is the "cloud" and all its ramifications. We've been hosting applications elsewhere for years but all of a sudden its the only way to run your IT shop - at least according to the experts in the computer press. When was the last time one of these pundits actually worked in IT - if they ever did.

IT is constantly changing - if you don't realise that and can't live with it you shouldn't be in the industry. An IT professional's job is to work out which of the changes are beneficial to their organisation and which are a distraction.

No matter what the "experts" tell you there is no single answer that fits every organisation - pick whats needed and disregard the rest.

The only constant is change