Monthly Archives: March 2012

More community TFS build extensions documentation

As part of the on-going effort in documentation I have recently published more documentation for the TFS build extension project activities

  • AssemblyInfo
  • CodeMetric (updated) and CodeMetricHistory
  • File
  • Twitter

More community TFS build extensions documentation

As part of the on-going effort in documentation I have recently published more documentation for the TFS build extension project activities

  • AssemblyInfo
  • CodeMetric (updated) and CodeMetricHistory
  • File
  • Twitter

Visual Studio Live @ Las Vegas Presentations – Tips and Tricks on Architecting Windows Azure for Costs

Unfortunately I wasn’t able to go and speak in Visual Studio Live @ Las Vegas as it was scheduled, due to an illness that made it impossible for me to travel, and stay in bed for a few days.

But even if I wasn’t there I would like to share with you some of the points on this topic “Tips and Tricks on Architecting Windows Azure for Costs”.

Tips & Tricks On Architecting Windows Azure For Costs

View more presentations from Nuno Godinho
The Key points to achieve this are:
  • Cloud pricing isn’t more complex than on-premises, it’s just different
  • Every component has it’s own characteristics, adjust them to your needs
  • Always remember that Requirements impact costs, choose the ones that are really important
  • Always remember that Developers and the way things are developed impact costs, so plan, learn and then code.
  • Windows Azure pricing model can improve code quality, because you pay what you use and very early can discover where things are going out of plan
  • But don’t over-analyze! Don’t just block because things have impacts, because even today the same things are impacting you, the difference is that normally you don’t see them that quickly and transparently, so “GO FOR IT”, you’ll find it’s really worth it.

In some next posts I’ll go in-depth into each one of those.

Special thanks for Maarten Balliauw for providing a presentation he did previously that I could work on.

Visual Studio Live @ Las Vegas Presentations – Architecture Best Practices in Windows Azure

Unfortunately I wasn’t able to go and speak in Visual Studio Live @ Las Vegas as it was scheduled, due to an illness that made it impossible for me to travel, and stay in bed for a few days.

But even if I wasn’t there I would like to share with you some of the points on this topic “Architecture Best Practices in Windows Azure”.

Here are 10 key Architecture Best Practices in Windows Azure:

  1. Architect for Scale
  2. Plan for Disaster Recovery
  3. Secure your Communications
  4. Pick the right Compute size
  5. Partition your Data
  6. Instrument your Solution
  7. Federate your Identity
  8. Use Asynchronous and Reduce Coupling
  9. Reduce Latency
  10. Make Internal Communication Secure

In some next entries I’ll go in-depth into each one of those.

Windows To Go

One of my favourite enterprise features that Microsoft is adding to Windows 8 is Windows To Go, which lets you provision a desktop on a USB flash drive and take it with you to boot on any hardware that meets the usual Windows 8 requirements. An IT department can build a desktop image, with applications installed (perhaps some of the intranet apps that you wouldn’t let your staff install on their home PC), and even domain join it before passing it to someone who needs to travel light, or who wants to be able to do some sensitive work on their personal laptop (the one that’s full of spyware and crap because their kids have had the ability to install anything – you know the one – it’s got so many browser toolbars that any web page is only an inch or two tall!). You can even secure it with BitLocker, without requiring a TPM chip in the hardware that’s going to host it.

Speaking of that host hardware, as I said, so long as it would support Windows 8 and will boot from USB, then you’re good to go. You won’t have access to any internal drives in that hardware (unless you’re also the administrator of that machine), but you will be able to use additional devices that you’ve plugged into its other USB ports, for example. When you use Windows To Go on a host PC for the first time, it’s going to do some plug’n’play detection (which may take a few minutes), then continue to boot. Every new bit of hardware is going to be stored in a profile, so the next time you use the same host it’s going to boot much faster (about as fast as you would expect from an internal drive).

Windows To Go isn’t, as a recent TechTarget mailing so cleverly pointed out, the answer to all your “Consumerisation of IT” dreams – they astutely observed that Windows To Go won’t run on an iPad. Running Windows from a USB flash drive on a device that has no USB port is apparently beyond Microsoft – shame on them! 😉

As an additional security measure, if you need to exit in a hurry (I like to imagine myself using Windows To Go behind enemy lines while I’m on some kind of secret mission – I don’t know why!), then you can just pull the drive out and the machine will freeze. If you don’t push it back into the same USB port within 60 seconds then the machine will reboot. If you knocked it out by accident (because the guy entering the internet cafe wasn’t actually a SPECTRE assassin hot on your heels), then you can plug it back in and carry on – if you were playing a video at the time, for example, it’ll take under a second to continue playback.

So to recap, as the IT guy, you can give somebody a Windows 8 instance (which you trust) that they can boot on their own hardware (which you don’t trust!), and you can continue to manage that instance like you would any other domain computer. You can give them software that you wouldn’t let them install on an untrusted computer without all the expense of giving them a trusted computer that you’ve configured. Just as importantly, your user can do important work stuff on the shiny new laptop that they bought for themselves without having to give it to you so that you can configure it and take away their admin rights. It’s a fantastic step in the right direction where “Bring Your Own Device/Computer” (BYOD/BYOC) is concerned.

With Windows 8 just in Consumer Preview (and Windows Server 8 in Beta) at present, all the details aren’t fully released about this feature yet, so some of this may not be 100% acurate at the time you read this:

You need at least a 32GB (my test image has Windows 8, Office 2010, Windows Live Essentials and a bunch of files on it and it still has 15GB free). The drive should be USB 3.0, although it’s going to work when plugged into a USB 2.0 port. These flash drive aren’t aren’t especially cheap at the moment, and they don’t all work as you’d hope…

When OEMs build drives, they have firmware that includes (among other things) a Removable Media Bit. The RMB is the thing that tells Windows whether the drive is “fixed” or “removable” (it defines the seperation in Windows Explorer). The trouble is that if you get one where the RMB is set to “removable” then Windows won’t do certain things with it. It won’t let you partition the drive, so you can’t use BitLocker; it won’t run Windows Update (including standalone WU packages); it won’t let you download apps from the Microsoft Store, and I dare say there are other things that I haven’t come up against yet. With some drives you can flip the value of the RMB, but on the Kingston DT Ultimate G2 32GB that I have, you can’t (I asked Kingston about this and told them why it was an issue – they’re going to bear it in mind for future products).

The upshot is that while you may be able to get Windows To Go to work today, you might not be able to do everything with it, and you might want to exercise caution before buying a load of drives, even if someone says that it works with a particular model.

All that said, if you want to give it a go, there are step-by-step instructions on the TechNet wiki, and a very informative video from the 2011 BUILD conference. Also, Ars Technica has an step-by-step with a slightly different method, using the WAIK and a single partition, so you can do it on a “removable” drive (although you can tweak the TechNet steps to do that too).

Before I forget (and because this is one of the things that I was asked at the TechDays UK IT Camp this week), you are going to be activating Windows via AD or a key management server, hence my pointing out right at the start of this post that this is an enterprise feature.

Open Source Microsoft–Build MVC, WebAPI, Razor, and WebPages

Scott Guthrie has announced on his blog that as of this very moment, ASP.NET MVC, ASP.NET WebAPI, and WebPages with Razor syntax have all been open sourced on CodePlex at http://aspnetwebstack.codeplex.com. That’s huge news. Oh and the ASP.NET Web Stack can be repo’d using TFS, SubVersion, Mercurial, and newly added Git.

So, you may be thinking “This sounds cool. But, what does it mean for me?” It means your awesome. It means that you can now take your favorite features and patches to their framework and submit it back to the team for review. It means you can use their framework when it is eventually ported over to Mono and other open-source platforms. It means, you’ll eventually be able to run ASP.NET wherever you’d like.

Be sure to check it out and provide feedback to the team. If you’re not sure what type of feedback to provide, choose from the following:

  • “The ASP.NET team just knocked it out of the park with this: http://aspnetwebstack.codeplex.com. Go OSS!”
  • “ScottGu and his team delivered yet again.”
  • “Who said that Microsoft can’t release software using an open source license?”
  • “Congrats to the ASP.NET team for, yet again, exceeding expectations!”

Your choice. In the meantime, great job Microsoft!


SQL Server # Storing Hierarchical Data – Parent Child n’th level # TSQL

Introduction

Today, I would like to explain one way in which we can store the HIERARCHICAL data in SQL tables. A general table structure which people come up to store this kind of data is –

1

Where, EmployeeID id the UniqueID alloted to every new employee record inserted into the table and ManagerID is the EmployeeID of the immediate manager of the employee. Keeping in mind that Manager is also an employee.

Problem Statement

This table structure very well serves the purpose as long as we have 1-Level hierarchy. However, if the hierarchy is of n’th level, the SELECT statement to fetch the records becomes more complex with this kind of table structure. Suppose, we want to fetch the complete TREE of a particular employee, i.e. list of all the employees who are directly or indirectly managed by a particular employee. How to do it……..?

Thanks to CTE’s for making the life a bit easier – as using them in a recursive manner, we can get the work done. Please follow this msdn link to see an implementation using recursive CTE.

Suggested Table Structure

2

Here, I have just included a new column [PATH]. It is of VARCHAR(MAX) type. I have taken it as VARCHAR(MAX) just to make sure the field is long enough to store the complete path. But one can assign appropriate size as per their system’s requirement.

The basic idea of the [path] column is to store the complete hierarchical path of any employee separated by a delimiter as under –

3

Calculating the new path is very simple. It’s just, {New Path} = {Parent Path} + {Self ID} + {Delimiter}

Now, suppose if I want to fetch all the employees who are directly or indirectly working under EmployeeID = 2, I can use the below tsql –

;WITH CTE AS (

SELECT 1 EmployeeID,NULL ManagerID, '1' [Path]

UNION ALL    

SELECT 2 EmployeeID,1 ManagerID, '12' [Path]

UNION ALL    

SELECT 3 EmployeeID,1 ManagerID, '13' [Path]

UNION ALL    

SELECT 4 EmployeeID,2 ManagerID, '124' [Path]

UNION ALL    

SELECT 5 EmployeeID,4 ManagerID, '1245' [Path]

)

SELECT

  *

FROM

  CTE

WHERE

  [Path] LIKE '%2%'

We can use a simple logic to even find out the level of the Employee –

SELECT

  *,

  (LEN([Path]) - LEN(REPLACE([Path],'',''))) - 2 [Level]

FROM

  CTE

WHERE

  [Path] LIKE '%2%'

4

2 is subtracted from the formula as the length of delimiter for Level-0 is 2.

Conclusion

Hope, this simple trick could save a lot of time for the ones who find themselves lost playing with the hierarchical data.

Unit testing in VS11Beta and getting your tests to run on the new TFSPreview build service

One of my favourite new features in VS11 is that the unit testing is pluggable. You don’t have to use MSTest, you can use any test framework that an adaptor is available for (at the release of the beta this meant the list of framworks on Peter Provost’s blog, but I am sure this will grow).

So what does this mean and how do you use it?

Add some tests

First it is worth noting that you no longer need to use a test project to contain your MSTest, you can if you want, but you don’t need to. So you can

  1. Add a new class library to your solution
  2. Add a reference to Microsoft.VisualStudio.TestTools.UnitTesting and create an MStest test
  3. Add a reference to xUnit (I used NuGet to add the reference) and create an XUnit test
  4. Add a reference to XUnit extensions (NuGet again) and add a row based xUnit test
  5. Add a reference to nUnit (you guessed it – via NuGet) and create a nUnit test

All these test frameworks can live in the same assembly.

Add extra frameworks to the test runner

By default the VS11 test runner will only run the MStest test, but by installing the xUnit.net runner for Visual Studio 11 Beta and NUnit Test Adapter (Beta) either from the Visual Studio gallery or via the Tools –> Extension Manager (and restarting VS) you can see all the test are run

image

You can if you want set it so that every time you compile the test runner triggers (Unit Testing –> Unit Test Settings –> Run Test After Build). All very nice.

image

Running the tests in an automated build

However, what happens when you want to run these tests as part of your automated build?

The build box needs to have have a reference to the extensions. This can be done in three ways. However if you are using the new TFSPreview hosted build services, as announced at VS Live, only one method, the third, is open to you as you have not access to the VM running the build to upload files other than by source control.

By default, if you create a build and run it on the hosted build you will see it all compiles, but only the MStest test is run

image

The fix is actually simple.

  1. First you need to download the xUnit.net runner for Visual Studio 11 Beta and NUnit Test Adapter (Beta) .VSIX packages from Visual Studio Gallery.
  2. Rename the downloaded files as a .ZIP file and unpack them
  3. In TFSPreview source control create a folder under the BuildProcessTemplates for your team project. I called mine CustomActivities (the same folder can be used for custom build extensions hence the name, see Custom Build Extensions for more details)
  4. Copy the .DLLs from the renamed .VSIX files into this folder and check them in. You should have a list as below

    image

  5. In the Team Explorer –> Build hub, select the Actions menu option –> Manage Build Controllers. Set the Version control path for  custom assemblies to the new folder.

    image

You do not need to add any extra files to enable xUnit or nUnit tests as long as you checked in the runtime xUnit and nUnit assemblies from the Nuget package at the solution level. This should have been default behaviour with NuGet in VS11 (i.e. there should be a package folder structure in source control as shown in source explorer graphic above)

You can now queue a build and you should see all the tests are run (in my case MStest, XUnit and nUnit). The only difference from a local run is that the xUnit row based tests appear as separate lines in the report

image

So now you can run tests for any type on a standard TFSPreview hosted build box, a great solution for many projects where just a build and test is all that is required.

Unit testing in VS11Beta and getting your tests to run on the new TFSPreview build service

One of my favourite new features in VS11 is that the unit testing is pluggable. You don’t have to use MSTest, you can use any test framework that an adaptor is available for (at the release of the beta this meant the list of framworks on Peter Provost’s blog, but I am sure this will grow).

So what does this mean and how do you use it?

Add some tests

First it is worth noting that you no longer need to use a test project to contain your MSTest, you can if you want, but you don’t need to. So you can

  1. Add a new class library to your solution
  2. Add a reference to Microsoft.VisualStudio.TestTools.UnitTesting and create an MStest test
  3. Add a reference to xUnit (I used NuGet to add the reference) and create an XUnit test
  4. Add a reference to XUnit extensions (NuGet again) and add a row based xUnit test
  5. Add a reference to nUnit (you guessed it – via NuGet) and create a nUnit test

All these test frameworks can live in the same assembly.

Add extra frameworks to the test runner

By default the VS11 test runner will only run the MStest test, but by installing the xUnit.net runner for Visual Studio 11 Beta and NUnit Test Adapter (Beta) either from the Visual Studio gallery or via the Tools –> Extension Manager (and restarting VS) you can see all the test are run

image

You can if you want set it so that every time you compile the test runner triggers (Unit Testing –> Unit Test Settings –> Run Test After Build). All very nice.

image

Running the tests in an automated build

However, what happens when you want to run these tests as part of your automated build?

The build box needs to have have a reference to the extensions. This can be done in three ways. However if you are using the new TFSPreview hosted build services, as announced at VS Live, only one method, the third, is open to you as you have not access to the VM running the build to upload files other than by source control.

By default, if you create a build and run it on the hosted build you will see it all compiles, but only the MStest test is run

image

The fix is actually simple.

  1. First you need to download the xUnit.net runner for Visual Studio 11 Beta and NUnit Test Adapter (Beta) .VSIX packages from Visual Studio Gallery.
  2. Rename the downloaded files as a .ZIP file and unpack them
  3. In TFSPreview source control create a folder under the BuildProcessTemplates for your team project. I called mine CustomActivities (the same folder can be used for custom build extensions hence the name, see Custom Build Extensions for more details)
  4. Copy the .DLLs from the renamed .VSIX files into this folder and check them in. You should have a list as below

    image

  5. In the Team Explorer –> Build hub, select the Actions menu option –> Manage Build Controllers. Set the Version control path for  custom assemblies to the new folder.

    image

You do not need to add any extra files to enable xUnit or nUnit tests as long as you checked in the runtime xUnit and nUnit assemblies from the Nuget package at the solution level. This should have been default behaviour with NuGet in VS11 (i.e. there should be a package folder structure in source control as shown in source explorer graphic above)

You can now queue a build and you should see all the tests are run (in my case MStest, XUnit and nUnit). The only difference from a local run is that the xUnit row based tests appear as separate lines in the report

image

So now you can run tests for any type on a standard TFSPreview hosted build box, a great solution for many projects where just a build and test is all that is required.

What Do the Performance Values in Windows Task Manager Represent?

If you’ve ever taken a look at Windows Task Manager, you’ve undoubtedly wondered what all the numbers mean. This guide briefly explains each value and helps you familiarize yourself with what these values represent. The performance information is broken down into four categories: CPU Physical Memory Kernel Memory System CPU CPU (Central Processing Unit) usage […]

What Do the Performance Values in Windows Task Manager Represent?

If you’ve ever taken a look at Windows Task Manager, you’ve undoubtedly wondered what all the numbers mean. This guide briefly explains each value and helps you familiarize yourself with what these values represent. The performance information is broken down into four categories: CPU Physical Memory Kernel Memory System CPU CPU (Central Processing Unit) usage […]

Let the VS Team know about VS 11

You should really take advantage of the opportunity,  tell them what you dont like and what you  really like

 

Link to Team’s blog http://blogs.msdn.com/b/visualstudio/archive/2012/03/21/visual-studio-11-beta-survey.aspx

 

So far so good, My laptop is a little clunky and I had a lot of problems installing on a 64 bit machine but the problem was Win Update and not VS.

Microsoft Most Influential Virtualization Professional e MMS 2012

Olá pessoal.

Desde outubro do ano passado eu venho participando junto com outros membros da comunidade de um concurso chamado Most Influential Virtualization Professional (MIVP). A idéia, criada pelo gerente de produtos da Microsoft Brasil Danilo Bordini, era de fazer com que a comunidade disseminasse conteúdos relacionados a virtualização, System Center e Windows Server. Por 6 meses eu participei com alguns artigos, vídeos, fórums e webcasts, além da hashtag #MIVP pelo twitter.

Semana passada o Danilo postou no seu blog os ganhadores do concurso e pra minha surpresa eu fui um deles, tendo como prêmio uma viagem pro Microsoft Management Summit 2012 (MMS) em Las Vegas com tudo pago pela Microsoft. o/

Esta foi a lista dos 5 ganhadores do MIVP:

  • Terceiro – Quinto: Prêmio: 01 XBOX 360 cada ITPro. Ganhadores:
    • Herleson Pontes
    • Jordano Mazzoni
    • Rafael Bernardes
  • TOP 2, ou seja, os dois primeiros colocados: Prêmio: 01 viagem ao MMS 2012 (Microsoft Management Summit) em Las Vegas, Estados Unidos. Ganhadores:
    • Cleber Marques
    • Leandro Carvalho

Durante o evento eu estarei extraindo todas as novidades sobre virtualização e System Center e vocês poderão acompanhar tudo aqui pelo blog. Fiquem ligados!

Eu gostaria muito de agradecer ao Danilo pelo prêmio e pela oportunidade de participar, mesmo morando fora, e também aos jurados Vinícius Apolinário, Fabio Hara e Daniel Camillo. Meu muito obrigado pelo reconhecimento.

Também queria parabenizar os outros 4 vencedores: Helerson, Jordano, Rafael e Cleber.

Leandro Carvalho 
MCSA+S+M| MCSE+S | MCTS | MCITP | MCBMSS | MCT | MVP Virtual Machine 
MSVirtualization | BetterTogether | LinhadeCodigo | MVP Profile
TwitterLeandroEduardo | LinkedInLeandroesc

How I try to keep up to date

I have just added a page to the blog that lists some of the podcasts I try to listen to, in an attempt to keep up to date.

How I try to keep up to date

I have just added a page to the blog that lists some of the podcasts I try to listen to, in an attempt to keep up to date.

Please Help Fix A Possible Oversight

I am sure by now you are sick of reading the vote for me pleas coming from all your friends. Please take a moment to finish reading this note; it is not one of those requests.
There are about 10 days left for this 2012 SMB 150 Awards competition and it seems some folks who are […]

Windows 8 CP saves the day..

So I needed to re-install an HP DV9000 and the DVD unit would not play ball. Time to set up a bootable Vista flash drive. It’s all so easy and fast too..

Open CMD as admin and type:

  1. diskpart
  2. select disk 1 or whatever the flash drive disk number is..
  3. clean
  4. create partition primary
  5. select partition 1
  6. active
  7. format fs=fat32
  8. assign
  9. exit

Next, copy the entire OS DVD to the flash drive and you are ready to go..

Even though I knew that the flash drive was disk #4, I still typed in ‘select disk 1’. Unfortunately for the Windows 8 CP installation, it was sitting on Disk 1. Fortunately for me, it was ONLY Windows 8 which was sitting on drive 1. It could so easily have been the 500gb drive which holds the first bunch of operational backups.

Hurrah for Windows 8..

Smile

Problems editing TFS11 build templates in VS11Beta

Whilst writing documentation for TFS community build extensions (just published the Zip activity documentation) I hit upon a problem working with TFS11. The TFS community build extensions support both TFS2010 and TFS11beta, unfortunately the two versions need to be built separately (once against TFS2010 DLLs and once against TFS11 ones). As of version 1.3 of the extensions both versions are shipped in the download.

In the past in the past I have tended to work in TFS2010 on this community project, but since the VS/TFS11 beta release I am trying to move over to the new build. So to write the documentation for the ZIP activity I started in TFS11. I followed the usual method to use a custom activity (there is no improvement over this frankly horrible process in VS11) so within VS11 I added the ZIP activity to a copy of the defaultprocesstemplate.xaml. All appeared OK but when I ran a build with this new template. I got the error

The build process failed validation. Details:

Validation Error: The private implementation of activity ‘1: DynamicActivity’ has the following validation error:   Compiler error(s) encountered processing expression "BuildDetail.BuildNumber".

Type ‘IBuildDetail’ is not defined.

image

On checking the .XAML file you can see there is duplication in the namespaces

image

[Note the greatly improved compare tooling in VS11]

This it  turns out is a known issue logged on Microsoft Connect. The answer, at this time, is to do you build process .XAML editing in a text editor like Notepad, not a good story, but a workaround.

Problems editing TFS11 build templates in VS11Beta

Whilst writing documentation for TFS community build extensions (just published the Zip activity documentation) I hit upon a problem working with TFS11. The TFS community build extensions support both TFS2010 and TFS11beta, unfortunately the two versions need to be built separately (once against TFS2010 DLLs and once against TFS11 ones). As of version 1.3 of the extensions both versions are shipped in the download.

In the past in the past I have tended to work in TFS2010 on this community project, but since the VS/TFS11 beta release I am trying to move over to the new build. So to write the documentation for the ZIP activity I started in TFS11. I followed the usual method to use a custom activity (there is no improvement over this frankly horrible process in VS11) so within VS11 I added the ZIP activity to a copy of the defaultprocesstemplate.xaml. All appeared OK but when I ran a build with this new template. I got the error

The build process failed validation. Details:

Validation Error: The private implementation of activity ‘1: DynamicActivity’ has the following validation error:   Compiler error(s) encountered processing expression "BuildDetail.BuildNumber".

Type ‘IBuildDetail’ is not defined.

image

On checking the .XAML file you can see there is duplication in the namespaces

image

[Note the greatly improved compare tooling in VS11]

This it  turns out is a known issue logged on Microsoft Connect. The answer, at this time, is to do you build process .XAML editing in a text editor like Notepad, not a good story, but a workaround.

More thoughts on Typemock Isolator, Microsoft Fakes and Sharepoint

I posted yesterday on using Typemock and Microsoft Fakes with SharePoint. After a bit more thought I realised the key thing in using Typemock I found easier was the construction of my SPListItem dataset. Typemock allowed me to fake SPListItems and put them in a generic List<SPListItem> then just make this the return value for the Item collection using the magic .WillReturnCollectionValuesOf() method that converts my List to the required collection type. With the Microsoft Fakes I had think about a delegate that constructed my test data at runtime. This is not a problem, just a different way of working.

A side effect of using the Typemock .WillReturnCollectionValuesOf() method is that if I check the number of SPListItems in the return SPListColection I have a real collection so I can use the collection’s own .Count method, I don’t have fake it out. With the Microsoft Fakes as there is no collection returned so I must Fake its return value.

This is a trend common across Typemock Isolator, it does much  of the work for you. Microsoft Fakes, like Moles, required you to do the work. In Moles this was addressed by the use of behaviour packs to get you started with standard items you need in SharePoint.

I would say again that there may be other ways of using the Microsoft Fakes library, so there maybe ways to address these initial comments of mine, I am keen to see if this is the case

Recent Comments

Archives