I have to admit a slept rather well last night. Train traffic greatly diminished after 22:00. The windows had double glazing as well, helping to reduce the noise.
What really insured my night’s rest however were the earplugs. Advice to newbie business travelers: always bring earplugs.
The hotel did not have bacon for breakfast.
I finished my day1 report last night so I used the extra time this morning to post that report and catch up to my email. Apparently, some spamming copany or other has discovered the contact form on my blog. I might have to disallow anonymous contacts soon, since the deluge of spam is increasing daily.
Improve Database application performance with SQL server service broker
This session is hosted by Bob Beauchemin. He looks like my boss’s twin brother. Really bizarre.
This session is one that I went to because there was nothing else that interested me, but now I am glad I went there.
It was 100% code demo. T-SQL code to be precise. It showed how you can design your database enabled logic to work with SQL transactions. The main advantage of using transactions is that your applications can keep on working if one or more of the back-end servers are offline.
This technology uses built-in message queues that allow you to have much faster transactions than what you would get with DTS (Distributed transaction server).
These queues turn the database actions into asynchronous operations.
This demo was very clear and concise, and one of the better code demos I have ever seen. Nuff said.
ADO.NET vNext – Linq, Object services and the data entity model
Originally I was going to go to a Workflow Foundation demo by Ingo Rammer, but I had already seen a beginner’s intro once at tech-ed.
I knew literally nothing about vNext and object services, so another code demo by Bob Beauchemin might be a better use of my time.
vNext, object services and the data entity model are going to be part of Orcas.
The data entity model is a new data model that is accompanied by 3 XML schemes for defining and mapping the data flow from a custom data store (like the registry, the GAC or whatever) to ADO.NET. The model contains hooks for 3d part provider writers.
Object services is a technology that allows you to work with object databases that contain data objects that can have inheritance and other object properties.
I have to admit that I still don’t know that much about it, let alone understand it, but now at least I know it exists.
This session is hosted bu Bruce Payence, and is about the windows powershell.
This is actually an IT session, not a development session.
Powershell is the new .NET enabled command line interface. From the demos it looked really cool. It can use all available .NET components, so it allows you to do basically anything you could do with a unix or DOS command shell, as well as anything you can do with .NET.
It is also fully customizable via plug-in modules.
To show the power of powershell, there were a couple of interesting demos: a space invaders program and a couple of examples of really interesting uses of plug-in modules like an extension that allows you to ‘cd’ into the registry to get information.
Powershell cannot do anything that e.g. C# could not also do, but because you are creating shell scripts, it is much easier to string different programs together to do something.
Another nicety is that powershell can use both unix commands (ln, rm, …) as DOS commands (dir, copy, …).
One important improvement over standard unix style scripting is that you can pipe .NET types into other programs without having to format and parse everything to and from text.
At this point I should point out that the room was packed, and the majority of the people in the audience were older sysadmins, and not developers.
Everybody seems to find it perfectly obvious that 44/7 equates to 6.2857… That is a sure sign of not being a developer. I would have expected it to be 6. Whatever.
Anyway, Powershell looks really interesting. So interesting in fact that I bought a book about it. It looks like something that can be really useful in a sysadmin role. Since I am going to have to maintain a standalone production line network, it might be worth to dive into it.
Bruce is an excellent speaker. He also did a book-signing session at the book stand, so I had him sign my copy of ‘Powershell’. What can I say… I am a geek.
Continuous integration with and without Team System
Another talk delivered by Roy Osherove.
This session is about continuous integration. CI means that every time a check-in is done (or each fixed time interval) your project is checked out to a build server and built.
This way you can immediately verify if your (or someone else’s changes) broke the built.
Continuous integration has the advantage that you discover build problems as soon as possible.
Roy showed several tools that can help you to automate a build:
· Nant: works for smaller projects, but can be a tedious to set up if you are not an XML masochist.
· MSBUILD: same problem. XML hell.
· Team Foundation server MSBUILD: better, but is very limited in the current version of TFS. The Orcas TFS Build utility will be much better though.
· FinalBuilder: a 3d party tool that allows you to configure an automated build graphically. The very nice thing about this tool is that it comes with dozens of configurable actions like deploying build output, burning DVDs, running unit tests, … This was the favority tool of the speaker.
· Visual Builder. Comparable to FinalBuilder.
Having such a build system in place is only the first step, because now you’ll have to set it up so that it starts by itself with each change.
There are several tools for this:
- Cruise control
- Draco .NET
- Something that I cannot read in my notes anymore
These were not really shown, but their use was explained.
Continuous building is also possible with TFS. This takes some custom programming. TFS supports 3d part plugins through .NET interfaces. With a custom plugin you can register to TFS events, like ‘check-in complete’ or other such events.
These events can be used to let the plug-in trigger a check-out and a rebuild. This takes quite a bit of programming, but there are already a couple of open-source plug-ins that you can use.
This session also came with a song, but it was not as nice as the one of his first session. It was very interesting, and convinced me of the necessity of having an automated build system in place if you are one a project with more than 1 other programmer.
Building WCF based services with WF enabled business logic
Another session delivered by Chris Weyer.
As with his previous sessions, he again began with the ABC story which I’d heard 4 times by now. Sigh… But I digress. After 15 minutes the session really took off.
Thinking about it, it would make perfect sense to implement WF functionality in a WCF service. After all, if you start a custom workflow, you’d want someone to be able to interact with it, no?
As it turns out, the current releases of WCF and WF make this very hard to do, and require a lot of custom programming.
Chris showed how you can do it, and demonstrated how his multimedia demo project implemented this.
He also showed how this is implemented in Orcas. Luckily WF and WCF are perfectly matched in the Orcas release of .NET, so that boades well for the future. One of these days, Chris will upload his demo project to his blog so that we can all look at it. It uses WPF, WCF, WF, SQL Server and some other stuff.
End of the day
The conference is over. I decided not to stay for the closing keynote. I don’t believe in keynotes. At least with the kick-off keynote you have the comfort that you can do something interesting when it is over.
The trip home took 2.5 hours because –surprise surprise- there was an accident.
I was completely exhausted from trying to remember and learn all these new things over the last 2 days.
But all in all it was very interesting, despite the lack of C++ (or any native code at all).