Mix 2008 Dates Announced

It sounds like Mix 2007 was the place to be this year.


Mix 2008 has been announced and I’m guessing is likely to sell out quicker than this year’s one.


From Jennifer Ritzinger, I received:

Mark your calendar and save the date for MIX08!We are pleased to announce that MIX will be returning to the Venetian Hotel in Las Vegas on March 5-7, 2008. Stay tuned for announcements related to registration at MIX and don’t forget to register early and sign up for those poker lessons you’ve been meaning to take.  J    http://visitmix.com/

 

Why SQL Server Performance Tuning Matters

I’ve had a good week this week. I’ve been working on SQL Server performance tuning for a large client here in Melbourne. I’ve been spending a week a month doing that for them.


Today’s results are why I love this work. No matter how much you tune your .NET code, you don’t get returns like you can with database tuning. I’ve turned on statistics IO to document the change in the proc I worked on today. How cool is this? It used to take 9,383,786 logical page reads to execute the proc. It now takes 11 (yes eleven). Eat your heart out upper-layer coders :-)

Original proc

 
Table
‘TableA’. Scan count 1, logical reads 4, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘Worktable’. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableB’. Scan count 696588, logical reads 3093369, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableC’. Scan count 0, logical reads 2089764, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableD’. Scan count 698103, logical reads 2097440, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableE’. Scan count 5, logical reads 2103203, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableF’. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableG’. Scan count 1, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. 

Modified proc


 
Table
‘TableA’. Scan count 1, logical reads 4, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableE’. Scan count 0, logical reads 3, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table
‘TableB’. Scan count 0, logical reads 4, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.



(table names changed to keep the client’s details private)

NSimil – A proposal for a simulation extension to .Net unit testing and mocking infrastructures – Part #1

One thing I’m always interested in is reassessing things that were previously used in the IT industry, have fallen by the wayside for some reason but might have new application today, when circumstances have changed. This series of blog posts covers my thinking around taking the essence out of simulation languages to create dynamic unit testing or dynamic mock objects. I haven’t thought all this through yet but I decided to kick the ideas around anyway.


Simula (and Simula 67) were amazing languages. The Wikipedia article mentions that Simula 67 introduced objects, classes, subclasses, virtual methods, coroutines, garbage collection and discrete event simulation. It’s hard to comprehend that a language had these concepts back in 1967.


What I loved about Simula was its simulation abilities (hence its name). The idea was that to simulate say the people traffic at an airport counter, you’d describe each of the actors with procedural class code, then you’d run the simulation and see how they’d interact. But the real genius was that, just as real people don’t react the same way all the time, the classes included code that introduced random variation in the responses and Simula then provided the framework for running the simulation.


As an example, Hold() statement was a bit like System.Threading.Thread.Sleep() in that it made the object stop for a period of time but it allowed you to add in functions that would determine the distribution of the random values that determined the time involved. For example, from the Wikipedia article:


 Hold (Normal (12, 4, u));


As another example, imagine if when writing code that said:


if (something == somethingelse)


you could add an attribute that said what the odds were of taking each branch of the subsequent code.


It occurs to me that current unit testing tools do a great job of testing code in a static way (particularly testing edge cases) but completely miss being able to test the dynamics of the code. By introducing some simulation constructs, we should be able to expand testing infrastructure into a dynamic test rig rather than a static one. This could be particularly useful in creating dynamic mock objects.

Books: Richard Dawkins – The God Delusion

I was led to this book while listening to the New Scientist podcast. I had heard discussions about Richard Dawkins in an episode discussing how a well-known geologist deals with being both a Christian and a geologist, given the two are pretty much always at odds. Richard Dawkins is often regarded as one of the most vocal of the anti-religion movement.


It’s an interesting read. I have to say I pretty much agree with most of the points he makes in the book although I don’t like the early chapters too much. They tend to come across a bit like a point-scoring match with the religious communities, particularly the fundamentalist movements in the United States. However, the later chapters are well written and makes points that are pretty hard to ignore.


I’d recommend reading it but prewarn that you mightn’t enjoy the early chapters as much as the later ones.


http://www.amazon.com/God-Delusion-Richard-Dawkins/dp/0618680004/ref=pd_bbs_sr_1/102-4052044-2687347?ie=UTF8&s=books&qid=1180054806&sr=1-1

Books: Bill Bryson – A Short History Of Nearly Everything

If ever there was a book that I think anyone with a semblance of interest in Science should read, it’s this one. I was pointed to it by fellow Readifarian Chris Hewitt and it’s just wonderful. The writing style sets a great pace and the blend of facts and humour makes it a joy to read. Bill takes us on a journey through our discovery of the world to the present day. The best aspect of the book is the human side he brings to the discussions on notable people throughout the ages. Highly recommended: http://www.amazon.com/Short-History-Nearly-Everything/dp/076790818X/ref=pd_bbs_sr_2/102-4052044-2687347?ie=UTF8&s=books&qid=1180054543&sr=8-2


 

Deterministic vs Volatile

One of the things that I’ve never been comfortable with is when people take existing computing terminology and use it for something different to what it was designed for. For example, the first time I saw a Windows setting for Default Gateway, I wasn’t pleased. Router was a perfectly good term. Gateway was also a perfectly good term but it sure didn’t mean Router. I think when this happens, it ends up confusing people unnecessarily. Every time I’ve heard someone describing the Default Gateway, they say “oh that’s just the address of the router”. So why not call it Router?  What brought me to this the other day was watching a demo of the new Excel Services. I noticed that they had adorned methods with an IsVolatile attribute. However what it was used for was to indicate whether or not a function always returned the same output for the same input parameters. But isn’t that the definition of Deterministic? In SQL CLR programming, they’ve correctly used IsDeterministic as an attribute for this purpose. Volatile usually refers to a value that can be changed without the knowledge of the program ie: if I have a variable that represents some external thing that can be updated by code other than mine, the compiler needs to understand that to avoid incorrectly applying optimizations. Does anyone else think that’s odd or am I being too picky about this stuff? I just find it frustrating and think it leads to confusion.