Sysinternals Live

A colleague of mine showed me this and looked around for some information about it. Here it is:

Sysinternals Live: We’re excited to announce the beta of Sysinternals Live, a service that enables you to execute Sysinternals tools directly from the Web without hunting for and manually downloading them. Simply enter a tool’s Sysinternals Live path into Windows Explorer or a command prompt as \\live.sysinternals.com\tools\<toolname> or view the entire Sysinternals Live tools directory in a browser at http://live.sysinternals.com.

Getting .NET 1.1 CLR String Hash Codes In The .NET 2.0 CLR

Everyone knows (or, should know) that values values retrieved from the GetHashCode method should not be persisted for later use, specially with strings, because:

The behavior of GetHashCode is dependent on its implementation, which might change from one version of the common language runtime to another. A reason why this might happen is to improve the performance of GetHashCode. [^]

Nevertheless, code that persist values retrieved from the GetHashCode method for later use can fall on your lap. And if you need to upgrade your code to, for example, use WCF or WF, you have a problem.

The usual solution would be to use Reflector to see how it was done in the 1.1 CLR and implement the same algorithm in the 2.0 runtime.

Unfortunately, the 1.1 implantation isn’t managed.

Today, thanks to Nicole, I found System.Collections.Specialized.BackCompatibleStringComparer on the 2.0 CLR. This class implements the 1.1 CLR String.GetHashCode algorithm and can be found in the System.dll and System.Windows.Forms.dll assemblies. You can’t use it because it’s internal but you can see its implementation using Reflector. You can also see it here.

Now I have a few questions:

.NET Framework Client Profile – What Will Be On It?

Justin Van Patten has posted on BCL Team Blog about the .NET Framework Client Profile.

In this post he goes through the list of the assemblies that will and will not be part of this client profile.

If you know how classes are packaged in the .NET Framework, you know that sometimes they are packaged, often, not by concern but by ownership. That’s why you find a provider model on System.Web.dll, as well as HttpUtility, or even the Cache (that can be very useful in non-web applications).

If you read through the comments you’ll come to the conclusion that there is a need to several client profiles (or a bootstrapper that will be cooked up by the developer, which is worst). Some of these profiles could be (loose definition):

  • Application Foundation – mscorlib.dll, System.dll, System.Core.dll
  • Windows Forms – System.Windows.Forms.dll
  • Windows Communication Foundation (client) – System.ServiceModel.dll (broken in core and client only classes)
  • Windows Communication Foundation (server) – System.ServiceModel.dll (broken in core and server only classes)
  • Web – System.Web.dll (not the full fledged ASP.NET just because you need HttpUtility)

This could be taken to a level that the Silverlight, XNA, Robotics, Compact Framework would become profiles of the framework. Something like the concept of server roles for server operating systems.

I think that more important than what each one of us would like to see in this Client Profile is that the team is analyzing it.

Internet Explorer vs. FireFox

Until recent I had never used FireFox (FF) because Internet Explorer (IE) was good enough for me.

I don’t do much web page development and because I own licenses for Visual Studio (VS), HTTPWatch and IEWatch (tried the Internet Explorer Developer Toolbar but it keeps bowing up and killing IE and I’ve seen Nikhil Kothari‘s Web Development Helper installed and doesn’t work well when non US English characters are displayed) I never needed anything else.

Over the years I’ve seen all the campaigning against IE and promoting FF as a better, more standards compliant, more secure and what else.

A few days back I had to do some work with ASP.NET validation summary and validators and needed to check if it worked on FF.

Talk about disappointment:

  • FireBug is by far no better than the tools I’ve been using.
  • FF needs its own proxy configurations – For me, any application running on Windows that needs its own proxy settings it’s just a badly developed application.
  • (I’m sure I’d find much more if I used it.)

IE isn’t a good developer tool yet (not even in IE8 at this time [^]) and it should have been for a long time. Or, at least, VS should have better support HTML and CSS debugging.

But, on the other hand, Windows Internet Explorer is just another application built on top of the Web Browser Control[^] (which is part of the IE installation, but can be used by itself). You can build any Windows application that uses a Web Browser Control (I’ve built more than one). Looks like the same is not as simple with FF [^].

I don’t intend to start a web browser war. I just wanted to state my disappointment. I guess FF fans set my expectation too high.

My Shared Podcasts

I used to share my podcast watching through a feed created using FeedDemon, NewsGator and FeedBurner.

It was very easy. All it took was to adding he post to a clipping folder created to that purpose. That folder was shared as an RSS feed and I ran it through FeedBurner to get some statistics and a more friendly URL. For downloading the webcast/pocast, I used FeedBurner’s companion: FeedStation.

Because I listen to a fair amount of podcasts, I got myself an 8GB Zune. The Zune uses it’s own software to load content and that software is capable handling podcast subscriptions, so I won’t be needing FeedDemon and FeedStation for subscribing downloading podcasts.

Because I have a few subscribers to my feed, I wanted to keep it. Then I found PodShow [^][^] a created my onw profile and changed the source of the feed.

So, if you already are one of my followers you don’t need to do anything. If you aren’t and would like to be, go ahead and subscribe it.

subscribe to my shared podcast's feed My mevieo

MSDN And TechNet Virtualized

Lately I’ve been analyzing various solutions for resource (applications, desktops and servers) virtualization, pooling and provisioning from the various players in this market (Microsoft, Citrix , VMware , Sun , HP).

There are many advantages to virtualization:

  • Hardware consolidation with out the need for server consolidation.
    • Energy cost reduction.
      • Servers and desktops can be instantiated on demand. No more need to have machines turned on waiting for users.
    • Hardware cost reduction.
      • One big machine can host many servers and desktops.
  • Ease of deployment and maintenance.
    • Deploying is just copying a file.
    • Patching can de done on a copy that is deployed after.
  • Ease of diagnostics.
    • If a problem occurs, it can be diagnosed on a copy that will be then patched and redeployed.
  • Ease of development.
    • When development teams need a new environment for a new application they just need to deploy and start up a copy of a pre-existing environment.
  • Business continuity
    • If a data center is taken offline for any reason, all it takes is new machines and the latest backup and you’re up and running. (It’s not that easy, but a lot easier than installing all the applications in the data center).
  • … and much more.

Lately there has been some discussions on database virtualization. Database systems are very resource intensive (both memory and I/O), but the advantages for business continuity purposes are starting to weigh in some IT departments’ decisions.

When a co-worker and good friend of mine told me that Microsoft had virtualized MSDN and TechNet, I couldn’t believe it. You can get the detailed report from here.