/YX Switch Is The Best Thing Ever

I guess I miss a lot of things, but in particular, Microsoft managed to sneak the /YX switch into the compiler without my noticing. Wow! I could never decide if I hated the absymally slow C++ compile times I was getting for our products or the Hell Of Pre-Compiled Header Setup And Maintenance (to quote
Big Trouble In Little China
, I have a lot of hells…). This newfangled /YX thingy figures out where the headers stop automatically and builds PCH files all by itself, and then next time it finds them and uses them! I’m sure that this is common knowledge, but automatic PCH generation is about the Best Thing Ever (apologies to whatevs).

Then again,
Johnny Cash’s American IV
, now playing, is a close #2.

BUILD Breakage

Armed with my excellent reasons for wanting to move my company’s build system to the DDK, I sat down and tried to get one of our static libraries to build. This particular library gets linked with a couple of different binaries that we produce, includes headers from three other static libraries of ours, and yet is small and discrete enough to not be too much of a handfull to build. So, I made a SOURCES file, tweaked the include paths a bit, set up my 3790 DDK, and started whacking at trial builds.

Problem #1: Missing SDK header files. Yes, the DDK includes a (mostly) complete SDK, along with all of the ATL madness necessary to build, well, ATL-based projects, but it turns out that it’s missing some files. On the list are wininet.h and mprapi.h. Curious fact: both import libraries are in fact in the DDK. Just not the headers. Odd. For what it’s worth, the missing headers might be in the Longhorn DDK, but as it is a year or two pre-release at this point, that doesn’t seem like a feasible requirement. Especially due to the NDA. 😉 Anyway, it is obvious that I can’t rely on the DDK for my SDK – this is the first of dozens of projects that I’m trying to convert; who’s to say what else will be missing? Besides, I can’t exactly constrain my development team to using only headers in the DDK’s version of the SDK down the road.

Problem #2: Missing STL headers. We make heavy use of the C++ STL in our code. In fact, one of our developers actually branched out a bit and used hash_map, which is not quite part of the C++ standard, but is present in SGI’s STL implementation and recent Microsoft CRT header sets. However, much to my chagrin, there is no hash_map in the 3790 DDK. How odd. Furthermore, attempts to use the SGI STL led to Internal Compiler Errors. Did I mention that I hate C++? The only hash_map I could get working with the DDK compiler was the one that comes with current Visual Studio. Since one of the objectives that I failed to mention last time is to get the system away from dependence on an IDE for development, I don’t exactly love the idea of requiring Visual Studio for building. And yes, I know about the excellent and free Visual Studio Toolkit 2003, which is a big improvement, but it would still mean that my dev environment would need the PSDK, the visual tools, and the DDK. Not much of a simplification from what I have now.

Problem #3: Brain-dead source layout requirements. This one is my favorite. Yeah, we can work around the goofy SDK with its missing header files, and we can find a workable STL somewhere, but the driving reason behind my wanting to use the DDK for building – the BUILD.EXE utility itself – is broken. Really. I never really appreciated it until this conversion. According to the build utility documentation, source files must be present in either the current directory (i.e. the one with the SOURCES file in it), or in the parent directory, or in a platform-specific subdirectory. Dumb. Really.

The havoc that this creates with source trees is laughable, even for the small library I tried converting. Essentially, in order to use BUILD, you have to dump all of your source files (other than those #include’d from somewhere else) into one of these directories. Look around the DDK for some interesting examples of this. Who would want 200 source files in one directory, if they can be sanely and naturally broken up into subdirectories? For what it’s worth, BUILD is *not* the only utility with this assumption built in – there are others (Perforce’s Jam system, for example; more on that later). I know Microsoft builds lots of stuff with BUILD, and I can’t imagine that they like this. The NT source tree must have some crazy exceptions built into the main makefile (looks like it does, from makefile.new examination).

For better or for worse, this kills BUILD for my project, unless we attempt a major source code re-structuring, or do something dumb like generate a static library at every directory.

Problem #4: Platform support problems. I alluded to this one last time, but was willing to accept it as a limitation, assuming everything else worked as designed. There is no great way to run BUILD.EXE on linux (WINE might do it, of course, but I haven’t tried it). There is absolutely no way on Mac, that I’m aware of, due to different CPU architectures.

So, the moral of the story is that BUILD is really great for the smallish projects I typically use it on, but it presents a major scalability problem as the complexity of the project increases. Now, with that said, it’s entirely possible (likely?) that I don’t know what the heck I’m doing, and hopefully some bright reader will leave feedback pointing out my stupidity. A couple of readers offered their experiences; I’d love to hear what you guys came up with when you moved your projects.

So, the search for the ultimate build system continues.

Microsoft’s Shared Source Director Is Blogging

I noticed from Larry Osterman’s Blog that Jason Matusow, Microsoft’s Director of Shared Source, has started a blog. Free Software, Open Source, and Microsoft’s versions of both, are topics of great interest to me. In addition to being a commercial software development compnay, we make heavy use of both Windows-based software and of Free/Open Software.

Jason does a good job in his first couple of posts of getting some of the issues on the table. He certainly sounds like he knows what he’s talking about (a rarity when geeks and geek-like people start talking about licensing). I’ll be keeping tabs on what he has to say.

I couldn’t leave this topic without plugging another blog I read, that of Stanford Law Professor Lawrence Lessig. Lessig, along with Richard Stallman of the Free Software Foundation, is one of the world’s leading thinkers on the topics of Free Software and Free Culture, and is always worth a read, even if you happen to disagree with him. Which I rarely do. 🙂

Building with BUILD

At Positive Networks, where I work, we have a complicated set of products to build. We keep hiring new people, and training people on our build system is a pain. When I tell new developers that.. yeah.. we’re using.. Visual Studio 6 – they all respond with the same look of surprise, followed by the obligatory “Will that work OK alongside my real compiler?”. Throw in the fact that we support a bunch of drivers for both WinNT+ and Win98+, and that much of our code builds on both Linux and Windows (with Mac support coming), and you have a mess.

I’ve long since switched to using BUILD.EXE, from the DDK, for my own personal projects. It took a little hunting around in the DDK samples and a pretty detailed reading of
the BUILD.EXE documentation
, but I’ve managed to figure out how to get all of my little side projects building easily.

The benefits to moving to the DDK for building are significant (particularly compared to what we’re using!):

  • New compiler! ISO C++! Yeah!
  • New headers! The new compiler actually supports newer versions of various SDKs and libraries that we use
  • Support for all of the best current practices – /GS for example (more on this later)
  • Cool tools that come with the DDK
  • Simple, well-defined upgrade path – never worry about converting DSP files again
  • Great multi-architecture, multi-OS support (as long as the OS has “Windows” in the name)
  • Multi-processor support
  • More that I’m not thinking of at the moment…

So, it looks good on paper, but will it work? Well, I decided to take one of our static libraries as an example and get it building with the DDK. And so began the struggle. . .

Inisde Windows, 4e

Hello again, and sorry for the incredibly long delay between posts. Work has gotten busy lately, and I’ve been running around doing various non-technical things for the past six or so weeks (trade show, media tour, partner launch, major version upgrade of our main product, etc). I’ve barely had time to think, let alone blog. 🙂

I don’t know how it happened, but I noticed last January that Russinovich and Solomon had managed to slip another edition of the Inside Windows series past me without my noticing it. The current (fourth) edition, Microsoft Windows Internals, is the update to the very popular Inside Windows 2000, and now covers Windows XP, Windows Server 2003, 64-bit computing, and more.

In particular, they have updated the memory manager section on how Windows deals with 64-bit architectures, and have added a section on WOW. They also added a new chapter on crash dump analysis, which should answer most of the basic questions we see pop up on the WINDBG list.

Overall, it’s a good buy, especially if you happened to like the third edition (which I did). Recommended.

In unrelated news, I’m starting on a bit of an experiment with the build system for my company’s products. We have been using a cobbled-together Visual Stuido 6 / PSDK / DDK environment for a very long time, and it’s getting very difficult to manage. So, I’m toying with switching entirely to the DDK for product building. It’ll be interesting to see how well it works, and I’ll definitely write an article or two on it.