I really don’t think the article is all that fair – maybe Kevin Kean and Debby Fry Wilson weren’t able to convince the journalist to write the whole side of the story, I don’t know.
Certainly, while I was at Microsoft, new code was run through code-review by real, intelligent people, with a keen eye to ensuring that it could not be abused; new features were run through threat-modeling by a Security PM, whose job it was to find all kinds of avenues of misuse and abuse, and rate mitigiations and risk. I don’t believe that approach was unique to IIS.
So, far from this being something that’s completely new to MS, and has never been addressed, this is something that’s looked at for new code all the time.
Old code, I’m not so sure of, and maybe the article is right in that regard, that old code is not scanned rigorously for unintended uses.
What software company out there is currently scanning through old code for such unintended uses as a regular part of their security process?
Okay, so we do it here at Texas Imperial Software, every time a new and interesting abuse is revealed against someone else’s software, but that’s a much smaller segment of code.