You ask: why did Microsoft train ALL developers on Security?

One of you readers asked me to investigate why Microsoft decided to train all developers on Security, rather than targeting either (a) those developers who touch security-related features or (b) one designated “security expert” on each development team.

You asked, I answer with a collection of quotes from various sources, but basically all from the horse’s mouth (yes Michael, that makes *you* the horse in this analogy).  Please enjoy, and feel free to link others you might stumble across…
“We need to teach more people about security. Now, you’re probably a geek, or a geek-wanna-be, and I bet you’re thinking, “ah, he’s trying to sell more copies of his book, he wants to teach people about writing secure code.” Ok, that’s true, I think software designers, developers & testers need to understand what it takes to build secure software; the threats have changed, and security no longer resides in the realm of the Security High Priesthood nor the Security Learned Few. Building secure software is simply part of getting the job done. Just like we learned the basics of optimal algorithms in school, kids coming out of school need to know the basics of building code that will run in that most hostile of environments – The Internet.”
“We require our SDL training to emphasize the basics of secure design, development and test – then allow employees and their management to select the training that meets the needs of their particular product or service.  There is one other point that bears mentioning – our training is constantly being reviewed or embellished to make sure that emerging security or privacy issues are being addressed. ”
“If your engineers know nothing about the basic security tenets, common security defect types, basic secure design, or security testing, there really is no reasonable chance they could produce secure software. I say this because, on the average, software engineers don’t pay enough attention to security. They may know quite a lot about security features, but they need to have a better understanding of what it takes to build and deliver secure features. It’s unfortunate that the term security can imply both meanings, because these are two very different security realms. Security features looks at how stuff works, for example the inner operations of the Java or common language runtime (CLR) sandbox, or how encryption algorithms such as DES or RSA work. While these are all interesting and useful topics, knowing that the DES encryption algorithm is a 16-round Feistel network isn’t going to help people build more secure software. Knowing the limitations of DES, and the fact that its key size is woefully small for today’s threats, is very useful, and this kind of detail is the core tenet of how to build secure features.

“The real concern is that most schools, universities, and technical colleges teach security features, and not how to build secure software. This means there are legions of software engineers being churned out by these schools year after year who believe they know how to build secure software because they know how a firewall works. In short, you cannot rely on anyone you hire necessarily understanding how to build security defenses into your software unless you specifically ask about their background and knowledge on the subject.”
(a) “But is it important to note that an education program is critical to the success of the SDL. New college and university graduates in computer science and related disciplines generally lack the training necessary to join the workforce ready and able to design, develop, or test secure software. Even those who have completed course work in security are more likely to have encountered cryptographic algorithms or access control models than buffer overruns or canonicalization flaws. In general, software designers, engineers and testers from industry also lack appropriate security skills.

“Under those circumstances, an organization that seeks to develop secure software must take responsibility for ensuring that its engineering population is appropriately educated. Specific ways of meeting this challenge will vary depending on the size of the organization and the resources available. An organization with a large engineering population may be able to commit to building an in-house program to deliver ongoing security training to its engineers, while a smaller organization may need to rely on external training. At Microsoft, all personnel involved in developing software must go through yearly “security refresher” training.”

(b) “One key aspect of the security pushes of early 2002 was product group team-wide training for all developers, testers, program managers, and documentation personnel. Microsoft has formalized a requirement for annual security education for engineers in organizations whose software is subject to the SDL. The need for an annual update is driven by the fact that security is not a static domain: threats, attacks and defenses evolve. As a result, even engineers who have been fully competent and qualified on the aspects of security that affect their software must have additional training as the threat landscape changes. For example, the importance of integer overflow vulnerabilities has increased dramatically in the last four years, and it has been demonstrated recently that some cryptographic algorithms have previously unrecognized vulnerabilities.

“Microsoft has developed a common introduction and update on security that is presented to engineers in both “live training” and digital media form. Microsoft has used this course as the basis for specialized training by software technology and by engineer role. Microsoft is in the process of building a security education curriculum that will feature further specialization by technology, role, and level of student experience.”
“Hopefully, you realize that reviewing other people’s code, while a good thing to do, is not how you create secure software. You produce secure software by having a process to design, write, test, and document secure systems, and by building time into the schedule to allow for security review, training, and use of tools. Simply designing, writing, testing, and documenting a project, and then looking for security bugs doesn’t create secure software. Code reviewing is just one part of the process, but by itself does not create secure code.”

The Security Development Lifecycle Chapter 5

“If your engineers know nothing about basic security tenets, common security bug types, basic secure design, or security testing, there really is no reasonable chance that they will produce secure software. We say this because, on average, software engineers know very little about software security. By security, we don’t mean understanding security features; we mean understanding what it takes to build and deliver secure features.”

Something broke my CacheMyWork app!

Ever since I joined up with my current employer, I’ve been unable to get consistent results out of my CacheMyWork application. It wasn’t exactly professional quality when I released it, but it did what I wanted nicely on some XP & Vista systems I’d been using.

Since getting my IT-issued notebook, however, I’ve been unable to get the darned thing to work consistently. When I “cache” a half-dozen or more apps, I’ve never yet seen *all* of them start up at my next logon; sometimes I’ve even seen that NONE of them run. And yes, I’m quite certain that the Registry entries are getting successfully created (under HKCU\…\RunOnce) – which means that something is interrupting the execution of these once I’ve logged on.

My suspicions are heavily weighted towards the McAfee suite of security apps, especially AntiSpyware and HIPS (Host Intrusion Prevention Service aka “entercept”). I’ve been trying to figure out how to block their activity, even temporarily (which admittedly is pretty much what I’m sure they were built to withstand), but no luck.

I’ve tried escalating through the IT service desk folks, but they are pretty much lacking in cluefulness – all I get is pointers to a couple of web pages and a vague escalation process (which seems to terminate in “single-app exceptions that *might* be added to the configuration). I need to be able to unblock whatever it is that’s intercepting CreateProcess (presumably) by the shell when it’s iterating through the HKCU\…\RunOnce values – the whole point of my app is to let me restart *any* user app, so one-by-one exception allowances is hardly an efficient solution.

I’ll keep digging, but if anyone has ever seen this kind of behaviour and/or has any pearls of wisdom on how to log/troubleshoot RunOnce activity, I’d sure appreciate a nice smack upside the head. [figuratively speaking – I’ll provide other rewards for anyone that actually helps me make progress…]

Heading to Portland, Leaving Microsoft…

After five years living in Seattle (which my wife considers a pale comparison to Portland), Robin has been offered a kick-butt job at a really cool-sounding law firm in Portland, where she’ll practice tax-exempt and estate planning law.  Obviously we’ll be moving to Portland very shortly (in fact, in the next few weeks it looks like…)

That put me in the difficult position of considering whether to remain with Microsoft or venture out to new horizons.  Working for my current team (Security Accelerators – Security and Compliance) has been awesome, but it just can’t be done when not stationed on campus.  My next-best option would be to re-join Microsoft Consulting Services, but I don’t think I’m ready to go back to a delivery role after a few years of the intellectual rewards of working on the fringes of R&D.  I started my career at Microsoft in MCS Canada (take off eh!), and though I don’t regret those experiences, I’m more interested in making broad impact on the security of large numbers of individuals and organizations than go back to the one-at-a-time approach (not yet, anyway).

Looking around Portland, there’s a lot of high-tech opportunities but the big players are Intel and McAfee.  Against all odds, a really incredible job opening was available in a smallish, growing group at Intel.  Intel seems like a really interesting place to work, and the group I’m joining has their fingers in many pies: product pen testing, product security development lifecycle, product security consulting.  [I’m pretty sure I’m mischaracterizing their efforts but I’ll clear that up shortly.]

So yes, I’m leaving Microsoft and joining Intel!  For those of you that know me personally, this may come as a shock that I’m joining another international corporation (one good friend of mine thought I’d join a Save the Whales or militant tree-hugging organization – heh :), but it’s just too tempting to pass up: leveraging my past experiences in a new environment, learning a whole new set of technologies around hardware development, software & hardware programming, and stepping into a new role with new perspectives, leaders and fresh thinking – how can I beat that?

One of my colleagues asked me to start blogging about life after Microsoft (aka life in my new job), and I think I’ll take him up on that.  It should be fun to reflect on my new experiences, especially after so long in the belly of the Beast. 🙂

‘Embarrassed’ Gun Suspect Sues Microsoft After FBI Finds Sex Videos On His PC – Technology News by InformationWeek

You’re kidding, right?  This guy surfs for porn, bases a purchase decision on his need to surf for porn anonymously, and then sues Microsoft when he (inevitably) gets caught?  I know there are many folks out there that don’t understand the difference between “delete” and “wipe all traces”.  I’ve had to explain this to countless folks who want to understand how to actually protect their data from snoopers and thieves.

However, I’ve rarely known anyone with “questionable” surfing habits – and especially those who know their habits are questionable or at least embarrassing – to not investigate deep into the computer to make sure there are six ways from Sunday protecting them from unaware spouses, curious kids and wary employers.

What you do in the privacy of your own computer is your business, and I’m hardly condemning this guy for his legal and not-uncommon activities.  Still, he needs to cash a reality check if he has some expectation of privacy from law enforcement officials when he’s relying only on “automated delete” features of any piece of software.  There’s tons of discussions, web sites and vendors hawking cheap/strong/free encryption products, disk wipers, “trace erasers” and the like.

It’s no coincidence that Zimmerman’s PGP stood for “Pretty Good Privacy” – if you want to keep something to yourself, you’d better lock it behind a reasonable key and not leave the kitchen window wide open.

Heck, at least try some of the in-box encryption technology, before you go laying blame at Microsoft’s feet.  Then you’ve got at least a toe or two to stand on…

Link to ‘Embarrassed’ Gun Suspect Sues Microsoft After FBI Finds Sex Videos On His PC – Technology News by InformationWeek

Trusted Computing Best Practices, the TNC spec, and Microsoft’s involvement – hypocritcal?

Below are excerpts from Bruce Schneier’s “Schneier on Security” blog, asserting that Microsoft is making an effort to prevent the TCG’s software-only spec for TPM apply to Windows Vista before its release:

In May, the Trusted Computing Group published a best practices document: “Design, Implementation, and Usage Principles for TPM-Based Platforms.” Written for users and implementers of TCG technology, the document tries to draw a line between good uses and bad uses of this technology.


Meanwhile, the TCG built a purely software version of the specification: Trusted Network Connect (TNC). Basically, it’s a TCG system without a TPM.

The best practices document doesn’t apply to TNC, because Microsoft (as a member of the TCG board of directors) blocked it. The excuse is that the document hadn’t been written with software-only applications in mind, so it shouldn’t apply to software-only TCG systems.

This is absurd. The document outlines best practices for how the system is used. There’s nothing in it about how the system works internally. There’s nothing unique to hardware-based systems, nothing that would be different for software-only systems. You can go through the document yourself and replace all references to “TPM” or “hardware” with “software” (or, better yet, “hardware or software”) in five minutes. There are about a dozen changes, and none of them make any meaningful difference.

If true, this feels to me like some form of hypocrisy, at least at a company level. Microsoft took a decidedly different stance on the use of the “no execute” (NX) feature of the latest generation of CPUs from Intel and AMD, and in an ideal world I’d expect them to do the same here.

In the release of Windows XP’s Service Pack 2 (SP2), they implemented changes to the OS that would enable it to assert the “no execute” flag on any and all processes running on the system – if a process attempted to execute a “page” that was previously considered a data page (i.e. non-executable code), then the OS could immediately halt the program and alert the user. The intent is to prevent things like “buffer overruns” from being able to successfully circumvent a program’s intended purpose and ultimately cause the program to do something the attacker wishes (usually a malicious attack on the OS, its programs, or the user’s data). Worms and viruses have had a field day with this kind of attack for years, and Microsoft and the CPU vendors finally got around to implementing an idea that had kicked around the security community for quite a while.

So far so good. However, while this feature was intended to work with the cooperation of software and hardware, it left most of the existing base of XP users (those without NX-capable CPUs) up the creek. So Microsoft decided to implement a subset of those ideas on any computer running Windows XP SP2. This is a software-only implementation of NX – not perfect, not foolproof, and definitely not as strong as the hardware-backed NX you get with the NX-capable CPUs, but a major leap forward from the “buffer overrun friendly” versions of Windows that have preceded it.

And actually, it seems to work pretty well. I’ve enabled the NX feature on all the computers I touch, and seen it catch a number of programs that were (in most cases accidently) caught doing the very things that NX is set to trap. It doesn’t interfere with the stable, mature applications I’m running, and it hasn’t yet prevented me from doing anything really important. Mostly, it’s trapped this behaviour in the third-party “shareware” type apps that are nice to have. [Hopefully I’ve been able to help the developers of these apps by sending them the crash dumps from these apps. When I am notified by XP SP2 that an app was caught by NX, I’ll trace through the dialogs that tell me where the dump files are located – indicated as the “technical information” that would be submitted to Microsoft through the Error Reporting feature – I’ll find the dump folder, Zip up a copy, and email that Zip file to the ISV who developed the app. Microsoft probably does this as well for apps that often show up in their error reporting queues, but I figure it can’t hurt to make sure anyway. Hint: I don’t have one on my system right now – the folder is deleted once it’s uploaded to Microsoft’s error reporting site – but the crash dump files will be written to your %temp% folder, with a folder name conaining “WER”, and the major files will have the extension “.hdmp” and “.mdmp”. The files compress quite well.]

So here’s my concern: if Microsoft’s Windows division was comfortable with taking a hardware-assisted feature like NX and implementing it as a “software-only” feature, wouldn’t it seem hypocritical to resist applying a software-only spec for TPM to the premier OS next on the horizon? I know I’m being naive here, but it seems like Microsoft would be in a near-ideal position to apply TNC to Vista. They’ve been working on the formerly code-named “Palladium” technology for ages now – or at least talking about it in the press. As well, they’ve apparently been involved with the TCG and the development of these documents for quite a while now, and presumably had at least some level of influence over their content (though probably not a dominant hand in them, given the number of other players with just as much at stake here).

So I wonder aloud: what possible benefit does Microsoft gain from Vista “escaping” the confines of the TNC spec? I would guess it’s because, at this late stage in the development of Windows Vista (they just passed Beta 1), there aren’t a lot of fundamental changes to the OS that could be introduced – without significant risk of delaying the release of Vista AGAIN. [How many scheduling delays now, and how many valuable features REMOVED to keep the schedule from slipping further?]

Perhaps there are other just as innocent explanations as well, e.g.:

  • They’ve been trying to get the TNC spec worked into Vista all along, but at the same time as they decided to pull the “Palladium” features out of Vista, they also had to decide whether to further delay Vista (and continue to stabilize the TNC components) or take the TNC components out of Vista and stabilize the Vista ship schedule.
  • The TNC spec may have taken a late change that drastically altered the requirements for Vista, and the Vista team couldn’t add the major code change without resetting the Vista development milestones.
  • There are plans to add TNC into Vista post-RTM – not unlike the way that many significant features were added to XP via SP2.

It would certainly help quell a potential firestorm of controversy if Microsoft got out ahead of Schneier’s allegations and discussed their plans for TNC implementation in Windows, and what prevents them from incorporating the spec in Vista before it ships. Despite the nefarious personality that some would like to attribute to every action from Microsoft, I’ve found that the people I’ve met and with whom I’ve worked there really do have the best of intentions at heart.