Windows Vista’s Full Volume Encryption & TPM, part 4: available PCs that include TPM 1.2 chip

[Edit: corrected the Broadcom adapter model #, and removed the listing for the Dell Precision 380 Workstation, which turns out to only have a TPM 1.1b chip via the Broadcom BCM5751 chip.]

Since I only talked about Tablet PCs in part 2, I figure I owe it to the community to collect together a listing of any and all shipping PCs that include a v1.2 TPM chip.

What follows are all Servers, desktops, notebooks and Tablets that I could confirm currently include a TPM 1.2 chip:

Servers
none to date

Desktops & Workstations
Dell Optiplex GX620
http://www1.us.dell.com/content/products/productdetails.aspx/optix_gx620?c=us&cs=555&l=en&s=biz
Gateway FX400XL (via Broadcom NIC referenced here)
http://www.gateway.com/products/GConfig/proddetails.asp?system_id=fx400xl&seg=hm
Gateway FX400S (via Broadcom NIC referenced here)
http://www.gateway.com/products/GConfig/proddetails.asp?system_id=fx400s&seg=hm
Gateway FX400X (via Broadcom NIC referenced here)
http://www.gateway.com/products/GConfig/proddetails.asp?system_id=fx400x&seg=hm
Gateway E-6500D SB (via Broadcom NIC referenced here)
http://www.gateway.com/products/gconfig/proddetails.asp?system_id=e6500dsb&seg=sb
HP Compaq Business Desktop DC7600 (via Broadcom NIC)
http://h10010.www1.hp.com/wwpc/us/en/sm/WF04a/12454-64287-89301-321860-f56.html
Vector GZ desktop
http://www.pdspc.com/products/vectorgz.aspx

Notebooks
Gateway M250 Series
http://www.gateway.com/products/gconfig/prodseries.asp?seg=sb&gcseries=gtwym250b
Gateway M460 Series
http://www.gateway.com/products/gconfig/prodseries.asp?seg=sb&gcseries=gtwym460b
Gateway M680 Series
http://www.gateway.com/products/gconfig/prodseries.asp?seg=sb&gcseries=gtwym680b

** HP TC4200 [THEORY: the TPM is an orderable part (Part #383545-001, $42.00 list price), which implies that it’s a removable/replaceable part (and thus that a TPM 1.2 chip could be swapped in later), but this is only an unconfirmed theory on my part] **

Tablets
Gateway M280 Series
http://www.gateway.com/products/gconfig/proddetails.asp?seg=sb&system_id=m280eb

Bonus 1: Add-on Components
Broadcom BCM5752 & BCM5752M network controller chips (which has an integrated TPM 1.2 chip)
http://www.broadcom.com/press/release.php?id=700509

Bonus 2: Linux drivers
Linux driver with support for Infineon’s TPM v1.2 chip
http://www.prosec.rub.de/tpm/

And again, don’t forget to check Tony McFadden’s TPM Matrix. NOTE: I only used Tony’s TPM Matrix to start my search – I haven’t copied any entries without external confirmation, so there may be disagreements between our pages. When in doubt, remember that unless I could confirm a TPM 1.2 chip was included in a PC system, I did not list that system here. Tony’s page is meant to be more comprehensive, so he lists both PC systems with TPM 1.1 chips as well as those with unknown chips or which haven’t been confirmed to include a TPM chip.

P.S. Do you know of any other PC systems shipping a TPM 1.2 chip? If so, add your comment below!


P.P.S. What have I learned in my searches for TPM 1.2-integrated PC systems? Here’s a couple of tips that may be helpful if and when you off on your own search:

  1. If the spec sheet only mentions non-version-specific phrases such as “TPM chip”, “TPM Embedded Security Chip” or “the TCG standard” [emphasis mine], you can and should assume that the chip is a TPM 1.1 chip. Anytime I was able to confirm a TPM 1.2 chip, the PC system vendor made specific and repeated mention of the 1.2 version number. [Apparently this is a big differentiator, though few if any references on the Internet have clarified why.]
  2. If you are looking into a PC that was shipped before Summer 2005, you can rest assured that it did NOT ship with a TPM 1.2 chip, since the TPM chip vendors didn’t have production chips on the market until at least mid-summer of 2005.
Advertisements

Windows Vista’s Full Volume Encryption & TPM, part 3: links & background reading

Paul Thurrott indicates that FVE will appear in Enterprise & Ultimate editions of Vista:
http://www.winsupersite.com/showcase/winvista_beta1_vs_tiger_02.asp

Bart DeSmet digs in deep on EFS, TPM, Secure Startup and more:
http://community.bartdesmet.net/blogs/bart/archive/2005/08/17/3471.aspx

David Berlind speculates on possible incompatibility between Vista/TPM & virtual machine technology:
http://blogs.zdnet.com/microsoftvistas/?p=17

George Ou shines light on a potential key export “backdoor” for FVE, and his ideas on why smartcards would be an ideal FVE key storage mechanism:
http://blogs.zdnet.com/Ou/?p=109

William Knight vaguely alludes to some proprietary algorithms used in FVE that could lead to “a possibility of in-memory attacks for keys.”
http://www.contractoruk.com/002386.html

David Berlind speculates again on a possible use of the TPM by Windows Product Activation (totally unconfirmed at this point):
http://blogs.zdnet.com/microsoftvistas/?p=44

An out-of-date but still “best there is” collection of TPM-related hardware, software and integration information:
http://www.tonymcfadden.net/tpmvendors.html

And last but not least, Microsoft’s Technical Overview of FVE:
http://www.microsoft.com/whdc/system/platform/pcdesign/secure-start_tech.mspx

Windows Vista’s Full Volume Encryption & TPM, part 2: FVE on Tablet PC?

OK, so where was I when I last left the TPM topic? Oh yeah

Frankly I don’t know what to think about the state of TPM-backed data encryption. I really *want* to be able to say “yeah baby – your best bet for securing data on a laptop will be Vista’s FVE” (or any other OS-level TPM-backed file encryption option). For a few hours, I actually believed it could be true – not just for an individual, but for any really big organization as well.

However, the past couple of months’ effort has me pretty much convinced otherwise. I’m not exactly optimistic for the prospect of widespread TPM-secured data protection in the near future.

It looks to me like Full Volume Encryption (FVE) in Windows Vista won’t be a viable option for anyone who isn’t prepared to drop a bundle on new computing hardware at the same time. That’s because there’s almost no computers – especially mobile computers – on the market that have a v1.2 TPM.

While I realize that there are other IHV- and ISV-supplied TSS packages to support TPM-backed file encryption, I am mostly focused on Vista FVE for a couple of reasons:

  1. Until a service is provide in-the-box with the OS, my experience with customers is that integrating vendor-specific security software is a huge hassle, and not supportable at scale over shorter periods of time (e.g. 2-3 years).
  2. There’ll often be more than one TPM-enabled package to support – generally, it looks like an organization will have multiple packages, one for every desktop/notebook/tablet/server vendor that integrates a different TPM module.
  3. It’s not clear at this time how the TSS packages are licensed, but I’ll take a SWAG and assume that you’re only licensed to use the TSS package on the system with which it was shipped, and that you’ll have to pay extra to use that package on PCs that were shipped with a different TSS package.
  4. An organization could scrap the bundled software packages entirely and just license a third-party product across the board (e.g. Wave), but the choices are pretty limited from what I’ve seen, and personally (without having had any hands-on experience to support my gut feeling) I don’t know how much confidence I’d have locking my organization’s most prized data up under this – e.g. what’s the enterprise management (archival & recovery, configuration management, identity management) story like?
  5. [Disclosure: I’m a former Microsoft employee, security consultant and spent most of my tenure consulting on EFS, RMS and other security technologies.]

I’ve been in the market for a new laptop for a while, and one of the reasons for my recent obsession with TPM is that (a) any purchase I make now will have to last well beyond the release data of Vista, (b) since I intend to continue to leverage my Windows security expertise, I should really get a computer that supports FVE so I get first-hand knowledge of how it works, and (c) you generally can’t add a TPM chip to a computer after you’ve purchased it (with at least one known exception).

Oh, and I’ve committed myself to the Tablet PC variant, since I am a committed “whiteboard zealot” and I expect to use the freehand drawing capability quite a bit.

So my mission is to find a Tablet PC that meets my “requirements”:

  • TPM v1.2 chip
  • max RAM > 1 GB
  • dedicated video RAM > 32 MB (to support the lunatic Vista graphical improvements)
  • can run from battery for at least three hours a day (e.g. bus rides to and from work, meetings away from my desk)
  • won’t break my wrist if I use it standing up (e.g. weight under 5 lbs)
  • will withstand dropping it once in a while – I’m more than a bit clumsy

I have spent countless hours scouring the Internet for TPM-enabled Tablets. After my intial survey of the PC vendors’ offerings, I figured there’d be at least a couple of options from which to choose. However, the longer I looked, the more bleak it became. Of the major vendors of Tablet PCs (Acer, Fujitsu, Gateway, HP, Lenovo, Motion and Toshiba), I have so far found exactly ONE Tablet on the market with a v1.2 TPM chip.

One.

And not exactly the industry standard for large enterprise deployment – Gateway!

Did I mention that Windows Vista will require the v1.2 chip to support Secure Startup and Full Volume Encryption?

Oh, and did you hear that Microsoft is trying like h*** to get Tablet PCs in the hands of as many users as possible?

Geez Louise, I even went so far as to contact Fujitsu (who have a really fantastic Tablet with a v1.1 TPM chip) to see if they were sitting on any about-to-be-released v1.2-enabled Tablets, asking them the following:

Could you give me some idea of the following:
– whether Fujitsu is committed to integrating v1.2 TPM chips in their computing products?
– when we can expect to see Tablet PCs with v1.2 TPM chips integrated into them?
– Any planned model or series of Tablets that the v1.2 TPM chips will be used in e.g. Lifebook 4000 series, Slate vs. Convertible, etc.?

And this is the response I got:

We fully intend to continue our support of TPM and transition to v1.2.

However, at this time we can not provide a date as to when this will be available. Fujitsu company policy and NDA agreements with suppliers do not allow us to publicly disclose future plans prior to product launch.

So what’s a guy to think? Right now we’ve got exactly one FVE-ready Tablet on the market, and according to this guy, the big wave of computer upgrades in the business sector may already be passing by. [Let me ignore the fact that I haven’t looked into notebooks yet, and assume that TPM v1.2-equipped notebooks are just as scarce. I’ll check into this further and report back.]

Between now and the shipment of Vista (perhaps October 2006, if you can believe these rumours), less than a year away, am I to believe that hordes of TPM v1.2-equipped PCs will show up on people’s desks? If so, then perhaps there might be a minority of organizations who would consider testing the Vista FVE technology (though I doubt they’d be ready to standardize on it, assuming – rightly – that they’ll have less than a majority of Vista FVE-ready PCs in their organization).

But even if TPM v1.2-equipped PCs were to quickly dominate these organizations, would I feel comfortable urging such organizations to adopt Vista to enable use of FVE to protect their data? I honestly don’t know – I don’t feel a resounding “YES” coming on, but neither do I feel a “NO” building in my gut. Perhaps it’s because I feel like this question won’t be practical for a number of years yet.

By requiring the v1.2 TPM chip for FVE & Secure Startup, I believe that:

  • Third-party TSS packages will get a lot of leeway to take the “organizational standard” position – especially for those TSS packages that also support v1.2 TPM chips
  • Most mid-sized to large organizations won’t be in a position to adopt FVE & SS as their data protection standard until say 2008 or later.

This leaves me wondering what data will be left to protect by then? Given the fact that most organizations are being forced through one regulation or another to encrypt customer-sensitive data, I believe that the next couple of years will be the final window for unencrypted user data to reside on client PCs.

Put another way: if you’re the InfoSec officer in charge of recommended strategies for regulatory compliance & avoiding liability, wouldn’t you rather just encrypt every disk on every “physically insecure” PC throughout the organization? That’s one sure-fire way to know that users haven’t accidentally stored a sensitive file in an unencrypted volume, folder or file. Only then would the organization be able to claim that a lost or stolen PC did not contain unencrypted customer data.

[Now, sure, in 3-5 years there’ll be room to re-evaluate the technology used to maintain protected data on hard drives, and it’s quite possible that by then Vista’s SS & FVE will get the nod from many organizations. Migrating from one highly-technical solution to another is never easy in large orgs, and is pretty scary for small outfits or self-supporting end users, but I’m leaving the door open for the landscape to change beyond my wildest imaginings in the 3-5 year timeframe…]

Does anyone see things differently? Does Vista FVE look like it’ll capture a significant portion of the “data protection” market? I’d really like to be wrong about this – it would suck if the best “free” on-disk data protection technology to come out of Microsoft won’t be practical for the majority until long after they had to commit to another on-disk encryption solution.

Database "intrusion detection" appliance – nice thinking, hope to see more like this

http://www.computerworld.com/securitytopics/security/story/0,10801,105429,00.html

I am very much interested in anything that helps an organization get a handle on the kinds of “attacks” this device is intended to detect.

My first reaction when I read “The current version of the Symantec appliance does not actually block suspicious queries — it simply monitors and reports on what the database is up to — but that feature is being considered for a future version…” was – Wow, doesn’t that make this a pretty useless piece of tech then?

However, when I think back on all the customers with whom I’ve worked, I’ve found that most of them are happy enough to be able to detect unauthorized behaviour. Sure, if preventative controls cost no more (time, effort, resources, usability) than the equivalent detective control, they’d be happy to use that instead. However, most of us have had enough experience with “prevention is the only path to security” approaches to understand that preventative security can only guarantee that it’ll block some form of intended usage, and that (as Schneier so often points out) the bad guys will always find some other way to accomplish their goals, if they’re determined enough.

Such as: if you block unauthorized use through a database “intrusion prevention” appliance, the bad guys will then try other attack vectors such as:

  • escalating the privilege of an account that doesn’t start with sufficient privilege
  • finding an account that does have sufficient privilege and breaking its password
  • finding an alternate path to the database that doesn’t go through the database IP appliance
  • cracking the appliance (sure, of course it’s impregnable, but…)
  • DoS’ing the appliance (say if nothing else worked, and they’re just frustrated enough to want to do *some* harm)

Bottom line: I like the thinking that went into Symantec’s database security appliance, and I hope to see more creative ideas like this in the future. As the article said, “…enterprise users are becoming increasingly focused on data security and regulation compliance.” [emphasis mine]

Windows Vista’s Full Disk Encryption is only available if you have Microsoft Software Assurance?

http://www.computerworld.com/softwaretopics/os/windows/story/0,10801,104918,00.html

Wow – personally, I think someone in marketing at Microsoft has miscalculated on this one. Don’t get me wrong, I can understand the rationale – “Well, most of the customers that have asked us for this feature are already on Software Assurance or wouldn’t have to spend much additional $$$ to get it. The smaller orgs still have EFS to be able to protect their data, and since they haven’t asked for anything else, they must be satisfied with EFS right?”

I don’t buy it – here’s my thinking:

  • Just because those few organizations who’ve actually taken the time to articulate their needs happen to have the SA arrangements already made (or have the EA leverage to negotiate cheap SA rates), doesn’t mean they’re the only ones who would (or could) use this feature;
  • Just because SA has been considered by many Microsoft customers to be a rip-off, and not worth buying again, shouldn’t lead to the effect (intentional or not) of holding some of the most critical features of Vista hostage from the rest of the Microsoft customer base (especially those who wish to purchase one of the premium Vista SKUs such as the rumoured Professional or Full Media editions);
  • Many of the organizations who haven’t explicitly articulated a need to their Microsoft reps for Windows-native full disk encryption [at least based on my experience with them] are either (a) still struggling with their much more limited – and challenging, in most cases – deployments of some form of file encryption on user’s PCs, and are sick of talking about encryption, or (b) have committed to another technology because Microsoft hasn’t yet provided a solution for this critical business need. However counterintuitive it might sound, those organizations who fall under (b) should be given the chance to try Vista’s full-disk encryption without having to commit to SA to do so. Many organizations with whom I’ve worked have told me they’d far rather use a technology that already comes with the products they’re using, than to have to integrate yet another piece of third-party hardware into an already-overly-complex “desktop” deployment – just so long as they believe the built-in technology reasonably achieves their overall goals. Nothing like hands-on testing (and widespread talk from others also testing) to help convince them – but it’s very difficult to get that groundswell of opinion when so few organizations even qualify to be able to use a technology like Secure Startup.

It’s not like the need isn’t critical in every organization – just the opposite in fact, based on my experience with customers over the years. I wonder if it just happens that there hasn’t been enough formal market research at Microsoft to show how widespread the need really is.

Makes me wonder what ELSE is being locked up in the SA-only Vista Enterprise SKU. I’d love to hear a response to this from those at Microsoft who’ll have to defend this to the legions of Microsoft customers for whom Secure Startup won’t be available…

EFS + SYSKEY followup, NTBackup and EFS-TPM integration

A colleague recently asked me about a previous post of mine:

“Mike, in your blog you mentioned you must use Syskey for real protection of EFS protected data. You said if you didn’t use Syskey, it was relatively easy to get to EFS files. So 3 questions that I haven’t been able to find an answer:

  1. Are there any public attacks documented or tools to get to EFS protected data, other than cracking the user desktop login password? If yes, please link. I guess this would be cracking the DPAPI secure store.
  2. What NTBackup options are required to keep the data encrypted in the .bkf file? If there isn’t a way, how can data files in incremental backups be safely encrypted?
  3. Dell is now shipping TPMv1.1 chips in their Inspiron & Latitude laptops. Can EFS private keys be stored there? How can you know that the private key is actually stored in the TPM chip?”

First I should clear up the misunderstanding I may have created regarding SYSKEY and EFS. What I meant to assert is that EFS files are relatively easy to get at (for educated attackers) unless you use either:
(a) SYSKEY boot floppy or SYSKEY boot password, or
(b) domain logon accounts (and a relatively decent password/passphrase).

I don’t generally recommend SYSKEY in a domain environment; instead I recommend domain accounts and strong passwords or passphrases for reasonable security against brute force attacks.

As for the direct questions I *was* asked:

  1. There are no cryptographic “backdoors” to attack EFS data – the cryptography behind EFS, combined with the reliance on multiple layers of protection of the encryption keys, follows the usual best practices for software-based data encryption. I have faith in DPAPI to do what it sets out to do, and to be as secure as any software-based encryption implementation can be. However, there are a number of potential attacks on EFS’d data – none of them “magic”, but really just predictable consequences of both (a) the ways that keys must be stored on disk and (b) the integration of EFS with the Windows logon infrastructure.
  2. No parameters or configuration are necessary for NTBackup to be able to backup encrypted files – its default behaviour can natively backup EFS encrypted files. NTBackup is one of a class of applications that use the RAW APIs. Applications that call these APIs are requesting that NTFS give them the “raw” file along with the EFS alternate data streams, all in a single binary stream. This means that NTBackup gets a copy of the encrypted file and its keys, so that the backup files contain everything that’s needed to decrypt the files later. When NTBackup restores such files to an NTFS filesystem, you get back the encrypted file intact with its encryption keys. So you can backup any files you like with NTBackup – full, incremental, whatever – and rest assured that the backups are no more vulnerable than the original files. While some backup solutions end up with plaintext copies of the files, those backup apps that use the RAW APIs never expose the unencrypted file contents to later attack.
  3. All currently released versions of Windows are hard coded to ONLY use the native software CSPs for EFS (specifically, the Base or Enhanced CSPs) – they can’t use any other CSPs for EFS, even the oft-requested smart cards (nor the TPM-enabled CSPs). I have no idea whether there will be support for TPM storage of EFS private keys in Windows Vista, though they have announced plans to include EFS-private-key-on-smartcard support. They also mention support for a “full volume encryption” feature (AFAIK, unrelated to EFS) that would work on systems with TPM v1.2 chips. I assume the TPM software dictates how keys are managed, but until there’s any information on whether non-smartcard CSPs are supported, I can only speculate how “enforcing” TPM storage could possibly work. At this point, I believe the “enforce smartcard” option in Windows Vista EFS is a simple checkbox, so it’s probably hard-coded to look for smartcard CSPs only.

I had a quick look around the Internet for current details on leveraging a TPM (Trusted Platform Module) chip for encrypting files on disk – here’s what I learned on my first pass:

  • There’s very little mention of which version of the TPM spec is supported on most PCs in the market today – or at least, that information is not easy to uncover. So far the only mention I’ve found on Dell & Toshiba’s sites is “v1.2” for certain Optiplex models, and v1.1 Infineon chips in the Toshiba Tecra M4 & Latitude systems you mentioned.
  • So far I don’t know if there are any significant differences between v1.1 & v1.2 TPM chips in terms of support from the CSPs, and what application scenarios are/are not supported by each version. Maybe the differences are negligible, maybe there’s an order of magnitude more possibilities once you have v1.2. [Or maybe that just happens to be what the “full volume encryption” team was willing to test, even if v1.1 would have been just as good for this scenario.]
  • Seems like every PC vendor has some models shipping with TPM chips – IBM/Lenovo (Atmel), Toshiba (Infineon), HP, Dell. Good news for us.
  • Seems like there’s only a small number of application + CSP suites out there so far that enable TPM in XP:
  • Some suites leverage particular application APIs that require third party plug-ins (e.g. Dell/Wave)
  • Others (e.g. Toshiba’s suite) “support” EFS features – I don’t know what this means, as the documentation I’ve seen is too vague to be sure:
    • Does it merely leverage the DRA public key to provide a recovery path for the Personal Secure Drive (encrypted virtual drive)?
    • Does it encrypt the contents of the user’s profile with keys protected by the TPM?
    • Does it somehow provide a redirection layer so that the RSA files in the user’s profile are actually encrypted by TPM-protected keys before the Windows CSPs drop the files on disk?

This is fascinating, and a lot more than I expected to turn up. It seems that TPM has finally started to catch on with the PC vendors – I was shocked to see that pretty much all the major PC vendors had TPM-enabled PCs. It’s not that I didn’t expect this to happen, but that since I hadn’t heard any of my customers asking me about this so far, I assumed it was still “on the horizon” (like “the year of the PKI” is still just a year or two away, for the tenth year in a row).

I’m going to devote some serious research into the state of TPM-enabled data encryption, and over the next few posts I’ll be putting up my findings and opinions on where I think TPM-enabled encryption fits into the kinds of solutions I normally recommend.

Watch for it.

National Standard for Data Security? It’s about freakin’ time

http://www.computerworld.com/securitytopics/security/story/0,10801,103558,00.html?source=NLT_AM&nid=103558

I predict this will be a watershed moment in terms of focus on security of DATA, and (thankfully) take the primacy away from perimeter, network and host security (which in my opinion has consumed an inordinate share of attention, leaving the ONLY UNIQUE [information security] AND IRREPLACEABLE ASSET EACH ORGANIZATION HAS – their data – to languish in insecure obscurity. Let’s hope this helps get those infosec security audit & remediation efforts refocused on the ASSETS, not on the IMPACT, part of the threat analysis equation.

Not to underestimate the efforts this will kick off, I believe those truly interested in securing the privacy and confidentiality of their customers’ data (credit cards, PII and other privacy-occluded data) will have to spend considerable effort on:

  • re-examining their business process data flows, and the processes for assuring the security of the data at all stages (and throughout the process) in its storage, processing and transmission
  • cryptography and key management – not just in implementing “encryption”, but in ensuring that the implemented encryption isn’t just an obfuscation step – that the encryption provides real security benefit against the expected (and likely) threats
  • backup and recovery processes, to ensure data is handled in “archived” form just as securely as in “live” form
  • ensuring they have good reasons to collect any and all data from or about their customers, and having solid justification for storing any of that data (whether short- or long-term).

Encrypting files on the server – WTF???

I can’t tell you how irritated I get when I read yet another recommendation from some well-meaning security expert that says you should use EFS to encrypt files on a Windows SERVER. I have little or no problem with EFS on a Windows CLIENT (though if you’re not using domain accounts, or you don’t use SYSKEY [shudder], you’re only keeping your files safe from grandma, not your kids), but I have to wonder how many people understand how decryption keys are protected (and by what) when they recommend using EFS on a server.

SQL Database (mdf) encryption example
Let’s take a simple case: you want to protect your SQL database files from remote attackers, so naturally you think “I’ll encrypt the data using EFS – cheap, free and easy – and then remote attackers won’t be able to access the data.” Yes, in one sense that is quite true – if a remote attacker were to try to copy the files on disk – e.g. from a buffer overflow exploit that gave them remote LocalSystem access – then NTFS would return an Access Denied error.

  • when you encrypt a file that is to be accessible to a Service (such as the “MS SQL Server” service that mounts the SQL database files), you are in reality required to encrypt the file in the context of the Service account in which the service runs.
  • In this example, you’d have to encrypt in the MSSQLServer service’s account context – and if you’ve been reading your SQL Server security guidance, you’ll already have created a service account and downgraded MSSQLServer from the default LocalSystem service account context.
  • This means that only the service account (e.g. you’ve created a local account named SERVER\SQLServiceAcct) can decrypt the files.
  • What happens when the service starts? The service “logs on” with the SQLServiceAcct (actually the Service Control Manager calls CreateProcessAsUser() or similar API and runs the new process in the context of the user account specified as the Service Account in the service’s configuration).
  • How does the Service Control Manager “authenticate” the service? The service account name is stored in cleartext in the Registry, and the service account password is stored as an LSA Secret elsewhere in the Registry.
  • LSA Secrets are ACL’d so they are not readable by any user except the LocalSystem, and they are further encrypted with the System Key (aka SYSKEY), so that only the LSA process (which has the ability to use the SYSKEY decryption key) could access the LSA Secrets.
  • [AFAIK] The Service Control Manager requests that the LSA decrypt the service account password and pass it to the Service Control Manager for use in the CreateProcessAsUser() API call.
  • Once the MSSQLServer service is running in the correct user context, then the EFS driver in NTFS will decrypt the encrypted database files for the MSSQLServer process, and SQL Server will be able to mount the now-decrypted database files.
  • Any process running in any other user context will not be able to supply the correct RSA private key for EFS to be able to decrypt the files. In our example, if the attacker could remotely run a script in the LocalSystem context that tried to copy the database files,NTFS will return an Access Denied message to the script process that tried to access the encrypted database files.

However, if that same remote attacker were really interested in getting access to that encrypted file, they could quite easily grant themselves access:

  • Anyone with LocalSystem access (or local Administrators membership as well) could grant themselves the SeDebugPrivilege, and then run any number of “hacker” tools that could dump the LSA Secrets from memory into cleartext form.
  • e.g. the family of lsadump*.exe tools attach to the LSASS.EXE process (via the Debug privilege) and dump out all the decrypted LSA Secrets.
  • Once you have the decrypted LSA Secrets, you can quickly find the SQLServiceAcct password, which then gives you the ability to logon as that user account.
  • Once you can authenticate as the SQLServiceAcct user account, you’ll have access to all the RSA decryption keys stored in that user’s profile. Then any attempts to read/copy files encrypted by that user will be automatically decrypted by EFS.

This is an unavoidable consequence of the scenario. Services must be able to start automatically (at least, on all Windows servers for which I’ve had to recommend security measures), which means that the Service Control Manager must be able to read the password from LSA Secrets without user intervention.

[This also usually means that SYSKEY boot passphrases or boot floppies won’t be used, since the use of an “off-system SYSKEY” means the server will never boot without an administrator intervening, which makes remote management a heckuva lot harder. Unless you have some of those fancy Remote Insight boards AND a sysadmin who doesn’t mind getting paged every time the server has to reboot.]

My conclusion: EFS-encrypting files for processes that start without user intervention provides very little protection against remote attackers who can gain LocalSystem or Administrators access to your server. This means *any* Service, whether on a server or a client (e.g. the ol’ ArcServ backup agent that runs on every Windows server and client, and [at least used to] “require” a Domain Admin account as the service account. That’s another hairy security implementation for another day’s rant, lemme tell you…).

[Note: Netscape web server had this same “problem” back in the days when I still administered Netscape-on-Windows. If you had an SSL certificate configured for the site, and you didn’t want to have to stand at the keyboard every time you wanted to start the web server, you’d have to store the private key’s decryption password in a plaintext file on the server. Kinda ruled out any *real* security that you could claim for that private key, but whatever – SSL was just there to encrypt the session key anyway, and very few SSL sessions lasted long enough for the fabled “sniff the SSL session on the wire” attacks anyway.]

SQL Database dump file example
“But wait Mike – what if the MSSQLServer service was always running? Doesn’t SQL have an exclusive lock on all its database files while the service is running?” Yes, absolutely. This brings to mind a couple of different thoughts:

  • how do you make sure the service is always running – prevent it being shut down, or ensure that the server reboots as soon as the service is no longer running?
  • if the files are already exclusively locked, doesn’t that mean the remote attacker won’t be able to read or copy the files off the filesystem? Why bother encrypting if the service *never* doesn’t run?

Also note: the “exclusive lock” principle obviously won’t apply to scheduled database dump files – the files are written once, then unlocked by the scheduled dump process/thread. This should make you think twice/thrice about encrypting the database dump files on disk – the files will be unlocked, waiting on the filesystem for that same LocalSystem/Admin attacker to logon as the dump user context and copy the files at their leisure. [It would also mean that any remote process to read or copy the dump files – e.g. an enterprise backup system running on a central server – would have to be able to decrypt the files remotely. This requires “Trusted for Delegation” configuration for the server where the dump files are held, which is a security headache that warrants careful thought before implementing.]

My best advice for protecting the database dumps from remote attackers?

  • Don’t ever dump to the local filesystem of the server – stream your database backups over the network, either to a remote file share that wouldn’t be accessible to the remote attackers, or directly to a backup device that writes the files to backup media; OR,
  • Minimize the amount of time that the database dumps are stored on a locally-accessible filesystem. Have the files copied off-device as soon as possible, and if possible wipe the free space after you’ve deleted the files (if you’re concerned about the remote attackers undeleting the files).

News article: "The scramble to protect personal data"

The New York Times recently posted an article about a wide-ranging set of data security issues that I found interesting.

This is the kind of thing that’s recently been guiding my thinking – not just encryption because CSB 1386 [for example] says you should, but holistic means of preventing loss of Confidentiality (via Information Disclosure threats e.g. not just the always-discussed-but-rare-in-the-real-world MITM SSL attacks but lo-tech attacks like getting hired at the delivery company that picks up your backup tapes).

Data security as a superior approach to perimeter, network, host or application security?

I’ve always believed that the network is a poor substitute for protecting a host, its apps or (ultimately) the Data made available via the n/w, server and its apps. To me, the other layers of perimeter, network, host & applications are necessary means to the end, which is to access, manipulate and sometimes even add to that Data. Similarly, security enforced at the perimeter, network, host or application layers, while necessary, is not sufficient when what you’re really trying to accomplish is to security the Data.

Disclosure: I started my IT career in the desktop arena, and I’ve grown into the server and enterpise data arena over time. It’s been a great journey, but I’m probably still biased towards the PC at heart.

Certainly your ability to protect each set of data scales much better, the further you draw it out towards the perimeter. However, in my experience, you sacrifice the ability to provide very specific protections for the data inside the perimeter, for example:

  • the firewall rule which blocks or allows 80/tcp access doesn’t provide an ability to authorize access for specific groups of users (only for IP addresses or subnets)
  • the host security that allows/denies a user to Logon to the host (e.g. “access this computer from the network”) doesn’t give you the ability to allow the user to only access one set of data on the host but not access all other sets of data on the host
  • the application protection (e.g. SQL per-user/group roles assignment, Samba per-folder ACLs, email per-server/mailstore ACLs) don’t allow you to protect individual pieces of data accessed through the application that “front-ends” the data, though often the application can also provide built-in “bulk data protection” mechanisms. However, if another application were used to access the same on-disk data, it is often possible for the application-level protections to be completely bypassed.

In my opinion, the defining characteristic of data-level protection is that it only allows access for the authorized user(s) to the authorized data (set(s)), no matter what application or OS is used to access the data.

In the extreme, this definition would only include data that has been encrypted such that only the key that is in the authorized user’s posession can unlock the data (e.g. PGP, S/MIME).

Examine for a moment what happens when you rely only on the application-level security:

  • Windows file share-level permissions could be bypassed by users who can access the admin shares, a Remote desktop or an FTP session
  • SQL roles-based access can be bypassed by anyone who can mount the .mdf file as a new database instance, or who has one of the privileged server or database roles such as sysadmin or db_owner.

In each example, the application protections that were meant to protect the data itself has not been changed, but the data could be accessed by the user (attacker) even though the application protections were configured to block the user.

The principle of “defense in depth” implies that the more layers of defense, the better the asset is protected against a failure at any one layer. However in my experience, a valued asset will “grow” new methods of access over its lifetime – e.g. a mainframe application may start out with access only from terminal emulators, then PC access is added via Communications Server & PComm, then later a Web Services interface is added to the mainframe for direct browser-based access.

The mainframe application may not have changed in the slightest throughout all this, but the “DiD” model that was suitable at first may not be appropriate to protecting from the threats that are now possible via the new interface.

Data-level protection may not require encryption to prevent access, but it’s the only significant protection I can think of, and it seems to be the protection to which most people turn. You can also add DRM (or ERM, RMS, etc.), but if it doesn’t also include encryption, to me it’s pretty much ineffective at actually protecting content you’ve given to someone else from being exploited beyond the rights you wished to assign.