In this post, I'll cover the Windows Vista Secure Startup feature to present you one of the big security enhancements made in the Windows code-named Longhorn wave. To go short, Secure Startup is a new security feature in Vista that addresses the concern of better data protection, based on hardware support.
So, what is all this Secure Startup stuff all about? Take a look back at my short definition of Secure Startup above. It's a security feature, it addresses data protection and it's based on hardware support.
Let's start with the latter one, the hardware support. Currently, almost all of the security-related stuff is handled by the operating system and by applications running on top of the OS in order to protect data which is stored somewhere on a machine. In the end, security is all about enforcing security rules and policies in order to make sure data and services are protected against malicious usage whatever form it may take (e.g. data disclosure). However, this approach of having the software dealing with all security-related stuff makes the system also vulnerable by its very nature. As the system itself has to store and retrieve keys for data protection (e.g. the global system key, abbreviated as syskey, in Windows), the information used to secure things is right there on the harddisk. Imagine the very actual risk of "stolen laptops". It's a simple task to retrieve the data which is stored on the machine using tools that can be found on the internet, no tricks involved. Check out Steve Riley's (blog) column on The Case of the Stolen Laptop as well.
Okay, there exist solutions to address this issue partially. For example, you could store encryption keys on a separate medium, e.g. some removable storage thing. However, it's still key to realize that you only improve the security of data secured by that very key. For example, it's wrong to think that storing the syskey on a floppy disk will enhance the security of the data stored on disk to a high extent. Data which was unencrypted before will still be unencrypted. You're only securing the data which is secured by the syskey. And of course, physical security still plays a very important role.
A computer contains a bunch of critical security-related information nowadays. It's clear that this information needs to be protected against a broad variety of attacks, including information disclosure. This information includes the local Security Account Manager (SAM) database, Active Directory account information, IPSec keys, wireless network keys (WEP, WPA), computer keys (e.g. used for Kerberos), system recovery passwords (e.g. Active Directory restore mode), secrets managed by LSA (local security authority), SSL keys, EFS keys, etc. All this stuff protects something, but how is this stuff protected itself? The answer is that there are keys to encrypt the (private) keys which are stored on the system. These are called master keys, an example being a master key associated with a user in order to protect his/her EFS, SSL, etc keys. On their turn, these master keys are also encrypted by a kind of "root key", called the syskey. I've been blogging about this thing earlier on http://community.bartdesmet.net/blogs/bart/archive/2004/02/29/226.aspx as I'm a syskey-addict :-).
By default, the syskey is stored on the computer itself and is randomly generated during Windows Setup. This information is then spread across the registry (called "scattering") in a pattern which is unique for your Windows installation ("obfuscation"). Using the syskey.exe tool included with Windows in the system32 folder, you can change the way the syskey is stored or derived. A first option is to store the key on a floppy disk. You simply can't boot the pc without the floppy and you can't retrieve secured information without having the key (but remember that unprotected data physically stored on the harddisk is not protected). Another mode enables you to enter a system boot password which is used to derive the master key from. Check out the syskey.exe tool but do it with care.
Syskey can help against information theft when combined with EFS. The use of EFS encrypts the data on disk whileas syskey encrypts the keys used to encrypt the data. By moving the system key off the harddisk using syskey.exe, e.g. to a floppy or by using a password, security is enhanced in the "case of the stolen laptop (or whatever machine)". The syskey password mode is the safest of the three available modes as there exist cracking tools to extract the syskey from the registry (no, I won't be posting links to these tools :p).
Another example is the use of EFS, the Encrypted File System, in Windows 2000, XP, 2003. With EFS (a NTFS-related feature), a user can encrypt his/her data which is stored on disk, completely transparent for further usage. However, this only secures users against each other on a multi-user system. It does not secure you against a stolen computer from which the EFS keys are restored in order to decrypt a user's data. And there's also a threat that comes from recovery agents that should be trustworthy people.
Let's talk about EFS a little further. EFS stands for Encrypted File System and was introduced in the Windows 2000 operating system family as an add-on device driver for the NTFS driver (from XP on, it's merged into the NTFS driver itself). What it does is providing a transparent way to encrypt/decrypt files and folders on the level of the file system. (Read: EFS is all about data confidentiality through encryption, not about integrity and protection against tampering.) What I do mean by the word transparent is the fact that you don't need a password in order to access a file. EFS is available on Windows 2000, Windows XP Professional and Windows Server 2003. Now the technical stuff.
As a side-note I want to mention that you should never encrypt single files, always encrypt entire folders. This is because of the fact that EFS creates a temporary plaintext "shred" of the file (called Efs0.tmp) in order to encrypt it and then copy it back to the folder where it belongs to. This shred can remain on the disk, leaving a potential attack open. The only effective way to solve this is to encrypt entire folders or to wipe the disk (but prefer the former one if you can).
First, a little word on how to encrypt a file or folder. In Windows Explorer, right-click a (file or) folder and go to Properties. On the tab General, click Advanced and mark the "Encrypt contents to secure data". That's it. Now, when another user tries to open the file (assuming he has read permission to the file) he/she will get an "access denied" error. Another way to encrypt/decrypt a file/folder is by using the cipher.exe tool with the /E and /D flags. More information can be found in the Windows Help and Support documentation. Geeks can also use the advapi32.dll's EncryptFile function that will call into the feclient.dll file that's handling file encryption stuff. The actual encryption/decryption is done by part of the LSA system which runs as SYSTEM, so impersonation and user profile loading is being done.
Now, how is this encryption done behind the scenes? It should be clear we do need a key in order to encrypt the file. Next, we do need an algorithm to encrypt and decrypt the data efficiently, which means we should use symmetric encryption. The former one, the key, is the so-called File Encryption Key (FEK) which is a key that's randomly generated on a file-by-file basis (when you encrypt a folder, you're in fact flagging the folder to encrypt every single file inside the directory). The latter one, the encryption algorithm, is DESX on Windows 2000. On Windows XP SP1 and later and on Windows Server 2003 there's also support for 3DES and AES. By default, Windows 2000 and Windows XP (pre SP1) use DESX whileas Windows XP SP1 and later as well as Windows Server 2003 use AES.
Note: You can force 3DES to be used as the encryption standard for all cryptographic services on the system by altering the local system policy (Security, System Cryptography, "Use FIPS Compliant Algorithms For Encryption, Hashing and Signing"). If you prefer to use the registry, go to HKLM\SYSTEM\CurrentControlSet\Control\LSA and set FipsAlgorithmPolicy to 1. To control the encryption algorithm for EFS only, go to HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\EFS and change AlgorithmID to 0x6603, which enables 3DES. Note that on Windows 2000, this won't work if the High Encryption Pack (separate floppy disk) isn't installed. Changing the encryption mechanism this way doesn't change the encryption for existing files, it only affects newly encrypted files.
To continue, what happens with the FEK? Well, it's just stored together with the file. In order to secure the encryption key so that only the owning user can decrypt the file, the FEK itself is encrypted using public/private key encryption (RSA) using the public EFS key for the user. If multiple users need to able to access the file (which is the case when using recovery agents - by default the administator is a recovery agent - for instance), there will be one encrypted FEK for each user (encrypted with that user's public key). All these encrypted FEK instances are stored in the so-called Data Decypher Field of the file.
This mechanism provides a high stack of encryptions. At the very bottom line we have the system startup key (syskey, see above). On top of that, there's a master key for the user. Next, we have the EFS key for the user. And last but not least, there is a separate key for each encrypted file. You might wonder where the private keys for the user live and how these are protected. The private keys are stored in %appdata%\Microsoft\Crypto\RSA\<usersid> and are encrypted by the user's master key, which lives in %appdata%\Microsoft\Protect\<userid> (check on XP Pro computer) and is encrypted based on the password of the user. If you change your password, the master keys are reencrypted automatically. However, if your password is being reset, this does not happen (it simply can't because there is no way to use the old password for decryption first) and so you loose your information if you don't have a recovery agent in place.
Other sources where you can find useful EFS-related information are:
Some general EFS-related guidelines include:
- Use EFS in a domain configuration. On a standalone pc, you can break EFS using various tools because all the secrets that are part of the encryption/decryption are kept locally (unless you're using syskey in mode 2 or 3, i.e. floppy or password mode). Another possible attack is to clear the local Administrator password by removing the local SAM database in %windir%\system32\config\sam. As the local Administrator is a recovery agent by default, logging in as local Administrator will make it possible for you to recover the encrypted file. Syskey modi 2 and 3 solve this problem because you need the EFS keys which are stored by LSA in a so-called "secrets cache" which is physically protected by the syskey. In a domain, the domain admin is a recovery agent by default which establishes a "physical gap" between the EFS encrypted files on one system and the recovery agent stuff on another one.
- Be aware of "interactive logon cache", a Local Security setting that enables you to cache the last n logons on the machine which is particularly interesting for mobile users that need to login to their computers when being on the road, without access to a domain controller. Using this cache, an attacker might be able to authenticate and decrypt a user's files.
- Avoid encrypting single files, because of the shred backup file being made in efs0.tmp. Although this plaintext backup file is deleted after the EFS encryption has taken place, the data will still live on the harddisk and can be recovered by an attacker (e.g. by using the Support Tool dskprobe.exe). As I mentioned above, the cipher.exe /w can be used (it was actually implemented by Microsoft as a reaction on a Bugtraq post about this issue).
Finally, my personal advice is to use EFS in a domain environment and to consider to use syskey mode 2 or 3 to move the system key away from the harddisk.
Also notice the existence of so-called ATA passwords which don't have to do anything with the OS but can be used to secure all data stored on a harddisk. I won't cover this in further detail over here.
So, Secure Startup wants to provide a way to offload certain aspects of data protection to the hardware, but in an OS-controlled fashion. Very concretely, the Secure Startup feature uses a Trusted Platform Module (TPM 1.2) to protect the user data and to protect against tampering while the system is offline. Just like EFS, Secure Startup is transparent to the users. Basically what it does is encrypting the entire Windows volume.
Trusted Platform Module
A Trusted Platform Module is a microcontroller to store keys, passwords and digital certificates, and is typically sitting on the motherboard (of the upcoming generation of computers). TPM is "invented by" the Trusted Computing Group which is promoted by several computer companies, including Microsoft (see member list). TCG is a non-profit organization that aims to enhance the security of computing in general and was formed in early 2003, based on the work done by the Trusted Computing Platform Alliance (TCPA). TCG has impact on both hardware and software by delivering a standards proposals such as TPM. Detailed specification documents can be found on https://www.trustedcomputinggroup.org/downloads/specifications/.
Back to TPM in particular now. In order to understand the role of TPM, you'll need to have a basic understanding of what TCG calls the "Trusted Platform". What follows is based on the "TCG Specification Architecture Overview" specification. In order for a platform to be called trusted, it should provide three basic features:
- Protected capabilities are a collection of commands that can interact with sensitive data that is "shielded" against malicious access. Such places are called "shielded locations" and include registers and places in memory where manipulations of sensitive data are guarenteed to be protected. The protected capabilities have exclusive access to these shielded locations. Samples include management of cryptographic keys and random number generators.
- Attestation is all about the accuracy of information, which is an important factor in the trustworthiness of a platform. First there is attestation by the TPM which can be used to provide proof of data known to the TPM itself. Using a so-called Attestation Identity Key (AIK) internal TPM data is digitally signed. Secondly, there is attestation to the platform to provide proof of a platform's trustworhtiness to report integrity measurements. Hand-in-hand with attestation to the platform we have attestation of the platform, used to provide proof of a platform's integrity measurements also using an AIK which signs PCRs (Platform Configuration Registers, part of the protected capabilities). Last but not least there is the need for authentication of the platform's identity.
- Integrity measurement, storage and reporting is the collection of integrity-related services to enhance and measure the trustworthiness of a platform. The integrity measurement part helps to obtain metrics which have impact on the overall trustworthiness of the platform, based on trusted states. Integrity storage is used to log integrity metrics in a safe manner, by digesting the contents and storing it in PCRs. Finally, integrity reporting is a form of attestation for the data stored in the integrity storage. The overall idea of this set of "services" is to have trustworthy evidence of the state a platform was/is in, to assist processes in evaluating these integrity states and taking appropriate actions upon that.
In order to provide this functionality, the TCG architecture requires components that need to be trusted for the full 100%, otherwise misbehavior can't be detected. These components are critical to the trustworthiness of the system and are called roots of trust. The collection of roots of trust in a system has to provide the functionality needed to describe the platform's characteristics that affect the trustworthiness of it. This includes:
- Root of Trust for Measurement (RTM) to make reliable measurements of system integrity.
- Root of Trust for Storage (RTS) for the secure maintenance of integrity digests.
- Root of Trust for Reporting (RTR) for reporting of the RTS' contents.
Now, how is integrity measured in order to be stored and reported? To support this integrity measurement, there is a so-called measurement kernel component that generates measurement events, consisting of two components: a measured value and a hash digest of that value (e.g. SHA-1). Basically, these measurements and the corresponding digests are snapshots of the operational state of the machine. The digest is stored by RTS whileas the measured value can be consumed anywhere. In order to verify the measured data digest values are recreated and compared to the stored data. Sequences of related measurement event data are kept in a Stored Measurement Log in which a common measurement digest is used as the starting point. Newly aqcuired measurement values are appended to the common measurement digest and rehashed in order to preserve ordening. This process is called extending the digest.
Furthermore, the TPM can act as an endpoint of communication which means that it can be used to provide several security related services for secure exchange of data between systems, relying on a trustworthy identification of the systems involved in the communication. By providing key management support and configuration management, the TPM can help to improve security for communication between systems. In order to support this scenario, TCG defines four classes of protected message exchange:
- Binding is the process of encrypting the to-be-transferred data by means of a public key of the intended recipient, which can only recover the message using the corresponding (non-shared?) private key.
- Signing generates a signature for the message that can be used to validate the integrity of the message and is typically done by hashing the message and encrypting it using a (signing only) key.
- Sealing is an extension of binding and also encrypts messages. Furthermore, it binds the message to a series of platform metrics that need to be fulfilled in order for the message to be decrypted. This binding binds the symmetric key used to encrypt the message (for speed) with a set of PCR values and the assymetric key. Sealing clearly improves trustworthiness of a platform by requiring a certain state for the platform to be in, before decryption can be done.
- Sealed-signing links the signing operations with the PCR registers of the machine creating the signature. This allows a verifier to check the PCR contents included with the signed message in order to have a clear picture of the configuration of the signing platform configuration at the time of the signing process.
The protected storage component (RTS) holds keys and data entrusted to the TPM. Embedded in the TPM there are two (semi-)fixed keys, the Endorsement Key (EK, used to establish a platform owner) and the Storage Root Key (SRK, associated with the platform owner, can be replaced). Keys stored in the TPM are categorized into two categories: the migratable keys (which can be transferred to another TPM) and non-migratable keys (cannot leave the system). An AIK (see attestation discussion above) is a prime sample of a non-migratable key. TCG defines 7 key types which are dedicated to certain functionality. These include:
- Signing keys - assymetric key, migratable or not migratable, signs application data and/or messages
- Storage keys - assymetric key, encrypts data or other keys
- Identity keys (AIK) - assymetric key, not migratable, used to signed data from the TPM (e.g. PCR register values)
- Endorsment keys (EK) - decryption key, not migratable, used to decrypt owner authorization data when the owner of a platform is established, also used to decrypt AIK-creation related messages
- Bind keys - encrypt/decrypt small amounts of information to be transferred across platforms, e.g. symmetric keys
- Legacy keys - keys created outside the TPM (e.g. by the OS or an application), imported in the TPM and also migratable
- Authentication keys - symmetric keys to protect transport of data where the TPM is involved in
The RTS interfaces with storage devices through a so-called Key Cache Manager which I won't cover over here. You can find more information in the "TCG Specification Architecture Overview" document mentioned above.
This brings us to the TPM Components, which I'll cover a little further. Keep in mind we're talking about a hardware component over here. The software aspect will be covered further on. A first important component is the I/O component that acts as an interface of the TPM component to the outside world. It's connected to the TPM's internal communication bus and routes messages to the right destination. Next, there are a couple of engines for cryptographic functionality, such as the random number generator, an engine for SHA-1 hashing, a key generation engine and and RSA engine. Further on, there is a piece of non-volatile storage that holds the EK, SRK and other owner-related data. On the field of memory, there are the PCRs too that can be either volatile or non-volatile. Another piece of storage contains the AIKs, but it's recommended by TCG to store these keys outside the TPM (nevertheless, there is reserved room inside the TPM component itself). Further on, there is program code living inside the TPM (which acts as the Core Root of Trust for Measurement or CRTM). The execution engine runs the program code. Finally, a so-called Opt-In component is used to enable/disable/deactivate various operations that the TPM is capable of providing.
For the operational states of the TPM component, I refer to the "TCG Specification Architecture Overview" document again.
On to the software aspect of TPM. At the very bottom we have of course the TPM component itself which is managed by the TPM Device Driver in the OS' kernel mode. On top of that we have the user mode, consisting of system processes and user processes. In the system processes space, you'll find the TCG Device Driver Library (TDDL) which provides an interface (TDDLI) to players higher in the stack. The TSS Core Services (abbreviated as TCS; TSS stands for TCG Software Stack) talks to the TDDLI interface and provides a TCSI interface to the user processes. In the user processes space, there's a service model called the TCG Service Provider or TSP which has an associated interface called TSPI. Finally, we end up with the application interacting with this TSPI interface thing. Detailed information about these interfaces can be found in "TCG Specification Architecture Overview". The TCG Software Stack Specification (TSS) is available over here as well as the header file (C).
The Windows Longhorn Secure Startup feature acts on three different fields:
- Data protection of offline systems - a non-connected offline system is safe against data theft because of encryption of user and system data (read: the entire Windows volume) including the hibernation file, the page file, system files, temporary files, user data, etc. All applications on the machine also benefit from additional (implicitly added) security for offline scenarios.
- Ensurance of boot integrity includes tampering detection for monitored files. The system won't boot if tampering (during offline time) is detected.
- Hardware recycling - as the TPM contains encryption keys and other encryption stuff, simply erasing its contents makes the volume useless and non-recoverable. This eliminates the need of disk sweeping to physically delete critical content.
The overall idea is to encrypt the entire Windows volume, therefore including the SYSKEY, without having the top-most encryption key on the harddisk but moving it to a hardware component (the TPM). When the system is booting, the keys needed to read the data from the harddisk to boot Windows are only released when no tampering is detected. This is done by unique measurements from multiple factors during system boot, which results in a digest. If someone has tampered with the boot system, the digest won't be the same and the tampering is detected. The use of TPM is completely transparent for the operating system during boot and further operation. Notice that the encrypted data on a protected partition is bound to a particular installation of Windows.
Let's go in somewhat more detail and look at the boot process with TPM enabled. First of all, there's a firmware security bootstrap mechanism that kicks in. This procedure is encoded in the TPM hardware and is in the end responsible to ensure system integrity, through the Core Root of Trust Measurement (CRTM). When the system is considered to be secure and trustworthy, a series of measurements is done to create a fingerprint of the system. At boot time, the same measurements are made in order to check the system for integrity. This mechanism is called Static Root of Trust Measurement (SRTM) and makes sure the encryption key for the system volume is only unsealed when the integrity verification process succeeds. The measurements I've referred to are stored in the PCRs of the STM component and include the following: firmware, option ROMs, step-by-step measurement of the boot process up to the BOOTMGR level, a value measurement of the boot-time events that occur (OS boot time). The volume encryption is based on block encryption.
In order to run Secure Startup in Vista, the system needs to match the following requirements:
- TPM 1.2 should be available on the hardware and should be enabled (see operational modes in "TCG Specification Architecture Overview"). Ownership should be taken as well (see further).
- The Master Boot Record (MBR) of the harddisk should be modified by a version shipping with Vista that can talk to the TPM hardware (to collect measurement data etc).
- Vista must be installed on an NTFS volume.
Notice that Vista also supports the use of EFI (extra link) with TPM. The classic non-EFI scenario is to have an MBR active NTFS partition that contains Vista. EFI is a more recent BIOS specification replacing boot loaders and especially created for trusted computing scenarios. The idea is to offload the system hardware preparation stuff to the firmware at boot time, freeing the OS from these tasks. TPM boot procedures can be supported by such an EFI BIOS.
Vista supports TPM through the Windows Security Center where various management tasks are available. First of all there's of course support to enable Secure Startup on the machine. This is where the encryption is started after reboot. Next, there's support to disable Secure Startup with the option to retain encryption (which moves the key to a floppy or to the harddisk also with support for a password - compare to syskey modi) or to remove encryption (decryption kicks in). To support recovery an administrator can choose to store a recovery key on removable media or to supply a password for recovery. Recovery is needed when the hardware changes (old harddisk in new computer with different TPM), when the system is tampered with, data corruption of the MBR or the system, debugging, offline system updates. In all of these cases, the measured data does not match the original gathered data and the boot process is blocked till a recovery password or medium is supplied by the end user. These passwords and keys for recovery can be stored in Active Directory, enabling enterprise-level scenarios.
With this, I want to conclude a brief overview of the Secure Startup technology based on TPM in Windows Vista. I want to stress that Secure Startup is not the only feature of Vista that leverages the power of TPM. Other places include WMI support for TPM administration tasks, Group Policy support for TPM, a Key Storage Provider (KSP) that works with TPM and the exposure of the TSS interfaces to 3rd party applications. More information can be found via the links mentioned below.
| Digg It