Security Risks in Key Recovery

Peter G. Neumann
Principal Scientist, Computer Science Laboratory
SRI International, Menlo Park CA 94025-3493
Telephone: 1-415-859-2375; 1-650-859-2375 beginning August 1997
Internet: Neumann@CSL.SRI.com; Website: http://www.csl.sri.com/neumann.html

Written testimony for the Senate Judiciary Committee hearing originally scheduled for 25 June 1997, but postponed until 9 July 1997. Oral testimony will also be available in the final printed Senate hearing proceedings.

I am very grateful for the opportunity to address you today on a matter that is one of our nation's most pressing sociotechnological issues. I speak to you as an individual, although I refer to several other efforts in which I have been involved jointly.

This written testimony begins with the executive summary of recent report on risks related to key recovery (Reference 1), of which I was a coauthor. It then discusses some of the potential risks related to key recovery, and draws some conclusions. At the end it provides various relevant references and some of my background. In particular, I draw your attention to the National Research Council crypto study (Reference 2) and the earlier ACM crypto report (Reference 3); I was a coauthor of both of these reports as well. Also of special relevance is my testimony for the Senate Permanent Subcommittee on Investigations from June 1996 (Reference 6).

For the record of this session, I have appended a copy of the cited report on risks related to key recovery (Reference 1). (On-line availability is noted in the reference.)

In addition, to provide further background for these hearings, I also have appended the most recent version of my summary of illustrative risks to the public (Reference 8), which contains numerous examples of security risks in our computer-communication environments, from which we can infer some of the many potential risks facing any would-be key-recovery infrastructures. (On-line availability is noted in the reference.)

Introduction

There are significant potential risks, costs, and implications that must be carefully considered prior to deployment of any key-management and key-recovery schemes. This testimony considers primarily the technological risks, and urges that legislation not be carried out hastily in the absence of detailed investigations of the long-term potential social and economic effects of those risks and the associated costs.

A self-constituted group of 11 cryptographers and computer scientists, Hal Abelson (MIT/HP), Ross Anderson (Cambridge University), Steven M. Bellovin (AT&T Research), Josh Benaloh (Microsoft Research), Matt Blaze (AT&T Research), Whitfield Diffie (Sun Microsystems), John Gilmore, Peter G. Neumann (SRI International), Ronald L. Rivest (MIT), Jeffery I. Schiller (MIT), and Bruce Schneier (Counterpane Systems), has issued a report (Reference 1) on the technical implications, risks, and costs of `key recovery', `key escrow', and `trusted third-party' encryption systems. The report evolved via e-mail exchanges, achieving iterative consensus over a four-month period subsequent to one meeting in January 1997.

As a coauthor and someone who has studied computer-related risks for many years, I believe that the report deserves your closest study and further discussion. The next five paragraphs represent the executive summary taken from the appended full report.

A variety of `key recovery,' `key escrow,' and `trusted third-party' encryption requirements have been suggested in recent years by government agencies seeking to conduct covert surveillance within the changing environments brought about by new technologies. This report examines the fundamental properties of these requirements and attempts to outline the technical risks, costs, and implications of widely deploying systems that provide government access to encryption keys.

The deployment of key-recovery-based encryption infrastructures to meet law enforcement's stated specifications will result in substantial sacrifices in security and greatly increased costs to the end-user. Building the secure computer-communication infrastructures necessary to provide adequate technological underpinnings demanded by these requirements would be enormously complex and is far beyond the experience and current competency of the field. Even if such infrastructures could be built, the risks and costs of such an operating environment may ultimately prove unacceptable. In addition, these infrastructures would generally require extraordinary levels of human trustworthiness.

These difficulties are a function of the basic government access requirements proposed for key-recovery encryption systems. They exist regardless of the design of the recovery systems -- whether the systems use private-key cryptography or public-key cryptography; whether the databases are split with secret-sharing techniques or maintained in a single hardened secure facility; whether the recovery services provide private keys, session keys, or merely decrypt specific data as needed; and whether there is a single centralized infrastructure, many decentralized infrastructures, or a collection of different approaches.

All key-recovery systems require the existence of a highly sensitive and highly available secret key or collection of keys that must be maintained in a secure manner over an extended time period. These systems must make decryption information quickly accessible to law-enforcement agencies without notice to the key owners. These basic requirements make the problem of general key recovery difficult and expensive -- and potentially too unsecure and too costly for many applications and many users.

Attempts to force the widespread adoption of key-recovery encryption through export controls, import or domestic use regulations, or international standards should be considered in light of these factors. The public must carefully consider the costs and benefits of embracing government-access key recovery before imposing the new security risks and spending the huge investment required (potentially many billions of dollars, in direct and indirect costs) to deploy a global key recovery infrastructure.

Risks

This is an extremely complex subject, and requires discussion of technical issues as well as policy matters. I hope my presentation is understandable. If not, then please ask for clarifications or further details.

On one hand, cryptography is not a panacea for attaining security and privacy; it is just one technique among many. The cryptographic and system-security communities themselves must work harder to overcome some of the deficiencies in existing computer-communication environments (Reference 6) -- hopefully with greater encouragement from the U.S. law-enforcement community (which as you know focuses primarily on prosecution, to the detriment of preventing computer misuse and related crime). This is a difficult problem, because essentially all systems have some potentially serious security risks.

On the other hand, the trapdoor access implicit in key recovery is not a panacea for law enforcement or fighting terrorism; at best, it provides peepholes into certain kinds of information. It would provide substantial administrative problems -- for law enforcement and for everyone else.

The real costs that must underlie any extensive key-retrieval mechanisms and recovery infrastructures are a serious source of concern. To date, those costs have not been adequately considered by proponents of key-recovery, key-escrow, and key-management mechanisms and their supporting computer- communication environments. The costs in the large necessarily involve the entire key-recovery infrastructure itself, including its operational procedures, management, oversight, enforcement costs, legal liabilities, and costs of litigating misusers. There are also some hidden costs, namely, those necessary to ratchet up the security of the overall commercially available computer-communication systems and networks as well. Some of those cost issues are discussed at length in the key-recovery report.

We focus here more on the security and social risks, which to date have also not been adequately considered by the proponents of key-recovery and key-escrow infrastructures. There are numerous potential risks associated therewith:

There are appalling weaknesses in the security of today's computer systems and networks -- including operating systems, network software, Web browsers and servers, programming languages, cryptographic implementations, application software systems, and so on. Weak links exist at many points. Even strong cryptographic implementations can often be broken or completely circumvented by devious means.

Some of these security weaknesses could add considerably to the risks in key-recovery infrastructures. User and system authentication techniques in most commercially available systems represent enormous risks; typically, fixed reusable passwords are used, and transmitted unencrypted across unprotected communication media. In addition, systems are susceptible to penetration by other means. As a result, masquerading is often relatively simple to achieve. System accountability is often very poor, which makes it difficult to detect when a major misuse has occurred (particularly if it were to involve a critical component of the key-recovery infrastructure). In turn, the absence of meaningful authentication makes it even more difficult to identify the culprits -- assuming their misuse can even be detected.

Conclusions

It is absolutely fundamental that security must be addressed as a systemic problem. Risks can certainly arise in the cryptographic algorithms and key lengths, as in the recent cracking of the RSA-challenge DES key (demonstrating that DES and 56-bit keys may be outliving their utility). However, even greater risks typically arise in how the cryptography is encapsulated in operating systems, networking software, and applications, and in other weaknesses in those components themselves. Key escrow and key recovery are sometimes (I believe, mistakenly) touted as inherently increasing security. They actually have the potential of seriously decreasing security overall. Even if they are very carefully conceived, implemented, and analyzed for security vulnerabilities, they will remain vulnerable to misuse, particularly by insiders. We must examine much more seriously all of the relevant security risks that can arise with key-escrow and key-recovery schemes. The third-party agents are themselves enormous potential sources of risks. Those people who seek to increase system and network security and those who believe that the inherent risks of key recovery are controllable all face similar problems. Because the infrastructure is weak, vulnerabilities are inevitable. If those vulnerabilities exist, they will be exploited.

Whereas legislation and strict administrative supervision of employees could help to reduce some of the risks, the fundamental weaknesses in the computer-communication infrastructure today are not likely to be overcome in the near future. Although there has been some improvement in recent years, many of the conclusions of the 1989 Computers at Risk study (Reference 4) are still valid today; furthermore, new security vulnerabilities are introduced with each new system. As a result, supposedly secure systems are penetrable (e.g., Reference 10). The risks are still ubiquitous, and are likely to remain so (Reference 5). Anyone who tells you they can develop an infrastructure that avoids or contains the risks -- including but not limited to those that I have outlined here -- is simply not familiar with the realities of computer-communication system security and the foibles of real human beings.

Consequently, the entire concept of key-recovery is riddled with potential risks. Because the underlying computer-communication infrastructure is so weak with respect to security, it would be extremely difficult to provide serious assurances that the key-recovery infrastructure is not substantially weaker. However, some recommendations for improving matters are included in my earlier Senate testimony from June 1996 (Reference 6) and in Computer-Related Risks (Reference 5), both of which I urge you to read.

Any meaningful assessment of the risks relating to key recovery must consider the costs and risks to the law enforcement community and to society associated with an inability to detect and prosecute crime. The notion of preventing computer-related crime (opportunities for which are likely to increase dramatically, even in the near future) should not be antithetical to prosecuting it. Arguments about costs and risks must be broadly based, not narrowly drawn just within the confines of law enforcement, and genuine tradeoffs must be clearly understood. The expected misuses of crypto would have to clearly dominate the benefits from the expected uses to justify a widespread key-recovery infrastructure. To date, there is little real evidence that crypto is becoming a significant problem for law enforcement, and considerable evidence at present that it is not -- at least not yet -- worldwide.

It must be recognized that the common goal is to reduce total crime, for which multiple approaches are undoubtedly necessary. However, whereas key-recovery schemes do not help the intelligence community (and probably hinder it), they might also backfire badly on the law-enforcement community -- because of the risks outlined here. Law enforcement desperately needs to pursue other avenues. Among many other alternatives, database tracking facilities are already widespread, through telephone records, credit-card billing, airline reservations, etc. Intelligent programs for data fusion could be very effective -- although perhaps risky from a privacy point of view. Additionally, use of biometric and other forms of less spoofable identification and authentication would add significantly to determining who is doing what to whom.

I conclude that the costs associated with implementing and administering something whose overall feasibility has so many unidentified risks (some of which are enumerated here) do not seem justifiable at this time, and certainly not until some thorough, objective detailed studies of the implications have been completed. To that end, detailed architectures and procedural definitions are required before the costs and risks can be realistically assessed. I note that this emphasis on understanding the costs and risks is completely consistent with the principles of the OECD Guidelines for Cryptography Policy, established 27 March 1997.

My primary observation here is that the nation is not ready for any widespread key-recovery infrastructures, especially those that might be mandated by the U.S. Government for nationwide use. Furthermore, this is an international issue, not just a national one -- which may significantly complicate the search for adequate solutions. The complexities of legislation relating to the social implications of emerging technologies relating to the Internet are illustrated by the experiences surrounding the Communications Decency Act. Rushing into legislation without serious consideration (as appears to be happening with McCain-Kerrey, for example) runs the risk of prematurely establishing an unworkable and counterproductive policy. If there are genuine commercial demands for key-recovery for stored information, those needs will be satisfied naturally. However, there is no real need for key-recovery for communications apart from those of law enforcement, and the costs and risks are potentially too great to bear in the absence of further study. I firmly believe the conclusion of our National Research Council study that recommends in essence that the U.S. Government use itself as a guinea pig and explore the risks and costs and other factors before instituting any widespread key-recovery infrastructures.

References

  1. Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption, 27 May 1997 (ftp://research.att.com/dist/mab/key_study.txt or .ps; http://www.crypto.com/key_study). [Attached]

  2. Kenneth W. Dam, W.Y. Smith, Lee Bollinger, Ann Caracristi, Benjamin R. Civiletti, Colin Crook, Samuel H. Fuller, Leslie H. Gelb, Ronald Graham, Martin Hellman, Julius L. Katz, Peter G. Neumann, Raymond Ozzie, Edward C. Schmults, Elliot M. Stone, and Willis H. Ware, Cryptography's Role In Securing the Information Society (a.k.a. the CRISIS report), Final Report of the National Research Council Cryptographic Policy Study Committee, National Academy Press, 2101 Constitution Ave., Washington, D.C. 20418, 1996. The executive summary is available on-line (http://www2.nas.edu/cstbweb).

  3. Susan Landau, Stephen Kent, Clinton Brooks, Scott Charney, Dorothy Denning, Whitfield Diffie, Anthony Lauck, Douglas Miller, Peter G. Neumann, and David Sobel, Codes, Keys, and Conflicts: Issues in U.S. Crypto Policy, Report of a Special Panel of the ACM U.S. Public Policy Committee (USACM), June 1994 (http://info.acm.org/reports/acm_crypto_study.html).

  4. David D. Clark, W. Earl Boebert, Susan Gerhart, John V. Guttag, Richard A. Kemmerer, Stephen T. Kent, Sandra M. Mann Lambert, Butler W. Lampson, John J. Lane, M. Douglas McIlroy, Peter G. Neumann, Michael O. Rabin, Warren Schmitt, Harold F. Tipton, Stephen T. Walker, and Willis H. Ware, Computers at Risk: Safe Computing in the Information Age, National Research Council, National Academy Press, 1991, 2101 Constitution Ave., Washington, D.C. 20418, 1996.

  5. Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995.

  6. Peter G. Neumann, Security Risks in the Emerging Infrastructure, written testimony for the U.S. Senate Permanent Subcommittee on Investigations of the Senate Committee on Governmental Affairs, 25 June 1996. See Security in Cyberspace, Hearings, S. Hrg. 104-701, 1996, pages 350-363, with oral testimony included on pages 106-111. ISBN 0-16-053913-7 (http://www.csl.sri.com/neumannSenate.html).

  7. Peter G. Neumann, Security and Integrity Controls for Federal, State, and Local Computers accessing NCIC, SRI Technical Report for the FBI, 29 June 1990.

  8. Peter G. Neumann, Illustrative Risks to the Public in the Use of Computer Systems and Related Technology (ftp://www.csl.sri.com/pub/illustrative.PS). [Attached]

  9. Laurie E. Ekstrand, ``National Crime Information Center: Legislation Needed to Deter Misuse of Criminal Justice Information,'' U.S. General Accounting Office testimony before the U.S. House of Representatives Subcommittee on Information, Justice, Agriculture, and Transportation, of the Committee on Government Operations, and the Subcommittee on Civil and Constitutional Rights, of the Committee on the Judiciary, 28 July 1993.

  10. Information Security: Computer Attacks at Department of Defense Pose Increasing Risks, U.S. General Accounting Office, May 1996, GAO/AIMD-96-84.

  11. Phillip A. Porras and Peter G. Neumann, EMERALD: Event Monitoring Enabling Responses to Anomalous Live Disturbances, Proceedings of the National Information System Security Conference, October 1997. A preprint is available on-line at http://www.csl.sri.com/intrusion.html, along with other information on current work and historical background.

Personal Background

I have been involved with the U.S. Government in different technological contexts for many years, including (among others) national security, law enforcement, air-traffic control, and NASA. My first computer-related job was for the Navy in the summer of 1953, 44 years ago next week.

I have long been concerned with security, reliability, human safety, system survivability, and privacy in computer-communication systems and networks, and with how to develop systems that can dependably do what is expected of them. For example, I have been involved in designing operating systems and networks, secure database-management systems, and monitoring systems that seek to identify abnormal patterns of behavior. I have also been seriously involved in identifying and preventing risks. Some of this experience is distilled into my book, Computer-Related Risks (Reference 5).

In activities directly related to cryptography and its applications, I was a member of the National Research Council committee (1994-96) study of U.S. cryptographic policy (Reference 2). I participated in an earlier study of the same subject sponsored by the ACM U.S. Policy Committee (USACM) (Reference 3). I was also a coauthor of the 1988-90 National Research Council study report, Computers at Risk (Reference 4).

Over the years, I have had several opportunities to consider the security needs of the FBI. From 1987 to 1989, I served on an expert panel for the House Judiciary Committee Subcommittee on Civil and Constitutional Rights, addressing law-enforcement database systems at the request of then Congressman Don Edwards. In 1991, at the request of Al Bayse, then Deputy Director of the FBI, I wrote a report on security requirements in the use of the national (NCIC), state, and local databases (Reference 7). In addition, the SRI Computer Science Laboratory had an ongoing project to study the application of our technology for misuse and anomaly detection ("intrusion detection") to FBI internal applications. The most recent incarnation of that technology is summarized in Reference 11.

I am a Fellow of the American Association for the Advancement of Science, the Institute for Electrical and Electronics Engineers, and the Association for Computing (ACM). My present title is Principal Scientist in the Computer Science Laboratory at SRI International (not-for-profit, formerly Stanford Research Institute), where I have been since 1971 -- after ten years at Bell Telephone Laboratories in Murray Hill, New Jersey. I have doctorates from Harvard and the Technische Hochschule, Darmstadt, Germany (the latter obtained while I was on a Fulbright from 1958 to 1960). I am a member of the ACM USACM committee, chairman of the ACM Committee on Computers and Public Policy, and Moderator of its widely read Internet Risks Forum comp.risks).

Attachments for my Senate hearing written testimony:

1. Eleven-authored crypto report, 27 May 1997 (http://www.crypto.com/key_study).

2. Latest edition of ``Illustrative Risks'' summary (ftp://www.csl.sri.com/pub/illustrative.PS).