Computer-Related Risks and the National Infrastructures

Peter G. Neumann
Principal Scientist, Computer Science Laboratory
SRI International, Menlo Park CA 94025-3493
Telephone: 1-650-859-2375
E-mail: Neumann@CSL.SRI.com; Web site: http://www.csl.sri.com/neumann.html

Written testimony, 6 November 1997, for the U.S. House Science Committee Subcommittee on Technology

The written testimony is published in The Role of Computer Security in Protecting U.S. Infrastructures, Hearing, 105th Congress, 1st session, No. 33, 1998, pages 64--99, ISBN 0-16-056151-5, 1997, preceded by the oral presentation on pages 61--63. Oral responses to oral questions are on pages 101--118, and written responses to subsequent written questions are on pages 148--161.

Summary

The President's Commission on Critical Infrastructure Protection (PCCIP) has completed its investigation, having addressed eight major critical national infrastructures: telecommunications; generation, transmission and distribution of electric power; storage and distribution of gas and oil; water supplies; transportation; banking and finance; emergency services; and continuity of government services. Perhaps most important is the Commission's recognition that very serious vulnerabilities and threats exist in all of these critical infrastructures. Perhaps equally important if not more so is the PCCIP's recognition that all of these critical infrastructures are closely interdependent and that they all depend on underlying computer-communication information infrastructures, such as computing resources, databases, private networks, and the Internet.

Background

Because the PCCIP report itself contains considerable background, I merely summarize my main conclusions. The computer-communication vulnerabilities, threats, and risks are documented in considerable detail in References 1, 2, 3, 4 and 5. In addition, specific vulnerabilities and risks related to the use of cryptography are given in References 6, 7, 8, and 9. I hope that you will review some of that material.

Conclusions

Here a few brief summary conclusions.

Vulnerabilities, Threats, and Risks

Critical Dependencies

Risks in Key-Recovery Cryptography

System Development and Operation

Other System Considerations

Attaining systems and networks that are dependably survivable, secure, and reliable is a very difficult problem. It is essentially impossible to have any guarantees whatsoever that a particular system will work properly whenever, wherever, and however it is needed. Furthermore, the information infrastructures are highly heterogeneous, which makes it even more difficult to have any guarantees that information infrastructures in the large --- that is, aggregations of different computer systems and networks --- will behave properly. Survivability, security, and reliability are all weak-link phenomena, and there are far too many weak links today. On the other hand, there will always be weak links and resulting risks, and there will always be cases involving multiple weak links.

Analysis

Our job in the research and development community is to find ways to avoid as many of those risks as possible, to minimize the consequences of the exploitation or accidental triggering of those that cannot be avoided, and to provide well-founded assurances that systems and networks are likely to be able to satisfy their critical requirements. One of your tasks in Congress is to encourage such R&D and to ensure that viable new approaches can find their way into the computer-communication information infrastructures --- and then to encourage their adoption within the national infrastructures. One of the main tasks of those who control the national infrastructures is to ensure that they are using the most robust computer-communication technology available, that management is fully aware of the risks, and that they coordinate their efforts. As the PCCIP notes, education and awareness are vital (along with research and development), although it helps greatly if what is being taught and learned can actually lead to much greater survivability, reliability, and security --- benefitting from the best R&D efforts and the best practical systems. This is a very difficult challenge for all of us.

We cannot afford to wait for massive disasters. The Commission has outlined some initial measures that may be beneficial --- if they are pursued vigorously. However, the PCCIP has identified only the tip of a very large iceberg, and there is much more work to be done.

The Commission has clearly recognized that protecting the national infrastructures is a matter of shared responsibility between the private and public sectors, and also that it is a matter of national security. However, it is of the utmost importance that the term "national security" be interpreted not in the narrow sense of the Department of Defense and the National Security Agency, but in the broadest possible sense of the well-being if not the survival of the nation.

The PCCIP is to be commended for what they have accomplished, recognizing that an effort of this breadth and scope is almost unprecedented. Their report is very impressive and provides an important basis for future action. It deserves very careful public analysis and discussion. (The unclassified version of the Commission's report became available only a few hours before the deadline for me to submit this written testimony, and therefore I cannot provide any in-depth analysis at this time.) Apparently, there is also considerable backup material relating to the information infrastructure that will be made available later.

The PCCIP's recommendations that may cause the greatest difficulty are those involving industry cooperation and information sharing. Historically, there have always been difficulties in getting competitors within a particular national infrastructure to collaborate, and there have always been problems getting the different national infrastructures to coordinate with each other. The Commission is completely correct that such cooperation is essential. The model of the Office of the Manager of the National Communication System has been fairly successful in some respects, but not in others. For example, the process of improving the security of the telecommunications infrastructure has been slow. Overall, the OMNCS experience bears deeper study.

I am concerned that deregulation of various national infrastructures could have a deleterious effect with respect to the critical infrastructures. Indeed, the PCCIP also recognizes deregulation as a risk that could run counter to their hopes. More generally, the tension between government control and private enterprise is a serious source of difficulty. Individual infrastructures such as electrical power and telecommunications have incentives to be efficient and profitable, but critical security, reliability, and system survivability are requirements that appear desirable only to the extent they are motivated by profitability. That is not acceptable when national survivability is at stake. On the other hand, draconian government regulation also is not likely to succeed. As an example for consideration, there seems to be less surplus power today, with various providers relying on each other for load sharing; when rolling outages occur as they did in July and August of 1997, there seems to be increasingly less margin for error.

Critical requirements are also seemingly less important to computer-communication system purveyors. We generally suffer with bad system security, weak network security, and unpredictable system and network survivability. There are no easy answers in this respect. However, the efforts needed to protect the critical information infrastructures with respect to threats to the national infrastructures are greater than those perceived by computer system developers as their bread-and-butter customers. Consequently, some government involvement is essential.

The PCCIP recognizes that the infrastructures are vulnerable and must be improved --- especially the computer-communication infrastructures. The computer-system development practice must become much more highly based on sound principles and good engineering practice. The government procurement process must be reformed, to avoid future development fiascoes.

The Commission's chapter on research and development is unfortunately extraordinarily superficial --- only three pages in my Web version. R&D is absolutely essential to the problems confronting us. I can only hope that much more detailed supplementary recommendations will become available later.

The Commission has largely ducked the issue of cryptography, other than to note that it is important to securing the information infrastructure. Unfortunately, they recommend the adoption of key-recovery techniques (simply because they think it seems prudent?), without having analyzed any of the risks and other implications.

I applaud you for holding these hearings. There is an enormous need for open discussion of these issues rather than seeking simplistic would-be remedies --- which in this case do not exist. Before you take any legislative action relating to the critical national infrastructures and the computer-communication infrastructures, I hope you will study the PCCIP report and its backup materials, and then read my book on the risks associated with computer-communication technologies (Reference 2) and perhaps my earlier testimonies (References 3, 4, and 6), as well as some of the other reports mentioned below (References 1, 7, 8, 9, 10, and 11). You must fully understand the vulnerabilities, threats, risks, and potential consequences. The issues are complex; the PCCIP's recommendations for education and awareness must include everyone --- including Congress.

References

  1. David D. Clark, W. Earl Boebert, Susan Gerhart, John V. Guttag, Richard A. Kemmerer, Stephen T. Kent, Sandra M. Mann Lambert, Butler W. Lampson, John J. Lane, M. Douglas McIlroy, Peter G. Neumann, Michael O. Rabin, Warren Schmitt, Harold F. Tipton, Stephen T. Walker, and Willis H. Ware, Computers at Risk: Safe Computing in the Information Age, National Research Council, National Academy Press, 1991, 2101 Constitution Ave., Washington, D.C. 20418, 1996.

  2. Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995.

  3. Peter G. Neumann, Security Risks in the Emerging Infrastructure, U.S. Senate Permanent Subcommittee on Investigations of the Senate Committee on Governmental Affairs, 25 June 1996. The written testimony appears in Security in Cyberspace, Hearings, S. Hrg. 104-701, ISBN 0-16-053913-7, 1996, pp. 350-363, with oral testimony included on pages 106-111 (http://www.csl.sri.com/neumannSenate.html).

  4. Peter G. Neumann, Computer Security in Aviation: Vulnerabilities, Threats, and Risks, International Conference on Aviation Safety and Security in the 21st Century, White House Commission on Safety and Security, and George Washington University. 13-15 January 1997 (http://www.csl.sri.com/neumann/air.html).

  5. Peter G. Neumann, Illustrative Risks to the Public in the Use of Computer Systems and Related Technology (periodically updated index of risks cases, ftp://www.csl.sri.com/pub/illustrative.PS).

  6. Peter G. Neumann, Security Risks in Key Recovery, written testimony for the Senate Judiciary Committee hearing on cryptographic key recovery, 9 July 1997 (http://www.csl.sri.com/neumann/judiciary.html).

  7. Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption, 27 May 1997. This report is published in the World Wide Web Journal (Web Security: A Matter of Trust) 2, 3, O'Reilly & Associates, Summer 1997, pages 241-257 (ftp://research.att.com/dist/mab/key_study.txt or .ps; http://www.crypto.com/key_study).

  8. Kenneth W. Dam, W.Y. Smith, Lee Bollinger, Ann Caracristi, Benjamin R. Civiletti, Colin Crook, Samuel H. Fuller, Leslie H. Gelb, Ronald Graham, Martin Hellman, Julius L. Katz, Peter G. Neumann, Raymond Ozzie, Edward C. Schmults, Elliot M. Stone, and Willis H. Ware, Cryptography's Role In Securing the Information Society (a.k.a. the CRISIS report), Final Report of the National Research Council Cryptographic Policy Study Committee, National Academy Press, 2101 Constitution Ave., Washington, D.C. 20418, 1996. The executive summary is available on-line (http://www2.nas.edu/cstbweb).

  9. Susan Landau, Stephen Kent, Clinton Brooks, Scott Charney, Dorothy Denning, Whitfield Diffie, Anthony Lauck, Douglas Miller, Peter G. Neumann, and David Sobel, Codes, Keys, and Conflicts: Issues in U.S. Crypto Policy, Report of a Special Panel of the ACM U.S. Public Policy Committee (USACM), June 1994 (http://info.acm.org/reports/acm_crypto_study.html).

  10. Information Security: Computer Attacks at Department of Defense Pose Increasing Risks, U.S. General Accounting Office, May 1996, GAO/AIMD-96-84.

  11. ``National Crime Information Center: Legislation Needed to Deter Misuse of Criminal Justice Information,'' statement of Laurie E. Ekstrand, U.S. General Accounting Office, as testimony before the U.S. House of Representatives Subcommittee on Information, Justice, Agriculture, and Transportation, of the Committee on Government Operations, and the Subcommittee on Civil and Constitutional Rights, of the Committee on the Judiciary, 28 July 1993. The appendix to that testimony documents 62 cases of misuses of law-enforcement computer data.

Personal Background

By way of introduction, I note that I have been involved with the U.S. Government (as well as state and local governments) in different technological contexts for many years, including (for example) national security, law enforcement, air-traffic control, and aviation safety and security (including the early stages of fly-by-wire research and space-station planning). My first computer-related job was for the Navy in the summer of 1953.

I have long been concerned with security, reliability, human safety, system survivability, and privacy in computer-communication systems and networks, and with how to develop systems that can dependably do what is expected of them. For example, I have been involved in designing operating systems and networks, secure database-management systems, and systems that monitor activities and seek to identify abnormal patterns of behavior. I have also been seriously involved in identifying and preventing risks. I received AB, SM, and PhD degrees from Harvard in 1954, 1955, 1961, respectively, and in 1960 received a Dr rerum naturarum from the Technische Hochschule, Darmstadt, Germany --- where I was a Fulbright scholar for two years. In the Computer Science Lab at Bell Telephone Labs at Murray Hill, N.J., throughout the 1960s, I was involved in research in computers and communications; during 1965-69, I participated extensively in the design, development, and management of Multics, a pioneering secure system, developed jointly by MIT, Honeywell, and Bell Labs. I was a visiting Mackay Lecturer at Stanford in 1964 and at Berkeley in 1970-71. I am a Principal Scientist in the Computer Science Laboratory at SRI, where I have been since 1971, concerned with computer systems having critical requirements such as security, reliability, human safety, and high assurance.

I am a member of the General Accounting Office's Executive Council for Information Management. From 1994 to 1996, I served a 2.5-year term on the Internal Revenue Service Commissioner's Advisory Group, where I addressed privacy and security issues as well attempting with considerable futility to give the IRS some remedial advice on the seriously flawed Tax Systems Modernization effort. From 1987 to 1989, I served on an expert panel for the House Judiciary Committee Subcommittee on Civil and Constitutional Rights, addressing law-enforcement database systems, at the request of Congressman Don Edwards.

I was an organizer of the 1982 Air Force Studies Board database security study, and was a member of the 1989-90 National Research Council System Security Study Committee that produced the report, Computers at Risk. I recently served on three studies targeted at reviewing U.S. crypto policy: an ACM panel (June 1994, Reference 9), a more intensive National Research Council study (1996, Reference 8), and a report (Reference 7) introduced as part of my written Senate testimony on 9 July 1997 (Reference 6), on the technical implications, risks, and costs of `key recovery', `key escrow', and `trusted third-party' encryption systems.

For the Association for Computing Machinery, I was founder and Editor of the SIGSOFT Software Engineering Notes (1976-1993) and now Associate Editor for the RISKS material; Chairman of the ACM Committee on Computers and Public Policy (since 1985); and a Contributing Editor for CACM (since 1990) for the monthly `Inside Risks' column. I co-chaired SIGSOFT '91 on software for critical systems. In 1985 I created, and still moderate, the ACM Forum on Risks to the Public in the Use of Computers and Related Technology, which is one of the most widely read of the on-line computer newsgroups. RISKS (comp.risks) provides a medium for discussion of issues relating to all aspects of computers and the social and technological problems that they create. My RISKS-derived book (Computer-Related Risks, Reference 2) explores the benefits and pitfalls of computer-communication technology and suggests ways of avoiding risks in critical systems.

My Web page (http://www.CSL.sri.com/neumann/) includes pointers to my 25 June 1996 Senate testimony on security risks in the critical infrastructure (Reference 3), a position statement for the Gore Commission on Aviation Safety and Security (Reference 4), testimony for the House Ways and Means Committee subcommittee on the Social Security Administration and a slightly extended statement for a subsequent SSA panel in San Jose, 28 May 1997, and testimony for the Senate Judiciary committee on key recovery, 9 July 1997 (Reference 6).

I am a Fellow of the American Association for the Advancement of Science, the ACM, and the Institute of Electrical and Electronics Engineers (and a member of the Computer Society). I received the ACM Outstanding Contribution Award for 1992, the first SRI Exceptional Performance Award for Leadership in Community Service in 1992, the Electronic Frontier Foundation Pioneer Award in 1996, the ACM SIGSOFT Distinguished Service Award in 1997, and the CPSR Norbert Wiener Award in 1997. I am a member of the Advisory Board of the Electronic Privacy Information Center (EPIC).

Federal Funding Disclosure Statement

I am testifying as an individual, not as a representative of my employer (SRI International) or the ACM or any other organization in which I participate (such as the GAO). I note for the record that I am currently receiving U.S. Government funding under two research projects. Both projects are directed at crucial aspects of the problems of protecting computer-communication information infrastructures, as is the vast majority of research and development work that I have done over the past 44 years. See my Web site (http://www.CSL.sri.com/neumann/) for more details.