Information Security Is Not Improving, Relative to the Risks

Peter G. Neumann
Principal Scientist, Computer Science Laboratory
SRI International, Menlo Park CA 94025-3493
Telephone: 1-650-859-2375
E-mail:; Web site:

Testimony for the U.S. House Committee on Government Reform, Subcommittee on Government Efficiency, Financial Management, and Intergovernmental Relations, San Jose, California, 29 August 2001


This is the fourth time I have provided testimony for a U.S. House of Representatives committee relating to computer-communication security, the previous three having been in Washington D.C. [1,2,3] in 1997, 1999, and 2000. The situation has not been noticeably improving; indeed, we seem to be falling further behind.

Although there have been advances in the research community on information security, trustworthiness, and dependability, the overall situation in practice appears to continually be getting worse, relative to the increasing threats and risks -- for a variety of reasons. The information infrastructure is still fundamentally riddled with security vulnerabilities, affecting end-user systems, routers, servers, and communications; new software is typically flawed, and many old flaws still persist; worse yet, patches for residual flaws often introduce new vulnerabilities. There is much greater dependence on the Internet, for Governmental use as well as private and corporate use. Many more systems are being attached to the Internet all over the world, with ever increasing numbers of users -- some of whom have decidedly ulterior motives. Because so many systems are so easily interconnectable, the opportunities for exploiting vulnerabilities and the ubiquity of the sources of threats are also increased. Furthermore, even supposedly stand-alone systems are often vulnerable. Consequently, the risks are increasing faster than the amelioration of those risks.


There are quite a few realistic but sometimes dirty truths that remain largely unspoken and under-appreciated.


One conclusion from the above discussion is very simple: we are not progressing sufficiently in our attempts to achieve acceptable information security. Essentially everything I wrote in my 1995 book [5] about computer-related risks -- and particularly security risks -- still seems to apply today.

A broadly coordinated effort is needed, not just palliative measures. In principle, technological problems need technological solutions, not legal solutions. Legal problems need laws and enforcement, not technological solutions. In general, technologists are better at understanding the technical problems, and similarly for the legal communities. Mismatched solutions tend not to be effective. However, many of our emerging problems require a careful combination of approaches cognizant of the full spectrum of social, economic, technological, legal, and other needs. Nevertheless, at the very minimum, we need vastly improved security, reliability, dependability, and survivability in the face of adversity, in the computer and communication systems on which we critically depend for so many things.

It is unfortunate that many important research advances are not finding their way into practice. In the research community, we have known how to do much better for a long time. For example, many approaches for developing and operating vastly more secure systems and networks can be found in a recent report [6], including system and network architectures that sharply reduce the necessity for trusting potentially untrustworthy components and individuals, while also realizing extensive interoperability and ability to evolve over time while still fulfilling the desired requirements. However, many factors have contributed to our having less information security than we deserve, including (for example) U.S. Government's past restrictions on cryptography policy, the House's predominant concern with the immediate future rather than looking farther ahead, corporations often determined to deliver functionality without regard to security, customers lacking awareness of the risks, and a general lack of commitment to progress.

What Might Congress Do?

Overall, there are few incentives today for the development, operation, and maintenance of robust, secure, reliable computer-communication systems that are so badly needed as a basis for our future. That needs to be corrected.


(Hot links to the references are included in the Web version of this document:

  1. Peter G. Neumann, Computer-Related Risks and the National Infrastructures. U.S. House Science Committee Subcommittee on Technology, 6 November 1997. In The Role of Computer Security in Protecting U.S. Infrastructures, Hearing, 105th Congress, 1st session, No. 33, 1998, pages 64--99, ISBN 0-16-056151-5, 1997, preceded by the oral presentation on pages 61--63. Oral responses to oral questions are on pages 101--118, and written responses to subsequent written questions are on pages 148--161. ( Written testimony at and written responses to written questions at )

  2. Peter G. Neumann, Melissa is Just the Tip of a Titanic Iceberg. Written testimony, for the U.S. House Science Committee Subcommittee on Technology, hearing on 15 April 1999. ( Written testimony at

  3. Peter G. Neumann, Risks in Our Information Infrastructures: The Tip of a Titanic Iceberg Is Still All That Is Visible. Written testimony, for the U.S. House Science Committee Subcommittee on Technology, hearing on 10 May 2000, introduced into the record by Keith Rhodes of the General Accounting Office on my behalf. ( Written testimony at

  4. Tom Marsh (ed), Critical Foundations: Protecting America's Infrastructures, President's Commission on Critical Infrastructure Protection, October 1997. (CIAO Web site at and PCCIP report information at

  5. Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995.

  6. Peter G. Neumann, Practical Architectures for Survivable Systems and Networks, SRI report for the U.S. Army Research Laboratory, 30 June 2000. ( html, PostScript, and pdf versions available at

Personal Background

I have been in the Computer Science Lab at SRI International since 1971, after ten years in the Computer Science Lab at Bell Telephone Laboratories in Murray Hill, New Jersey, 1960 to 1970 (where from 1965 to 1969 I was heavily involved jointly with MIT and Honeywell in the highly innovative secure operating system, Multics). I have doctorates from Harvard and the Technische Hochschule, Darmstadt, Germany -- where I was on a Fulbright grant. I have taught at Stanford, the University of California at Berkeley, and the University of Maryland. I am a Fellow of the AAAS, ACM, and IEEE, and recipient of the CPSR Norbert Wiener Award for Professional and Social Responsibility in Computing and the Electronic Frontier Foundation Pioneer Award. I have testified for Senate and House committees, and for the PCCIP. I served on several National Research Council studies, including Computers at Risk (1990) and Cryptography's Role in Securing the Information Society (1996). I am currently on the U.S. General Accounting Office's Executive Council on Information Management and Technology and the U.S. National Science Foundation Computer Information Science and Engineering Advisory Board. I am co-founder of People For Internet Responsibility ( See my Web site at for further background.