Computer-Communications Security Risks:
Melissa is Just the Tip of a Titanic Iceberg

Peter G. Neumann
Principal Scientist, Computer Science Laboratory
SRI International, Menlo Park CA 94025-3493
Telephone: 1-650-859-2375
E-mail:; Web site:

Written testimony, for the U.S. House Science Committee Subcommittee on Technology, hearing on 15 April 1999

This testimony, for the record, updates my 6 November 1997 testimony, ``Computer-Related Risks and the National Infrastructures'', which appears in your subcommittee's hearing record for that date, The Role of Computer Security in Protecting U.S. Infrastructures, Hearing, 105th Congress, 1st session, No. 33, 1998 (ISBN 0-16-056151-5), with my oral testimony on pages 61--63, and my written testimony on pages 64--99. My oral responses to oral questions at the hearing are on pages 101--118 of that record, and my written responses to subsequent written questions are on pages 148--161. [These 1997 pages are collectively referred to herein as Reference 0.]


This testimony addresses the recent Melissa episode in the larger context of our information system intrastructures and concludes that Melissa is just one more relatively benign (albeit annoying) example of what could happen much more disastrously unless the proper conclusions are drawn and acted upon.

In a nutshell, my 1997 testimony (Reference 0) discussed some of the most serious vulnerabilities that existed then in our critical national infrastructures and in the computer-communication infrastructures on which we all depend, with respect to the security, reliability, and survivability of those systems in the face of numerous adversities. Typical adversities include system penetrators and malicious insiders, and also systems that fall apart all by themselves (without any malicious users) -- as a result of design flaws, implementation bugs, operational mistakes, and many other events that have not been properly anticipated.

In the intervening year and one-half since my 1997 testimony, the already serious situation has in many respects worsened rather than improved, relative to other events that are rapidly overtaking us. Desperately needed technological improvements have been slow to emerge or have not even been attempted. Many new flaws have been uncovered. Even in cases where some improvements have occurred, the likelihood of serious adversities has increased faster than the improvements in many instances. For example, the relatively unconstrained exponential growth of the Internet has opened up new vulnerabilities that can be exploited by penetrators, terrorists, disgruntled ex-employees, trusted but untrustworthy insiders, and other malfeasors. The increased dependence on technology and fundamental difficulties in software development have actually increased the risks. Systems and networks continue to fall apart on their own. The AT&T Frame Relay Network outage and the Galaxy IV satellite outage are just two examples. Another example involved the Yorktown Aegis missile cruiser being dead in the water for 2 hours and 45 minutes as the result of an unchecked divide-by-zero in a Windows NT application. More Web sites have been broken into and altered by crackers with very little deep knowledge, because scripted attacks on intolerably unsecure systems are increasingly widely known.

The Melissa Word macro virus is just one more hint of how vulnerable some systems are to unanticipated combinations of features that may seem separately desirable (notably, certain Microsoft products that were affected), and how vulnerable they are to attacks or collapses potentially of much greater consequence than Melissa. It must be recognized as a symptom of a much broader problem, rather than the problem itself. There is an easy tendency on the part of certain Government departments and agencies to blame high-school students (as in the Cloverdale case), ``hackers'', and other malfeasors, and to believe in law enforcement as a suitable deterrent. On the other hand, there is very little concerted effort to improve the systems and networks, which would be a much more effective defensive approach toward prevention. The most obvious conclusion is that serious preventive and other remedial technological actions are urgently needed across the board of computer-communication systems, and that legislation and law enforcement that focus on punishment of attackers will remain inadequate as long as the technology is so lacking in robustness. The opportunities for much more serious attacks are ubiquitous, and pervasive throughout government systems, defense systems, and the private sector. The rush into electronic commerce often ignores some of these potential threats. Because of the international nature of the Internet and telecommunication systems (dial-up access is often possible even when systems are not connected to the the Internet, or indirect routes may suddenly appear to seemingly disconnected classified systems), and because terrorist activities can wreak irreparable damage, legislation and law enforcement are necessarily only secondary measures. We must have computer-communication systems that are more robust in the face of a wider range of adversities.

The fundamental reason that Melissa could so easily replicate itself is that the most widely-used operating system on the Internet lacks any form of compartmentalization, and Microsoft apparently has no plans to add any such protection. The potential risks of Word macro viruses have been well known for as long as there have been Word macros. In addition, Melissa was so easily able to propagate because the widely-deployed e-mail infrastructure lacks authentication and integrity (along with definitive traceability), and as yet there is no effective scheme that can be retrofitted. The spread of Melissa also involved a lot of people unwittingly allowing an e-mail enclosure to execute within a computer system environment that offered no defenses. The propagation of Melissa was human-aided, in contrast with the 1988 Internet Worm -- which was self-replicating. Many of the lessons that should have been learned from the Internet Worm were evidently not learned. Now we have some further incentives.

Conclusions Related to the Previous Testimony

Here are a few observations updating my earlier testimony of Reference 0. Within this section, we summarize our earlier testimony and precede by "&&&&" the incremental discussion that concludes each bulleted item.

Vulnerabilities, Threats, and Risks

Critical Dependencies

Risks in Key-Recovery Cryptography

&&&& As noted above, Melissa succeeded in part because of the lack of wide-spread authentication. One reason that there is no wide-spread authentication is that the U.S. Government has discouraged vendors from using encryption, which is essential for good authentication. More recently, the debate over law-enforcement access to unencrypted information has diverted much intellectual energy that would otherwise have been working on deploying encryption so as to prevent incidents such as Melissa. The French government has recently decided not only that they can get along without access involving key control, but that they should encourage secure private communication in order not be be left behind in the new world of electronic commerce. They have recognized that their own national well-being is seriously at stake.

System Development and Operation

Other System Considerations

Attaining systems and networks that are dependably survivable, secure, and reliable is a very difficult problem. It is essentially impossible to have any guarantees whatsoever that a particular system will work properly whenever, wherever, and however it is needed. Furthermore, the information infrastructures are highly heterogeneous, which makes it even more difficult to have any guarantees that information infrastructures in the large --- that is, aggregations of different computer systems and networks --- will behave properly. Survivability, security, and reliability are all weak-link phenomena, and there are far too many weak links today. On the other hand, there will always be weak links and resulting risks, and there will always be cases involving multiple weak links. &&&& See Reference 13 for an extensive report on attaining survivable systems and networks.


The challenge for the research and development community is to find ways to avoid as many of these risks as possible, to minimize the consequences of the exploitation or accidental triggering of those that cannot be avoided, and to provide well-founded assurances that systems and networks are likely to be able to satisfy their critical requirements. One of your tasks in the Science Committee must be to encourage such R&D and to ensure that viable new approaches can find their way into our information infrastructures --- and then to encourage their adoption within the national infrastructures. One of the main tasks of those who control the national infrastructures and those who depend on our computer-communication infrastructures is to ensure that they are using the most robust information technology available, that management is fully aware of the risks, and that they coordinate their efforts. As the PCCIP report notes, education and awareness are vital (along with research and development), although it helps greatly if what is being taught and learned can actually lead to much greater survivability, reliability, and security --- benefitting from the best R&D efforts, the most robust open systems, and the best practical systems. This is a very difficult challenge for all of us.

From the above discussion, the realization that so little tangible progress has been made in the technology and that we may indeed be more vulnerable now than we were in 1997 is very disheartening. It suggests that our priorities are not properly aligned. Of utmost concern is the conclusion that the necessary improvements require serious technical measures, more fundamentally than legislation and law enforcement -- which are minimally ineffective in the absence of more-robust systems and networks. Defensive legislation and law enforcement are of course necessary -- because no improvements can result in perfectly robust systems -- but can never be sufficient. Furthermore, the current reliance on a proprietary-system monoculture is extremely dangerous.

The most important realization that you must grapple with is that the Melissa problem is in reality merely a microcosm of the collection of serious problems described here, affecting all of our national and information infrastructures. It is essential that you look at the big picture, rather than focusing on Melissa itself (herself?).


Much greater emphasis must be devoted to the development of secure, reliable, and highly survivable systems and networks. Strong cryptography that is well implemented is just one necessity. The system development process is in general a national disgrace. Good software development practice is seldom used, as illustrated by the bad state of system security, the Y2K problem, failures of defense systems, cancellations of major procurements, and so on. Reversal of that trend requires better education and training, improvements of the procurement process including more stringent and carefully-defined requirements, improved oversight of system developments, liability and responsibility for failed systems, and recognition of the risks if dramatic improvement does not occur rapidly.

Significant new research and development efforts are needed, although ways must be found to get the R&D results into the commercial mainstreams. For example, we must find ways of making systems and networks more secure, more reliable, and more survivable, in the face of realistic adversities. Various approaches to making certain nonproprietary software (so-called open software and free software) more robust seem to be very promising (Reference 14). Also, nonintrusive systems for detecting and responding to system misuses (from insiders as well as outsiders) need to be vigorously pursued (although I need to point out that I have long been involved in advanced projects on anomaly and misuse detection and responses thereto -- as noted at the end of this testimony). System administrators need all the help they can get -- they are currently expected to keep abreast of a massive flow of security patches, without having much of an understanding of how proprietary code works or how vulnerable it is to misuse. The Y2K system-upgrading exercise suggests that enormous resources had to be expended in a crash effort to take care of one particular problem that was known long ago. Unfortunately, the myopic focus on that effort has ignored many of the deeper problems.

In the past year and one-half, system and network security has not improved commensurate with the increased threats. Experience suggests that many new vulnerabilities will exist in new systems. The same kind of mistakes tend to be made over and over. You must become aware in depth of the extent of the threats, the vulnerabilities, the risks, and the measures necessary for some remediation. Although Melissa is really just one more ringing alarm clock, it is time to wake up and spring to action.


Computer-communication vulnerabilities, threats, and risks are documented in considerable detail in References 1, 2, 3, 4, 5, and 12. The recent NRC report Trust in Cyberspace (Reference 12) revisits the earlier NRC report Computers at Risk (Reference 1), and concludes that the problems have not diminished. In addition, specific vulnerabilities and risks related to the use of cryptography are given in References 6, 7, 8, and 9.

NOTE: To simplify the text, specific references to most of the cases cited herein can be found in Reference 5, including the AT&T frame relay outage, Galaxy IV, the rampant Website alterations, the San Francisco blackout, and so on.

  1. David D. Clark, W. Earl Boebert, Susan Gerhart, John V. Guttag, Richard A. Kemmerer, Stephen T. Kent, Sandra M. Mann Lambert, Butler W. Lampson, John J. Lane, M. Douglas McIlroy, Peter G. Neumann, Michael O. Rabin, Warren Schmitt, Harold F. Tipton, Stephen T. Walker, and Willis H. Ware, Computers at Risk: Safe Computing in the Information Age, National Research Council, National Academy Press, 1991, 2101 Constitution Ave., Washington, D.C. 20418.

  2. Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995.

  3. Peter G. Neumann, Security Risks in the Emerging Infrastructure, U.S. Senate Permanent Subcommittee on Investigations of the Senate Committee on Governmental Affairs, 25 June 1996. The written testimony appears in Security in Cyberspace, Hearings, S. Hrg. 104-701, ISBN 0-16-053913-7, 1996, pp. 350-363, with oral testimony included on pages 106-111 (

  4. Peter G. Neumann, Computer Security in Aviation: Vulnerabilities, Threats, and Risks, International Conference on Aviation Safety and Security in the 21st Century, White House Commission on Safety and Security, and George Washington University. 13-15 January 1997 (

  5. Peter G. Neumann, Illustrative Risks to the Public in the Use of Computer Systems and Related Technology (periodically updated index of risks cases). The latest version is browsable at (also in .ps and .pdf form).

  6. Peter G. Neumann, Security Risks in Key Recovery, written testimony for the Senate Judiciary Committee hearing on cryptographic key recovery, 9 July 1997 (

  7. Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze, Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption, 27 May 1997. This report is published in the World Wide Web Journal (Web Security: A Matter of Trust) 2, 3, O'Reilly & Associates, Summer 1997, pages 241-257, and is browsable at .

  8. Kenneth W. Dam, W.Y. Smith, Lee Bollinger, Ann Caracristi, Benjamin R. Civiletti, Colin Crook, Samuel H. Fuller, Leslie H. Gelb, Ronald Graham, Martin Hellman, Julius L. Katz, Peter G. Neumann, Raymond Ozzie, Edward C. Schmults, Elliot M. Stone, and Willis H. Ware, Cryptography's Role In Securing the Information Society (a.k.a. the CRISIS report), Final Report of the National Research Council Cryptographic Policy Study Committee, National Academy Press, 2101 Constitution Ave., Washington, D.C. 20418, 1996. The executive summary is available on-line (

  9. Susan Landau, Stephen Kent, Clinton Brooks, Scott Charney, Dorothy Denning, Whitfield Diffie, Anthony Lauck, Douglas Miller, Peter G. Neumann, and David Sobel, Codes, Keys, and Conflicts: Issues in U.S. Crypto Policy, Report of a Special Panel of the ACM U.S. Public Policy Committee (USACM), June 1994 (

  10. Information Security: Computer Attacks at Department of Defense Pose Increasing Risks, U.S. General Accounting Office, May 1996, GAO/AIMD-96-84.

  11. ``National Crime Information Center: Legislation Needed to Deter Misuse of Criminal Justice Information,'' statement of Laurie E. Ekstrand, U.S. General Accounting Office, as testimony before the U.S. House of Representatives Subcommittee on Information, Justice, Agriculture, and Transportation, of the Committee on Government Operations, and the Subcommittee on Civil and Constitutional Rights, of the Committee on the Judiciary, 28 July 1993. The appendix to that testimony documents 62 cases of misuses of law-enforcement computer data.

  12. F.B. Schneider and M. Blumenthal, editors, Trust in Cyberspace, National Research Council, 1998.

  13. Peter G. Neumann, Practical Architectures for Survivable Systems and Networks, U.S. Army Research Lab report, 28 January 1999, browsable at (also in .ps and .pdf form).

  14. Peter G. Neumann, Robust Open-Source Software, Communications of the ACM 42, 2, February 1999.

  15. Wayne L. O'Hern, Jr., task force chairman, An Open Systems Process for DoD, Open Systems Task Force, Defense Science Board, October 1998.

Personal Background

By way of introduction, I note that I have been involved with the U.S. Government (as well as state and local governments) in different technological contexts for many years, including (for example) national security, law enforcement, air-traffic control, and aviation safety and security (including the early stages of fly-by-wire research and space-station planning). My first computer-related job was for the Navy in the summer of 1953.

I have long been concerned with security, reliability, human safety, system survivability, and privacy in computer-communication systems and networks, and with how to develop systems that can dependably do what is expected of them. For example, I have been involved in designing operating systems and networks, secure database-management systems, and systems that monitor activities and seek to identify abnormal patterns of behavior. I have also been seriously involved in identifying and preventing risks.

I received AB, SM, and PhD degrees from Harvard in 1954, 1955, 1961, respectively, and in 1960 received a Dr rerum naturarum from the Technische Hochschule, Darmstadt, Germany --- where I had a Fulbright grant for two years. In the Computer Science Lab at Bell Telephone Laboratories at Murray Hill, N.J. throughout the 1960s, I was involved in research in computers and communications; during 1965-69, I participated extensively in the design, development, and management of Multics, a pioneering secure system, developed jointly by MIT, Honeywell, and Bell Labs. (Multics made great advances in security, and incidentally also recognized and avoided the Y2K problem -- in 1965!) I was a visiting Mackay Lecturer at Stanford in 1964 and at Berkeley in 1970-71. I am a Principal Scientist in the Computer Science Laboratory at SRI, where I have been since 1971, concerned with computer systems having critical requirements such as security, reliability, information system survivability, human safety, and high assurance.

I am a member of the General Accounting Office's Executive Council for Information Management, which is heavily involved in Y2K. From 1994 to 1996, I served a 2.5-year term on the Internal Revenue Service Commissioner's Advisory Group, where I addressed privacy and security issues as well attempting with considerable futility to give the IRS some remedial advice on the seriously flawed Tax Systems Modernization effort. From 1987 to 1989, I served on an expert panel for the House Judiciary Committee Subcommittee on Civil and Constitutional Rights, addressing law-enforcement database systems, at the request of then Congressman Don Edwards.

I was an organizer of the 1982 Air Force Studies Board database security study, and was a member of the 1989-90 National Research Council System Security Study Committee that produced the report Computers at Risk. I recently served on three studies targeted at reviewing U.S. crypto policy: an ACM panel (June 1994, Reference 9), a more intensive National Research Council study (1996, Reference 8), and a report (Reference 7) introduced as part of my written Senate testimony on 9 July 1997 (Reference 6), on the technical implications, risks, and costs of `key recovery', `key escrow', and `trusted third-party' encryption systems.

For the Association for Computing Machinery (ACM), I was founder and Editor of the SIGSOFT Software Engineering Notes (1976-1993) and now Associate Editor for the RISKS material; Chairman of the ACM Committee on Computers and Public Policy (since 1985); and a Contributing Editor for CACM (since 1990) for the monthly `Inside Risks' column. I co-chaired SIGSOFT '91 on software for critical systems. In 1985 I created, and still moderate, the ACM Forum on Risks to the Public in the Use of Computers and Related Technology, which is one of the most widely read of the on-line computer newsgroups. RISKS (comp.risks) provides a medium for discussion of issues relating to all aspects of computers and the social and technological problems that they create. My RISKS-derived book (Computer-Related Risks, Reference 2) explores the benefits and pitfalls of computer-communication technology and suggests ways of avoiding risks in critical systems.

My Web page ( includes pointers to my 25 June 1996 Senate testimony on security risks in the critical infrastructure (Reference 3), a position statement for the Gore Commission on Aviation Safety and Security (Reference 4), testimony for the House Ways and Means Committee subcommittee on the Social Security Administration and a slightly extended statement for a subsequent SSA panel in San Jose, 28 May 1997, and testimony for the Senate Judiciary committee on key recovery, 9 July 1997 (Reference 6).

I am a Fellow of the American Association for the Advancement of Science, the ACM, and the Institute of Electrical and Electronics Engineers (and a member of its Computer Society). I received the ACM Outstanding Contribution Award for 1992, the first SRI Exceptional Performance Award for Leadership in Community Service in 1992, the Electronic Frontier Foundation Pioneer Award in 1996, the ACM SIGSOFT Distinguished Service Award in 1997, and the CPSR Norbert Wiener Award in 1997. I am a member of the Advisory Board of the Electronic Privacy Information Center (EPIC).

Federal Funding Disclosure Statement

I am testifying as an individual, not as a representative of my employer (SRI International, a not-for-profit R&D institute), or the ACM, or any other organization. I note for the record that I am currently receiving U.S. Government funding as an SRI employee under the following research projects from the Army Research Lab and DARPA. These projects are directed at crucial aspects of the problems of protecting computer-communication information infrastructures, as is the vast majority of research and development work that I have done over the past 46 years. See my Web site at for further background. (Incidentally, my current role on the GAO Executive Council for Information Management and Technology is a pro bono effort.)