Computer-Communications Security Risks:
Melissa is Just the Tip of a Titanic Iceberg
Peter G. Neumann
Principal Scientist, Computer Science Laboratory
SRI International, Menlo Park CA 94025-3493
Telephone: 1-650-859-2375
E-mail: Neumann@CSL.SRI.com; Web site: http://www.csl.sri.com/neumann.html
Written testimony, for the U.S. House Science Committee Subcommittee
on Technology, hearing on 15 April 1999
This testimony, for the record, updates my 6 November 1997 testimony,
``Computer-Related Risks and the National Infrastructures'', which appears
in your subcommittee's hearing record for that date, The Role of
Computer Security in Protecting U.S. Infrastructures, Hearing, 105th
Congress, 1st session, No. 33, 1998 (ISBN 0-16-056151-5), with my oral
testimony on pages 61--63, and my written testimony on pages 64--99. My
oral responses to oral questions at the hearing are on pages 101--118 of
that record, and my written responses to subsequent written questions are on
pages 148--161. [These 1997 pages are collectively referred to herein as
Reference 0.]
Summary
This testimony addresses the recent Melissa episode in the larger context of
our information system intrastructures and concludes that Melissa is just
one more relatively benign (albeit annoying) example of what could happen
much more disastrously unless the proper conclusions are drawn and acted
upon.
In a nutshell, my 1997 testimony (Reference 0) discussed some of the most
serious vulnerabilities that existed then in our critical national
infrastructures and in the computer-communication infrastructures on which
we all depend, with respect to the security, reliability, and survivability
of those systems in the face of numerous adversities. Typical adversities
include system penetrators and malicious insiders, and also systems that
fall apart all by themselves (without any malicious users) -- as a result of
design flaws, implementation bugs, operational mistakes, and many other
events that have not been properly anticipated.
In the intervening year and one-half since my 1997 testimony, the already
serious situation has in many respects worsened rather than improved,
relative to other events that are rapidly overtaking us. Desperately needed
technological improvements have been slow to emerge or have not even been
attempted. Many new flaws have been uncovered. Even in cases where some
improvements have occurred, the likelihood of serious adversities has
increased faster than the improvements in many instances. For example, the
relatively unconstrained exponential growth of the Internet has opened up
new vulnerabilities that can be exploited by penetrators, terrorists,
disgruntled ex-employees, trusted but untrustworthy insiders, and other
malfeasors. The increased dependence on technology and fundamental
difficulties in software development have actually increased the risks.
Systems and networks continue to fall apart on their own. The AT&T Frame
Relay Network outage and the Galaxy IV satellite outage are just two
examples. Another example involved the Yorktown Aegis missile cruiser being
dead in the water for 2 hours and 45 minutes as the result of an unchecked
divide-by-zero in a Windows NT application. More Web sites have been broken
into and altered by crackers with very little deep knowledge, because
scripted attacks on intolerably unsecure systems are increasingly widely
known.
The Melissa Word macro virus is just one more hint of how vulnerable
some systems are to unanticipated combinations of features that may
seem separately desirable (notably, certain Microsoft products that
were affected), and how vulnerable they are to attacks or collapses
potentially of much greater consequence than Melissa. It must be
recognized as a symptom of a much broader problem, rather than the
problem itself. There is an easy tendency on the part of certain
Government departments and agencies to blame high-school students (as
in the Cloverdale case), ``hackers'', and other malfeasors, and to
believe in law enforcement as a suitable deterrent. On the other
hand, there is very little concerted effort to improve the systems and
networks, which would be a much more effective defensive approach
toward prevention. The most obvious conclusion is that serious
preventive and other remedial technological actions are urgently
needed across the board of computer-communication systems, and that
legislation and law enforcement that focus on punishment of attackers
will remain inadequate as long as the technology is so lacking in
robustness. The opportunities for much more serious attacks are
ubiquitous, and pervasive throughout government systems, defense
systems, and the private sector. The rush into electronic commerce
often ignores some of these potential threats. Because of the
international nature of the Internet and telecommunication systems
(dial-up access is often possible even when systems are not connected
to the the Internet, or indirect routes may suddenly appear to
seemingly disconnected classified systems), and because terrorist
activities can wreak irreparable damage, legislation and law
enforcement are necessarily only secondary measures. We must have
computer-communication systems that are more robust in the face of a
wider range of adversities.
The fundamental reason that Melissa could so easily replicate itself
is that the most widely-used operating system on the Internet lacks
any form of compartmentalization, and Microsoft apparently has no plans
to add any such protection. The potential risks of Word macro viruses
have been well known for as long as there have been Word macros. In
addition, Melissa was so easily able to propagate because the
widely-deployed e-mail infrastructure lacks authentication and
integrity (along with definitive traceability), and as yet there is no
effective scheme that can be retrofitted. The spread of Melissa also
involved a lot of people unwittingly allowing an e-mail enclosure to
execute within a computer system environment that offered no defenses.
The propagation of Melissa was human-aided, in contrast with the 1988
Internet Worm -- which was self-replicating. Many of the lessons
that should have been learned from the Internet Worm were evidently
not learned. Now we have some further incentives.
Conclusions Related to the Previous Testimony
Here are a few observations updating my earlier testimony of Reference 0.
Within this section, we summarize our earlier testimony and precede by
"&&&&" the incremental discussion that concludes each bulleted item.
Vulnerabilities, Threats, and Risks
- The existing information infrastructures are riddled with
vulnerabilities, including hardware and software unreliability and system
unavailability. Examples include the nationwide 1990 AT&T long-distance
collapse and many recent outages and saturations of Internet service
providers. Further cases are anticipated in computer systems that may break
in the Year 2000 because of two-digit date fields. Serious security flaws
are also abundant in computer systems, networks, Web software, programming
languages, and have been widely reported. The extent of the risks is still
not widely recognized, and preventive measures have been very slow to
develop. Indeed, we are in all likelihood not even aware of many still
unidentified vulnerabilities, and new vulnerabilities are continually being
introduced. Future disasters may exploit vulnerabilities we do not know
about. &&&& THIS IS ALL STILL VERY TRUE, AND SEEMS TO BE
INCREASINGLY PROBLEMATIC RATHER THAN LESS SO. Some of the apparent progress
in the Y2K problem is still to be verified by future events. although local
testing may be successful, large-scale fault modes may remain hidden.
Furthermore, some of the self-reported improvement is the result of having
redefined certain noncompliant systems as no longer considered ``critical''.
The massive costs incurred in attempting to confront Y2K are further
examples of endemic shortsightedness over many years and the continual lack
of good software engineering practice. Ironically, there is now widespread
awareness of the Y2K problem, and significant effort expended in avoiding
it; however, the security problem is larger, more pervasive, liable to
strike at unpredictable times, and yet has received much less attention
(perhaps because it does not have a fixed ``drop-dead'' date or a catchy
name). Systems continue to be broken into or to break by themselves in the
absence of intruders. In addition, insider misuse remains an enormous
potential problem that is largely unrecognized and undefended.
- There are many realistic threats to the information infrastructures,
including malicious insiders and intruders, terrorists, saboteurs, and just
plain incompetent administrative and operational staff. These threats may
come from corporate and national interests as well as individuals --- in
addition to effects of the environment, natural phenomena, accidental
interference, and so on. Malicious attacks may come from anywhere in the
world, via dial-up lines and network connections, often anonymously. The
list of threats is long and multidimensional (and discussed in the PCCIP
report). Consequently, it is not possible to predict which threats will be
exploited, and under what circumstances. &&&& THIS IS STILL TRUE.
THE KNOWN THREATS HAVE INCREASED IN SCOPE AND LIKELIHOOD, while the defenses
have lagged; furthermore, new vulnerabilities are continually being
discovered. The number of detected and catalogued virus types continues to
double each year, partly because of the availability of easy-to-use toolkits
for generating them!
- Thus far, there have been relatively few truly serious malicious
attacks on computer systems and networking (for example, see Reference
10, which includes analysis of the Rome Lab case), although such
activities from both insiders and outsiders appear to be increasing,
particularly in financial systems (such as the $588 million Japanese
Pachinko frauds and the Citibank case). There have been numerous cases of
more than mere nuisance value (for example, the hacking of Web sites of the
Justice Department, CIA, US Air Force, and NASA), including many denials of
service (for example, network flooding attacks that have disabled entire
networks). There have also been penetration studies that have
constructively demonstrated the extent of the vulnerabilities, without
malicious intent (such as the 1988 Internet Worm and numerous analyses and
demonstrations of flaws in Web browsers, servers, protocols, algorithms, and
encryption schemes). It is good that we have so many friendly participants
in this struggle to increase dependability. Perhaps because there have been
no truly devastating attacks, concern is less than it should be ---
considering the magnitude of the potential risks. However, the rapid
acceleration of electronic commerce can be expected to inspire some
ingenious massive frauds that systematically exploit various major
vulnerabilities on the information infrastructure --- which could be a
goldmine for organized crime. &&&& THIS IS STILL TRUE. Melissa is
just one more reminder, but still not a case of massive malicious
destruction. It is often speculated that it would take an event of
Chernobyl-like destruction with world-wide consequences before awareness of
the risks is raised sufficiently. However, the reactions to the Internet
Worm were mostly palliative -- a few people fixed up a few things and again
stuck their heads in the sand. The reactions to Melissa appear to be about
the same -- it is seen by some people as just another silly prank, and by
law enforcement as a case that must be prosecuted to the hilt. It would be
very appropriate to recognize the long-term deeper risks of flawed
information infrastructures, and to respond proactively with substantive
corrective action.
- In many cases, system collapses attributable to reliability problems
could also have been triggered maliciously, because of corresponding
security vulnerabilities. &&&& THIS IS STILL TRUE AND WILL REMAIN TRUE
in the absence of enormous improvements in system design.
Critical Dependencies
- Because all eight of the national infrastructures cited by the
President's Commission on Critical Infrastructure Protection are more or
less critically dependent on the computer-communication infrastructures,
risks to the latter immediately become of concern to the former. In
addition, the computer-communication information infrastructures are
themselves dependent on many of the national infrastructures --- most
notably electric power and telecommunications. To date, we have seen
relatively few attacks on national infrastructures, although the 23 October
1997 power outage in the northern half of San Francisco appears to have been
sabotage, following an earlier physical attack. Many other attacks would be
very easy to perpetrate, especially those that exploit vulnerabilities in
the information infrastructures. &&&& THIS IS STILL TRUE, although a
few marginal improvements are noted in the critical infrastructures, partly
as a byproduct of their having to deal with the Y2K problem and having the
opportunity to fix a few other problems as well. Also, some of the
PCCIP/CIAO organizational structures have been established. However, such
organizational approaches are by themselves not capable of overcoming the
inherent limitations of the technology as it exists today.
- We have become massively interconnected and interdependent. Whether we
like it or not, we must coexist with people and systems of unknown and
unidentifiable trustworthiness (including unidentifiable hostile parties),
within the U.S. and elsewhere. The problems have become international as
well as national. &&&& THIS HAS ESCALATED FURTHER, including the
information warfare attacks on NATO systems and the discovery of
Trojan horses that have been introduced into this country in software that
had been shipped abroad for Y2K-related repairs. Furthermore, dependence
on information on the Internet is inherently risky: much information is
simply wrong or misleading, and the possibility of intentionally altered
information is a consequence of the lack of adequate security.
- The use of certifiably strong cryptography is essential to the future
of computer-communication systems, and therefore to the protection of the
critical national infrastructures. Cryptography is only one small link in
achieving confidentiality, authentication, and integrity, but it is a vital
link. Although cryptographic compromises have not been a major source of
security risks in the past, preventing them will be especially critical to
the successful conduct of electronic commerce, which is growing very rapidly
and places stringent demands on computer-communication systems. Secure
implementation of cryptographic systems requires strong operating systems,
networking, and application software, and strong authentication of users and
systems. &&&& THIS IS A MIXED BAG. 56-bit DES encryption is now
demonstrably broken by efforts that have been able to crack the keys in a
matter of hours. U.S. cryptography policy has reluctantly acknowledged
that, but is otherwise still troglodytic. Much to the surprise of many
people, implementations of cryptography have also been broken -- largely
because of the intricate nature of the task and their being embedded in weak
operating systems. The Advanced Encryption Standard (AES) effort holds
considerable promise with respect to algorithmic successors to DES. New
legislation is pending in the House and Senate on cryptographic policy, but
it is not clear what will emerge.
Risks in Key-Recovery Cryptography
&&&& As noted above, Melissa succeeded in part because of the lack of
wide-spread authentication. One reason that there is no wide-spread
authentication is that the U.S. Government has discouraged vendors from
using encryption, which is essential for good authentication. More recently,
the debate over law-enforcement access to unencrypted information has
diverted much intellectual energy that would otherwise have been working on
deploying encryption so as to prevent incidents such as Melissa. The French
government has recently decided not only that they can get along without
access involving key control, but that they should encourage secure private
communication in order not be be left behind in the new world of electronic
commerce. They have recognized that their own national well-being is
seriously at stake.
- The risks involved in key-recovery, key-escrow, and supposedly
trustworthy third parties (or even second parties) have barely begun to be
assessed by Congress and the Administration. Cryptography was considered
rather tangentially by the PCCIP --- recognizing that it is important, and
concluding that key management with key recovery would be advisable.
However, I believe that a thorough analysis of the potential risks of key
management must be conducted before instituting any key-recovery
technologies that could be inherently flawed, extremely riskful, and
possibly counterproductive to the overall goal of protecting the
infrastructures. In reality, the use of flawed key-management techniques
would greatly reduce security rather than increase it. Indeed, the inherent
security weaknesses in our computer systems themselves can often lead to the
subvertibility of even the best cryptographic techniques; cryptographic
systems are typically broken not by exhaustively searching for the keys, but
rather by finding much simpler ways to bypass or compromise the cryptography
and key management. Although key-recovery systems can of course be built,
there are no real assurances that they can be operated satisfactorily.
&&&& THIS IS UNCHANGED, although the names of the same flawed
trap-doored concepts keep changing -- from surreptitious key-escrow access
to surreptitious key-recovery access to surreptitious plain-text access. In
addition, long-term economic issues are finally becoming more widely
recognized in Congress. Cryptographic implementations that are at least as
good as what can be obtained within the U.S. are increasingly appearing
abroad, and the presumed dominance of U.S. software is now noticeably
beginning to erode with respect to cryptography.
- The desires of law enforcement for access to cryptographic keys runs
directly counter to the needs to protect our critical infrastructures.
Key-recovery cryptography is a particularly complex subject, even more than
the other topics discussed here; it does not lend itself to simplistic
would-be solutions. I only hint at the complexity and the risks here, and
urge you to see References 6, 7, 8, and 9 if you are actively pursuing this particular aspect of the overall
problem. &&&& THIS IS STILL A PROBLEM. No satisfactory bounded-risk
solution to surreptitious access to decrypted content has emerged, and none
is likely to emerge as long as the computer-communication infrastructures
are so flaky. However, it is becoming obvious that unbreakable crypto is
essential for our own protection and those of our allies. Furthermore, the
availability in Kosovo of only easily broken exportable crypto was a serious
risk to those people attempting to communicate securely in defending
themselves, in light of the fact that 40-bit crypto can now be broken in a
matter of seconds.
- There are numerous alternatives to key recovery that could be
considered. For example, database tracking facilities are already
widespread, through telephone records, credit-card billing, airline
reservations, etc. Intelligent programs for data fusion could be very
effective -- although perhaps risky from a privacy point of view.
Additionally, use of biometric and other forms of less spoofable
identification and authentication would add significantly to determining who
is doing what to whom. &&&& THIS IS STILL A PROBLEM. Much better
authentication is needed in systems, to deter penetrations. Much better
monitoring and analysis of real-time activities can be achieved without
having to build in trapdoors that can themselves be misused. The potential
for insider misuse remains a serious threat that needs much greater
attention. (Fortunately, earlier legislation to couple key recovery with
authentication infrastructures seems to have disappeared; it was a true
disaster waiting to happen.)
- One of the serious risks of cryptographic key recovery is that today's
information infrastructures are so weak, and that inherent risks are likely
to arise in the key management itself. However, even if the
computer-communication technology could be greatly improved, there are still
some very serious potential risks that must be resolved --- for example,
relating to sloppy administration and system operation and to insider misuse
of the key infrastructures. The ultimate necessity of having to rely on
second- and third-party agents and other people who could be corruptible
must be a serious concern. On balance, the risks may outweigh the
benefits. &&&& THIS IS STILL TRUE TODAY, AND IS NOT LIKELY TO CHANGE
IN THE FORESEEABLE FUTURE.
- There have been notable cases of computer system and database misuse
within the government, including misuse by IRS agents (as demonstrated in
recent House hearings) and law-enforcement personnel (for example, see
Reference 11). There have also been many computer-related misuses
committed by authorized system insiders outside of government. Given the
opportunities for and past experiences with insider misuse, schemes that
require surreptitious access to otherwise completely secret and highly
protected cryptographic keys are inherently suspect. Such schemes represent
trapdoors in search of illegal exploitation. &&&& THIS IS
ESSENTIALLY UNCHANGED. The recent Los Alamos case illustrates the ease with
which information can be leaked. Insider misuse will remain a serious
threat, even if outsider threats can be reduced through better user
authentication.
System Development and Operation
- There are enormous weaknesses in the software development process, with
many large systems that have been drastically over budget, seriously late,
and ultimately incapable of satisfying their stated requirements. Recent
experience with the federally mandated California Statewide Automated Child
Support System is illustrative of the difficulties in developing large
systems. Several major system developments have been terminated after large
expenditures and prolonged developments, including the FAA's efforts to
upgrade an archaic air-traffic control system, the IRS's Tax Systems
Modernization effort, the FBI's automated fingerprint system, and the
California DMV system. Other developments are struggling along, including
the FBI NCIC upgrade. According to various GAO sources, roughly one third
of all large system developments are scuttled before being operational.
Pervasive difficulties in system development suggest that critical
information systems will continue to be seriously flawed. &&&& THIS
IS STILL TRUE. Incidentally, various efforts are underway in attempts to
extend the open-source software paradigm into meaningfully robust but very
well supported nonproprietary systems or licensed proprietary systems with
modifiable source code, with trustworthy sources from which to obtain
trustworthy code via trustworthy distribution paths. Proprietary systems
have the serious drawback that it is very difficult for anyone other than
the developers to analyze these systems, in order to be able to identify
potentially serious problems and to remediate them. (See Reference
14.) The most popular proprietary closed-source systems are
seriously flawed with respect to security and reliability, but are not
easily modified by anyone but the developers. An additional trend that has
emerged since Reference 0 was written is the tendency toward a monoculture,
in which just a single vendor or even a single system type becomes universal
-- for example, within the Department of Defense. The recent report of an
open-systems task force (Reference 15) is also an important
direction, the ability to combine components into predictably dependable
systems. Melissa suggests that cyberdiversity akin to biodiversity is a
much more sensible strategy, because it was precisely the prevailing
monoculture that was affected by Melissa -- as had been expected.
- The development, operation, maintenance, and management of computer
systems and networks that must survive despite arbitrary threats require a
highly disciplined approach to our information infrastructures. Such an
approach must address survivability, reliability, and security within a
common integrated framework and must recognize that these attributes are
closely interrelated --- just as the national infrastructures and the
information infrastructures are closely interrelated. Furthermore, system
and network behavior tends to be very unpredictable, typically because of
flawed software, but sometimes because of unexpected hardware behavior or
invalid or outdated assumptions about the environment (e.g., Ariane 5, the
Year-2000 problem, supposedly redundant circuits cut by a single backhoe, a
single day on the New York Stock Exchange and NASDAQ vastly exceeding all
previous days). (Incidentally, several of my current research projects
noted in the disclosure statement at the end of this testimony are actively
addressing such a disciplined approach.) &&&& SOME PROGRESS EXISTS
in the research community (for example, Reference 13), but not
much of that has been transitioned into the real world yet. The Y2K
clean-up experience is very instructive. The code correction has itself
been expensive, but the testing effort is soaking up huge resources
(reportedly 75% of the entire costs); however, because testing is inherently
incomplete and local testing hides more global failures, many uncertainties
remain. A lesson from this process is that up-front effort in requirements
and design can pay off enormously later on. But that lesson never seems to
get learned, because the rush to market and the belief that users are the
least expensive beta-debugging platform are both contributing to some
mediocre systems.
Other System Considerations
- There are many potential failure modes in which a single apparently
minor event can trigger a chain reaction resulting in massive outages, such
as the 1980 ARPAnet collapse, the 1990 AT&T collapse, and the Western power
outages of July and August 1997. It may seem surprising to some of you that
such widespread outages can result from local events, but such risks grow
with both the complexity of interconnected systems and with the attempts to
optimize the performance of the whole by increasing the coupling of the
parts. Much more effort must be devoted to analyzing and preventing such
occurrences, considering them as both security and reliability problems in a
common context. Furthermore, we must improve the ability of our computer
systems to continue to operate acceptably in the presence of security
attacks and reliability collapses. &&&& THIS IS STILL TRUE, and
remains as an extremely difficult problem -- both in research and in
practical systems.
- As system defenses become more sophisticated, so do the methods at the
disposal of would-be misusers. This seems to be a never-ending upwardly
escalating spiral. Although computer skills are much more widely known than
before, the on-line dissemination of knowledge about techniques for
penetrating systems suggests that misuse can often be carried out without
great expertise, from great distances, and often anonymously. &&&&
THIS IS STILL TRUE. But it would be unwise to blame the existence of the
Internet for the rapid spread of such information. That would be equivalent
to shooting the messenger.
- Our defenses are inadequate to protect us from even isolated
attacks and unanticipated events. Risks include not just penetrations
and insider misuse, but also insidious Trojan horse attacks that can
lie dormant until triggered. However, our defenses are even more
inadequate to protect us against large-scale coordinated attacks. The
unintended effects of the accidentally harmful Internet Worm merely
hint at the potentially devastating effects that could have resulted
if something along those lines had been carried out maliciously.
&&&& THIS IS STILL TRUE. Surprisingly little progress has been
seen in the intervening years, and Melissa is just one example. It is
essential to realize that much worse attacks would still be easy to
perpetrate.
Attaining systems and networks that are dependably survivable, secure, and
reliable is a very difficult problem. It is essentially impossible to have
any guarantees whatsoever that a particular system will work properly
whenever, wherever, and however it is needed. Furthermore, the information
infrastructures are highly heterogeneous, which makes it even more difficult
to have any guarantees that information infrastructures in the large ---
that is, aggregations of different computer systems and networks --- will
behave properly. Survivability, security, and reliability are all weak-link
phenomena, and there are far too many weak links today. On the other hand,
there will always be weak links and resulting risks, and there will always
be cases involving multiple weak links. &&&& See Reference 13 for an
extensive report on attaining survivable systems and networks.
Analysis
The challenge for the research and development community is to find
ways to avoid as many of these risks as possible, to minimize the
consequences of the exploitation or accidental triggering of those
that cannot be avoided, and to provide well-founded assurances that
systems and networks are likely to be able to satisfy their critical
requirements. One of your tasks in the Science Committee must be
to encourage such R&D and to ensure that viable new approaches can
find their way into our information infrastructures --- and then to
encourage their adoption within the national infrastructures. One of
the main tasks of those who control the national infrastructures and
those who depend on our computer-communication infrastructures is to
ensure that they are using the most robust information technology
available, that management is fully aware of the risks, and that they
coordinate their efforts. As the PCCIP report notes, education and
awareness are vital (along with research and development), although it
helps greatly if what is being taught and learned can actually lead to
much greater survivability, reliability, and security --- benefitting
from the best R&D efforts, the most robust open systems, and the best
practical systems. This is a very difficult challenge for all of us.
From the above discussion, the realization that so little tangible
progress has been made in the technology and that we may indeed be
more vulnerable now than we were in 1997 is very disheartening. It
suggests that our priorities are not properly aligned. Of utmost
concern is the conclusion that the necessary improvements require
serious technical measures, more fundamentally than legislation and
law enforcement -- which are minimally ineffective in the absence of
more-robust systems and networks. Defensive legislation and law
enforcement are of course necessary -- because no improvements can
result in perfectly robust systems -- but can never be sufficient.
Furthermore, the current reliance on a proprietary-system monoculture
is extremely dangerous.
The most important realization that you must grapple with is that the
Melissa problem is in reality merely a microcosm of the collection of
serious problems described here, affecting all of our national and
information infrastructures. It is essential that you look at the big
picture, rather than focusing on Melissa itself (herself?).
Recommendations
Much greater emphasis must be devoted to the development of secure,
reliable, and highly survivable systems and networks. Strong cryptography
that is well implemented is just one necessity. The system development
process is in general a national disgrace. Good software development
practice is seldom used, as illustrated by the bad state of system security,
the Y2K problem, failures of defense systems, cancellations of major
procurements, and so on. Reversal of that trend requires better education
and training, improvements of the procurement process including more
stringent and carefully-defined requirements, improved oversight of system
developments, liability and responsibility for failed systems, and
recognition of the risks if dramatic improvement does not occur rapidly.
Significant new research and development efforts are needed, although ways
must be found to get the R&D results into the commercial mainstreams. For
example, we must find ways of making systems and networks more secure, more
reliable, and more survivable, in the face of realistic adversities.
Various approaches to making certain nonproprietary software (so-called open
software and free software) more robust seem to be very promising (Reference
14). Also, nonintrusive systems for detecting and responding to system
misuses (from insiders as well as outsiders) need to be vigorously pursued
(although I need to point out that I have long been involved in advanced
projects on anomaly and misuse detection and responses thereto -- as noted
at the end of this testimony). System administrators need all the help they
can get -- they are currently expected to keep abreast of a massive flow of
security patches, without having much of an understanding of how proprietary
code works or how vulnerable it is to misuse. The Y2K system-upgrading
exercise suggests that enormous resources had to be expended in a crash
effort to take care of one particular problem that was known long ago.
Unfortunately, the myopic focus on that effort has ignored many of the
deeper problems.
In the past year and one-half, system and network security has not improved
commensurate with the increased threats. Experience suggests that many new
vulnerabilities will exist in new systems. The same kind of mistakes tend
to be made over and over. You must become aware in depth of the extent of
the threats, the vulnerabilities, the risks, and the measures necessary for
some remediation. Although Melissa is really just one more ringing alarm
clock, it is time to wake up and spring to action.
References
Computer-communication vulnerabilities, threats, and risks are documented in
considerable detail in References 1, 2, 3,
4, 5, and 12. The recent NRC report
Trust in Cyberspace (Reference 12) revisits the
earlier NRC report Computers at Risk (Reference 1), and
concludes that the problems have not diminished. In addition, specific
vulnerabilities and risks related to the use of cryptography are given in
References 6, 7, 8, and 9.
NOTE: To simplify the text, specific references to most of the cases cited
herein can be found in Reference 5, including the AT&T frame
relay outage, Galaxy IV, the rampant Website alterations, the San Francisco
blackout, and so on.
-
David D. Clark, W. Earl Boebert, Susan Gerhart, John V. Guttag,
Richard A. Kemmerer, Stephen T. Kent, Sandra M. Mann Lambert, Butler
W. Lampson, John J. Lane, M. Douglas McIlroy, Peter G. Neumann,
Michael O. Rabin, Warren Schmitt, Harold F. Tipton, Stephen T. Walker,
and Willis H. Ware, Computers at Risk: Safe Computing in the
Information Age, National Research Council, National Academy Press,
1991, 2101 Constitution Ave., Washington, D.C. 20418.
-
Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995.
-
Peter G. Neumann, Security Risks in the Emerging Infrastructure, U.S. Senate
Permanent Subcommittee on Investigations of the Senate Committee on
Governmental Affairs, 25 June 1996. The written testimony appears in
Security in Cyberspace, Hearings, S. Hrg. 104-701, ISBN 0-16-053913-7,
1996, pp. 350-363, with oral testimony included on pages 106-111
(http://www.csl.sri.com/neumann/senate96.html).
-
Peter G. Neumann, Computer Security in Aviation: Vulnerabilities,
Threats, and Risks, International Conference on Aviation Safety and
Security in the 21st Century, White House Commission on Safety and Security,
and George Washington University. 13-15 January 1997
(http://www.csl.sri.com/neumann/air.html).
-
Peter G. Neumann, Illustrative Risks to the Public in the Use of
Computer Systems and Related Technology (periodically updated index of
risks cases). The latest version is browsable
at http://www.csl.sri.com/neumann/illustrative.html (also in
.ps and .pdf form).
-
Peter G. Neumann, Security Risks in Key Recovery, written
testimony for the Senate Judiciary Committee hearing on cryptographic key
recovery, 9 July 1997 (http://www.csl.sri.com/neumann/judiciary.html).
-
Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze,
Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey
I. Schiller, Bruce Schneier, The Risks of Key Recovery, Key Escrow, and
Trusted Third-Party Encryption, 27 May 1997. This report is published
in the World Wide Web Journal (Web Security: A Matter of Trust) 2,
3, O'Reilly & Associates, Summer 1997, pages 241-257, and is
browsable at
http://www.cdt.org/crypto/risks98/ .
-
Kenneth W. Dam, W.Y. Smith, Lee Bollinger, Ann Caracristi, Benjamin
R. Civiletti, Colin Crook, Samuel H. Fuller, Leslie H. Gelb, Ronald Graham,
Martin Hellman, Julius L. Katz, Peter G. Neumann, Raymond Ozzie, Edward
C. Schmults, Elliot M. Stone, and Willis H. Ware, Cryptography's Role In
Securing the Information Society (a.k.a. the CRISIS report), Final
Report of the National Research Council Cryptographic Policy Study
Committee, National Academy Press, 2101 Constitution Ave., Washington,
D.C. 20418, 1996. The executive summary is available on-line
(http://www2.nas.edu/cstbweb).
-
Susan Landau, Stephen Kent, Clinton Brooks, Scott Charney, Dorothy
Denning, Whitfield Diffie, Anthony Lauck, Douglas Miller, Peter G. Neumann,
and David Sobel, Codes, Keys, and Conflicts: Issues in U.S. Crypto
Policy, Report of a Special Panel of the ACM U.S. Public Policy
Committee (USACM), June 1994
(http://info.acm.org/reports/acm_crypto_study.html).
-
Information Security: Computer Attacks at Department of Defense
Pose Increasing Risks, U.S. General Accounting Office, May 1996,
GAO/AIMD-96-84.
-
``National Crime Information Center: Legislation Needed to Deter
Misuse of Criminal Justice Information,'' statement of Laurie E. Ekstrand,
U.S. General Accounting Office, as testimony before the U.S. House of
Representatives Subcommittee on Information, Justice, Agriculture, and
Transportation, of the Committee on Government Operations, and the
Subcommittee on Civil and Constitutional Rights, of the Committee on the
Judiciary, 28 July 1993. The appendix to that testimony documents 62 cases
of misuses of law-enforcement computer data.
-
F.B. Schneider and M. Blumenthal, editors, Trust in
Cyberspace, National Research Council, 1998.
-
Peter G. Neumann, Practical Architectures for Survivable
Systems and Networks, U.S. Army Research Lab report, 28 January 1999,
browsable at
http://www.csl.sri.com/neumann/arl-one.html (also in .ps and
.pdf form).
-
Peter G. Neumann, Robust Open-Source Software,
Communications of the ACM 42, 2, February 1999.
http://www.csl.sri.com/neumann/illustrativerisks.html
-
Wayne L. O'Hern, Jr., task force chairman,
An Open Systems Process for DoD, Open Systems Task Force,
Defense Science Board, October 1998.
Personal Background
By way of introduction, I note that I have been involved with the U.S.
Government (as well as state and local governments) in different
technological contexts for many years, including (for example) national
security, law enforcement, air-traffic control, and aviation safety and
security (including the early stages of fly-by-wire research and
space-station planning). My first computer-related job was for the Navy in
the summer of 1953.
I have long been concerned with security, reliability, human safety, system
survivability, and privacy in computer-communication systems and networks,
and with how to develop systems that can dependably do what is expected of
them. For example, I have been involved in designing operating systems and
networks, secure database-management systems, and systems that monitor
activities and seek to identify abnormal patterns of behavior. I have also
been seriously involved in identifying and preventing risks.
I received AB, SM, and PhD degrees from Harvard in 1954, 1955, 1961,
respectively, and in 1960 received a Dr rerum naturarum from the Technische
Hochschule, Darmstadt, Germany --- where I had a Fulbright grant for two
years. In the Computer Science Lab at Bell Telephone Laboratories at Murray
Hill, N.J. throughout the 1960s, I was involved in research in computers
and communications; during 1965-69, I participated extensively in the
design, development, and management of Multics, a pioneering secure system,
developed jointly by MIT, Honeywell, and Bell Labs. (Multics made great
advances in security, and incidentally also recognized and avoided the Y2K
problem -- in 1965!) I was a visiting Mackay Lecturer at Stanford in 1964
and at Berkeley in 1970-71. I am a Principal Scientist in the Computer
Science Laboratory at SRI, where I have been since 1971, concerned with
computer systems having critical requirements such as security, reliability,
information system survivability, human safety, and high assurance.
I am a member of the General Accounting Office's Executive Council for
Information Management, which is heavily involved in Y2K. From 1994
to 1996, I served a 2.5-year term on the Internal Revenue Service
Commissioner's Advisory Group, where I addressed privacy and security
issues as well attempting with considerable futility to give the IRS
some remedial advice on the seriously flawed Tax Systems Modernization
effort. From 1987 to 1989, I served on an expert panel for the House
Judiciary Committee Subcommittee on Civil and Constitutional Rights,
addressing law-enforcement database systems, at the request of then
Congressman Don Edwards.
I was an organizer of the 1982 Air Force Studies Board database
security study, and was a member of the 1989-90 National Research
Council System Security Study Committee that produced the report
Computers at Risk. I recently served on three studies
targeted at reviewing U.S. crypto policy: an ACM panel (June 1994,
Reference 9), a more intensive National Research Council study
(1996, Reference 8), and a report (Reference 7) introduced
as part of my written Senate testimony on 9 July 1997 (Reference
6), on the technical implications, risks, and costs of
`key recovery', `key escrow', and `trusted third-party' encryption
systems.
For the Association for Computing Machinery (ACM), I was founder and Editor
of the SIGSOFT Software Engineering Notes (1976-1993) and now Associate
Editor for the RISKS material; Chairman of the ACM Committee on Computers
and Public Policy (since 1985); and a Contributing Editor for CACM (since
1990) for the monthly `Inside Risks' column. I co-chaired SIGSOFT '91 on
software for critical systems. In 1985 I created, and still moderate, the
ACM Forum on Risks to the Public in the Use of Computers and Related
Technology, which is one of the most widely read of the on-line computer
newsgroups. RISKS (comp.risks) provides a medium for discussion of issues
relating to all aspects of computers and the social and technological
problems that they create. My RISKS-derived book (Computer-Related
Risks, Reference 2) explores the benefits and pitfalls of
computer-communication technology and suggests ways of avoiding risks in
critical systems.
My Web page
(http://www.CSL.sri.com/neumann/) includes pointers to my 25 June 1996
Senate testimony on security risks in the critical infrastructure (Reference
3), a position statement for the Gore Commission on Aviation
Safety and Security (Reference 4), testimony for the House Ways and
Means Committee subcommittee on the Social Security Administration and a
slightly extended statement for a subsequent SSA panel in San Jose, 28 May
1997, and testimony for the Senate Judiciary committee on key recovery, 9
July 1997 (Reference 6).
I am a Fellow of the American Association for the Advancement of Science,
the ACM, and the Institute of Electrical and Electronics Engineers (and a
member of its Computer Society). I received the ACM Outstanding
Contribution Award for 1992, the first SRI Exceptional Performance Award for
Leadership in Community Service in 1992, the Electronic Frontier Foundation
Pioneer Award in 1996, the ACM SIGSOFT Distinguished Service Award in 1997,
and the CPSR Norbert Wiener Award in 1997. I am a member of the Advisory
Board of the Electronic Privacy Information Center (EPIC).
Federal Funding Disclosure Statement
I am testifying as an individual, not as a representative of my
employer (SRI International, a not-for-profit R&D institute), or the
ACM, or any other organization. I note for the record that I am
currently receiving U.S. Government funding as an SRI employee under
the following research projects from the Army Research Lab and DARPA.
These projects are directed at crucial aspects of the problems of
protecting computer-communication information infrastructures, as is
the vast majority of research and development work that I have done
over the past 46 years. See my Web site at
http://www.CSL.sri.com/neumann/ for further background.
(Incidentally, my current role on the GAO Executive Council for
Information Management and Technology is a pro bono effort.)
-
Practical Architectures for Survivable Systems, for the Army Research Lab
(administered by the Harry Diamond Lab, Adelphi, MD, under Contract
No. DAKF11-97-C-0020, $699,833, 25 September 1997 through 31 January 2000;
Principal Investigator). This project is considering survivable systems and
networks and their explicit dependence on security and reliability,
identifying specific shortcomings in the existing computer-communication
infrastructures, and exploring system and network architectures for
overcoming those deficiencies. The first-phase final report is cited
as Reference 13.
-
Analysis and Response for Intrusion Detection in Large Networks, for
the Defense Advanced Research Projects Agency (administered by Rome
Laboratory, under Contract No. F30602-96-C-0294, DARPA Order No. E302,
within the ITO survivability program, $1,749,658, 28 August 1996
through 27 August 1999; co-Principal Investigator with Phillip
A. Porras). This project is developing EMERALD (Event Monitoring
Enabling Responses to Anomalous Live Disturbances), a system for
analyzing network misuse, a successor to our earlier forefront work on
IDES (Intrusion Detection Expert System) and NIDES (Next-Generation
IDES). (See http://www.csl.sri.com/intrusion.html for our Web pages
on anomaly and misuse
detection, including recent papers.) A related contract for the
DARPA ISO organization is integrating EMERALD into various DARPA
platforms.
-
In addition, I am providing some support to the U.S. Government in a
small effort to help organize two workshops, the first of which will
be devoted to detecting and responding to insider misuse.