Computer-Related Risks and the National Infrastructures
Peter G. Neumann
Principal Scientist, Computer Science Laboratory
SRI International, Menlo Park CA 94025-3493
Telephone: 1-650-859-2375
E-mail: Neumann@CSL.SRI.com; Web site: http://www.csl.sri.com/neumann.html
Written testimony, 6 November 1997, for the
U.S. House Science Committee Subcommittee on Technology
The written testimony is published in The Role of Computer Security in
Protecting U.S. Infrastructures, Hearing, 105th Congress, 1st session,
No. 33, 1998, pages 64--99, ISBN 0-16-056151-5, 1997, preceded by the oral
presentation on pages 61--63. Oral responses to oral questions are on pages
101--118, and written responses to subsequent written questions are on pages
148--161.
Summary
The President's Commission on Critical Infrastructure Protection (PCCIP) has
completed its investigation, having addressed eight major critical
national infrastructures: telecommunications; generation,
transmission and distribution of electric power; storage and distribution of
gas and oil; water supplies; transportation; banking and finance; emergency
services; and continuity of government services. Perhaps most important is
the Commission's recognition that very serious vulnerabilities and threats
exist in all of these critical infrastructures. Perhaps equally important
if not more so is the PCCIP's recognition that all of these critical
infrastructures are closely interdependent and that they all depend on
underlying computer-communication information infrastructures, such
as computing resources, databases, private networks, and the Internet.
Background
Because the PCCIP report itself contains considerable background, I merely
summarize my main conclusions. The computer-communication vulnerabilities,
threats, and risks are documented in considerable detail in References
1, 2, 3, 4 and 5. In
addition, specific vulnerabilities and risks related to the use of
cryptography are given in References 6, 7, 8,
and 9. I hope that you will review some of that material.
Conclusions
Here a few brief summary conclusions.
Vulnerabilities, Threats, and Risks
- The existing information infrastructures are riddled with
vulnerabilities, including hardware and software unreliability and system
unavailability. Examples include the nationwide 1990 AT&T long-distance
collapse and many recent outages and saturations of Internet service
providers. Further cases are anticipated in the many computer systems that
are expected to break in the Year 2000 because of two-digit date fields.
Serious security flaws are also abundant in computer systems, networks, Web
software, programming languages, and have been widely reported. The extent
of the risks is still not widely recognized, and preventive measures have
been very slow to develop. Indeed, we are in all likelihood not even aware
of many still unidentified vulnerabilities, and new vulnerabilities are
continually being introduced. Future disasters may exploit vulnerabilities
we do not know about.
- There are many realistic threats to the information infrastructures,
including malicious insiders and intruders, terrorists, saboteurs, and just
plain incompetent administrative and operational staff. These threats may
come from corporate and national interests as well as individuals --- in
addition to effects of the environment, natural phenomena, accidental
interference, and so on. Malicious attacks may come from anywhere in the
world, via dial-up lines and network connections, often anonymously. The
list of threats is long and multidimensional (and discussed in the PCCIP
report). Consequently, it is not possible to predict which threats will be
exploited, and under what circumstances.
- Thus far, there have been relatively few truly serious malicious
attacks on computer systems and networking (for example, see Reference
10, which includes analysis of the Rome Lab case), although such
activities from both insiders and outsiders appear to be increasing,
particularly in financial systems (such as the $588 million Japanese
Pachinko frauds and the Citibank case). There have been numerous cases of
more than mere nuisance value (for example, the hacking of Web sites of the
Justice Department, CIA, US Air Force, and NASA), including many denials of
service (for example, flooding attacks that have disabled entire networks).
There have also been penetration studies that have constructively
demonstrated the extent of the vulnerabilities, without malicious intent
(such as the 1988 Internet Worm and numerous analyses and demonstrations of
flaws in Web browsers, servers, protocols, algorithms, and encryption
schemes). It is good that we have so many friendly participants in this
struggle to increase dependability. Perhaps because there have been no
devastating attacks, concern is less than it should be --- considering the
magnitude of the potential risks. However, the rapid acceleration of
electronic commerce can be expected to inspire some ingenious massive frauds
that systematically exploit various major vulnerabilities on the information
infrastructure --- which could be a goldmine for organized crime.
- In many cases, system collapses attributable to reliability problems
could also have been triggered maliciously, because of corresponding
security vulnerabilities.
Critical Dependencies
- Because all eight of the national infrastructures are more or less
critically dependent on the computer-communication infrastructures, risks to
the latter immediately become of concern to the former. In addition, the
computer-communication information infrastructures are themselves dependent
on many of the national infrastructures --- most notably electric power and
telecommunications. To date, we have seen relatively few attacks on the
national infrastructures, although the 23 October 1997 power outage in the
northern half of San Francisco appears to have been sabotage, following an
earlier physical attack. Many other attacks would be very easy to
perpetrate, especially those that exploit vulnerabilities in the information
infrastructures.
- We have become massively interconnected and interdependent. Whether we
like it or not, we must coexist with people and systems of unknown and
unidentifiable trustworthiness (including unidentifiable hostile parties),
within the U.S. and elsewhere. The problems have become international as
well as national.
- The use of certifiably strong cryptography is essential to the future
of computer-communication systems, and therefore to the protection of the
critical national infrastructures. Cryptography is only one small link in
achieving confidentiality, authentication, and integrity, but it is a vital
link. Although cryptographic compromises have not been a major source of
security risks in the past, preventing them will be especially critical to
the successful conduct of electronic commerce, which is growing very rapidly
and places stringent demands on computer-communication systems. Secure
implementation of cryptographic systems requires strong operating systems,
networking, and application software, and strong authentication of users and
systems.
Risks in Key-Recovery Cryptography
- The risks involved in key-recovery, key-escrow, and supposedly
trustworthy third parties (or even second parties) have barely begun to be
assessed by Congress and the Administration. Cryptography was considered
rather tangentially by the PCCIP --- recognizing that it is important, and
concluding that key management with key recovery would be advisable.
However, I believe that a thorough analysis of the potential risks of key
management must be conducted before instituting any key-recovery
technologies that could be inherently flawed, extremely riskful, and
possibly counterproductive to the overall goal of protecting the
infrastructures. In reality, the use of flawed key-management techniques
would greatly reduce security rather than increase it. Indeed, the inherent
security weaknesses in our computer systems themselves can often lead to the
subvertibility of even the best cryptographic techniques; cryptographic
systems are typically broken not by exhaustively searching for the keys, but
rather by finding much simpler ways to bypass or compromise the cryptography
and key management. Although key-recovery systems can of course be built,
there are no real assurances that they can be operated satisfactorily.
- The desires of law enforcement for access to cryptographic keys could
run directly counter to the needs to protect our critical infrastructures.
Key-recovery cryptography is a particularly complex subject, even more than
the other topics discussed here; it does not lend itself to simplistic
would-be solutions. I only hint at the complexity and the risks here, and
urge you to see References 6, 7, 8, and 9 if you are actively pursuing this particular aspect of the overall
problem.
- There are numerous alternatives to key recovery that could be
considered. For example, database tracking facilities are already
widespread, through telephone records, credit-card billing, airline
reservations, etc. Intelligent programs for data fusion could be very
effective -- although perhaps risky from a privacy point of view.
Additionally, use of biometric and other forms of less spoofable
identification and authentication would add significantly to determining who
is doing what to whom.
- One of the serious risks of cryptographic key recovery is that today's
information infrastructures are so weak, and that inherent risks are likely
to arise in the key management itself. However, even if the
computer-communication technology could be greatly improved, there are still
some very serious potential risks that must be resolved --- for example,
relating to sloppy administration and system operation and to insider misuse
of the key infrastructures. The ultimate necessity of having to rely on
second- and third-party agents and other people who could be corruptible
must be a serious concern. On balance, the risks may outweigh the
benefits.
- There have been notable cases of computer system and database misuse
within the government, including misuse by IRS agents (as demonstrated in
recent House hearings) and law-enforcement personnel (for example, see
Reference 11). There have also been many computer-related misuses
committed by authorized system insiders outside of government. Given the
opportunities for and past experiences with insider misuse, schemes that
require surreptitious access to otherwise completely secret and highly
protected cryptographic keys are inherently suspect. Such schemes represent
trapdoors in search of illegal exploitation.
System Development and Operation
- There are enormous weaknesses in the software development process, with
many large systems that have been drastically over budget, seriously late,
and ultimately incapable of satisfying their stated requirements. Current
experience with the federally mandated California Statewide Automated Child
Support System is illustrative of the difficulties in developing large
systems. Several major system developments have been terminated after large
expenditures and prolonged developments, including the FAA's efforts to
upgrade an archaic air-traffic control system, the IRS's Tax Systems
Modernization effort, the FBI's automated fingerprint system, and the
California DMV system. Other developments are struggling along, including
the FBI NCIC upgrade. According to various GAO sources, roughly one third
of all large system developments are scuttled before being operational. The
ubiquitous difficulties in system development suggest that critical
information systems will continue to be seriously flawed.
- The development, operation, maintenance, and management of computer
systems and networks that must survive despite arbitrary threats require a
highly disciplined approach to our information infrastructures. Such an
approach must address survivability, reliability, and security within a
common integrated framework and must recognize that these attributes are
closely interrelated --- just as the national infrastructures and the
information infrastructures are closely interrelated. Furthermore, system
and network behavior tends to be very unpredictable, typically because of
flawed software, but sometimes because of unexpected hardware behavior or
invalid or outdated assumptions about the environment (e.g., Ariane 5, the
Year-2000 problem, supposedly redundant circuits cut by a single backhoe, a
single day on the New York Stock Exchange and NASDAQ vastly exceeding all
previous days). (Incidentally, both of my current research projects noted
in the disclosure statement at the end of this testimony are actively
addressing such a disciplined approach.)
Other System Considerations
- There are many potential failure modes in which a single apparently
minor event can trigger a chain reaction resulting in massive outages, such
as the 1980 ARPAnet collapse, the 1990 AT&T collapse, and the Western power
outages of July and August 1997. It may seem surprising to some of you that
such widespread outages can result from local events, but such risks grow
with both the complexity of interconnected systems and with the attempts to
optimize the performance of the whole by increasing the coupling of the
part. Much more effort must be devoted to analyzing and preventing such
occurrences, considering them as both security and reliability problems in a
common context. Furthermore, we must improve the ability of our
computer systems to continue to operate acceptably in the presence of
security attacks and reliability collapses.
- As system defenses become more sophisticated, so do the methods at the
disposal of would-be misusers. This seems to be a never-ending upwardly
escalating spiral. Although computer skills are much more widely known than
before, the on-line dissemination of knowledge about techniques for
penetrating systems suggests that misuse can often be carried out without
great expertise, from great distances, and often anonymously.
- Our defenses are inadequate to protect us from even isolated attacks
and unanticipated events. Risks include not just penetrations and insider
misuse, but also insidious Trojan horse attacks that can lie dormant until
triggered. However, our defenses are even more inadequate to protect us
against large-scale coordinated attacks. The unintended effects of the
nonmalicious 1988 Internet Worm merely hint at the potentially devastating
effects that could have resulted if something along those lines had been
carried out maliciously.
Attaining systems and networks that are dependably survivable, secure, and
reliable is a very difficult problem. It is essentially impossible to have
any guarantees whatsoever that a particular system will work properly
whenever, wherever, and however it is needed. Furthermore, the information
infrastructures are highly heterogeneous, which makes it even more difficult
to have any guarantees that information infrastructures in the large ---
that is, aggregations of different computer systems and networks --- will
behave properly. Survivability, security, and reliability are all weak-link
phenomena, and there are far too many weak links today. On the other hand,
there will always be weak links and resulting risks, and there will always
be cases involving multiple weak links.
Analysis
Our job in the research and development community is to find ways to avoid
as many of those risks as possible, to minimize the consequences of the
exploitation or accidental triggering of those that cannot be avoided, and
to provide well-founded assurances that systems and networks are likely to
be able to satisfy their critical requirements. One of your tasks in
Congress is to encourage such R&D and to ensure that viable new approaches
can find their way into the computer-communication information
infrastructures --- and then to encourage their adoption within the national
infrastructures. One of the main tasks of those who control the national
infrastructures is to ensure that they are using the most robust
computer-communication technology available, that management is fully aware
of the risks, and that they coordinate their efforts. As the PCCIP notes,
education and awareness are vital (along with research and development),
although it helps greatly if what is being taught and learned can actually
lead to much greater survivability, reliability, and security ---
benefitting from the best R&D efforts and the best practical systems. This
is a very difficult challenge for all of us.
We cannot afford to wait for massive disasters. The Commission has outlined
some initial measures that may be beneficial --- if they are pursued
vigorously. However, the PCCIP has identified only the tip of a very large
iceberg, and there is much more work to be done.
The Commission has clearly recognized that protecting the national
infrastructures is a matter of shared responsibility between the private and
public sectors, and also that it is a matter of national security. However,
it is of the utmost importance that the term "national security" be
interpreted not in the narrow sense of the Department of Defense and the
National Security Agency, but in the broadest possible sense of the
well-being if not the survival of the nation.
The PCCIP is to be commended for what they have accomplished, recognizing
that an effort of this breadth and scope is almost unprecedented. Their
report is very impressive and provides an important basis for future action.
It deserves very careful public analysis and discussion. (The unclassified
version of the Commission's report became available only a few hours before
the deadline for me to submit this written testimony, and therefore I cannot
provide any in-depth analysis at this time.) Apparently, there is also
considerable backup material relating to the information infrastructure that
will be made available later.
The PCCIP's recommendations that may cause the greatest difficulty are those
involving industry cooperation and information sharing. Historically, there
have always been difficulties in getting competitors within a particular
national infrastructure to collaborate, and there have always been problems
getting the different national infrastructures to coordinate with each
other. The Commission is completely correct that such cooperation is
essential. The model of the Office of the Manager of the National
Communication System has been fairly successful in some respects, but not in
others. For example, the process of improving the security of the
telecommunications infrastructure has been slow. Overall, the OMNCS
experience bears deeper study.
I am concerned that deregulation of various national infrastructures could
have a deleterious effect with respect to the critical infrastructures.
Indeed, the PCCIP also recognizes deregulation as a risk that could run
counter to their hopes. More generally, the tension between government
control and private enterprise is a serious source of difficulty.
Individual infrastructures such as electrical power and telecommunications
have incentives to be efficient and profitable, but critical security,
reliability, and system survivability are requirements that appear desirable
only to the extent they are motivated by profitability. That is not
acceptable when national survivability is at stake. On the other hand,
draconian government regulation also is not likely to succeed. As an
example for consideration, there seems to be less surplus power today, with
various providers relying on each other for load sharing; when rolling
outages occur as they did in July and August of 1997, there seems to be
increasingly less margin for error.
Critical requirements are also seemingly less important to
computer-communication system purveyors. We generally suffer with bad
system security, weak network security, and unpredictable system and network
survivability. There are no easy answers in this respect. However, the
efforts needed to protect the critical information infrastructures with
respect to threats to the national infrastructures are greater than those
perceived by computer system developers as their bread-and-butter customers.
Consequently, some government involvement is essential.
The PCCIP recognizes that the infrastructures are vulnerable and must be
improved --- especially the computer-communication infrastructures. The
computer-system development practice must become much more highly based on
sound principles and good engineering practice. The government procurement
process must be reformed, to avoid future development fiascoes.
The Commission's chapter on research and development is unfortunately
extraordinarily superficial --- only three pages in my Web version. R&D is
absolutely essential to the problems confronting us. I can only hope that
much more detailed supplementary recommendations will become available later.
The Commission has largely ducked the issue of cryptography, other than to
note that it is important to securing the information infrastructure.
Unfortunately, they recommend the adoption of key-recovery techniques
(simply because they think it seems prudent?), without having analyzed any
of the risks and other implications.
I applaud you for holding these hearings. There is an enormous need for
open discussion of these issues rather than seeking simplistic would-be
remedies --- which in this case do not exist. Before you take any
legislative action relating to the critical national infrastructures and the
computer-communication infrastructures, I hope you will study the PCCIP
report and its backup materials, and then read my book on the risks
associated with computer-communication technologies (Reference 2) and
perhaps my earlier testimonies (References 3, 4, and
6), as well as some of the other reports mentioned below
(References 1, 7, 8, 9, 10, and
11). You must fully understand the vulnerabilities, threats,
risks, and potential consequences. The issues are complex; the PCCIP's
recommendations for education and awareness must include everyone ---
including Congress.
References
-
David D. Clark, W. Earl Boebert, Susan Gerhart, John V. Guttag,
Richard A. Kemmerer, Stephen T. Kent, Sandra M. Mann Lambert, Butler
W. Lampson, John J. Lane, M. Douglas McIlroy, Peter G. Neumann,
Michael O. Rabin, Warren Schmitt, Harold F. Tipton, Stephen T. Walker,
and Willis H. Ware, Computers at Risk: Safe Computing in the
Information Age, National Research Council, National Academy Press,
1991, 2101 Constitution Ave., Washington, D.C. 20418, 1996.
-
Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995.
-
Peter G. Neumann, Security Risks in the Emerging Infrastructure, U.S. Senate
Permanent Subcommittee on Investigations of the Senate Committee on
Governmental Affairs, 25 June 1996. The written testimony appears in
Security in Cyberspace, Hearings, S. Hrg. 104-701, ISBN 0-16-053913-7,
1996, pp. 350-363, with oral testimony included on pages 106-111
(http://www.csl.sri.com/neumannSenate.html).
-
Peter G. Neumann, Computer Security in Aviation: Vulnerabilities,
Threats, and Risks, International Conference on Aviation Safety and
Security in the 21st Century, White House Commission on Safety and Security,
and George Washington University. 13-15 January 1997
(http://www.csl.sri.com/neumann/air.html).
-
Peter G. Neumann, Illustrative Risks to the Public in the Use of
Computer Systems and Related Technology
(periodically updated index of risks cases,
ftp://www.csl.sri.com/pub/illustrative.PS).
-
Peter G. Neumann, Security Risks in Key Recovery, written
testimony for the Senate Judiciary Committee hearing on cryptographic key
recovery, 9 July 1997 (http://www.csl.sri.com/neumann/judiciary.html).
-
Hal Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze,
Whitfield Diffie, John Gilmore, Peter G. Neumann, Ronald L. Rivest, Jeffrey
I. Schiller, Bruce Schneier, The Risks of Key Recovery, Key Escrow, and
Trusted Third-Party Encryption, 27 May 1997. This report is published
in the World Wide Web Journal (Web Security: A Matter of Trust) 2,
3, O'Reilly & Associates, Summer 1997, pages 241-257
(ftp://research.att.com/dist/mab/key_study.txt or .ps;
http://www.crypto.com/key_study).
-
Kenneth W. Dam, W.Y. Smith, Lee Bollinger, Ann Caracristi, Benjamin
R. Civiletti, Colin Crook, Samuel H. Fuller, Leslie H. Gelb, Ronald Graham,
Martin Hellman, Julius L. Katz, Peter G. Neumann, Raymond Ozzie, Edward
C. Schmults, Elliot M. Stone, and Willis H. Ware, Cryptography's Role In
Securing the Information Society (a.k.a. the CRISIS report), Final
Report of the National Research Council Cryptographic Policy Study
Committee, National Academy Press, 2101 Constitution Ave., Washington,
D.C. 20418, 1996. The executive summary is available on-line
(http://www2.nas.edu/cstbweb).
-
Susan Landau, Stephen Kent, Clinton Brooks, Scott Charney, Dorothy
Denning, Whitfield Diffie, Anthony Lauck, Douglas Miller, Peter G. Neumann,
and David Sobel, Codes, Keys, and Conflicts: Issues in U.S. Crypto
Policy, Report of a Special Panel of the ACM U.S. Public Policy
Committee (USACM), June 1994
(http://info.acm.org/reports/acm_crypto_study.html).
-
Information Security: Computer Attacks at Department of Defense
Pose Increasing Risks, U.S. General Accounting Office, May 1996,
GAO/AIMD-96-84.
-
``National Crime Information Center: Legislation Needed to Deter
Misuse of Criminal Justice Information,'' statement of Laurie E. Ekstrand,
U.S. General Accounting Office, as testimony before the U.S. House of
Representatives Subcommittee on Information, Justice, Agriculture, and
Transportation, of the Committee on Government Operations, and the
Subcommittee on Civil and Constitutional Rights, of the Committee on the
Judiciary, 28 July 1993. The appendix to that testimony documents 62 cases
of misuses of law-enforcement computer data.
Personal Background
By way of introduction, I note that I have been involved with the U.S.
Government (as well as state and local governments) in different
technological contexts for many years, including (for example) national
security, law enforcement, air-traffic control, and aviation safety and
security (including the early stages of fly-by-wire research and
space-station planning). My first computer-related job was for the Navy in
the summer of 1953.
I have long been concerned with security, reliability, human safety, system
survivability, and privacy in computer-communication systems and networks,
and with how to develop systems that can dependably do what is expected of
them. For example, I have been involved in designing operating systems and
networks, secure database-management systems, and systems that monitor
activities and seek to identify abnormal patterns of behavior. I have also
been seriously involved in identifying and preventing risks.
I received AB, SM, and PhD degrees from Harvard in 1954, 1955, 1961,
respectively, and in 1960 received a Dr rerum naturarum from the Technische
Hochschule, Darmstadt, Germany --- where I was a Fulbright scholar for two
years. In the Computer Science Lab at Bell Telephone Labs at Murray Hill,
N.J., throughout the 1960s, I was involved in research in computers and
communications; during 1965-69, I participated extensively in the design,
development, and management of Multics, a pioneering secure system,
developed jointly by MIT, Honeywell, and Bell Labs. I was a visiting Mackay
Lecturer at Stanford in 1964 and at Berkeley in 1970-71. I am a Principal
Scientist in the Computer Science Laboratory at SRI, where I have been since
1971, concerned with computer systems having critical requirements such as
security, reliability, human safety, and high assurance.
I am a member of the General Accounting Office's Executive Council for
Information Management. From 1994 to 1996, I served a 2.5-year term on the
Internal Revenue Service Commissioner's Advisory Group, where I addressed
privacy and security issues as well attempting with considerable futility to
give the IRS some remedial advice on the seriously flawed Tax Systems
Modernization effort. From 1987 to 1989, I served on an expert panel for
the House Judiciary Committee Subcommittee on Civil and Constitutional
Rights, addressing law-enforcement database systems, at the request of
Congressman Don Edwards.
I was an organizer of the 1982 Air Force Studies Board database security
study, and was a member of the 1989-90 National Research Council System
Security Study Committee that produced the report, Computers at Risk. I
recently served on three studies targeted at reviewing U.S. crypto policy:
an ACM panel (June 1994, Reference 9), a more intensive National
Research Council study (1996, Reference 8), and a report (Reference
7) introduced as part of my written Senate testimony on 9 July 1997
(Reference 6), on the technical implications, risks, and costs
of `key recovery', `key escrow', and `trusted third-party' encryption
systems.
For the Association for Computing Machinery, I was founder and Editor of the
SIGSOFT Software Engineering Notes (1976-1993) and now Associate Editor for
the RISKS material; Chairman of the ACM Committee on Computers and Public
Policy (since 1985); and a Contributing Editor for CACM (since 1990) for the
monthly `Inside Risks' column. I co-chaired SIGSOFT '91 on software for
critical systems. In 1985 I created, and still moderate, the ACM Forum on
Risks to the Public in the Use of Computers and Related Technology, which is
one of the most widely read of the on-line computer newsgroups. RISKS
(comp.risks) provides a medium for discussion of issues relating to all
aspects of computers and the social and technological problems that they
create. My RISKS-derived book (Computer-Related Risks, Reference
2) explores the benefits and pitfalls of computer-communication
technology and suggests ways of avoiding risks in critical systems.
My Web page
(http://www.CSL.sri.com/neumann/) includes pointers to my 25 June 1996
Senate testimony on security risks in the critical infrastructure (Reference
3), a position statement for the Gore Commission on Aviation
Safety and Security (Reference 4), testimony for the House Ways and
Means Committee subcommittee on the Social Security Administration and a
slightly extended statement for a subsequent SSA panel in San Jose, 28 May
1997, and testimony for the Senate Judiciary committee on key recovery, 9
July 1997 (Reference 6).
I am a Fellow of the American Association for the Advancement of Science,
the ACM, and the Institute of Electrical and Electronics Engineers (and a
member of the Computer Society). I received the ACM Outstanding
Contribution Award for 1992, the first SRI Exceptional Performance Award for
Leadership in Community Service in 1992, the Electronic Frontier Foundation
Pioneer Award in 1996, the ACM SIGSOFT Distinguished Service Award in 1997,
and the CPSR Norbert Wiener Award in 1997. I am a member of the Advisory
Board of the Electronic Privacy Information Center (EPIC).
Federal Funding Disclosure Statement
I am testifying as an individual, not as a representative of my employer
(SRI International) or the ACM or any other organization in which I
participate (such as the GAO). I note for the record that I am currently
receiving U.S. Government funding under two research projects. Both
projects are directed at crucial aspects of the problems of protecting
computer-communication information infrastructures, as is the vast majority
of research and development work that I have done over the past 44 years.
See my Web site (http://www.CSL.sri.com/neumann/) for more details.
-
Practical Architectures for Survivable Systems, for the Army Research Lab
(administered by the Harry Diamond Lab, Adelphi, MD, under Contract
No. DAKF11-97-C-0020, $699,833, 25 September 1997 through 24 September 1999;
Principal Investigator). This project is considering survivable systems and
networks and their explicit dependence on security and reliability,
identifying specific shortcomings in the existing computer-communication
infrastructures, and exploring system and network architectures for
overcoming those deficiencies.
-
Analysis and Response for Intrusion Detection in Large Networks, for the
Defense Advanced Research Projects Agency (administered by Rome Laboratory,
under Contract No. F30602-96-C-0294, DARPA Order No. E302, within the ITO
survivability program, $1,749,658, 28 August 1996 through 27 August 1999;
co-Principal Investigator with Phillip A. Porras). This project is
developing EMERALD (Event Monitoring Enabling Responses to Anomalous Live
Disturbances), a system for analyzing network misuse, a successor to our
earlier forefront work on IDES (Intrusion Detection Expert System) and NIDES
(Next-Generation IDES) (http://www.csl.sri.com/intrusion.html).