Responses [PGN] sent 4 December 1997 from Peter Neumann to questions [Qi] (i = 1 to 18) received on 19 Nov 1997 from Chairwoman Morella, House Science Committee subcommittee on Technology, following up on my written testimony (http://www.csl.sri.com/neumann/house97.html). This set of responses is also on-line (http://www.csl.sri.com/neumann/house97.ans).
Dr. Peter Neumann, Principal Scientist,
Computer Science Laboratory, SRI International, Menlo Park, California
[Q1] What is the state of technology for an individual institution to defend itself? What is the state of technology for defense of the network as a whole?
[PGN] An adequate answer to your question must address the state of the technology for security, availability, reliability, and overall system survivability. The state of that technology is fairly miserable in practice, although there are many efforts in the research community that could be brought to bear if there were a concerted recognition of the need. But the real gap is not between research and practice -- it is between what can be developed and what has actually been developed. A very secure commercial product was developed in the 1960s (Multics, a joint effort of MIT, Bell Labs, and Honeywell) that was vastly more secure than the personal computer systems available today; concepts from that effort are very slowly finding their way into other systems. Unix, which grew out of Multics, has better networking security than PC systems -- which are inherently flawed despite their popularity. Operating system and networking security must be dramatically improved before highly survivable distributed systems of personal computers can be configured.
[PGN] Certainly, a concerned institution could install firewalls, seek to use the most robust system components available, improve the education and awareness on the part of its system administrators and users, and undertake a few other largely palliative but nevertheless useful measures. However, the truth of the matter is that underlying commercial operating systems and networking software are basically very weak, and the administrative practice is also. The same conclusion applies even more acutely to the defense of networks as a whole, because the networks themselves depend largely on their constituent systems. There is very little network survivability per se beyond what is provided by the constituent computer systems. Although commercial vendors may tell you that they have solutions, in practice the solutions available today are inadequate.
[PGN] When you refer to "the network", I presume you might be thinking of the Internet. In that case, the situation is much more difficult than it is for the networks within a given enterprise, because the Internet is an international entity, it has no owners or controllers, its survival depends on the cooperation of many different institutions and governments, and it ultimately depends on the integrity of many of its constituent systems. Although the Internet has the potential for being the most highly survivable system of all, unfortunately the management of the Net and its constituent systems renders it highly vulnerable to isolated failures and systematic attacks. So, the bottom-line answer is that in principle it is possible to build and configure systems and networks that are more survivable that what is commercially available today, but in practice system developers, infrastructure purveyors, corporate institutions and end-users are not doing so; in addition, procurers of government systems are either not demanding such systems or else not having them successfully developed.
[Q2] Does the data which you have collected support the claims that the national information infrastructure is at risk? If so, how would you propose this infrastructure be protected?
[PGN] The archives of the Risks Forum absolutely support the position that the national information infrastructures are at risk. These archives include cases over five decades documented in the ACM Software Engineering Notes (summarized in the handout that accompanied my testimony, http://www.csl.sri.com/illustrative.PS), the ongoing Risks Forum electronic newsgroup comp.risks, and my book, Computer-Related Risks (Addison-Wesley, 1995). The collection of cases is riddled with vulnerabilities, threats, and risks, many of which are relevant today and are likely to be relevant in the future. Cases involving telecommunications, power, transportation, and computer-communication infrastructures are particularly well represented!
[PGN] Protecting the information infrastructure is a very difficult task. It requires significant increases in the security, reliability, and survivability of our systems and networks. It requires much better system and Web software. It requires enormous improvements in authentication infrastructures; we must urgently phase out the use of fixed reusable passwords in transit across unencrypted networks, and replace them with one-time nonreusable authentication mechanisms. It requires drastic improvements in the awareness of the vulnerabilities, threats, and risks. It requires government and private people who really understand the risks.
[Q3] Given the Commission's definition of national security related information, what is the impact on personal privacy of the Commission's recommendations?
[PGN] Our National Research Council "CRISIS" crypto study (K.W. Dam and H.S. Lin, editors, "Cryptography's Role In Securing the Information Society," National Academy Press, 2101 Constitution Ave., Washington, D.C. 20418, 1996) strongly urged that "national security" be considered in the broadest possible terms, relating to the well-being and indeed the survivability of the nation and its infrastructures, not just to defense, signals intelligence, and law enforcement. With that perspective, cryptography and privacy become national assets, not national detriments. The Commission's very cursory recommendations on cryptography could have an extremely negative implication on privacy because of the Commission's inability to disambiguate between key management and key recovery (see below). Privacy is a very delicate issue that is often relegated to a subordinate consideration. Overall, privacy is continually being compromised as we become increasingly dependent on information infrastructures. (For example, see P.E. Agre and M. Rotenberg, editors, Technology and Privacy: The New Landscape, MIT Press, Cambridge, Massachusetts, 1997, and B. Schneier and D. Banisar, The Electronic Privacy Papers, John Wiley and Sons, New York, 1997.) On the other hand, the law enforcement community needs to explore alternatives to key recovery that could effectively achieve similar results -- especially when confronted with anonymizing mechanisms.
[Q4] Would you elaborate on the technical risks posed by the use of key-recovery systems in encryption products?
[PGN] In essence, the information infrastructure is inherently incapable of enforcing stringent privacy requirements, because of the inherent weaknesses and security flaws in the systems and networking, because of the ultimate reliance on people who may be not entirely trustworthy, and because of risks arising in the secondary uses of information derived from computer systems. Key-recovery schemes put the most sensitive information (keys) into specific locations that become identifiable targets for attack and internal misuse. Any compromise of a key-recovery system places an enormous amount of information at risk. The notion of having to trust a third party makes very little sense in competitive or sensitive situations; no sensible corporation or country should be forced to do that, even if the government certifies the integrity of the third party, and especially not if the government absolves key holders from liability (as has been proposed in pending legislation). Certainly the intelligence and defense organizations would be unlikely to set themselves up for such compromises. Furthermore, the linking of key recovery with digital certificates used for authentication (as proposed in McCain-Kerrey) or indeed with any public-key infrastructures could completely undermine the credibility of all information infrastructures. This is equivalent to giving every law-enforcement person the ability to forge your signature. (In the case of McCain-Kerrey, this would also be true of anyone who can get a subpoena!) Perhaps worst of all, there is a serious confusion between good key management (which enables secure key distribution in some cases with absolutely no sharing of keys, although in certain cases it may enable carefully controlled key regeneration) and key recovery (which purports to give surreptitious access to law enforcement under situations whose risks have not even begun to be analyzed. The presence of intentional trapdoors that allow surreptitious access (irrespective of what controls are intended to exist) are likely to be vulnerable to misuse -- potentially by outsiders as well as insiders. As a critical example, digital commerce simply cannot survive in the presence of trapdoor access to master keys. The compromise of a master key used worldwide by a corporation or government could result in widespread disaster.
[PGN] It would be a great mistake to believe that legislation can prevent risks. You might contemplate laws that outlaw the use of crypto that does not mandate surreptitious third-party key recovery, as the FBI has sought. But that would be a true disaster for the future of the nation. You might contemplate laws that, in supporting key recovery, outlaw the misuse of supposedly protected escrowed keys by supposedly trustworthy individuals and the misuse of computer systems in general. But that is not likely to stop concerted government insiders and outside attackers -- especially if the latter can operate from foreign or even off-shore locations, or anonymously within the country. Laws are not a sufficient disincentive when the potential gains from illegal activities are considered -- especially when there is low probability of detection and prosecution. And the presence of laws whose enforcement is very difficult can be very counterproductive.
[PGN] Once again, see our NRC CRISIS report (noted above) and the more recent study of key recovery (Abelson, Anderson, Bellovin, Benaloh, Blaze, Diffie, Gilmore, Neumann, Rivest, Schiller, and Schneier, "The Risks of Key Recovery, Key Escrow, and Trusted Third-Party Encryption," World Wide Web Journal, Web Security: A Matter of Trust, O'Reilly & Associates, 2, 3, 241-257, Summer 1997) for extensive background on this subject.
[Q5] You say that "some government involvement is essential" in protecting critical infrastructures. Would you limit the government's role to the two areas identified by Mr. Stevenson, or follow the Commission's recommendations to establish those 5 new organizational entities, or something in between?
[PGN] Mr. Stevenson has recommended two areas (page 70--71 of the draft testimony): increasing dialog between private and public sectors, and getting Congress to resolve the encryption issues. Those are two important issues, but clearly there is much more to be considered.
[PGN] The Commission recommended five new entities (1,2,3,6,7) and
two organizational allocations of responsibility (4,5) in their
report's Figure 9, page 49:
(1) an Office of National Infrastructure Assurance;
(2) a National Infrastructure Assurance Council of CEOs, Cabinet Secretaries, and representatives of state and local governments;
(3) an Infrastructure Assurance Support Office;
(4) Federal lead agencies for each sector;
(5) sector infrastructure assurance coordinators to represent the each of the national infrastructures;
(6) An Information Sharing and Analysis Center;
(7) A Warning Center.
[PGN] Each of these PCCIP recommendations has significant merits, and deserves very careful study of potential problems such as overlaps, conflicts, and an inordinately complex bureaucracy. I think that establishing sector coordinators within the infrastructures and Federal lead agencies could be very beneficial. What is needed most is communication and cooperation.
[Q6] You note the lack of depth in the section of the Commission's report on the need for additional Federal funding for R&D. From your own experience and expertise, does this Commission's recommendation of adding $1 billion between now and and 2004 seem about right? Is it too much? Too little?
[PGN] By way of clarification, I interpret the wording on Page 89 of the Commission's report to recommend that the funding be increased to reach a total of $1 billion in the year 2004. However, R&D is not a question of how much, but rather of how the funding is directed. For example, research is needed in relating localized research results to the overall problems of survivable systems and networks. One one hand, tight research budgets are extremely counterproductive. On the other hand, it is likely that with such a rapid increase, $1 billion could not be spent wisely; it is more likely that much of it would be frittered away on improperly focused R&D. Some R&D is absolutely essential to fill serious gaps in existing technology and its application -- for example, ratcheting up the authentication infrastructure, networking security, techniques for higher assurance (such as formal methods for critical requirements and the specification and implementation of critical functions), and technologies for detecting, analyzing and responding to misuse and accidental failures.
[PGN] The Commission's terse list of six information infrastructure R&D items has several topics that involve the development and use of tools or simulation, including one for risk management decision support. Such techniques are important only if they are applied properly. Decision support tools and simulations can be extremely misleading when they are based on incorrect models. For example, the Risks Forum archives are littered with cases in which flawed tools and incorrect simulation models were used, or reasonable tools were used improperly. Tools and simulations are not an end in themselves. Greater emphasis must be devoted to good system and network architectures that constructively avoid risks, using subsystems that can be readily and predictably combined, along with prevention and detection of misuse, and enlightened management.
[PGN] The Commission's report suggests that the National Research Council should steer the selection of areas for R&D efforts (Page 90). This is a good idea, although it should be augmented with inputs from many diverse communities including government, infrastructures, and industry. Some of the resulting R&D efforts could well be funneled through DARPA, with an appropriately broadened charter to encompass the survivable information infrastructures on which the national infrastructures depend. One question we might ask is whether any of the Commission's proposed entities could add real motivations and guidance to the research efforts. Perhaps one of these entities should be asked to provide inputs to the selection of R&D topics. It would seem to be desirable that R&D directions should grow out of the natural needs of the national infrastructures themselves. However, it is not clear to me that such a role falls into the purview of any one of the seven proposed entities.
[PGN] A major realization in recent years on the part of both the defense establishment and the non-defense-related U.S. Government agencies is that they have become almost totally dependent on commercial systems. However, those commercial systems (particularly the software) are seriously inadequate when applied to the protection of critical infrastructures -- including information infrastructures. Unfortunately, developers of commercial systems have for the most part been very slow to adopt new ideas from the research community, for a wide variety of reasons (including lack of customer awareness and demand, lack of understanding of and commitment to the problems on the part of developers, lack of demonstrated massive threats that force awareness, and active impediments created by government crypto policy).
[PGN] The R&D funding that is allocated should be given to both private-sector institutions and to a few appropriate Federal Government agencies, for example, NSA and the FBI -- the latter of which desperately needs greater computer-technical competence and the ability to pursue other technological alternatives than trying to put all its eggs into the key-recovery basket. I would also like to see a merger the Army Research Lab, Naval Research Lab, and Air Force Research Lab, combined into one coherent joint-services R&D lab. In particular, there is an enormous need for coordination among the services; creating and funding a combined lab for R&D related to the information infrastructure could contribute nicely.
[PGN] In concluding my responses to this very important question, $1 billion seems like much too much to me. However, in any event, before allocating such funds, it is imperative that clear R&D leadership be established (e.g., accepting recommendations from the NRC, with funding channeled through various government agencies including DARPA) and that clear directions be agreed upon before any funding is allocated.
[Q7] The Commission's report in Chapter 9 under the title encryption states that "Establishment of trustworthy key management infrastructures (KMI's) is the only way to enable encryption on a large scale..." It would appear that there are many leading minds in the field who strongly disagree with that statement. In fact, the National Research Council in its study "Cryptography's Role in Securing the Information Society," as well as a recently released report by a group of highly regarded cryptographers "The Risks of Key Recovery, Key Escrow and Trusted Third Party Encryption," stated that "Certification can (and currently does) exist without any form of key recovery. Conversely, a key recovery infrastructure can exist completely independently of any key." In fact, these studies seem to indicate that significant weaknesses can be introduced to the infrastructure through the use of such a KMI. Do you think that KMI could lead to an increased security risk?
[PGN] The Commission has seriously confused key recovery and key management. Key management involves the generation, distribution, deletion, and other handling of keying information; secure key management is essential to the use of cryptography, secure networking, the conduct of electronic commerce, and protection of the infrastructures. Unsecure key management is a disaster waiting to happen. Key recovery is a euphemism for key escrow whereby keys must be retained in a manner that can be accessed surreptitiously by law enforcement and by corporations. Unsecure key recovery is extremely bad key management. Even supposedly secure surreptitious key recovery is potentially unsecure key management. The Diffie-Hellman public-key cryptography (whose main patent has now expired, bringing that scheme into the public domain) never exchanges a key -- it allows a shared key to be created by both parties without the key itself having been transmitted. In fact, one of its primary uses is to ensure that communications, key agreements, and authentications remain private even if a spy has disclosed all previous keying material. There is no need to save a key generated by Diffie-Hellman, because if it is lost or mangled, a new key can be generated. The idea of having to escrow or otherwise retain a key that is never shared and never needs to be saved is absolutely absurd; it has the potential of totally undermining all of the security that is available. So, in short, secure key management is absolutely essential, but very difficult to achieve even without key recovery. Key recovery -- particularly third-party surreptitious key recovery -- is intrinsically unsound key management, because of the risks (which have yet to be analyzed). It has the potential of destroying attempts to create sound key management -- particularly if poorly drawn requirements for key recovery were to engender a weak key-management infrastructure. In short, KMI and key-recovery infrastructures should not be coupled.
[Q8] The Commission's report in Chapter 9 under the title of "encryption" also goes on to state that "Law enforcement agencies should have lawful access to the decrypted information when necessary to prevent or detect serious crime. Procedures for judicial review prior to granting government access must be defined in law." How does this recommendation help protect our national infrastructures?
[PGN] It doesn't! In many ways, the existence of mechanisms for surreptitious access would hinder the protection of our national infrastructures. Some government folks want you to believe that surreptitious third-party key recovery is necessary to overcome the problems of lost keys or critical individuals being unavailable. Indeed, first-party and second-party key recovery can be useful for stored data, but not for pure information transmission. Whereas strong encryption is an essential requirement for communication, recovery of lost keys is never needed -- because the communicating parties can simply generate a new key and try again. The only effects key recovery has on communication are to reduce security and to allow law-enforcement access. Third-party key recovery is inherently dangerous. Its primary proponent is law enforcement; very few other communities believe that is a good idea, although there are of course a few applications in which it makes sense. Surreptitious access is also inherently dangerous. Indeed, enabling surreptitious access is the most dangerous aspect of key-recovery proposals; the likely risks therein have never been adequately addressed by the proponents of key-recovery -- who state merely that such schemes can be implemented. Implementability is not what is in doubt. The most important question is whether such schemes can be implemented with sufficiently few risks, despite a computer-communication infrastructure that is riddled with vulnerabilities. Until that question is answered carefully, openly, and without emotional arguments that attempt to obfuscate the underlying problems, key recovery must be considered intrinsically dangerous -- and especially third-party surreptitious key recovery.
[Q9] The Commission's definition of national security related infrastructures seems to include just about everything. The report also indicates that the private sector should take the lead in securing these systems. Do you think there is a conflict between the designation of infrastructures as "national security related" and retaining civilian control of security for these infrastructures?
[PGN] The balance between the public sector and private interests is a very challenging one. Some government oversight may be appropriate, especially in ensuring minimum standards of infrastructure survivability in an era of increased deregulation. However, in general, I am not in support of strict government regulation. Incidentally, my experience several years ago in working with the OMNCS (Office of the Manager of the National Communication System) on threats to the telecommunications infrastructure was that the infrastructure providers were reluctant to increase the security of their telephone switches, switch controllers, and information systems -- perhaps because they did not admit any threats. At that time, many of the providers were publically denying that there had ever been any intrusions into their systems -- which was clearly not true. Denials of the existence of unreported incidents were heard in the testimony of your hearing relating to the banking community, despite various evidence of cases that are not publically known. Unfortunately, remedial action tends to be taken only when there is a widely publicized disaster. Although there may be conflicts between public and private sectors, cooperation is absolutely essential, as the Commission has noted.
[Q10] Do you support the Commission's contentions that the Federal Government should increase funding for computer security?
[PGN] I can interpret this question in two ways: first, what funding should be allocated to Federal Government agencies, and second, what funding should be given to the private sector. My response to Q6 addresses the question of increased R&D funding to both sectors. The remainder of my response to the present question addresses non-R&D funding within the U.S. Government.
[PGN] Chapter 11 of the Commission's report recommends increasing R&D funding for infrastructure assurance, which by implication seems to be interpreted as increasing funding for computer security. However, "computer security" by itself is much too narrow. I would broaden any funding within the U.S. Government (including funded R&D) to encompass funding relating to reliability, fault tolerance, availability, system performance, as well as computer and network security, and the integral relationship of all of these requirements to the survivability of the information infrastructure. Although I have been involved in research in these topics for most of my 44-year professional career, I do not consider this answer to be a conflict of interest. I believe that fundamentally new directions are essential, because the old directions are not adequate.
[PGN] To some extent, the absence of meaningful security in the existing information infrastructure is due to the perceived lack of profitability by system purveyors. If the concept of security were broadened to include system and network survivability, availability, reliability, fault tolerance, and generally dependable system behavior, this might enhance the customer demand for such systems. These requirements would be more easily linked to profitability in the private sector than security by itself. And profitability of the private sector is ultimately the primary viable driving force.
[PGN] One of the most significant missing links related to computer security and more generally to survivability is the absence of definitive practical criteria for systems and networks. The history of the TCSEC (the Orange Book and other Rainbow volumes) indicates that good criteria are absolutely essential, but that the Rainbow series was not adequate. It also indicates that NSA and NIST are not chartered to represent the needs of consumers. Nevertheless, NSA and NIST should continue to represent the U.S. in the development of the international Common Criteria; in addition, I recommend that they be chartered with establishing further criteria that generalize the notions of security to encompass survivability of the information infrastructures and their constituent systems and networks, for the benefit of both the public and private sectors. This recommendation is a considerable extension of something suggested in Chapter 6 of the Commission's report.
[PGN] In addition, government agencies should make much more serious efforts to get their own houses in order with respect to the security and survivability of their own systems and networks. The fact that existing government systems are so vulnerable sets a poor example for the private sector.
[Q11] Can the overall goal of protection of critical infrastructures be met without increasing the use of encryption technologies? Will more widespread use of encryption occur under a policy framework that is focused on key recovery as the only appropriate technology?
[PGN] Cryptography is only one piece of the puzzle. Strong cryptography is essential, for authentication and integrity as well as for confidentiality. However, that cryptography must be nonsubvertibly embedded in our systems and networks, which in turn relies on secure operating systems. Nonsubvertible cryptography is absolutely essential for digital commerce and many other applications.
[PGN] As time goes on, developers and users will gradually realize that they need cryptographically based solutions in more situations, and they will use whatever encryption systems they can obtain. (And some excellent crypto systems are freely available, worldwide.) However, if the policy framework focuses only on systems with key recovery, that focus will delay and inhibit the use of encryption, and may drive the marketplace to other countries. Some users will require systems that do not provide key recovery, but they will not have a standard system and key-management framework on which to build.
[Q12] The Commission recommends extensive information sharing regarding cyber threats and vulnerabilities among the private sector and the government. In particular, the Commission indicates the need to build a "trusted environment" that will allow for the exchange of information without, in Gen. Marsh's words, "fear of regulation, loss of public confidence, or damage to reputation."
[Q12a] How realistic is this recommendation given the past reluctance of companies to reveal such information?
[PGN] This is a particularly sticky wicket, as I note in my written testimony. Industry cooperation and information sharing are absolutely essential. There are many problems, as suggested by the computer emergency response teams (CERTs) that tend to divulge information only when fixes are well known, but often long after the knowledge of the vulnerabilities has become widespread through other channels. Mr. Katz assured you most emphatically that banks report all of their internal fraud and related computer security problems (because they are required by law to do so). However, various financial-community insiders may tell you (if you really push them) of cases that remained unreported until they slowly became public knowledge.
[Q12b] What steps would be involved in building a "trusted environment" that could facilitate information exchange?
[PGN] Once again, if we had a much more secure information infrastructure, with good system security, good authentication, and good crypto, it would help significantly. There are problems in sanitizing that information and protecting sensitive items, and some of these can be addressed technologically. However, we must always recognize and admit the risks of untrustworthy individuals who are in trusted positions, with the ability to subvert the intent of computer system protections. Thus, some monitoring of the activities of those trusted individuals may be necessary, along with laws governing misuse and any resulting liabilities. (Note that similar remarks apply to the trusted individuals involved in key recovery.)
[Q13] Mr. Katz in his testimony recommends the establishment of a certification organization to assess the security functionality in commercial computer and software products. Do the other panelists concur on the need for such an entity, and should the Federal agency involved be the National Institute of Standards and Technology?
[PGN] Mr. Katz's recommendation raises some serious questions, for which history does not give encouraging answers. The experience of NSA's National Computer Security Center in establishing evaluation criteria (e.g., the DoD Trusted Computer Security Evaluation Criteria, or Orange Book) and conducting evaluations has been a mixed bag. On one hand, that experience has codified a few barebones minimum requirements and has succeeded in ratcheting up the minimal security of compliant computer systems. On the other hand, the TCSEC criteria are inherently incomplete (especially with respect to distributed systems and networks), and therefore completed evaluations can be very misleading --- suggesting that something is secure when it is actually seriously flawed. There are today no adequate criteria against which to measure systems and networks; there may never be such criteria that can be used rather mechanically by people who do not understand the hidden risks. The TCSEC effort has demonstrated the fundamental difficulty of establishing sufficient conditions, and of evaluating systems that tend to change frequently against static requirements that do not reflect reality. The TCSEC evaluation process was so time-consuming, and so bound (necessarily) to particular hardware configurations that evaluated systems lagged badly behind the market. Indeed, by the time a configuration was approved, the software if not also the hardware were often obsolete. Thus, I have grave reservations about Mr. Katz's recommendation in practice, although in principle it is a nice idea.
[Q14] What role should the National Security Agency have in increasing the security of the national infrastructures considered by the commission?
[PGN] NSA once played a significant role in establishing the 1985 TCSEC criteria for security for isolated systems (see my answer to the previous question), but adequate criteria for highly distributed systems and networks have never emerged. That approach now seems mired in the political efforts to establish international Common Criteria. NSA's computer security people could play a major role in encouraging serious security and survivability of the infrastructure. However, NSA's signals intelligence people have in the past not been supportive, and have in fact impeded the progress of security and applications of good cryptography.
[Q15] Is there general agreement on the need for the R&D initiative recommended by the Commission, on the proposed research priorities, and on the size of the proposed initiative?
[PGN] The six bulleted items relating to the information infrastructures (pages 89 and 90) of the Commission's report are for the most part all worthy topics for R&D efforts. Risk management is very important, but I have some strong reservations about funding risk management decision support tools (their fourth bulleted item), because there are many risks in risk analysis (see Section 7.10 of my Computer-Related Risks book). My comments on the size of the R&D initiative are given in my answer to Q6. It depends on how it is spent.
[Q16] What are the views of the panel on the new organizational structures proposed by the commission?
[PGN] The Commission's seven proposed entities are in my response to Q5. My view is that all of these could play potentially valuable roles, if they recognize the inherent vulnerabilities of our infrastructures today, the needs for dramatic improvements, the needs for serious sharing of information relating to vulnerabilities, threats and risks, and if they can avoid many of the risks that arise from factors such as commercial self-interest, partisan politics, and ignoring the basic fact that the Internet is an international rather than national resources.
[Q17] How are other countries dealing with the security issue? France has laws dealing with the use of encryption and Germany has legislation in this area. Is the proliferation of different international laws going to impede the development of a global information infrastructure?
[PGN] The question is not quite accurate. As far as I know, France has just one relevant law, to establish trusted third parties. Germany has no such laws at present, although it does have laws that promote anonymity in on-line transactions.
[PGN] A very misleading statement was made by at least one Government employee in oral testimony at a earlier Senate hearing in which I participated, to the effect that the Europeans love the American proposals for key recovery. The European Union stated almost an opposing view on July 7, 1997, and the OECD more recently strongly recommended that market forces drive crypto policy without key escrow or key recovery. I do not see Europe as a real impediment to global information infrastructure, although the U.S. government could be the major impediment!
[Q18] The Commission suggests revising the legal structure to address cyber security threats. Do the panelists believe that our current legal structure needs to be revised? If so, could you give some specific examples such as in the area of liability?
[PGN] The current legal structure does not properly address all the problems. Some existing laws are fuzzy enough that they can be interpreted as making it illegal to read, write, alter, or delete information in a computer -- which is what we all do on-line! However, any new laws must be written with a deeper understanding of the problems. Attempts to craft legislation on such matters as digital telephony, cryptography, key recovery, controls on Internet access, protection of children, anonymity, censorship and restrictions on free speech, unsolicited junk e-mail (spamming), and Internet gambling suggest the complexity of the issues that must be addressed. The consequences of foreign and off-shore operations must also be considered, as well as secondary uses of personal information such as in the use of Social Security Numbers for authenticating rather than identification purposes, and the increasing problems of identity theft. Simplistic legal measures are not likely to work when the technology itself is incapable of protecting the resources and controlling access.
[PGN] There are various liability issues that must be considered -- for example, regarding liability for the effects of attacks on our infrastructures (the information infrastructures as well as the national critical infrastructures identified by the Commission). There are also serious liability and responsibility questions raised by misuses of sensitive databases. One example involves misuse of key-recovery systems by authorized individuals and organizations. (The McCain-Kerrey legislation attempts to absolve from liability all certificate authorities that link key-recovery with certificates, and not absolve any others. This has some serious consequences that have apparently not been recognized by the framers of the legislation, in an attempt to coerce the key-recovery linkage.) A second example involves both insider and outsider misuse of Federal information systems such as have occurred in law enforcement, the IRS, the Social Security Administration, and state DMV systems. Although some existing laws apply to the more obvious cases of fraud and abuse, there are many other cases that are not adequately covered.
[PGN] I recommend the commissioning of a National Research Council study chartered to evaluate the adequacy of existing computer-related laws and to recommend new legislative, policy, and technological directions.