Sign In

Communications of the ACM

Inside risks

Risks of Technology-Oblivious Policy

Many readers of this column have tried to influence technology policy and had their advice ignored. Politics is frequently a factor, but another reason for our failure is that we don't do a good job of explaining the roots of computing-related security and usability issues to non-technical people.

People who have never written code do not understand how difficult it is to avoid and/or find bugs in software. Therefore, they don't understand why software patches are so dangerous. They have a difficult time believing that it's possible to conceal malicious code in large programs or insert malware via a software patch. They don't see why it is so difficult even to detect malicious code in software, let alone locate it in a large body of code.

The Digital Millennium Copyright Act (DMCA), which became U.S. law in 1998, is illustrative. The most controversial portions of the DMCA, the anti-circumvention and anti-dissemination provisions, did not come into effect until 2000. It was only by chance that we learned why the delay occurred. (Stop reading, and see if you can guess why.)

The delay was written into the DMCA because lawmakers believed aspects of the DMCA might criminalize work on securing software against Y2K problems. Y2K is not the only software security issue that requires the kind of code analysis needed to repair Y2K-related problems, but Congress either didn't know or didn't care. Computer security experts and cyberlaw professors had not been quiet about the risks of the DMCA. There were several letters, including one signed by a large number of experts in computer security and encryption (see, warning that the anti-circumvention provisions could criminalize some standard computer security techniques. But our warnings were ignored.

One consequence of the poorly drafted DMCA is that the anti-circumvention provisions can prevent independent experts from inspecting and testing voting machine software to check for bugs and malware without permission. Who would have thought that a law pushed by Hollywood would be used to protect the insecure and secret software deployed in voting machines?

When the computing community started warning about the risks of current paperless electronic voting machines, we encountered outright hostility from some election officials and policymakers. We were accused of being "fear mongers'' and Luddites. On Election Day 2004 a lobbyist for voting machine vendors claimed that "Electronic voting machine issues that have been cited are related to human error, process missteps, or unsubstantiated reports.'' How could the lobbyist make such a claim on Election Day? And why would anyone believe him, rather than the experts?

To counter unrealistic claims about the safety or robustness of software, we need analogies that help people gain insight into the complexity of large programs. Analogy is a poor tool for reasoning, but a good analogy can be very effective in developing intuition.

One possibly useful analogy is the U.S. Tax Code. Americans have some sense of its complexity and of the large number of people employed in its interpretation. Tax loopholes are analogous to hidden malicious code or Trojan horses in software.

The tax code resembles software in other ways as well:

  • It is intended to be precise and to interface with messy realities of the real world.
  • It has been developed in multiple iterations, responding to changing circumstances and requirements.
  • The people who wrote the original version are no longer around.
  • No one understands it in its entirety.
  • It can be difficult to infer intent simply by reading a section.
  • There are people who actively seek to subvert it.

Of course, there are also major differences between the tax code and software. The tax code is relatively "small"—although it runs to several thousand printed pages, Windows XP has 40 million lines of source code. While there is software that attempts to interpret parts of the tax code, most of the interpretation is done by people, thereby introducing the possibility of commonsense intervention and of human error.

We have failed to effectively explain the risks of inappropriate, careless, or poorly designed software to the general public, the press, and policymakers. But good analogies can help us communicate. The issues are too critical for us to be shut out of the debate.

Back to Top


Barbara Simons ( is a former president of ACM.

Jim Horning ( is a chief scientist at SPARTA, Inc.

©2005 ACM  0001-0782/05/0900  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.


No entries found