acm-header
Sign In

Communications of the ACM

Inside risks

Risks in Features vs. Assurance


Essentially all commercial computer systems development and deployment have been driven by concerns for time-to-market, novel features, and cost, with little if any concern for assurance, reliability, or the avoidance of system security and networked vulnerabilities. Retrofitted products for the new connected world expose new vulnerabilities, because the environment changes. Security features of an existing product may not address new risks, and new security policies take effect. New features introduce unforeseen interactions between various components.

Software and systems currently are related to risks under contract law rather than any more demanding liability laws. The contracts are typically inequitable, with purchasers assuming all liabilities, despite the practical impossibility of their assessing the security, reliability, or survivability characteristics of the products. Indeed, most software products come with an anti-warranty; the producer warrants nothing, and customers assume all liabilities.

There is a serious lack of understanding among developers and development managers that security and survivability are different from features. Self-promoted and self-assigned "security experts" often lead to security features that are promised, but poorly conceived and poorly understood. It has been the industry's advantage to position "security" as an added feature to other systems and computing complexes, rather than primarily a characteristic of thoughtful architecture, careful design, and meticulous engineering, coding, testing, and operation. Security does not result from modules added after the fact; it must be engineered in from the beginning. The CERT/CC database contains numerous vulnerabilities such as buffer overflows and password sniffing that are often consequences of basic system architecture and design, some of which cannot easily be retrofitted. Products are shipped with many features, but assurance is at best paid only with lip-service as part of the vendor's marketing campaign.

Lack of commercial preference regarding security favors feature-laden software and frequent shipping schedules; too often, it is more important to ship a product with promised features in the commercial world. Although this may seem to be the failure or ignorance of the industry, these features are often requested by customers who don't have an in-depth understanding of security. If a vendor can't deliver in time or doesn't offer a feature for sound security reasons, the customer finds another vendor that can. The industry is not interested in research and development without payoffs. As long as the customer takes the risks, there is a great incentive to offer "nifty" features, even if these features increase the vulnerability to compromise. Evaluation of a product against a Common Criteria protection profile, for example, is not on the list of most customers.

The feature-dominated production ignores security experts' warnings, which become the first sacrifice in development organizations. Similar sacrifice is carried out by customers demanding functionality.

Although gaining more interest, cryptography, computer security, and survivability are not widespread. Security issues covered in most operating system and software engineering courses can be improved. It may not be possible to expand the existing courses and squeeze more concepts within the same time frame. Instead, separate computer security courses might be added as is already practiced by some universities. But, one way or the other, security and software engineering need to be thoroughly integrated into the curriculum.

Inadequate testing of features and their myriad interactions generally relegates testing to a final screen. Hardware designers have long implemented design-for-testability rule sets and supporting integral test hardware (which may occupy more than 5% of a chip). Testing engineers should be involved at the inception of development to make sure issues relating to testability and reliability are properly addressed.

The software and systems industry has been allowed to develop without substantial legal oversight, under the assumption its customers were sophisticated and could manage their risk exposure appropriately. Unfortunately, even sophisticated customers cannot know their security exposure. Under such conditions, liability law may be held to override unjust contract disclaimers. If the industry will not clean up its act, it must expect the tort bar to do so.

Back to Top

Authors

Tolga Acar (tacar@novell.com) is a senior software engineer at Novell, Inc.

John Michener (jrmichener@ieee.org) is a consulting engineer at Enterprises, Inc.


©2002 ACM  0002-0782/02/0800  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2002 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: