acm-header
Sign In

Communications of the ACM

Computing ethics

Respecting People and Respecting Privacy


Respecting People and Respecting Privacy, illustration

Credit: Alicia Kubista / Andrij Borys Associates

People conflate privacy and security. For many there are no trade-offs, choices, or decisions: guarantee security and you guarantee privacy. But computer designers know it is more complicated than that. This column argues that starting with respect for people who desire privacy will help guide good security design. For example, to help mitigate the security threat of identity theft one wants to consider the loss of private information.

Good design practice is a responsibility. The ACM Code of Ethics requires that designers "respect the privacy of others" and provides two paragraphs of best practice. Many users are both busy and insufficiently proficient technically to watch out for themselves. They understand neither technical minutiae nor the basics of privacy. A recent survey asked about Internet public key certificates (PKI) certificates. Most people said they did not know what such certificates are, or that PKI certificates provide more protection than they do. Many thought PKI certificates ensure privacy, prevent tracking, provide legal accountability, or certify that a site is protected from "hackers."2 They confuse security goals and privacy, and seldom understand related risks well.

The survey included a range of respondents: shoppers at the Bloomington Farmers Market, people who attend Indiana University's "Mini University" (participants are active retirees who can be vulnerable and need to protect financial assets); attendees at the Dashcon convention for Tumblr enthusiasts (predominantly young people familiar with the Internet); and college students from psychology and computer science who are typically "digital natives." Privacy and security practices were not well understood by intelligent, educated, and cognitively flexible people, although those with significant expertise came closer to understanding the purpose of certificates.

Mobile device permissions can be even more confusing.3,4 Few people understand the routine level of geographic tracking (pinpointing the device's location). Intensive tracking might betray a pathological desire for data collection, but it might also be a result of the easiest default to set. Some designers have difficulty mastering permissions, and extensive data collection can be the consequence. I argue for data minimization, meaning collecting the least data needed to help ensure privacy and security. Failure to minimize data collection can be an ethical issue, but many developers fail to grasp its importance. Here, I offer three reasons to seek data minimization to protect privacy, and thereby security.

First, collecting and transmitting data exacerbates the risk of data exfiltration, a leading risk in security attacks. Attackers can subvert a computer or mobile phone to acquire credentials. An outward-facing mobile phone camera can allow reconstruction of sensitive office space. Attackers can examine an account before deciding how to exploit it.1 Traditional email attacks (for example, the stranded travelera) can be used on phone-based clients, and URLs for malware diffusion can be sent by short text messages (SMS), email, and other phone-based mechanisms. SMS is already used to obtain funds directly through per-charge messages. Reducing the amount of data that can be exfiltrated reduces both privacy and security risks.

Second, data minimization can reduce business risk. Flashlight apps that exfiltrate phone information to support advertisers can exfiltrate more data than needed. Too much exfiltration led to the removal of flashlight apps as iPhones and Android included free integrated flashlights. Exfiltration can lead to short-term profit but long-term losses to the companies involved. Apps can destroy markets when they weaken privacy controls. The app Girls Around Me used Foursquare data to display Facebook profiles of women near a location, including pictures. The app was making money but Foursquare blocked it because violating privacy lost users for Foursquare. An immensely popular and valuable application became a liability. Superfish on Lenovo computers hurt Lenovo; what Lenovo considered advertising others considered malware.

Data minimization can also reduce legal problems. Assistant U.S. Attorney General Leslie Caldwell of the Justice Department's Criminal Division said, "Selling spyware is not just reprehensible, it's a crime." Superfish on Lenovo computers is being investigated as wiretapping. The successful app StealthGenie has been considered a kind of spyware, and its CEO is now under indictment. Thinking in advance about privacy can help both designers and users.


Thinking in advance about privacy can help both designers and users.


Why do people accept privacy-violating products in the first place? Perhaps they do not care much about privacy or do not understand the implications. Or they care but accept the convenience in exchange for the risk of privacy violation. Perhaps they cannot use the tools intended to provide protection, so designers think users are protected but in fact they are not. There are good economic reasons to support those people who care about privacy. A market with perfect information enables price discrimination in which each person pays the amount he or she is willing to pay, no less and no more. Some customers will pay money for privacy, while some will pay in possible loss of privacy to avoid spending money. But the presence or lack of protections for privacy should be clear.

When people do not understand the privacy risks of their own choices, there is not only a business process failure, but also an ethical one. Usability analysis should indicate whether tools are effective for people trying to protect themselves. People can be bad at risk analysis, so care must be taken to help them understand risks. People might choose to take risks online for the same reasons they choose risks offline. Of course, designs cannot force people to make careful risk decisions. But designs often stop far short of that point. Lack of information about privacy risk can lead to consumer regret and unwillingness to reinvest. Or users might refuse products that carry too much risk to privacy, resulting in an untapped market. Users expect computer designers to follow the code of ethics laid down about privacy. To the extent that participation in the network is a form of enfranchisement, poor design for privacy is a kind of disenfranchisement.

Privacy risks and loss should not be invisible. If someone trades privacy for pricing, that trade should be abundantly clear. People do not expect their televisions to listen to every word in the house. They want transparency and risk communication. Some users will ignore a manufacturer that includes hidden surveillance capability while others will be furious. Transparency is expected by the U.S. Federal Trade Commission and is common in many markets. The market for computing devices, from computers to mobile phones, should be one such market. This does not mean everyone has the same expectations of privacy. It merely means that customers should know about privacy risks, and be able to handle those risks as they see fit. Choices about privacy and security are important. The choices that designers provide should help surgeons, air traffic controllers, and other highly skilled individuals who have responsibilities for the security and privacy of others—and those less highly skilled (and less empowered as well), make sensible decisions about their privacy and security needs. When security and privacy technologies fail, those with the knowledge, role, and skills put them in the best position to prevent the failures bear much of the responsibility.

Back to Top

References

1. Bursztein, E. et al. Handcrafted fraud and extortion: Manual account hijacking in the wild. In Proceedings of the 2014 Conference on Internet Measurement Conference, ACM, 2014.

2. Camp, L.J., Kelley, T., and Rajivan, P. Instrument for Measuring Computing and Security Expertise-TR715. Indiana University, Department of Computer Science Technical Report Series; http://www.cs.indiana.edu/cgi-bin/techreports/TRNNN.cgi?trnum=TR715.

3. Felt, A.P. et al. Android permissions: User attention, comprehension, and behavior. In Proceedings of the Eighth Symposium on Usable Privacy and Security, ACM, 2012.

4. Kelley, P.G. et al. A conundrum of permissions: Installing applications on an Android smartphone. Financial Cryptography and Data Security Springer Berlin Heidelberg, 2012, 68–79.

Back to Top

Author

L. Jean Camp (ljeanc@gmail.com) is a professor of informatics at Indiana University Bloomington.

Back to Top

Footnotes

a. The "stranded traveler" scam uses a subverted account to send a message to all account contacts asking for emergency help. The attacker pretends to be the account owner, claims to be desperately stranded, and requests money be sent immediately to a specific (attacker-owned) bank account.


Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2015 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: