acm-header
Sign In

Communications of the ACM

Privacy and security

Summing Up


Summing Up, illustration

The privacy and security Viewpoints column had its start six years ago. Though our 25 columns have had 23 different sets of authors, the fact is that an editor puts an imprint on a column simply by selecting potential contributors. No one should hold sway for too long, and so this column is the last under my leadership as section board member for the Privacy and Security column. It is a good time to ask how privacy and security have fared over the last half-dozen years.

The short answer is that for privacy and security, this is not the best of times. If privacy and security were students in a course, right now their grades would be somewhere between an F and an F-minus (for those readers unfamiliar with the U.S. academic system, an "F" for fail is as low as you can go). Consider just the events of the past two years. We have had the Snowden leaks, which show the U.S. National Security Agency (NSA) partnering with the intelligence agencies of the other "Five Eyes" (Australia, Canada, New Zealand, and U.K.) vacuuming up metadata (signals about communications), and, in many instances, heavily collecting content as well. The Snowden leaks also demonstrate a breach of security within the NSA and its contractors. Under legal requirements, the British telecommunications company Vodafone must provide the governments of six unnamed nations with direct access to all communications.7 U.S. retailer Target ignored the warnings of breach behavior from its security firm, Fire-Eye, resulting in the theft of 40 million credit and debit cards (more correctly, data sufficient to forge 40 million credit and debit cards).13 Meanwhile Facebook has a new product that records ambient sounds on user request—thus not only capturing lots of information about the enrolled user but also capturing conversations of unsuspecting bystanders. Google is strongly pressing users to have an always signed-on experience, particularly on their smartphones.11

The long answer is considerably more nuanced. Yes, the Snowden leaks demonstrated a level of collection that only the more paranoid among us had anticipated. The continuing revelations, including the amount of collection by the U.K.'s General Communication Headquarters, France's DGSE, and Chinese and Russian signals intelligence organizations, demonstrated that the U.S. was far from alone in conducting such massive surveillance. Yes, the wide usage of Facebook, the lack of user response, in terms of a drop-off in usage, to changes in the Google privacy policies, the willingness of users to avail themselves of apps that downloaded user location, whether or not such information was needed by the app, demonstrated that users continued to be generally unconcerned about protecting their privacy. But subtle signs indicate these signals are not all there is to the present privacy and security stories.

Consider the fact that Gregg Steinhafel, Target's CEO, was forced to resign within months of the breach. This is significant. Companies have had major breaches for years, but cybersecurity had previously never played a role in a senior executive's resignation. Now it has. Cybersecurity has suddenly become a lot more important in the C suite.

Consider the sudden success of SnapChat, the app that—incorrectly, as it turned out5—claimed it deleted photos seconds after their appearance on a recipient's screen. Someone—actually lots of someones—wanted such an app. Share a photo, but have it be ephemeral. There was interest in having email and text be ephemeral too. This is a desire to return to have communications largely be they way they were before digital transmission and storage made the most trivial utterance permanent.

Finally, consider the impact of the Snowden leaks on the U.S. government, which is now having serious, somewhat public, discussions about limits on collection. In January, President Obama announced he would move bulk metadata collection out of the government's hands.a The president also committed to restraining the use of communications collected for foreign-intelligence purposes in criminal cases and to restricting the length of time a National Security Letter, which "require[s] companies to provide specific and limited information to the government without disclosing the orders to the subject of the investigation" can be kept secret. There are some that would say the president did not go far enough in curbing intelligence collection. But these are substantive changes, and there may be even more forthcoming.

Thus that failing grade for privacy and security may not be an accurate description of where we are on privacy and security, though we do certainly appear to be at a nadir. Where we are headed is more than a little difficult to discern. Before I respond to that issue, we must first understand how we arrived at the present point. As security and privacy present substantively different issues, I will address them separately.

For security, partially the problem arose from the mistaken belief that security was all about confidentiality, integrity, and availability, and partially from the belief that security was merely a technical issue. But while the fact that both of these assumptions were wrong was obvious at least a dozen years ago, developing more complex responses to computer security was not easy. It may be difficult to design a secure cryptosystem; it has proven much harder to develop incentives that cause companies to adopt secure systems. The former is purely a technical problem,b the latter requires developing an economic model, perhaps including government regulations or laws, to pressure an entity into adopting a more secure—and likely more expensive—system.

For privacy, the situation is even more complicated. The first need was developing an understanding of privacy in the electronic age. In the 1960s, Alan Westin defined privacy as the ability "to determine for themselves when, how, and to what extent information about them is communicated."17 In the 1970s, the "Ware Report" operationalized and extended this into the Fair Information Practice Principles, which have been widely adopted throughout the world.16 But the principles of "Collection Limitation, Data Quality, Purpose Specification, Use Limitation, Security Safeguards, Openness, Individual Participation, and Accountability"8 do not cover a world in which a user may install hundreds of apps on her smartphone, each of which requests access to multiple permissions. And that is only the phone: the user also must confront her work desktop and its browsers, her shared family tablet, her laptop, and ...


It is a good time to ask how privacy and security have fared over the last half-dozen years.


This brief reprise of the current communication and computing world tells us that handling security and privacy involves understanding economics (for developing incentive schemes to build security into applications and to help users manage their privacy if they so desire), anthropology (for understanding the different ways people approach their electronic gadgets), psychology (for developing models that give people the computer interactions they want), design (for getting the colors, sizes, and signals right for people to achieve what they want), law (for balancing different interests in society), and so forth. While human-computer interaction has been an established field for at least three decades, the Workshop in Economics of Information Security is only in its 13th year and the Symposium on Usable Privacy and Security is just 10 years old. And while there is a Privacy Legal Scholars Conference that focuses, to a large extent, on digital privacy, there is no equivalent Cybersecurity Legal Scholars Conference. The point? Scholarship in these human aspects of privacy and security is nascent. That means translation into actual use is even more so.

Where does that put us? Being always connected is increasing and we are moving to the Internet of Things; both of these will create additional privacy and security risks. It is likely that for many people—especially those living under repressive regimes and those with less opportunity either as a result of economics or because of a lack of capabilities—privacy and security incursions will mount. In such situations, digital communications technologies appear to be leading to increasingly oppressive and invasive situations for many people.


I think we have done a reasonable job covering a broad and complex set of topics.


On the other hand, there are changes in the air. danah boyd describes how American teens have learned to hide their actions in plain sight,1 the U.S. government is seriously considering certain limitations in its collection of communications, both domestically and abroad, the Europeans are attempting to put limits on what information is available about individuals (though the European Court decision on "the right to be forgotten" decision4 seems to me to confuse data being accessible on the network—the real issue—with data being linked through search engines). There appears to be considerably more interest by private individuals in securing and protecting their data; Tor usage, for example, is approximately twice what it was in May 2013.c,15

Whether the latter will be a lasting change is, of course, crucial. It is a question I cannot answer. But I can address how well this column has handled broadly addressing the issues surrounding privacy and security over the last six years. And there I think the answer is, "Reasonably well."

Long before the Snowden leaks, we had columns on electronic surveillance in Sweden9 and a proposed code of ethics for U.S. intelligence officers.14 We had more technical columns focused on de-anonymizing anonymized networks12 and on why air gaps do not work for SCADA networks.3 We have had some on the social science side, including on the role of emotions in making complex security decisions,10 and on the impact that Fear, Uncertainty, and Doubt play in determining how to fight back against computer crime.6 We have covered the value, or lack thereof, of professionalizing the cybersecurity workforce.2 We have had researchers, implementers, engineers, computer scientists, lawyers, social scientists of many flavors, and management experts write—and there have been authors from three continents and at least 27 different policy persuasions. I think we have also done a reasonable job covering a broad and complex set of topics.

With this, I hand the mantle of Communications Privacy and Security columns to Carl Landwehr, for 30 years a leading cybersecurity researcher. Carl is a National Cyber Security Hall of Fame inductee and a former editor-in-chief of IEEE Security and Privacy. Many thanks for the pleasurable run.

Back to Top

References

1. boyd, d. It's Complicated. Yale University Press, 2014.

2. Burley, D., Eisenberg, J., and Goodman, S. Would cybersecurity professionalization help the cybersecurity crisis? Commun. ACM 57, 2 (Feb. 2013), 24–27.

3. Byres, E. The air gap: SCADA's enduring security myth. Commun. ACM 56, 8 (Aug. 2013), 29–31.

4. European Court, Judgement of the Court, Grand Chamber. Google Spain SL Google Inc. v. Agencia Espanola Proteccion de Datos Mario Costeja Gonzalez, May 13, 2014; http://bit.ly/1prYAfk.

5. Federal Trade Commission. SnapChat Settles FTC Charges that Promises of Disappearing Messages were False. Press Release, May 8, 2014.

6. Florencio, D., Herley, C., and Shostack, A. FUD: A plea for intolerance. Commun. ACM 57, 6 (June 2014), 31–33.

7. Garside, J. Vodafone reveals existence of secret wires that allow state surveillance. Guardian (June 5, 2014).

8. Gellman, R. Fair Information Practices: A History, Version 2.11 (Apr. 2014); http://bobgellman.com/rg-docs/rg-FIPShistory.pdf.

9. Irion, K. Communications networks tapped for intelligence-gathering. Commun. ACM 52, 2 (Feb. 2009), 26–28.

10. McDermott, R. Emotion and security. Commun. ACM 55, 2 (Feb. 2012), 35–37.

11. Mirani, L. Google's sneaky new privacy change affects 85% of iPhone users—but most of them won't have noticed. Quartz (Apr. 3, 2014); http://bit.ly/1fKRDB2.

12. Narayanan, A. and Shmatikov, V. Myths and fallacies of 'personally identifiable information'. Commun. ACM 53, 6 (June 2010), 24–26.

13. Riley, M., Elgin, B., Lawrence, D., and Matlack, C. Missed alarms and 40 million stolen credit card numbers: How target blew it. BloombergBusinessweek Technology, (Mar. 14, 2014).

14. Snow, B. and Brooks, C. An ethics code for U.S. intelligence officers. Commun. ACM 52, 8 (Aug. 2009), 30–32.

15. Tor Metrics Portal, Users; http://bit.ly/1oMOzJA.

16. U.S. Secretary's Advisory Committee on Automated Personal Data Systems. Records, Computers and the Rights of Citizens, 1973.

17. Westin, A. Privacy and Freedom, Antheneum, 1967.

Back to Top

Author

Susan Landau (susan.landau@privacyink.org) is a professor of cybersecurity policy in the Department of Social Science and Policy Studies at Worcester Polytechnic Institute in Worcester, MA.

Back to Top

Footnotes

a. "Remarks by the President on Review of Signals Intelligence," January 17, 2014; http://1.usa.gov/1awEWY8.

b. I am describing this as purely technical because I am focusing on the mathematical algorithm and not the protocol that implements it.

c. The reason for this is not entirely clear, since a major spike occurred in August 2013; numbers are substantially down from that high point.


Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2014 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: