Sign In

Communications of the ACM

Computing ethics

Surrounded By Machines

Surrounded by Machines, illustration

Credit: Viktor Koen

I predict that in the near future a low-budget movie will become a phenomenon. It will be circulated on the Internet, shared in the millions via mobile telephones, and dominate Facebook for a full nine days. It will show ordinary people going about their everyday lives as slowly, subtly, everything starts to go wrong as described in the following events.

A middle-aged man notices the advertisements that pop up with his Web searches are no longer related to the search, but to odd and random products and services—hair replacement, sports cars, retirement homes, second career counseling, fantasy vacations, divorce lawyers, treatments for depression and impotence.

A young professional woman, recently laid off because of the bad economy, posts an Internet ad to sell her piano. The ad doesn't mention that she needs the money to pay her rent. None of the offers are as high as her asking price, but two offer exactly what she owes for rent (to the penny) and one offers exactly $100 more.

The seven most troublesome students at a high school notice that wherever they go, they run into one or more of their three meanest teachers.

An elderly couple starts hearing voices in their assisted-living apartment, faint whispers they can barely discern:

"He's awake."

"She just got out of bed."

"The coffee machine is on."

These merely perplexing events become ever more ominous as thousands of people, then millions, realize they are always being watched, they have no privacy, and their every decision is controlled by some unseen force. Four-sevenths of moderate Americans who are likely to vote begin to slide from the middle to the extreme right or left, not knowing why. It gets worse and worse. It seems like Armageddon.

Just as the 1956 film Invasion of the Body Snatchers encapsulated the Red Scare zeitgeist with its depiction of ordinary people being replaced by exact replicas who are actually aliens bent on taking over the world—as many feared that Communist spies and sympathizers were taking over America—The Invasion of the Info Snatchers will play on our high-tech anxiety as our online lifestyles, position-broadcasting cellphones, and protective monitoring devices are inexorably compromised, exploited, and joined by ever more subtle devices.

The preceding descriptions are intended to be satirical, but all of these scenarios are possible today, though their probability varies. What seems most unlikely to me, though, is that people are and will be nervous about being swept away in the rising tide of pervasive information technology.

Pervasive computing, ubiquitous computing, ambient intelligence, and everyware are all commonly used terms for generally similar phenomena—the increasing presence in our everyday lives of information and communication technologies (ICT) that are too small to notice, or integrated into appliances or clothing or automobiles, or are aspects of services we use willingly. Some technologists have professed a goal for such technologies to be invisible to the users, taken-for-granted or simply unnoticed by most people, continuous with our background environment, existing and operating below and behind the realm of real-time human intervention or awareness.

When presented properly, the benefits of pervasive IT are obvious.

These technologies were the focus of a two-day workshop held last year: Ethical Guidance for Research and Application of Pervasive and Autonomous Information Technology (PAIT). The workshop was funded by the National Science Foundation (grant number SES-0848097), with additional support from the Poynter Center for the Study of Ethics and American Institutions ( at Indiana University Bloomington, and hosted by the Association for Practical and Professional Ethics ( Thirty-six scholars, including ethicists, engineers, social scientists, lawyers, geographers, and social scientists, participated in the meeting, discussed ethical issues in pervasive IT, and began crafting approaches to ethical guidance for the development and use of such devices, including public policy guidance. The workshop schedule, a list of participants, and more can be found at In this space, I cannot hope to do justice to the rich and wide-ranging conversations we had at the workshop, so I will focus on three significant topics we discussed at the workshop.

Back to Top

Machines on the go...

When presented properly, the benefits of pervasive IT are obvious. The popularity and benefits of the Internet and cellphones need not be defended or even described, but the amount of personal information in circulation over the former is tremendous and increasing, as are the unsanctioned uses of personal data. The position-broadcasting function of the latter is not nefarious in intent. Both can be used in knowledge creation, but also for stalking of various sorts.

At the workshop, Katie Shilton, a doctoral candidate in the Department of Information Studies and a researcher at the Center for Embedded Network Sensing (CENS) at the University of California at Los Angeles, described three intriguing CENS projects using mobile phones; I'll describe two.

Participants in the Personal Environmental Impact Report (PEIR) program record and submit a continuous location trace using their mobile devices. A participant's location is captured every few seconds, allowing the system to determine the location and infer the most likely mode of locomotion—foot, car, or bus. The participant's travel profile is then correlated with Southern California air quality and weather data, allowing PEIR to estimate the participant's carbon footprint, as well as her or his exposure to air pollution. The accuracy of the data gives an unprecedented look into the environmental harms people create and suffer.

Through the Biketastic project, bicyclists carrying GPS-enabled mobile phones transmit data on their routes through L.A. The information is not limited to position, but also includes data on the volume of noise and, using the device's accelerometer, the roughness of the path. The data is transmitted to a Web site ( and displayed. Future improvements could display existing data about the route, such as air quality, traffic conditions, and traffic accidents. The Biketastic riders can also share their information with other riders to create a detailed profile of the rideability of numerous routes.

Shilton described not only the benefits and uses of these projects, but the potential down-side and abuses. Summarizing the latter, she asked, "How do we build and shape such pervasive sensing systems without slipping into coercion, surveillance, and control?"

Back to Top the home...

Kalpana Shankar, an assistant professor in the School of Informatics and Computer Science and an Adjunct Assistant Professor in the School of Library and Information Science at Indiana University Bloomington, distributed a case study to participants before the workshop. The case study, "Sensing Presence and Privacy: The Presence Clock," was developed as part of an NSF-funded research project, "Ethical Technology in the Homes of Seniors," or ETHOS (

An increasing number of people want to live in their own homes as they age and begin to become less self-reliant due to slowly increasing frailty of various sorts. Their offspring want to ensure they are safe and that responses to mishaps are rapid and certain. The ETHOS project investigates how Internet-connected devices that alert responsible parties to troubling changes in routine—such as a person falling in the living room and not getting up—can give care providers peace of mind and elders more autonomy than they would enjoy in an assisted-living facility as well as life-saving interventions in an ethical manner acceptable to both elders and their offspring.

The Presence Clock case can be found at the PAIT blog ( along with commentary. Briefly described, the Presence Clock is an analog clock that comes in pairs. One clock is installed in the elders' living space and the second in the living space of their caretakers. The two clocks are connected via the Internet. Both clocks sense movement and presence and lights on the remote clock show roughly how much time someone spent at any given hour in the room with the local clock; the time spent is indicated by the brightness of a light next to the relevant hour marker (for example, a dull light at 1 and a bright light at 4 indicate someone spent little time near the clock at 1 and a good deal of time there at 4). A different-colored light blinks next to the hour marker when someone most recently entered the room.

The goal of the Presence Clock is to give the elder and caregiver a sense of mutual presence, even at a distance. A glance at the clock can give either party a sense of what the other has been doing; a change in routine might prompt a telephone call. It is less intrusive than a true surveillance system with an audio or visual feed, but could afford a great deal of comfort to both parties and enable an elder to stay in her or his own home longer than the caretakers would otherwise feel comfortable.

But it could feel intrusive to the elder, a trade-off between privacy and security that he or she does not want to make, but the caretaker insists upon. Responsible development and deployment of the Presence Clock is not just a technical and marketing challenge, but also a challenge in human relations and customer/user education.

Back to Top

...and thinking for themselves.

Perhaps even more troubling than machines that hide or drop from our awareness are machines that make choices without direct human intervention—generally termed "autonomous systems." Keith W. Miller, the Louise Hartman Schewe and Karl Schewe Professor of Computer Science at the University of Illinois at Springfield (to whom I owe the title of this column) highlighted one concern about autonomous systems in a presentation called "The problem of many hands when some of the hands are robotic," in which he revisited Helen Nissenbaum's 1994 article, "Computing and accountability."1 The problem of many hands lies in discerning or assigning responsibility when something goes wrong. The more people involved in a project, the more people there are to whom the blame can be shifted. When the technology is believed to be able to learn on its own and make its own decisions, there can arise a temptation—perhaps even a compulsion—to blame the machine itself, allowing the humans involved in its design, construction, and deployment to wash their hands of it.

Determining moral responsibility is a serious endeavor, and dodging or shifting blame (if that's all one does) is irresponsible in itself.

I think it's fair to say that most of the workshop participants deplored this tendency. Determining moral responsibility is a serious endeavor, and dodging or shifting blame (if that's all one does) is irresponsible in itself. At the workshop Miller started advocating for an effort to make a strong statement about the importance of accepting moral responsibility even in circumstances of complicated causality. Our time was too short to make much progress, but Miller has pushed the project forward in the interim. As I write this column in early 2011, Miller is working on the 27th draft of "Moral Responsibility for Computing Artifacts: Five Rules" (The Rules, for short) and has assembled a 50-member, international Ad Hoc Committee for Responsible Computing to improve drafts. It's remarkable to get such cooperation and consensus from scholars solely over email; more so is that the document is only four pages long (see

Back to Top


I have only touched on what happened at the workshop itself, and mentioned only one of the ongoing projects the workshop inspired. More is happening and still more will be accomplished thanks to the enthusiasm of this remarkable group of scholars, the ripple effect that will let this workshop touch many more people than those who attended, and a small grant from the National Science Foundation.

Back to Top


1. Nissenbaum, H. Computing and accountability. Commun. ACM 37, 1 (Jan. 1994), 72–80.

Back to Top


Kenneth D. Pimple ( is Director of Teaching Research Ethics Programs at the Poynter Center for the Study of Ethics and American Institutions, an endowed center at Indiana University-Bloomington, and on the Affiliate Faculty of the Indiana University Center for Bioethics.

Back to Top



Copyright held by author.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2011 ACM, Inc.


No entries found