Sign In

Communications of the ACM

ACM Opinion

Human-Like Programs Abuse Our Empathy

View as: Print Mobile App Share:
Illustration shows a person conversing with an AI/robot.

"Mimicking human behavior is a clear boundary not to be crossed in computer software development." -Emily Bender

Credit: Getty Images

Google engineer Blake Lemoine's misconception about the LaMDA chatbot's sentience shows the risks of designing systems in ways that convince humans they see real, independent intelligence in a program. If we believe that text-generating machines are sentient, what actions might we take based on the text they generate?

That is why we must demand transparency, especially in the case of technology that uses human-like interfaces such as language. For any automated system, we need to know what it was trained to do, what training data was used, who chose that data, and for what purpose.

From The Guardian
View Full Article


No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account