Robots aren’t coming – they’re already here, and have been for quite some time, though mostly behind-the-scenes in manufacturing and factory contexts. Today, they’re entering different and more personal facets of our lives: hospitals, transportation, the military, the workplace and even our homes.
Robots will be everywhere, predicts Dr. Kate Darling, MIT Media Lab researcher and fellow at the Harvard Berkman Center for Internet & Society. And despite fear-monger headlines and doomsday predictions from high profile scientists and entrepreneurs, a robotic takeover is far from reality. In fact, people are increasingly – and willingly – interacting with and even personifying these intelligent machines. Whether society can draw up rules for how we treat them, however, remains to be seen.
Robot ethics and law haven’t kept pace with the technology, according to Darling, a leading expert in both fields. But she believes the issues that must be addressed – from empathy and violence to privacy and data security – are less about the robots, and more about humans and “our relationships with each other.”
“It’s nothing new that we can emotionally relate to objects,” says Darling, who focuses her research on the emotional connection between people and life-like inventions, and seeks to influence technology design and policy direction. “People have always had the tendency to fall in love with cars and gadgets and stuffed animals. The new thing about robots is this effect tends to be more intense.”
In a recent talk at The Conference 2015, she cited the interplay of three factors to explain why: physicality, movement and a new category of robots specifically designed to elicit emotion – and hardwired to even understand human emotions. Take the Roomba vacuum. Just the fact that it’s moving causes people to feel bad for it when it gets stuck under the couch. Or more extreme examples are stories of soldiers in the U.S. military who become emotionally attached to the robots they work with – naming them, giving them medals, risking human lives to save them, and even having funerals for those injured beyond repair.
Those are robots engineered to be tools; but those designed for human engagement are on an entirely different plane. There’s the Nao next-generation robot that works with autistic children, weight loss coach robots, and the Paro seal used in elderly care and with dementia patients, among many others. These are the robots – and the interactions, relationships and attachment they spur – raising concerns about the “dark side” of such advanced, human-like technology https://valtvalacyc.com.
But Darling is convinced “we can talk about privacy and consumer protection and all of the ethic issues without dismissing the potential.” In the meantime, she says, “It’s a fascinating area to study because when we look at this anthropomorphization [to attribute human form or personality] of robots, we’re actually learning more about human psychology in the process.”
While Darling’s background is in intellectual property, law and economics, her passion for technology and robots led her to interdisciplinary fields. After co-teaching a robot ethics course at Harvard Law School with renowned Professor Lawrence Lessig, she now increasingly works at the intersection of law and robotics, with a focus on legal and social issues.