Want to Interact with Robots? Pretend They Are Animals


NAGLINGKIT: I want to talk about navigating relationships with robots at home or with companions, especially when it comes to empathy and actually developing complex relationships. What can we learn from what we have done for thousands of years with pets?

KD: One of the things we learn by looking at the history of pets and other emotional relationships we have with animals is that there isn’t anything naturally wrong with it-which is a something that is often jumped on immediately by people with robots. . They’re then like, “It’s wrong. It’s fake. Take it out of human relationships.” So I think comparing robots to animals is an immediate shift in speech, where people like, “If it’s more like a pet rabbit, it probably won’t get my son’s friends.”

One of the other things we do know is that animals, even in the social area, are actually useful for health and education. There are treatment methods that have really improved people’s lives through the emotional connection of animals. And it shows that there may actually be potential for robots to help in the same, but different, way-again, as a class of a new variety. It’s a new tool, it’s a new thing that we can use and use to our advantage.

One of the things that is important for me to write in the book is robots and animals not same. Unlike animals, robots can tell others your secrets. And robots are made by corporations. There are a lot of issues that I think are biased that we can’t see-or forget-because we’re focused on this aspect of human change. There are many issues with putting this technology in the capitalist society we live in, and only leaving companies with the freedom to rule over how they use emotional connections.

NAGLINGKIT: Say you have a robot at home for a child. To open a special feature, you have to pay extra money. But the child has already developed a relationship with that robot, where you can reasonably take advantage of the emotions, take advantage of the bond that a child has made with a robot, to get you paid more.

KD: It’s different as a whole app purchase scandal that happens for a while, but it can be done with steroids. Because you have this emotional connection, where not only does the child want to play an iPad game, but the child has a real relationship with the robot.

For kids, I’m really not worried because we have a lot of daycare organizations out there looking for new technologies that try to take advantage of kids. And there are laws that protect children in many countries. But what interests me is that it’s not just kids – you can take advantage of anyone this way. We know that adults can easily carry more personal information into a robot than they would like to enter a database. Or if your sex robot has enough shopping, that could be a way to really take advantage of buyers ’willingness to pay. And so I think there should be extensive consumer protection. For privacy reasons, for emotional manipulation reasons, I think it’s more plausible that people can bring in money to keep a robot alive, for example, and that companies can try to take advantage of that.

NAGLINGKIT: What will the relationship between robots and humans look like in the near future?

KD: Roomba is one of the simplest examples where you have a robot that is not very complicated, but it is in people’s houses, it operates alone. And people named their Roombas. Many other cases, such as military robots. The soldiers worked with the bomb disposal units and began treating them like pets. Give them names, give them Medals of Honor, they have funerals with gun salutes, and actually relate to them in ways similar to how animals become an emotional support for the soldiers in the worst situations throughout history.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *