AI Ethics: can robot partners replace human partners?

category:Internet
 AI Ethics: can robot partners replace human partners?


One

After Constance gemson sent his mother to a nursing home, the 92 year old man became not only more lonely, but also more confused. Even with two kind and attentive full-time life assistants taking full care of her, all her social needs cannot be fully met.

Until one day gemson visited his mother and brought a new helper, a robotic cat designed for the elderly. It can make a lovely snoring sound, and it can also rub against people with its nose. Its not a substitute for care, says gemson, but its a reliable friend my mother can embrace and trust. Her mother died last June at the age of 95.

Two

Nursing robots play an important role in the families of the elderly and patients, providing much-needed friendship and help.

The accelerating aging population has also helped the rise of robot care industry. Experts predict that by 2050, the global population aged 65 and over will more than double. At the same time, the working age population in many developed countries is also decreasing, and the nursing robot is increasingly regarded as a good medicine for the elderly life.

Nowadays, nursing robots have different forms and complete functions. Lovely desktop robots will remind the elderly to take medicine and walk regularly; some robots can feed snacks to dying patients. Since its debut in 2016, more than 100000 machine pets specially designed for the elderly have been sold in the United States, according to the machine pet manufacturing company. According to the International Federation of robotics, sales of care robots designed for the elderly and disabled are expected to grow by 25% a year by 2022.

Three

For nursing robots, we should also pay close attention to their moral issues. Because it is related to the future of human beings, and also contains the ultimate meaning of care, the moral questions about what human freedom and dignity are urgently needed to be answered. These artificial intelligence robots were originally designed to help the elderly, and they rely on the appearance of simulation and a certain degree of social ability to help, accompanied by moral hazard. For example, will grandmas robot pet encourage more family conversations, or will her relatives stay away from the heavy work of caring for the elderly?

Matthias scheutz, a roboticist and director of the human computer Interaction Lab at Tufts University, said: I think the real moral doubt is that when these machines are just puppets, someone is promoting their social functions.

Because thats where moral dilemmas begin - with soulless algorithms, we can make robots blink, sing, and act intelligently. However, no matter how many intelligent actions and languages we add to robots, people can only get a little weird care from them.

In this regard, maartjede Graaf, who studies the ethics of human robot interaction at the University of Utrecht in the Netherlands, believes that the best situation for social robots is to stimulate human empathy with them. For example, some owners of robot vacuum cleaners will feel sad when their robots break down; some will count robots as family members.

At present, little is known about the long-term impact of robot care, and the publics concern about it has become rampant. In the 2017 survey, nearly 60% of Americans said they dont want to use robots to take care of themselves or their families. 64% believed that such care would only increase the loneliness of the elderly. At the same time, 60% of people in EU countries agree to prohibit the use of robots to care for children, the elderly and the disabled. These concerns may provide some reference for the function design of robot.

Until recently, the elderly began to make their own voice: some elderly people expressed their willingness to have a nursing robot and become friends with it. But research shows that there are many elderly people who dont like the company of robots. They are afraid of being monitored and controlled, and even more afraid of losing human care. One potential user involved in the survey worried that robot care would discriminate against some people, making them unworthy of human company..

When the only goal is to build machines that increase profits and efficiency, human nature is not a priority. John C. havens, executive director of the AI ethics guide initiative, said. The key to AI ethics, and one of its main principles, is transparency, which is called anti Turing test. Humans need to recognize that they are interacting with robots. At the same time, we should also pay attention to the potential side effects of social functions of robots, such as disturbing the intimate relationship between human partners.

Similar AI guidelines can help users and designers keep their heads clear so as to better use and design robots.

Four

This years launch of the health coach robot Mabu is a typical example. Initially, it was designed for patients with chronic diseases such as heart failure. This little robot will send out health advice and medication reminders to patients. In some cases, it can also send users physical condition data to doctors. When catalia first designed the robot, it emphasized that it was not a doctor or nurse, but a health care worker.

However, the company often portrays Mabu as a real person. For example, the companys ad says, Ill be your number one cheerleader!

According to the company, the vast majority of the hundreds of people who use Mabu now are elderly people, who only communicate with robots for 25 minutes a week on average. However, Cory Kidd, the companys founder and CEO, once said that some users would name, dress and take Mabu on holiday.

I asked Kidd, is Mabu transparent enough as a nursing robot?

There is a lot of work to be done to understand this interaction between users and robots. He said.

Kerri hill, 40, spent most of her time at home with heart failure, and Mabus company was crucial when she was alone. But she didnt want to rely on such a nursing robot. Robots are just robots, Hill said. Beyond that, you still need real human interaction.

Five

These words remind me of the last four years of my mothers life and the hard choices I had to make to take care of her. As I raise two children under the age of six, I have to constantly assess how many nannies I need to hire and whether she will feel safe and lonely after I leave my mother temporarily.

If there was such a robot that could make her laugh, encourage her to eat and help her pick up the spoon dropped on the ground, would it be a kind of contempt for her? Would it be a relief for me? Would it be better?

In the days when I took care of my mother, I admitted that I had a hard time, but I believe that even if there was such a robot at that time, I would choose to take care of her personally. Because I think there is such a place in our life that we need to use our warm hands to comfort others and provide real care.

Caring and caring for others is a hard work, accompanied by constant self doubt. We can only protect the humanity of this kind of work if we have doubts and hesitations about the robots who care about human beings.

One day in the fall, gemson and I were sitting in a cafe in Manhattan. Gemson talked about the scene when nurses took her mother out to lunch and gently bathed and fed her. She also sincerely recalled the robot cat. After her mother died, she threw the robot cat away. (from: the New York Times Author: Maggie Jackson compilation: Netease intelligent participation: perseverance)

Source: Netease intelligent responsible editor: Liao ziyao, nbjs10040