Social robots for people caring – People have emotions like happiness, anger, sadness or fear; machines are getting more and more smart but will they ever be capable to prove emotions like humans do?

Pepper, the humanoid robot developed by the Japanese company Softbank and launched on the market in 2016, is able to recognize and interpret human emotions, analyzing facial expressions and tone of voice and is currently employed as a receptionist in some companies in United Kingdom and Japan, as well as in schools and Universities in the research field of human-robot interaction.

Few years ago, robots were still perceived as machines developed to replace human workforce, while today research is aiming to release “social robots” with people assisting purposes, such as tutorship, child or elderly people assistance, with virtual coach functions.

In RPA field, for instance, Crafter.ai is working to the development of conversational chatbots that will vocally assist elderly people in physical rehabilitation paths, within the project Re-hub-ility, financed by Regione Lombardia, in cooperation with Istituto Maugeri and a pool of territorial innovative companies.

social robots for people caring: personal robots

Social robots for people caring – Mabu is a smiling yellow little stationary robot, developed by Catalia Health, that works as a “smart home companion” doing tasks such as reminding users to take medicines, thus working like a “doctor extehension”, even suggesting things like low-sodium diet options, recommending calling a doctor, or making small talks.

To those that say that the same function may be done using a tablet, Cory Kidd, Catania Health CEO, answers that Mabu’s anthropomorphic interface helps to establish a more stable relationship with patients.

Milo, is developed by Robokind to assist children with autism diagnosis. The robot can show different facial expressions, interact verbally while showing symbols on his chest to help better understand what is being said.

Milo’s ability to repeat something again and again, using the same tone is particularly suited to helping children with ASD, but also children with Down syndrome and other emotion or learning related disturbs.

Affectiva is the first business to market “artificial emotional intelligence” that developed algorithms using more than 6 millions facial expressions from 87 different countries in the world, able to detect emotions such as: anger, contempt, disgust, fear, joy, sadness, and surprise.

Personal robotics researchers from MIT Media Lab integrated Affectiva emotional set in Tega, a robot developed to interact and support children learning, that is able to understand and reply on the basis of children facial expressions, while reinforcing its learning algorithm in parallel.

Tega – Social assistive robot

robotics and emotions

The progress of robotics research in emotional intelligence field, involves the ethic question, that it’s not a matter of rationality, but it’s a matter of empathy, that is the capability that allows us to understand how other people are feeling and care for them.

To act ethically robot not only shoul be able to recognize emotions, but should be able to “feel” emotions.

There are three main teories about emotions, based on appraisal, physiology and social constructions.

According to the cognitive appraisal theory, emotions are judgements about the relevance of a determined situation or change on people’s life; according to physiological theory emotions are related to physical changes such as heartbeat, breathing rate, and levels of hormones, such as cortisol; while according to social construction theory, emotions are related to language and cultural context, that affect the way people react to situations or changes, according to the different perception they have.

These three theories are complementary in human brain, while only the first could be currently applied in robotics.

Robots are already being built using neuromorphic chips, which are computer chips that mimic the brain by implementing millions of neurons. So maybe robots could get some approximation to human emotions through a combination of appraisals, rough physiological approximations, and linguistic/cultural sophistication.

robots capacity to recognize human emotions may bring us to a new level of man-machine interaction?

By 2050, 1,6 billions of people in the world will be over 65 years of age and artificial intelligence will be a form of elderly support.

ENRICHME, short for “ENabling Robot and assisted living environment for Independent Care and Health Monitoring of the Elderly”, is developing robots that assist aged people in physical exercise, or to remember where they put objects. ENRICHME robots were tested in retirement houses of three European countries and early results showed that the users accepted the robot, that helped them being more cognitively and physically active.

Welcome social robots that care for elderly people then, but the hope is that we are not teaching robots to recognize emotions because we are too concentrated on our own business to care for others…

Source:

https://www.psychologytoday.com/us/blog/hot-thought/201712/will-robots-ever-have-emotions

https://www.pbs.org/wgbh/nova/article/robots-emotional-intelligence/

https://www.affectiva.com/success-story/mit-media-lab/

You may also like:

https://crafter.ai/it/2021/01/14/intelligenza-artificiale-a-scuola/