A robot to care for the elderly is a very real prospect.
With advancements in technology, and the possible connectivity of everyday items or ‘The Internet of Things’, the idea of a robotic companion to assist the elderly is not as futuristic as it first appears.
The technology is here, and now too is an approved set of ethics.
Professor Tom Sorell, Politics and Philosophy Professor at Warwick University and member of Accompany – a project behind the creation of robot carer ‘Car-O-bot3’ – has defined six values that will set the ethical tone of the relationship between robot and human.
The Six Values
- Autonomy – being able to set goals in life and choose means;
- Independence – being able to implement one’s goals without the permission, assistance or material resources of others;
- Enablement – having, or having access to, the means of realizing goals and choices;
- Safety – being able readily to avoid pain or harm;
- Privacy – being able to pursue and realize one’s goals and implement one’s choices unobserved
- Social Connectedness – having regular contact with friends and loved ones and safe access to strangers one can choose to meet.
The values have been designed with elderly people in mind and are intended to be part and parcel of every engineering decision in the software and hardware of care bots.
With pilot care bot projects already in place, rules to define the ethics behind the care bot and the older person are needed.
The biggest concern, according to Professor Tom Sorell, is the need for robot carers to be used for the benefit of the elderly person, rather than for the benefit of their carers. He comments: “there are moral reasons why autonomy should be promoted before the alleviation of burdens on carers”.
Autonomy should always win, he argues: “Older people deserve to have the same choices as other adults, on pain otherwise of having an arbitrarily worse moral status. And where the six values conflict, there is reason for autonomy to be treated as overriding”.
Who is the boss of the bot?
There may be exceptions to the number one Autonomy: “Exceptions might be where older people lack ‘capacity’ in the legal sense (in which case they would not be autonomous), where they are highly dependent, or where leading life in one’s own way is highly likely to lead to the need for rescue” (Professor Tom Sorell).
Bots with values offer the elderly person more than a bot with mere monitoring capacity alone. Care-bots that are designed with the six values are “better than robots designed merely to monitor the vital signs and warn of risks and dangers” Professor Sorell argues.
Sneak bots are not welcome either. The ethics put in place are to prevent the care-robots to act as a ‘snitch’ on behalf of worried relatives; “Robots designed to let the user control information about their own routines and activities (including mishaps) are also to be preferred to those engaged in data-sharing with worried relations or health care workers” (Professor Tom Sorell).
One aspect of the bot is to provide the elderly person with a ‘presence’, a sad nod to the anticipated future loneliness of the elderly –
‘Being there—in the minimal sense of being co-located with a person—is open to a mop, a broom, or a newspaper. What is meant by ‘presence’ is the kind of co-location of a thing with a person that brings it about that the person no longer feels alone.’
(excerpt from Robot carers, ethics and older people).