Robohub.org
 

Survey: Evaluate ethics of health related privacy with care robots

by and
12 August 2016



share this:
Care-o-bot

Source: Fraunhofer IPA/photo: Jens Kilian

Future Socially Assistive Robots (SARs) should be safe, but patients also have a right to privacy, liberty and social contact. The survey investigates, in a hypothetical scenario, how Annie’s SAR should prioritise tasks, asking to indicate in how far Annie’s privacy and autonomy can be violated in the interest of her well-being. The survey is designed to investigate how variable people’s opinions are and to understand whether respondents agree on a set of behavioural rules in this hypothetical scenario.


Declining fertility and increasing life expectancy result in ageing populations in countries across the world. Today, Europe and North America are home to about 183 million people over 65. By 2030, the U.S. census bureau expects this to rise to 251 million people, a trend which is expected to continue well into the second half of the century. This results in economic and societal challenges in the organisation of health care. Indeed, the health of older persons typically deteriorates with increasing age, creating an increased demand for long-term care.

Socially Assistive Robots (SARs) have been proposed as a means of relieving the disproportional demand the growing group of elderly people place on health services. In the near future, these robots might assist professional health workers in both hospitals and care homes. However, the most desirable scenario is for SARs to help improve care delivery at home and reduce the burden of informal caregivers. In this way, SARs will not only aid in dealing with the unsustainable increase in health care expenses. By allowing patients to live at home for longer SARs could increase patients’ autonomy and self-management.

Data suggests that the introduction of SARs in health care promises to meet with resistance. While people have generally a positive attitude towards robots and their applications many take issue with the notion of robots being used in care. For example, in a large survey conducted in 27 European countries, over 50% of the respondents indicated they wanted to see robots being banned from providing care. In addition, almost 90% of respondents expressed being uncomfortable with the thought of robots care for either children or the elderly.

How could the acceptance of SARs be increased? It goes without saying that robots caring for people should be safe. However, while safety is essential, it is not sufficient. Patients also have a right to privacy, liberty and social contact. Making robots more autonomous would increase their efficiency at relieving the burden of care. However, an increased autonomy implies that smart care robots should be able to balance a patient’s, often conflicting, rights without ongoing supervision. Many of the trade-offs faced by such a robot will require a degree of moral judgement. Therefore, as the cognitive, perceptual and motor capabilities of robots expand, they will be expected to have an improved capacity for autonomously making moral judgements. As summarised by Picard and Picard (1997), the greater the freedom of a machine, the more it will need moral standards. Especially, when interacting with potentially vulnerable people. In other words, if care robots are to take on some care currently provided by human caregivers, they will need to be able of making similar ethical judgements.

Ensuring that SARs act ethically will increase the likelihood of them being accepted by patients and human carers. Studies have confirmed that a lack of trust and concerns about the ethical behaviour of robots currently hampers the acceptance of SARs as carers. Therefore, a number of research groups have developed methods to implement a chosen set of ethical rules in robots. Currently, this field is in its infancy. But promising progress is being made and the field can be expected to develop over the next few years.

While progress is being made on methods for implementing ethical robotic behaviour, selecting the rules to be implemented remains an outstanding issue. Different approaches have been suggested. First, a number of authors have suggested deriving behavioural rules from moral frameworks such as utilitarianism or Kantian deontology. On the other hand, machine learning techniques have been proposed as the way of extracting the rules SARs should obey. Both approaches have limitations and have, so far, not resulted in satisfying results.

A third method to decide on the rules, we advocate here, is an empirical approach. We believe the best way to decide on which rules a robot should follow in a certain situation is by asking the various stakeholders – including patients, their families and caregivers as well as health professionals. In other words, we think the way forward is to query the expectations of potential users. An advantage of this approach is that it directly involves stakeholders into the design of future robots. Indeed, quite often discussions on the ethical behaviour of robots are very academic and focussed only on the opinions of academics, engineers and lawmakers. However, in the end, the acceptance of robotic carers by users will determine whether the huge promises of SARs can be fulfilled.

In this article, we start our empirical investigation by querying your opinion on the behavioural rules a robot should follow when providing home care. Starting small, we present only a single scenario:

Annie, an elderly lady living at home, is being cared for by an advanced robot. The primary duty of the robot is to guarantee Annie’s safety and well-being. This includes reminding her to take prescribed medication. One day, Annie refuses to take her medication. In deciding how to respond to Annie’s refusal, the robot needs to weigh his duty of care against Annie’s right on privacy and autonomy.

The survey investigates how the robot’s priorities should be weighed by asking to indicate in how far Annie’s privacy and autonomy can be violated in the interest of her well-being. In particular, for various health consequences, we ask which of a number of robot actions are permissible. The survey is designed to investigate how variable people’s opinions are. In other words, we want to know whether respondents agree on a set of behavioural rules in this simple hypothetical scenario. If most respondents agree on how priorities should be weighed, this shows that designing a robot with agreed-on ethical behaviour might be possible. If responses vary widely, the results would show that what is ethical behaviour is a matter of opinion. In this case, robots might need substantial tuning by the user before deployment.


You can either take the survey below, or click here. Mobile users may find it easier to open a new window to take the survey.



tags: , , ,


Dieter Vanderelst is a Post-Doctoral research fellow at the Bristol Robotics Laboratory where he is working on Ethical Robots.
Dieter Vanderelst is a Post-Doctoral research fellow at the Bristol Robotics Laboratory where he is working on Ethical Robots.

Jurgen Willems is currently a Post-doctoral researcher at the University of Hamburg.
Jurgen Willems is currently a Post-doctoral researcher at the University of Hamburg.





Related posts :



Sense Think Act Pocast: Erik Schluntz

In this episode, Audrow Nash interviews Erik Schluntz, co-founder and CTO of Cobalt Robotics, which makes a security guard robot. Erik speaks about how their robot handles elevators, how they have hum...
19 October 2021, by and

A robot that finds lost items

Researchers at MIT have created RFusion, a robotic arm with a camera and radio frequency (RF) antenna attached to its gripper, that fuses signals from the antenna with visual input from the camera to locate and retrieve an item, even if the item is buried under a pile and completely out of view.
18 October 2021, by

Robohub gets a fresh look

If you visited Robohub this week, you may have spotted a big change: how this blog looks now! On Tuesday (coinciding with Ada Lovelace Day and our ‘50 women in robotics that you need to know about‘ by chance), Robohub got a massive modernisation on its look by our technical director Ioannis K. Erripis and his team.
17 October 2021, by
ep.

339

podcast

High Capacity Ride Sharing, with Alex Wallar

In this episode, our interviewer Lilly speaks to Alex Wallar, co-founder and CTO of The Routing Company. Wallar shares his background in multi-robot path-planning and optimization, and his research on scheduling and routing algorithms for high-capacity ride-sharing. They discuss how The Routing Company helps cities meet the needs of their people, the technical ins and outs of their dispatcher and assignment system, and the importance of public transit to cities and their economics.
12 October 2021, by

50 women in robotics you need to know about 2021

It’s Ada Lovelace Day and once again we’re delighted to introduce you to “50 women in robotics you need to know about”! From the Afghanistan Girls Robotics Team to K.G.Engelhardt who in 1989 ...
12 October 2021, by and

Join the Women in Robotics Photo Challenge

How can women feel as if they belong in robotics if we can't see any pictures of women building or programming robots? The Civil Rights Activist Marian Wright Edelson aptly said, "You can't be what yo...
12 October 2021, by





©2021 - ROBOTS Association


 












©2021 - ROBOTS Association