Seite - 188 - in Pflegeroboter
Bild der Seite - 188 -
Text der Seite - 188 -
188 B. C. Stahl
10.4.2 Ethical Risk Assessment and Management in BS 8611
The practice and implementation of ethical risk management for robots is described in
the text following overview table. Section 4.2 then spells out how ethical hazard identi-
fication should be put in practice. This starts with a reminder that the hazards need to be
reviewed with the people and animals potentially affected by them. It is also pointed out
that new developments in robotics may lead to new ethical hazards and risks.
The next steps of the risk management process are then discussed and related to exis-
ting risk management approaches that are described in other standards. It is suggested
that ethical hazards can be treated in a similar way to ergonomic hazards. In line with BS
EN ISO 14971 it is made clear that ethical hazards and risks for medical technologies
will always have to be balanced with the benefits for users or beneficiaries.
Ethical risk assessment, as described in section 4.3 should be undertaken with regards
to various human-robot interaction scenarios. This includes unauthorised use, reasonably
foreseeable misuse, the uncertainty of situations to be dealt with, psychological effects of
failure in the control system, possible reconfiguration of the system and ethical hazards
associated with specific robot applications. As a rule of thumb, it is suggested that the
risk of a robot performing an operation should not be higher than the risk of a human
performing the same operation.
The final subsection of section 4 focuses on learning robots. It offers a categorisation
of learning, depending on the degree of autonomy of the robot that covers three stages:
environmental, performance enhancement and strategy. Learning robots, it is pointed
out, raise additional ethical hazards because they can perform differently from otherwise
identical robots.
Section 5 of the standard covers ethical guidelines and measures. Starting with gene-
ral societal ethical guidelines, the document lists a number of norms that robots need
to be taken into account when designing and building robots. These include that robots
should not be designed solely or primarily to kill or harm humans, that humans and not
robots are responsible agents, that they should be secure and not deceptive. The section
refers to other norms, such as the precautionary principle and privacy by design. Roboti-
cists are recommended to work responsibly by engaging with the public, addressing pub-
lic concerns, demonstrate commitment to best practice, working with experts from other
disciplines and the media and providing clear instructions. The document provides a list
of groups that can help with engagement with various stakeholders. This is followed by
a number of sections that spell out in more detail some of the high level ethical topics of
concern, including privacy and confidentiality, respect for human dignity, human rights,
cultural diversity and pluralism, dehumanisation of humans, legal issues, the balancing
of risks and benefits, individual, organisational and social responsibility, informed con-
sent, informed command, robot addiction and dependence on robots, anthropomorphiza-
tion of robots and robots and employment. Section 5 concludes by pointing to a number
of application areas and specific issues these may raise. The areas include rehabilitation,
medical use, military use, commercial and financial guidelines and a reference to
environmental and sustainability issues.
zurück zum
Buch Pflegeroboter"