Web-Books
in the Austria-Forum
Austria-Forum
Web-Books
Technik
Pflegeroboter
Page - 187 -
  • User
  • Version
    • full version
    • text only version
  • Language
    • Deutsch - German
    • English

Page - 187 - in Pflegeroboter

Image of the Page - 187 -

Image of the Page - 187 - in Pflegeroboter

Text of the Page - 187 -

18710 Implementing Responsible Research … and risk management. Physical risks posed by robots are well documented and standards exist to help industry and developers to deal with these. BS 8611 therefore cites as “norma- tive references” (section 2 of the standard) a set of existing standards in this area: BS EN ISO 12100:2010, Safety of machinery—General principles for design—Risk assessment and risk reduction (ISO 12100:2010); BS ISO 8373, Robots and robotic devices—Vocabu- lary; BS ISO 31000, Risk management—Principles and guidelines. Unlike the existing body of standardisation, the focus of BS 8611 is on “ethical harm”. This is defined in section 3 (terms and definition) of the standard as “anything likely to compromise psychological and/or societal and environmental well-being”. An explanatory note elaborates that “Examples of ethical harm include stress, embarrass- ment, anxiety, addiction, discomfort, deception, humiliation, being disregarded. This might be experienced in relation to a person’s gender, race, religion, age, disability, poverty or many other factors.” Ethical hazards are defined as sources of ethical harms and ethical risks are said to be “probability of ethical harm occurring from the frequency and severity of exposure to a hazard”. It is interesting to note that RRI is also defined in the document as the “process that seeks to promote creativity and opportunities for sci- ence and innovation that are socially desirable and undertaken in the public interest”. The subsequent substantive sections of BS 8611 then follow the order of “ethical risk assessment” (section 4), ethical guidelines and measures (section 5), ethics-related system design recommendations (section 6), verification and validation (section 7) and information for use (section 8). The ethical risk assessment in section 4 starts with a table that lists ethical issues, ethical hazards and ethical risks. For each of these there is a suggested mitigation, space for comments and a validation mechanism. The ethical issues are the top level concerns. They are broken down into societal, application, commercial/financial and environmental. The largest group is that of societal issues. It includes the ethical issues of loss of trust, intentional or unintentional deception, anthropomorphisation, privacy and confidentiality, lack of respect for cultural diversity and pluralism, robot addic- tion, and employment. Application issues listed are misuse, unsuitable divergent use, dehumanisation of humans in the relationship with robots, inappropriate “trust” of a human by a robot and self-learning systems exceeding its remit. Under commercial/ financial issues the BS document lists the appropriation of legal responsibility and autho- rity, employment issues, equality of access, learning by robots that have some degree of behavioural autonomy, and informed consent. The environmental issues section, finally lists the hazards of environmental awareness (robots and appliances) and environmental awareness (operations and applications). The ethical risk column spells out how these hazards translate into ethical risks. In each case this is followed by a suggested mitigation, comments and validation mechanisms.
back to the  book Pflegeroboter"
Pflegeroboter
Title
Pflegeroboter
Author
Oliver Bendel
Publisher
Springer Gabler
Date
2018
Language
German
License
CC BY 4.0
ISBN
978-3-658-22698-5
Size
17.3 x 24.6 cm
Pages
278
Category
Technik
Web-Books
Library
Privacy
Imprint
Austria-Forum
Austria-Forum
Web-Books
Pflegeroboter