Page - 537 - in Menschenrechte und Gerechtigkeit als bleibende Aufgaben - Beiträge aus Religion, Theologie, Ethik, Recht und Wirtschaft
Image of the Page - 537 -
Text of the Page - 537 -
© 2020, Vandenhoeck & Ruprecht GmbH & Co. KG, Göttingen
ISBN Print: 9783847111658 – ISBN E-Lib: 9783737011655
eryone ineveryhousehold”.’9 In thehandsof authoritariangovernments these
powerful technologies amplify existing threats to freedomand equality. How-
ever, the ethical threats these technologiespose transcendtheirpotentially im-
moralusebyauthoritariangovernments.Rather,theextentofdatacollected,the
facility tomodel and triangulate, together with their predictive capacity also
posesignificant risks tocivil andpolitical liberties indemocratic societies, and
inparticular tomarginalised individualsandgroups.Asnotedby theBerkman
KleinCentre in its assessmentof ‘Artificial Intelligence andHumanRights’ ‘in
relationtotherightstoequality,freeexpression,association,assemblyandwork
(…)the impactofAI (…)hasbeenmorenegative thanpositive todate.’10
The predictive capacity of deep learning algorithms raises further ethical
challenges.Predictivetechnologiesarealreadyextensivelydeployedinfinancial,
health and criminal justice systems andassessments of their equity relative to
traditionalhuman-basedsystemsvary.Forexample,TheBerkmanKleinCentre
report notes anumberof arenaswhere thedeployment ofAIhashadpositive
outcomes.Regardingaccess to the financial systemithas foundthat ‘compared
to the status quo credit scoring algorithms, the introduction of AI into the
lending process is likely to have an overall positive impact on the ability of
low-incomeborrowers toaccesscredit’.This isbecause thewidevarietyofdata
sources used seems to ‘improve the ability of well-qualified individuals from
marginalisedcommunities toaccesscredit.’11However,determining theethical
and human rights impacts of the predictive dimension of AI technologies is
complex, inpartbecause these technologies arebeing implemented inexisting
socialinstitutionsthatalreadydiscriminatetoagreaterorlesserextent.Afurther
complexity relates to the nature, quality and extent of the data onwhich the
predictive capacity of AI depends. InWeapons ofMathDestruction: HowBig
Data Increases InequalityandThreatensDemocracy, CathyO’Neill argues that
predictive analytics based on algorithms tend to punish the poor.12 She uses
algorithmichiringpracticesasoneexample,butheranalysisextendstoarange
of social andeconomiccontexts, arguing that inmost cases, onaccountof the
partial, discriminatory anddecontextualizednatureof thedataonwhichAI is
trained,ortrainsitself,it isinevitablethatmachinelearningaccentuatesexisting
biasesandexclusions.Moreover, shepointsout thatasAI is embedded further
9 ZakDoffman,ChinaIsUsingFacialRecognitionToTrackEthnicMinorities,EvenInBeijing,
in:ForbesMay3/2019.
10 FilippoRaso /HannahHilligoss /VivekKrishnamurthy /ChristopherBavitz / LevinKim,
Artificial Intelligence andHumanRights, Opportunities and Risks,CambridgeMA., Be-
rkmanKleinCentre2018.
11 Ibid., p. 31.
12 Cathy O’ Neill, Weapons of Math Destruction: How Big Data Increases Inequality and
ThreatensDemocracy,USA2016.
DesigninganEthicalFuture:CanArtificial IntelligenceHelp? 537
Open-Access-Publikation im Sinne der CC-Lizenz BY 4.0
Menschenrechte und Gerechtigkeit als bleibende Aufgaben
Beiträge aus Religion, Theologie, Ethik, Recht und Wirtschaft
- Title
- Menschenrechte und Gerechtigkeit als bleibende Aufgaben
- Subtitle
- Beiträge aus Religion, Theologie, Ethik, Recht und Wirtschaft
- Authors
- Irene Klissenbauer
- Franz Gassner
- Petra Steinmair-Pösel
- Editor
- Peter G. Kirchschläger
- Publisher
- Vandenhoeck & Ruprecht GmbH & Co
- Location
- Wien
- Date
- 2020
- Language
- German
- License
- CC BY 4.0
- ISBN
- 978-3-7370-1165-5
- Size
- 15.5 x 23.2 cm
- Pages
- 722
- Category
- Recht und Politik