Web-Books
in the Austria-Forum
Austria-Forum
Web-Books
Zeitschriften
Austrian Law Journal
Austrian Law Journal, Volume 1/2019
Page - 45 -
  • User
  • Version
    • full version
    • text only version
  • Language
    • Deutsch - German
    • English

Page - 45 - in Austrian Law Journal, Volume 1/2019

Image of the Page - 45 -

Image of the Page - 45 - in Austrian Law Journal, Volume 1/2019

Text of the Page - 45 -

ALJ 2019 Digital Single Market – towards Smart Regulations 45 trained or how single criteria are weighted.53 Art. 22 thus cannot guarantee full transparency of the machine-learning algorithmic system behind an automated decision.54 Since algorithmic decisions generally only represent a special type of data processing,55 Art. 35 GDPR, which governs the data protection impact assessment, shall apply as well. Accordingly, whenever the processing is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall carry out an assessment of the impact of the envisaged algorithmic processing operations. Such a high risk is given if a systematic and extensive evaluation of personal aspects relating to natural persons is carried out based on automated processing (including profiling), which may then result in decisions that produce legal effects concerning the natural person or similarly significantly affect him or her.56 Based on the above elaboration it can be assumed that Art. 22 GDPR, in connection with the information obligations of Articles 13 and 14 GDPR, already covers a large part of algorithmic decision making. These provisions ensure the possibility of a human review or – potentially even more desirable - that the final decision is made by a human being. Regulation of algorithmic decision-making via data protection law is a valuable first step, but does not, as we have demonstrated, cover all relevant aspects thereof. As a consequence, it seems necessary to assess on a case to case basis whether the current regulatory framework is sufficient to meet other challenges of algorithmic decision-making or whether new regulation is needed. We will now turn to Art. 101 TFEU as an example of the first category. 2. Art. 101 para. 1 TFEU as an example of an implicit regulation on the use of algorithms While the provision does not explicitly address the topic, several aspects of the use of machine- learning algorithms have already been discussed in the light of Art. 101 para. 1 TFEU.57 For example, the use by online platforms of machine-learning algorithms for pricing ("digital pricing"58 53 Cf. Ernst, supra note 39, at 1033. For instance, a model description is also available for the above-mentioned example of the Austrian AMS. Regardless of the ambitions to defeat human prejudices with the help of the new algorithms and thus to guarantee objective and well-founded decisions, the documentation of the methodological considerations reveals above all disadvantages for structurally disadvantaged groups. Cf. Holl, Kernbeiß and Wagner-Pinter, Das AMS-Arbeitsmarktchancen-Modell: Dokumentation zur Methode (2018) Konzeptunterlage der Synthesisforschung Gesellschaft m.b.H. http://www.forschungsnetzwerk.at/downloadpub/arbeitsmarktchancen_methode_%20dokumentation.pdf. 54 Cf. the basic principle of transparency outlined in Art. 5 para. 1 a) GDPR. Initiatives such as Algo-Aware (cf. COM, algoaware, available at https://www.algoaware.eu/), a platform procured by the European Commission, that provides for information on the opportunities and challenges of algorithmic decision-making in commercial, cultural and civic society settings, cannot replace the necessity of transparency in individual cases (which has to take the specific circumstances of an individuals into account). 55 Cf. Art. 4 No. 2 GDPR. 56 Art. 35 para. 3 a) GDPR. 57 This has also been stressed by Commissioner for Competition Margarethe Vestager. Cf. Vestager, Speech at the Bundeskartellamt’s 18th Conference on Competition, BUNDESKARTELLAMT’S 18TH CONFERENCE ON COMPETITION, Mar. 16, 2017, available at https://ec.europa.eu/commission/commissioners/2014- 2019/vestager/announcements/bundeskartellamt-18th-conference-competition-berlin-16-march-2017_en. In this speech, Vestager said that „we need to make it very clear that companies can’t escape responsibility for collusion by hiding behind a computer program“. 58 On “Pricing algorithms“ cf. OECD, Algorithms and Collusion – Background Note by the Secretariat (DAF/COMP, 2017) 4 para 26.
back to the  book Austrian Law Journal, Volume 1/2019"
Austrian Law Journal Volume 1/2019
Title
Austrian Law Journal
Volume
1/2019
Author
Karl-Franzens-Universität Graz
Editor
Brigitta Lurger
Elisabeth Staudegger
Stefan Storr
Location
Graz
Date
2019
Language
German
License
CC BY 4.0
Size
19.1 x 27.5 cm
Pages
126
Keywords
Recht, Gesetz, Rechtswissenschaft, Jurisprudenz
Categories
Zeitschriften Austrian Law Journal
Web-Books
Library
Privacy
Imprint
Austria-Forum
Austria-Forum
Web-Books
Austrian Law Journal