Article

Against the Dehumanisation of Decision-Making-Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information

Details

Citation

Noto La Diega G (2018) Against the Dehumanisation of Decision-Making-Algorithmic Decisions at the Crossroads of Intellectual Property, Data Protection, and Freedom of Information. JIPITEC - Journal of Intellectual Property, Information Technology and E-Commerce Law, 9 (1). https://www.jipitec.eu/issues/jipitec-9-1-2018/4677/?searchterm=noto%20la%20diega

Abstract
This work presents ten arguments against algorithmic decision-making. These re-volve around the concepts of ubiquitous discretionary interpretation, holistic intu-ition, algorithmic bias, the three black boxes, psychology of conformity, power of sanctions, civilising force of hypocrisy, pluralism, empathy, and technocracy. Nowadays algorithms can decide if one can get a loan, is allowed to cross a bor-der, or must go to prison. Artificial intelligence techniques (natural language pro-cessing and machine learning in the first place) enable private and public deci-sion-makers to analyse big data in order to build profiles, which are used to make decisions in an automated way. The lack of transparency of the algorithmic deci-sion-making process does not stem merely from the characteristics of the relevant techniques used, which can make it impossible to access the rationale of the deci-sion. It depends also on the abuse of and overlap between intellectual property rights (the “legal black box”). In the US, nearly half a million patented inventions concern algorithms; more than 67% of the algorithm-related patents were issued over the last ten years and the trend is increasing. To counter the increased mo-nopolisation of algorithms by means of intellectual property rights (with trade se-crets leading the way), this paper presents three legal routes that enable citizens to ‘open’ the algorithms. First, copyright and patent exceptions, as well as trade se-crets are discussed. Second, the EU General Data Protection Regulation is critical-ly assessed. In principle, data controllers are not allowed to use algorithms to take decisions that have legal effects on the data subject’s life or similarly significantly affect them. However, when they are allowed to do so, the data subject still has the right to obtain human intervention, to express their point of view, as well as to contest the decision. Additionally, the data controller shall provide meaningful in-formation about the logic involved in the algorithmic decision. Third, this paper critically analyses the first known case of a court using the access right under the freedom of information regime to grant an injunction to release the source code of the computer program that implements an algorithm. Only an integrated ap-proach – which takes into account intellectual property, data protection, and free-dom of information – may provide the citizen affected by an algorithmic decision of an effective remedy as required by the Charter of Fundamental Rights of the EU and the European Convention on Human Rights.

Keywords
Algorithmic decision-making; algorithmic bias; right not to be subject to an algorithmic decision; GDPR; software copyright exceptions; patent infringement defences; freedom of information request; algorithmic transparency; algorithmic accountability; algorithmic governance; Data Protection Act 2018

Journal
JIPITEC - Journal of Intellectual Property, Information Technology and E-Commerce Law: Volume 9, Issue 1

StatusPublished
FundersNorthumbria University
Publication date31/05/2018
Date accepted by journal01/05/2018
URLhttp://hdl.handle.net/1893/30673
Publisher URLhttps://www.jipitec.eu/…oto%20la%20diega
ISSN2190-3387

Files (1)