Profiling the European Citizen

Similar documents
Privacy, Due Process and the Computational Turn: The philosophy of law meets the philosophy of technology

Artificial intelligence and judicial systems: The so-called predictive justice

Personal Data Protection Competency Framework for School Students. Intended to help Educators

Our position. ICDPPC declaration on ethics and data protection in artificial intelligence

CONSENT IN THE TIME OF BIG DATA. Richard Austin February 1, 2017

Big Data & Ethics some basic considerations

CILIP Privacy Briefing 2017

Ethical and social aspects of management information systems

This research is supported by the TechPlan program funded by the ITS Institute at the University of Minnesota

Digital transformation in the Catalan public administrations

Ethics Guideline for the Intelligent Information Society

RFID and Privacy an antagonism?

Societal and Ethical Challenges in the Era of Big Data: Exploring the emerging issues and opportunities of big data management and analytics

FUTURE TECHNOLOGIES FUTURE PRIVACY CHALLENGES

Challenges to human dignity from developments in AI

28 TH INTERNATIONAL CONFERENCE OF DATA PROTECTION

Data Protection, Privacy and Identity: Distinguishing Concepts and Articulating Rights

Executive Summary Industry s Responsibility in Promoting Responsible Development and Use:

Biometric Data, Deidentification. E. Kindt Cost1206 Training school 2017

clarification to bring legal certainty to these issues have been voiced in various position papers and statements.

On the Diversity of the Accountability Problem

Belgian Position Paper

T H E F O U N D A T I O N S O F T H E T I L B U R G C O B B E N H A G E N C E N T E R

INFORMATION AND COMMUNICATION TECHNOLOGIES AND HUMAN RIGHTS

How Explainability is Driving the Future of Artificial Intelligence. A Kyndi White Paper

The Information Commissioner s response to the Draft AI Ethics Guidelines of the High-Level Expert Group on Artificial Intelligence

From: President Magna Charta Observatory To: Council and Review Group Date: 8 September Towards a new MCU a first exploration and roadmap

Ethics, privacy and legal issues concerning GIS. This is lecture 12

Future of Identity in the Information Society. An FP6 Network of Excellence

Utility Patents. New and useful inventions and configurations of useful articles

Lumeng Jia. Northeastern University

IoT governance roadmap

Children s rights in the digital environment: Challenges, tensions and opportunities

Ethical issues raised by big data and real world evidence projects. Dr Andrew Turner

DIMACS/PORTIA Workshop on Privacy Preserving

Privacy and Security in Europe Technology development and increasing pressure on the private sphere

Data Protection, Privacy and Identity:

Convention on Certain Conventional Weapons (CCW) Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) April 2016, Geneva

National approach to artificial intelligence

Empirical Research on Invalidation Request of Invention Patent Infringement Cases in Shanghai

Vision. The Hague Declaration on Knowledge Discovery in the Digital Age

Before the NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION Washington, D.C Docket No. NHTSA

LAB3-R04 A Hard Privacy Impact Assessment. Post conference summary

Artificial Intelligence in Law: Facts, Futures & Risks

Computers, Privacy & Data Protection 2015 Data Protection on the Move Brussels, 23 January 2015

Programme Curriculum for Master Programme in Economic History

TechAmerica Europe comments for DAPIX on Pseudonymous Data and Profiling as per 19/12/2013 paper on Specific Issues of Chapters I-IV

Castan Centre for Human Rights Law Faculty of Law, Monash University. Submission to Senate Standing Committee on Economics

A Research and Innovation Agenda for a global Europe: Priorities and Opportunities for the 9 th Framework Programme

Predictive modelling

Transparency in Negotiations Involving Norms for Knowledge Goods. What Should USTR Do? 21 Specific Recommendations

Safety and Security. Pieter van Gelder. KIVI Jaarccongres 30 November 2016

Global Standards Symposium. Security, privacy and trust in standardisation. ICDPPC Chair John Edwards. 24 October 2016

ICC POSITION ON LEGITIMATE INTERESTS

Introduction to Foresight

Integrating Fundamental Values into Information Flows in Sustainability Decision-Making

EXPLORATION DEVELOPMENT OPERATION CLOSURE

Artificial Intelligence and the Law The Manipulation of Human Behaviour. Stanley Greenstein

Fraunhofer ISI Seite 1

General Questionnaire

Robotic automation goes mainstream: Accenture announces agreement with IPsoft

Cover Page. The handle holds various files of this Leiden University dissertation.

Privacy engineering, privacy by design, and privacy governance

I m sorry, my friend, but you re implicit in the algorithm Privacy and internal access to #BigDataStream

There is a difference between a system, a complex system (a system that is complex), and a

Big Data and Personal Data Protection Challenges and Opportunities

Australian Census 2016 and Privacy Impact Assessment (PIA)

A Citizen s Guide. to Big Data and Your Privacy Rights in Nova Scotia. Office of the Information and Privacy Commissioner for Nova Scotia

ALGORITHMIC EFFECTS ON USER S EXPERIENCE

Added Value of Networking Case Study INOV: encouraging innovation in rural Portugal. Portugal

Unified Ethical Frame for Big Data Analysis IAF Big Data Ethics Initiative, Part A. Draft March 2015

On Epistemic Effects: A Reply to Castellani, Pontecorvo and Valente Arie Rip, University of Twente

How Books Travel. Translation Flows and Practices of Dutch Acquiring Editors and New York Literary Scouts, T.P. Franssen

Human Rights Approach

Towards Trusted AI Impact on Language Technologies

Correlation Guide. Wisconsin s Model Academic Standards Level II Text

Views from a patent attorney What to consider and where to protect AI inventions?

OECD WORK ON ARTIFICIAL INTELLIGENCE

ANU COLLEGE OF MEDICINE, BIOLOGY & ENVIRONMENT

IGF Policy Options for Connecting the Next Billion - A Synthesis -

The Information Commissioner s role

Foresight & Policy-Making How?

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Committee on the Internal Market and Consumer Protection. of the Committee on the Internal Market and Consumer Protection

THE AI REVOLUTION. How Artificial Intelligence is Redefining Marketing Automation

CHAPTER 1 PURPOSES OF POST-SECONDARY EDUCATION

Internet 2020: The Next Billion Users

DIGITAL CITIZENSHIP EDUCATION (DCE)

Interoperable systems that are trusted and secure

Biometric Recognition: How Do I Know Who You Are?

Questionnaire February 2010

ISO/IEC INTERNATIONAL STANDARD. Information technology Security techniques Privacy framework

EUROPEAN COMMITTEE ON CRIME PROBLEMS (CDPC)

Draft Recommendation concerning the Protection and Promotion of Museums, their Diversity and their Role in Society

Book review: Group Privacy: New Challenges of Data Technologies

Identifying and Managing Joint Inventions

Location Privacy by Design - Technology & Business Incentives

Position Paper: Ethical, Legal and Socio-economic Issues in Robotics

The Role of Computer Science and Software Technology in Organizing Universities for Industry 4.0 and Beyond

Grades 5 to 8 Manitoba Foundations for Scientific Literacy

Transcription:

Vrije Universiteit Brussel From the SelectedWorks of Serge Gutwirth January 17, 2008 Profiling the European Citizen Serge Gutwirth Mireille Hildebrandt Available at: https://works.bepress.com/serge_gutwirth/13/

1 Profiling the European Citizen Serge Gutwirth and Mireille Hildebrandt CPDP2009 Text of the presentation by Serge Gutwirth on 17 January 2009 As you see I m standing here alone, but, as the program mentions, I speak for two : my presentation has been prepared with two heads and four hands : those of Mireille Hildebrandt and mine. Mireille and I have had the chance to be the editors of this book Profiling the European Citizen published last year with Springer - and which proposes a rich cross-disciplinary collection of contributions and replies on the issue of profiling, which we believe to be quintessential for privacy and data protection, today and in the near future. What follows is thus also indebted to the contributors of the book and to their common work within the European Network of Excellence on the future of identity in the information society (FIDIS) Indeed, it is not possible to do justice to the richness of the research done in 15 minutes, but Mireille Hildebrandt and I will try to present its most striking challenges, conclusions and outstanding issues. We will also suggest some paths to cope with them. * As you know profiling can pertain to one individual person, to a group or groups of persons, but also to animals, to objects and to relations between all those. It can be used, on the one hand, to classify, describe and analyze what happened, which is not particularly new or problematic. In such cases profiling permits a structuration of what was already known. On the other hand - and this is what we target today - profiling is used to cluster data in such way that information is inferred, and predictions or expectations can be proposed. Such profiling activity thus produces a particular sort of knowledge. The knowledge produced is non-representational : it does not represent a current state of affairs: profiles are patterns resulting of a probabilistic processing of data. They do not describe reality, but are detected in data bases by the aggregation, mining and cleansing of data. Taken to a more abstract level, by mining of machine readable data profiling leads to the identification of patterns in the past which can develop into probabilistic knowledge about individuals, groups of humans and non-humans in the present and in the future. In a way, our view of present and future is then shaped by what the data mining makes visible.

2 But indeed even if the profiling process shows that a pattern occurs every time some conditions are met, one cannot be 100% sure it will happen today and tomorrow as well. Based on its experience, an animal may associate a situation with a danger as a result of the recognition of a certain pattern and act consistently, even if the situation, in reality, is not a dangerous one : the bad human smell and the shuffling footsteps were not those of a bloodthirsty hunter, but those of a sweet animal rights observer. The example demonstrates that profiling is not a new phenomenon, but that it is as old as life. It is a kind of knowledge that has always supported the behavior of living beings and humans. It might well be that the insight that we often intuitively know something before we understand it, can be explained by the role profiling spontaneously plays in our minds. The point is, however, that in the recent decades profiling capacities have exponentially grown as a result of both the advances in technology and the increasing availability of readily processable data and traces. The use and convergence of the web, mobile phones, electronic financial systems, biometric identification systems, RFIDs, GPS, ambient intelligence and so forth, all participate in the automatic generation of data which become available for still more pervasive and powerful data mining and tracking systems. In sum, an enormous and permanently inflating cloud of electronic dust is there for grabs enabling not only extensive data mining and profiling, but also providing for real-time and autonomic applications which impact upon ongoing actions and their environment. To us these evolutions represent more than mere quantitative changes. On the contrary, they represent a significant qualitative shift compared to more classical statistical approaches that aim at validating or invalidating already proposed correlations believed to be relevant and pertinent to answer preceding questions. The correlations are the result of an oriented questioning and they are measurements. Today, however, such preceding questions are disappearing. Very differently, the emergence in itself of a correlation has become the pertinent information and will in its turn launch questions and suppositions. Things are going the other way around now: the detection of the correlation is the information. Detections, however, are much wider than measurements; they don t have a specific meaning, but they will have an impact if used or applied, and their meaning is produced by their application. In other words, the qualitative shift lies in the fact that correlations and profiles get generated before any preceding interest or question. This is why it can be said humans have become detectable far beyond their control : their actions have become the resources of an extensive, if not unlimited, network of possible profiling devices generating knowledge affecting and impacting upon them.

3 Indeed, such a shift demands careful monitoring from the perspective of the democratic constitutional state, because it likely entails a number threats to its such as - the surreptitious influencing, formatting and customisation of individual behavior, - the sharpening of power inequalities between those that possess the profiles and those that are being profiled - the making of wrong decisions as a result of false positives and false negatives - the making of unfair decisions based on correct profiles that allow for unwarranted and invisible discrimination - and, last but not least, the taking of unmotivated and unilateral decisions about individuals Next to these threats, profiling is also the precondition for autonomic computing that allows for a new socio-technical infrastructure that runs autonomically, that is by taking a number of decisions without human intervention. Autonomic computing will involve distributed intelligence that emerges from networked objects which are in a process of continuous real time machine to machine communication, and it is not clear how, in the case of harm, liability could be attributed to one of the nodes of such networks. Decisions taken, then, are not intentional in the traditional sense of the word, and they are not taken by one particular human or nonhuman node. Civil liability can of course be based on a strict liability, but to attribute criminal liability in case where neither a cause nor blame can be attributed we seem to have a problem. Another issue worth mentioning relates to the legal status of profiles : who has rights upon this machine generated knowledge? And if someone does, which rights? A crucial additional point is indeed that the process of data mining and the ways profiles are build is mostly invisible and uncontrollable for the citizens to which they are applied. Citizens whose data are being mined do not have the means to anticipate what the algorithms will come up with and hence they do not have a clue what knowledge about them exists, how they are categorized and evaluated, and what effects and consequences this entails. For individual citizens to regain some control, access is needed to the profiles applied to them. This will require both legal tools (rights to transparency) and technological tools (the means to exercise such rights). Under the name of Ambient Law some - and Mireille to start with - defend the idea that law should be embodied in the socio-technical infrastructure it aims to protect against. * From a legal point of view, profiling makes it necessary to clearly distinguish between privacy on the one hand and data protection on the other. Privacy is recognized as a fundamental right in different major international legal instruments and in many national constitutions. In short, it protects a number of fundamental political values of democratic constitutional states, such as the freedom of

4 self-determination of individuals, their right to be different, their autonomy to engage in relationships, their freedom of choice, and so on. By default privacy prohibits interferences of the state and private actors in the individuals autonomy : it shields them off from intrusions, it provides them a certain degree of opacity and invisibility. The scope and reach of privacy are underdetermined and it is up to the judges to decide when privacy interests are at stake and when protection can rightfully be invoked. Legislators can also intervene to protect particular privacy interests, for example through the enacting of professional secrets, the secrecy of communications or the inviolability of the home. Data protection is both broader and more specific than the right to privacy. It is broader because data protection also protects other fundamental rights such as the freedom of expression, the freedom of religion and conscience, the free flow of information, liberty, the principle of non discrimination, individual self-determination... But data protection is also more specific than privacy since it simply and only applies when personal data are processed. The application of data protection rules does not raise a privacy issue: data protection applies when the statutory conditions are met. By default, and contrary to privacy, data protection rules are not prohibitive, but they organize and control the way personal data are processed: such data can only be legitimately processed if some conditions pertaining to the transparency of the processing, the participation of the data subject and the accountability of the data controller are met. With regards to profiling, the former entails that data protection law only applies when profiling activities involve personal data. Protection beyond personal data is not foreseen and that actually leaves out the situations wherein profiling techniques make it possible to impact upon a person s behavior and autonomy without rendering this person identifiable, which will happen frequently, particularly in applications of ambient intelligence. Nevertheless, in such cases privacy interests are still under pressure and privacy protection can be called upon, which significantly implies that the non-applicability of data protection does not mean that there is no existing protection since privacy can be invoked. This indeed is not to say that there is no need for a better protection, considering especially the invisibility of the profiling process and the ensuing profiles. The problem is also that threats to non-discrimination and due process are not really met in the present legal framework. That is why we think that profiling calls for a system of protection of individuals against the processing of data that impact upon their behavior even if those data cannot be considered as personal data, which implies a shift from the protection of personal data to the protection of data tout court! It might seem so, but in fact this is not a revolutionary step since it just picks up the tread opened by the Directive 2002/58 which, in order to protect privacy, provided for the protection of location and traffic data (which are not necessarily personal data ).

5 Also, the same directive 2002/58 gave us the inspiration to plead for a regulation of unsolicited adjustments similar to the existing regulation of unsolicited communications or spam, providing for an opt-in system. No adjustments without explicit prior consent, would then be the rule. I will not elaborate upon this idea since Gloria Gonzalez Fuster will pick it up later during the 11 o clock panel. More generally speaking, we should maybe explore the possibility of a new legal approach of profiling, focusing on the way profiles can affect our behavior and decisions. Such a shift would emphasize the issues of discrimination and manipulation of conduct through the use of profiles, as well as the transparency and controllability of profiles. Furthermore, even if data protection law theoretically applies to many facets of profiling, many problems subsist, because its techniques remain a technological black box for citizens, making data protection uneffective and unworkable. Where data protection demands transparency and controllability, data mining and profiling tend to remain opaque, incomprehensible and evasive. That is why the integration of legal transparency norms into technological devices that can translate, for the citizen, what profiling machines are doing is a priority. * If we want to anticipate and/or change the way machines profile us, we will need what we have called transparency enhancing technologies (or TETs). If Privacy Enhnacing Technologies or PETs aimend at technologically enforce the individuals invisibility, TETs would involve the integration of legal transparency in the technological infrastructure they aim to protect against. As such they would empower citizens to unfurl the profiling operations they are subject to. TETs, however, are still to be invented and their application may run counter to the intellectual property rights of the owners of databases, while the question remains how humans could effectively communicate with the machines that provides transparency of the proliferation of profiles. This is a topic presently under investigation within the FIDIS research network. Profiling is a powerful technique that renders visible what is invisible to the naked human eye. This, however, concerns patterns in data bases that must not be mistaken for reality. By making visible what is aggregated in the data base, profiling also make invisible what cannot be translated into machine readable data. In as far as the governance of people and things becomes dependent on these advanced profiling technologies, new risks will emerge in the shadow of the real time models and simulations these technologies make possible. What has been made invisible can grow like weed. Threats to privacy, liberty, due process and non-discrimination may in fact hide under the surface of what has been called hidden complexity.

We hope that this conference will help to re-visualize what is happening under the sheets of autonomically interacting networks of things and other applications of ambient intelligence, and that it will put profiling on the agenda of policy makers, academics and activists as one of the most powerful and invisible techniques shaping our present and futures. 6