TechAmerica Europe comments for DAPIX on Pseudonymous Data and Profiling as per 19/12/2013 paper on Specific Issues of Chapters I-IV

Similar documents
IAB Europe Guidance THE DEFINITION OF PERSONAL DATA. IAB Europe GDPR Implementation Working Group WHITE PAPER

Ministry of Justice: Call for Evidence on EU Data Protection Proposals

Justice Select Committee: Inquiry on EU Data Protection Framework Proposals

ICC POSITION ON LEGITIMATE INTERESTS

Biometric Data, Deidentification. E. Kindt Cost1206 Training school 2017

Recast of RoHS Directive

ITAC RESPONSE: Modernizing Consent and Privacy in PIPEDA

Our position. ICDPPC declaration on ethics and data protection in artificial intelligence

ARTICLE 29 Data Protection Working Party

clarification to bring legal certainty to these issues have been voiced in various position papers and statements.

Having regard to the Treaty on the Functioning of the European Union, and in particular Article 16 thereof,

EFRAG s Draft letter to the European Commission regarding endorsement of Definition of Material (Amendments to IAS 1 and IAS 8)

8th Floor, 125 London Wall, London EC2Y 5AS Tel: +44 (0) Fax: +44 (0)

First Components Ltd, Savigny Oddie Ltd, & Datum Engineering Ltd. is pleased to provide the following

The Information Commissioner s response to the Draft AI Ethics Guidelines of the High-Level Expert Group on Artificial Intelligence

EUROPEAN CENTRAL BANK

8th Floor, 125 London Wall, London EC2Y 5AS Tel: +44 (0) Fax: +44 (0)

The Revised EU Block Exemption Regulation for Research and Development Agreements

2

The General Data Protection Regulation

PRIVACY ANALYTICS WHITE PAPER

Variation of UK Broadband s spectrum access licence for 3.6 GHz spectrum

GDPR Implications for ediscovery from a legal and technical point of view

UNIVERSAL SERVICE PRINCIPLES IN E-COMMUNICATIONS

EXIN Privacy and Data Protection Foundation. Preparation Guide. Edition

Organisation: Microsoft Corporation. Summary

The General Data Protection Regulation and use of health data: challenges for pharmaceutical regulation

DERIVATIVES UNDER THE EU ABS REGULATION: THE CONTINUITY CONCEPT

Privacy Policy SOP-031

VDMA Response to the Public Consultation Towards a 7 th EU Environmental Action Programme

ISO/IEC INTERNATIONAL STANDARD. Information technology Security techniques Privacy framework

TOOL #21. RESEARCH & INNOVATION

8th Floor, 125 London Wall, London EC2Y 5AS Tel: +44 (0) Fax: +44 (0)

Getting the evidence: Using research in policy making

Establishing a Development Agenda for the World Intellectual Property Organization

End-to-End Privacy Accountability

Submission to the Productivity Commission inquiry into Intellectual Property Arrangements

APEC Internet and Digital Economy Roadmap

What does the revision of the OECD Privacy Guidelines mean for businesses?

TITLE V. Excerpt from the July 19, 1995 "White Paper for Streamlined Development of Part 70 Permit Applications" that was issued by U.S. EPA.

Having regard to the Treaty establishing the European Community, and in particular its Article 286,

European Law as an Instrument for Avoiding Harmful Interference 5-7 June Gerry Oberst, SES Sr. Vice President, Global Regulatory & Govt Strategy

PRIMATECH WHITE PAPER COMPARISON OF FIRST AND SECOND EDITIONS OF HAZOP APPLICATION GUIDE, IEC 61882: A PROCESS SAFETY PERSPECTIVE

March 27, The Information Technology Industry Council (ITI) appreciates this opportunity

EUROPÊCHE RESPONSE TO THE EUROPEAN COMMISSION S CONSULTATION ON A NEW

THE EUROPEAN DATA PROTECTION SUPERVISOR, Having regard to the Treaty on the Functioning of the European Union, and in particular Article 16 thereof,

Mde Françoise Flores, Chair EFRAG 35 Square de Meeûs B-1000 Brussels Belgium January Dear Mde.

I hope you will find these comments constructive and helpful.

Media Literacy Policy

Legal Aspects of Identity Management and Trust Services

Counterfeit, Falsified and Substandard Medicines

Executive Summary. Industry urges the Technical Adaptation Committee (TAC), as a matter of utmost priority, to:

The EFPIA Perspective on the GDPR. Brendan Barnes, EFPIA 2 nd Nordic Real World Data Conference , Helsinki

CONSENT IN THE TIME OF BIG DATA. Richard Austin February 1, 2017

Ocean Energy Europe Privacy Policy

B) Issues to be Prioritised within the Proposed Global Strategy and Plan of Action:

Position Paper.

At its meeting on 18 May 2016, the Permanent Representatives Committee noted the unanimous agreement on the above conclusions.

The University of Sheffield Research Ethics Policy Note no. 14 RESEARCH INVOLVING SOCIAL MEDIA DATA 1. BACKGROUND

Commonwealth Data Forum. Giovanni Buttarelli

April 30, Andreas Bergman Chair International Public Sector Accounting Standards Board 529 Fifth Avenue, 6th Floor New York, NY USA

COMMISSION OF THE EUROPEAN COMMUNITIES

BBMRI-ERIC WEBINAR SERIES #2

Robert Bond Partner, Commercial/IP/IT

Data Protection Regulation: Keeping Health Research Alive in the EU. A Roundtable Event Hosted by Nessa Childers MEP. European Parliament, Brussels

FEE Comments on EFRAG Draft Comment Letter on ESMA Consultation Paper Considerations of materiality in financial reporting

IAB Europe Response to European Commission Consultation on the DP Framework

Spectrum for audio PMSE. Use of the 694 to 703 MHz band

Re: Examination Guideline: Patentability of Inventions involving Computer Programs

Impact on audit quality. 1 November 2018

December 8, Ms. Susan Cosper Technical Director Financial Accounting Standards Board 401 Merritt 7 PO Box 5116 Norwalk, CT

Fact Sheet IP specificities in research for the benefit of SMEs

Details of the Proposal

Proposed International Standard on Auditing 315 (Revised) Identifying and Assessing the Risks of Material Misstatement

Huawei response to the. Ofcom call for input: 3.8 GHz to 4.2 GHz band: Opportunities for Innovation

THE LABORATORY ANIMAL BREEDERS ASSOCIATION OF GREAT BRITAIN

World Trade Organization Panel Proceedings

ESA. European Seed Association. Community Plant Variety Rights System views of the European seed industry

Children s rights in the digital environment: Challenges, tensions and opportunities

International Seminar on Personal Data Protection and Privacy Câmara Dos Deputados-BRAZIL

GIE response to public consultation Interoperability NC of ENTSOG

Common evaluation criteria for evaluating proposals

Public Hearing on the use of security scanners at EU airports. European Economic and Social Committee. Brussels, 11 January 2011

April 21, By to:

CBD/ Access and Benefit Sharing

Mr Hans Hoogervorst International Accounting Standards Board 1 st Floor 30 Cannon Street London EC4M 6XH. MV/288 Mark Vaessen.

Consultation on the licensing of spectrum in the 800 MHz and 900 MHz bands

Making Materiality Judgements

RADIO SPECTRUM POLICY GROUP. Commission activities related to radio spectrum policy

Personal Data Protection Competency Framework for School Students. Intended to help Educators

IAASB Main Agenda (March, 2015) Auditing Disclosures Issues and Task Force Recommendations

(The Fishing Municipalities Strömstad-Tanum-Sotenäs-Lysekil-Tjörn-Göteborg-Ökerö Västra Götaland Region)

(Text with EEA relevance)

Questions and answers on the revised directive on restrictions of certain dangerous substances in electrical and electronic equipment (RoHS)

The BioBrick Public Agreement. DRAFT Version 1a. January For public distribution and comment

GROUP ON INTERNATIONAL AVIATION AND CLIMATE CHANGE (GIACC) FOURTH MEETING SUMMARY OF DISCUSSIONS DAY 3

Committee on the Internal Market and Consumer Protection

Identifying and Managing Joint Inventions

CODE OF CONDUCT. STATUS : December 1, 2015 DES C R I P T I O N. Internal Document Date : 01/12/2015. Revision : 02

OECD WORK ON ARTIFICIAL INTELLIGENCE

Transcription:

Tech EUROPE TechAmerica Europe comments for DAPIX on Pseudonymous Data and Profiling as per 19/12/2013 paper on Specific Issues of Chapters I-IV Brussels, 14 January 2014 TechAmerica Europe represents leading European high-tech operations with us parentage. Collectively we invest Euro 100 bn in Europe and employ approximately 500,000 Europeans. TechAmerica Europe Member companies are active throughout the technology spectrum, from software, semiconductors and computers to Internet technology, advanced electronics and telecommunications systems and services. Our parent company, TechAmerica is the leading tech association in the us. This paper has been put together in response to the 19 December 2013 Presidency paper summarising the discussion on the concepts of pseudonymous data and profiling within DAPIX. Please conta r more details. Introduction TechAmerica Europe (TAE) welcomes the opportunity to comment further on the concepts of pseudonymous data and profiling, currently under discussion with in DAPIX. The June 2013 Irish Presidency draft of Chapters I-IV of the Regulation made significant progress on both these concepts, but the 19 December Lithuanian Presidency paper highlights a number of ongoing issues that need addressing before both concepts can contribute to an improved legal framework for data protection in Europe. We believe much of the ongoing confusion about the concept of pseudonymous data derives from the fact that while some stakeholders - e.g. internet-based industries - see pseudonymous data as a concept that describes a particular state of personal data, other stakeholders particularly in the medical and scientific research community prefer to speak of pseudonymisation to describe a process of de-identification. Understanding these differences and resolving the confusion that these approaches cause will be essential if Council is to deliver on its promise to develop an effective riskbased approach within the new Regulation that can effectively calibrate controllers' and processors' data protection obligations while maintaining protection levels. Commentary Should delegations support the compromise reached in Irish presidency document (11013/13), namely the definition of "pseudonymous data" and corresponding calibrations of specific provisions of the draft Regulation?

Yes, the definition of pseudonymous data is an essential concept to a modernised data protection framework; however there is room for improving the compromise text and the possibility of including a notion of pseudonymisation, understood as a representation of a "privacy by design" technique. The June 2013 Irish text, while not perfect, contained a number of significant advances over the original proposal which merit the support of Council. We welcome, in particular, the inclusion for the first time of a definition of pseudonymous data, establishing the principle that not all identifiers may identify a data subject to the same degree and that different obligations might apply to the processing of such data. This is an absolutely critical insight and one of the cornerstones of any meaningful risk-based approach. We also welcome the reference to the use of pseudonymous data as an example of a relevant technique in the context of data protection by design (Article 23(1). However there are areas where we believe the Irish compromise requires further development. In an information economy, data controllers need to be able to target content and differentiated offerings in response to unique identifiers without the law concluding that the controller has "identified" the person just by treating them differently. Pseudonymous data, in other words, should serve as a safe means of allowing data controllers to treat different data subjects differently without linking that data to conventional identifiers that allow for direct identification. But it is not clear from the presidency's use of the phrase "specific data subject" whether "specific" means an identified data subject or simply a data subject that has been distinguished from another without identification. If the definition were taken to mean the latter, then this wording would appear to preclude the attribution of information to any singled-out individual, even absent conventional identifiers. In practical terms this distinction is important because, for example, websites aim to respond to information they gather about each unique, but unidentified, visitor to customise the experience for that visitor. The ability to safely tailor experiences on the basis of unique identifiers as a means of encouraging visitors to voluntarily directly identify themselves to a website is critical to the dynamics of e-ecommerce. But there is a risk that the Presidency definition of pseudonymous would not allow for this. An amendment to clarify that the definition covers unauthenticated users to allow differential treatment (subject to appropriate controls), would be beneficial in terms of stimulating data minimisation. Elsewhere, Recital 45 and Article 10 of the Irish Presidency text actually contradict each other, with the recital focused on processing that permits identification while the Article considers processing that requires identification. The change of focus in the Article is a significant change to the original European Commission drafting which, we believe, contradicts the original intention of that Article which was to help controllers who wish to treat unique users differently even if they could not then isolate the real person behind the online profile. The notion that processing should not "require" (as opposed to permit) identification is problematic because the notion of identification covers both direct and indirect identification. It is likely that much processing might require some form of indirect identification (i.e. to allow for customized experiences for two unique but unidentified visitors) but without permitting direct identification. If processing requires indirect identification then the 2

controller could in theory be required, under the presidency draft, to acquire more information to identify a visitor so that they are distinct from every other visitor. A return to the original Commission drafting, or the extension of the language of Recital 45 into Article 10, is required. We also have concerns with the implications of Recital 39 (which allows that the processing of personal data for the purposes of anonymising or pseudonymising personal data can be considered as a legitimate interest of the controller). On the face of it this attempts to provide legal coverage for situations where a controller collects but quickly anonymises or pseudonymises personal data. While in theory this could help avoid splitting legal hairs over how fast anonymisation I pseudonymisation must take place (is 1millisecond too slow?), the approach risks creating further legal uncertainty that undermines the legitimate interest clause. Pseudonymisation is not a purpose in itself. It may be a feature of a privacy by design approach, as the Irish presidency correctly identifies. And it may be a step in the processing of data for some other purposes (which may be pursued on various legal bases including legitimate interest in certain contexts). But pseudonymisation says nothing of the lawfulness of data collection in the first place, or of the lawfulness of subsequent processing. Article 30(1) should refer to pseudonymisation rather than pseudonymous data. As with privacy by design, pseudonymisation is a technique that can support the secure processing of data. Delete the reference to pseudonymous data in Article 32(2)a. The derogation from breach notification requirements for pseudonymous data overstates the protection offered by pseudonymous data. It should be remembered that this is not the same as anonymous data (which implies that someone would require a disproportionate amount of time or effort to identify an individual) or even encrypted personal data, but simply data that cannot identify an individual without additional data. When a breach occurs it cannot be known if the party that obtains the pseudonymised data has the necessary data to re-identify the individuals. Should the definition of "pseudonymous data" be replaced by a reference to a process supporting compliance with data protection requirements of the Regulation ("pseudonymisation")? No. The concept_oi_p..ie.ud91j~rnouldata cannot and should nq1_j)...e_r_epja_ced_,,_y jrutt_of pi~qljyml.sgtio.n. The legitimate business models of many stakeholders are not we" served by a framework that fails to accommodate both pseudonymous data as a state and pseudonymisation as a process. While the personal data definition proposed by the European Commission largely repeats concepts which exist within the existing Directive (reasonableness test, ability to identify, direct and indirect identification) it does increase uncertainty on the status of an extended range of identifiers which mayor may not, depending on the circumstances, be personal data. This has been done, inter alia, to respond to EU jurisprudence, e.g. Sabarn v Scarlet which states that for example IP addresses are "protected personal data", albeit referring only to the specific context of a data controller allocating that IPaddress to a known subscriber. 3

TAE members agree that such identifiers are worthy of protection and are rightly in scope for the Regulation. However we believe the Regulation needs to provide more efficient mechanisms for distinguishing between all types of personal data across the entire spectrum from clear direct identifiers to data which can only hypothetically or with a significant effort and cost be linked to a data subject. The introduction of pseudonymous data as a subset of personal data would allow for an injection of a risk-based approach where it is most needed, and in a way which Is entirely consistent with ECJ jurisprudence and the EU principles of necessity and proportionality. Failure to do so would leave data controllers confronted with the lack of legal certainty of Recital 24. A workable definition of pseudonymous data must therefore be flexible enough to cover both (i) data that has once directly identified an individual and has undergone a process to render it less likely to identify that person, and (ii) a series of unique online identifiers gathered about a data subject which may never have reached the point of actually allowing a controller to identify the person behind the data. Medical research tends to fall into (i) while internet industries rely on (ii). If a single definition cannot be found that covers both of these, then a separate additional definition of pseudonymisation (to describe a process of de-identification) is needed to provide legal support for the further processing of data for purposes including medical and scientific research. This definition could in turn be referenced in Article 24 on privacy by design as an example of a process which can help data controllers protect the interests of data subjects. PrQfil.ing Should delegations support the current compromise text on the issue of "profiling"? Should the definition of "profiling" be kept as in the current compromise, be identical to that of the Council of Europe; or remain in line with the logic of Directive 95/46/EC? We believe that the alternative definition of profiling outlined by the presidency in Paragraph 9, which seeks to align the definition of profiling with the logic of Directive 95/46/EC, should be supported. This point aside, the overall approach to Article 20 in the Irish presidency compromise text should be supported. On the definition of profiling, we agree with the argument outlined in paragraph 9 that this definition offers protection to a broader range of data by not requiring the creation of a profile. This approach is also more technology neutral and is hence less likely to be rendered obsolete by technological development. Data processing has substantially advanced since 1995. The ability to process data to extract new actionable insights is now absolutely essential to a knowledge-based economy. So any changes in Article 20 need to effectively protect the data subject against automated decisions that impact their legitimate interests whilst allowing legitimate and beneficial business activities that use advanced data processing techniques to continue and contribute to growth, jobs, entrepreneurship, innovation and competitiveness in Europe. 4

One of the ways that the Irish presidency compromise draft does this is by addressing a broader range of aspects than Article 15 of the 1995 Directive, such that location, behaviour and personal preferences are all now included. This means that data subjects that are subject to automated decisions based on these aspects and which affect their interests in a particular way are now entitled to some form of human intervention. The Irish compromise text ensures further welcome protections for data subjects compared to the current Directive by outlawing the use of profiling based on sensitive categories of data without suitable safeguards. It also effectively considers profiling as a separate category of data processing by limiting the legal bases available for such processing, compared to normal data processing. (NB This is in addition to the standard right to object to any data processing that is based on the legitimate interests of the data controller enshrined in Article 19). These are all measures that ensure strong protection for data subjects and represent welcome extensions of Article 15of Directive 1995/46/EC. Furthermore, we welcome in particular the fact that the compromise text has strengthened the legal threshold for impact on a data subject from "significant effect" to "severely affects". The proposed threshold also reflects a harm-based approach encompassing the requirement for a negative effect on the data subject. This is a pragmatic move that should reduce the likelihood of many legitimate business practices (particularly from the e-commerce space) being subject to unnecessary restriction. Some commentators have, for example, suggested that the targeting of content to unique but not identified website users, for example, should be considered as having a "significant effect" and thus be subject to Article 20. We reject this idea, and believe that the intention of Article 20 is to focus on clearly unfair or discriminatory practices such as the denial of insurance cover. A lower threshold could easily result in prohibiting many beneficial data processing techniques and enabling technologies across sectors that are clearly not intended to be covered by this provision. Absent this change in threshold, then it would be necessary to introduce the legitimate interests of the data controller as an allowable legal basis into the text of Article 20(2) to ensure a better balance of interests between data controllers and data subjects. Overall, when considering Article 20, it is important to reflect on the issues that the provision is attempting to address. Article 15 of the current Directive does not prohibit automated decisions, but rather seeks to ensure that human intervention is possible when automated decisions are taken that affect individuals in a particular way. Decisions that have a legal effect or significant effect are still permitted provided safeguards, such as the right to obtain human intervention, are in place. It is important that this concept is maintained in Article 20. For further information please contact: Rue de Namur 16 1000 Brussels 5