The Information Commissioner s role

Similar documents
Robert Bond Partner, Commercial/IP/IT

The Information Commissioner s response to the Draft AI Ethics Guidelines of the High-Level Expert Group on Artificial Intelligence

ICO submission to the inquiry of the House of Lords Select Committee on Communications - The Internet : To Regulate or not to Regulate?

ITAC RESPONSE: Modernizing Consent and Privacy in PIPEDA

ARTICLE 29 Data Protection Working Party

ICC POSITION ON LEGITIMATE INTERESTS

Commonwealth Data Forum. Giovanni Buttarelli

Our position. ICDPPC declaration on ethics and data protection in artificial intelligence

Executive Summary Industry s Responsibility in Promoting Responsible Development and Use:

IV/10. Measures for implementing the Convention on Biological Diversity

Privacy Impact Assessment on use of CCTV

The University of Sheffield Research Ethics Policy Note no. 14 RESEARCH INVOLVING SOCIAL MEDIA DATA 1. BACKGROUND

Should privacy impact assessments be mandatory? David Wright Trilateral Research & Consulting 17 Sept 2009

House of Lords Select Committee on the Constitution

IAB Europe Guidance THE DEFINITION OF PERSONAL DATA. IAB Europe GDPR Implementation Working Group WHITE PAPER

Justice Select Committee: Inquiry on EU Data Protection Framework Proposals

Pan-Canadian Trust Framework Overview

The General Data Protection Regulation and use of health data: challenges for pharmaceutical regulation

Getting the evidence: Using research in policy making

CCTV Policy. Policy reviewed by Academy Transformation Trust on June This policy links to: Safeguarding Policy Data Protection Policy

CONSENT IN THE TIME OF BIG DATA. Richard Austin February 1, 2017

Submission to the Productivity Commission inquiry into Intellectual Property Arrangements

Data Protection and Privacy in a M2M world. Yiannis Theodorou, Regulatory Policy Manager GSMA Latam Plenary Peru, November 2013

Details of the Proposal

IoT in Health and Social Care

TechAmerica Europe comments for DAPIX on Pseudonymous Data and Profiling as per 19/12/2013 paper on Specific Issues of Chapters I-IV

Dr Nicholas J. Gervassis University of Plymouth THE EMERGING UK DATA PROTECTION FRAMEWORK AND BEYOND

About the Office of the Australian Information Commissioner

A Research and Innovation Agenda for a global Europe: Priorities and Opportunities for the 9 th Framework Programme

March 27, The Information Technology Industry Council (ITI) appreciates this opportunity

Building DIGITAL TRUST People s Plan for Digital: A discussion paper

I hope you will find these comments constructive and helpful.

General Questionnaire

EXIN Privacy and Data Protection Foundation. Preparation Guide. Edition

CCTV Policy. Policy reviewed by Academy Transformation Trust on June This policy links to: T:Drive. Safeguarding Policy Data Protection Policy

Societal and Ethical Challenges in the Era of Big Data: Exploring the emerging issues and opportunities of big data management and analytics

Comments from CEN CENELEC on COM(2010) 245 of 19 May 2010 on "A Digital Agenda for Europe"

12 April Fifth World Congress for Freedom of Scientific research. Speech by. Giovanni Buttarelli

This policy sets out how Legacy Foresight and its Associates will seek to ensure compliance with the legislation.

28 TH INTERNATIONAL CONFERENCE OF DATA PROTECTION

Doing, supporting and using public health research. The Public Health England strategy for research, development and innovation

The main recommendations for the Common Strategic Framework (CSF) reflect the position paper of the Austrian Council

Ministry of Justice: Call for Evidence on EU Data Protection Proposals

IoT governance roadmap

End-to-End Privacy Accountability

The Role of the Intellectual Property Office

Engaging UK Climate Service Providers a series of workshops in November 2014

The Citizen View of Government Digital Transformation 2017 Findings

Section 1: Internet Governance Principles

Ethics Guideline for the Intelligent Information Society

Big Data & AI Governance: The Laws and Ethics

EXPLORATION DEVELOPMENT OPERATION CLOSURE

What does the revision of the OECD Privacy Guidelines mean for businesses?

Herts Valleys Clinical Commissioning Group. Review of NHS Herts Valleys CCG Constitution

A Guide for Structuring and Implementing PIAs

2

December Eucomed HTA Position Paper UK support from ABHI

South West Public Engagement Protocol for Wind Energy

The EFPIA Perspective on the GDPR. Brendan Barnes, EFPIA 2 nd Nordic Real World Data Conference , Helsinki

November 18, 2011 MEASURES TO IMPROVE THE OPERATIONS OF THE CLIMATE INVESTMENT FUNDS

EFRAG s Draft letter to the European Commission regarding endorsement of Definition of Material (Amendments to IAS 1 and IAS 8)

How can public and social innovation build a more inclusive economy?

What We Heard Report Inspection Modernization: The Case for Change Consultation from June 1 to July 31, 2012

Reflections on progress made at the fifth part of the second session of the Ad Hoc Working Group on the Durban Platform for Enhanced Action

Convention on Certain Conventional Weapons (CCW) Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) April 2016, Geneva

Towards a Magna Carta for Data

COMMISSION OF THE EUROPEAN COMMUNITIES

RECOMMENDATIONS. COMMISSION RECOMMENDATION (EU) 2018/790 of 25 April 2018 on access to and preservation of scientific information

Integrated Transformational and Open City Governance Rome May

Towards Code of Conduct on Processing of Personal Data for Purposes of Scientific Research in the Area of Health

Research integrity. House of Commons Science and Technology Committee. Submission from the Royal Academy of Engineering.

How Explainability is Driving the Future of Artificial Intelligence. A Kyndi White Paper

Spurring Big Data-Driven Innovation and Promoting Responsible Data Governance in a Privacy-Centred Europe

Legal Aspects of the Internet of Things. Richard Kemp June 2017

ANU COLLEGE OF MEDICINE, BIOLOGY & ENVIRONMENT

IAASB Main Agenda (March, 2015) Auditing Disclosures Issues and Task Force Recommendations

GSA SUMMARY REPORT OF EQUALITY CONSIDERATION AND ASSESSMENT OF EQUALITY IMPACT. PGT Ethics Policy. New: Existing/Reviewed: Revised/Updated:

Ethical issues raised by big data and real world evidence projects. Dr Andrew Turner

Intergovernmental Group of Experts on E-Commerce and the Digital Economy First session. 4-6 October 2017 Geneva. Statement by SINGAPORE

Technology and Innovation in the NHS Scottish Health Innovations Ltd

UNITED NATIONS COMMISSION ON SCIENCE AND TECHNOLOGY FOR DEVELOPMENT (CSTD)

THE LABORATORY ANIMAL BREEDERS ASSOCIATION OF GREAT BRITAIN

OECD WORK ON ARTIFICIAL INTELLIGENCE

CREDITING-RELATED READINESS ACTIVITIES UNDER THE PMR: UPDATE AND SUGGESTED NEXT STEPS

Latin-American non-state actor dialogue on Article 6 of the Paris Agreement

International Seminar on Personal Data Protection and Privacy Câmara Dos Deputados-BRAZIL

How do you teach AI the value of trust?

UN-GGIM Future Trends in Geospatial Information Management 1

Mde Françoise Flores, Chair EFRAG 35 Square de Meeûs B-1000 Brussels Belgium January Dear Mde.

PRIVACY ANALYTICS WHITE PAPER

Tuning-CALOHEE Assessment Frameworks for the Subject Area of CIVIL ENGINEERING The Tuning-CALOHEE Assessment Frameworks for Civil Engineering offers

Information & Communication Technology Strategy

Personal Data Protection Competency Framework for School Students. Intended to help Educators

Enforcement of Intellectual Property Rights Frequently Asked Questions

The General Data Protection Regulation

Innovation Systems and Policies in VET: Background document

COUNCIL OF THE EUROPEAN UNION. Brussels, 9 December 2008 (16.12) (OR. fr) 16767/08 RECH 410 COMPET 550

Effective Data Protection Governance An Approach to Information Governance in an Information Age. OECD Expert Consultation Boston October 2016

MINISTRY OF HEALTH STAGE PROBITY REPORT. 26 July 2016

PRIMATECH WHITE PAPER COMPARISON OF FIRST AND SECOND EDITIONS OF HAZOP APPLICATION GUIDE, IEC 61882: A PROCESS SAFETY PERSPECTIVE

Transcription:

Information Commissioner s response to the House of Commons Science and Technology Committee inquiry on The big data dilemma The Information Commissioner s role 1. The Information Commissioner has responsibility for promoting and enforcing the Data Protection Act 1998 ( DPA ), the Freedom of Information Act 2000 ( FOIA ), the Environmental Information Regulations ( EIR ) and the Privacy and Electronic Communications Regulations 2003 ( PECR ). He also deals with complaints under the Re-use of Public Sector Information Regulations 2015 ( RPSI ) and the INSPIRE Regulations 2009. 2. He is independent from government and upholds information rights in the public interest, promoting openness by public bodies and data privacy for individuals. The Commissioner does this by providing guidance to individuals and organisations, solving problems where he can, and taking appropriate action where the law is broken. 3. The Commissioner welcomes the opportunity to respond to this consultation. We have been looking at issues to do with big data for the last two years, as we have a strong regulatory interest in the privacy and data protection risks that arise when big data analytics involve the processing of personal data. In July 2014 we published our discussion paper on Big data and data protection 1. We invited feedback on this, and in April 2015 we published a summary of that feedback, together with our response and our plans for future work 2. Our submission to this inquiry reflects our findings from that work and takes account of more recent developments. Our key points are as follows: Big data is the evolution of large scale analytics that have been used for many years, rather than a new concept. However, the new capabilities on scale, speed and the ability to combine data do represent a step change. This step change introduces new incentives and opportunities for organisations to use data and test boundaries, and therefore increases the risk to privacy. The benefits of big data should not simply be traded for privacy rights. Data protection principles still apply to big data - they challenge organisations to be innovative and they promote good 1 Information Commissioner s Office. Big data and data protection. ICO, July 2014. Available at: https://ico.org.uk/media/for-organisations/documents/1541/big-data-and-data-protection.pdf 2 Information Commissioner s Office. Summary of feedback on Big data and data protection and ICO response. ICO, April 2015. Available at: https://ico.org.uk/media/for-organisations/documents/1043723/summary-offeedback-on-big-data-and-data-protection-and-ico-response.pdf

practice. The benefits of big data will be best achieved through gaining public trust and a high level of data protection compliance is an important component of gaining this trust. Different big data scenarios will present different risks. Privacy Impact Assessments (PIAs) should be used as a tool to assess data protection risks, ensure use of personal data is proportionate and mitigate the risks identified. Government strategy should emphasize the benefits of carrying out PIAs. It should also recognise that data protection supports data quality. The HE curriculum for data scientists should include education in applying a risk-based approach to data protection. The school curriculum should raise young people s awareness of how data is used and their rights. An assurance scheme, such as the ICO s planned privacy seals programme, can raise standards of compliance, build trust and demonstrate good practice. It can be difficult for organisations to provide privacy information about big data but there are ways to do this effectively organisations will need to innovate. When obtaining meaningful consent is problematic organisations can also consider other conditions in the DPA to legitimise the processing. Further research on anonymisation techniques should be supported. There is scope for considering the introduction of a criminal offence of deliberate re-identification. The opportunities for big data, and the risks 4. We recognise the potential benefits of big data, not only in terms of the commercial opportunities for companies but also in offering timely and relevant products and services to consumers, in improving the delivery of public services and in supporting scientific and medical research. However, these benefits should not simply be traded with privacy rights. In our big data paper we identified a number of areas in which big data challenges the established principles of data protection and gives rise to privacy concerns.

Data protection and privacy risks 5. Fairness is a key principle in the Data Protection Act. The complexity of big data analytics, the use of algorithms to find correlations, and the increased ability to profile individuals and to make inferences about them, and use of those inferences to make decisions, can raise questions about the fairness of the processing. For personal data processing to be fair individuals must have relevant information. It can be challenging to explain to people, in clear and meaningful terms, how their data is being used. This is compounded by the fact that few people are willing to read detailed privacy notices. However, this important component of data protection should not be ignored. 6. A key characteristic of big data is the ability to bring together data of different types and use it for new purposes. In some cases this may challenge the principle that personal data collected for one purpose should not be used for a further, incompatible purpose. 7. Data protection requires that organisations process only the minimum data needed for a particular purpose. However, one of the driving forces behind big data is to collect as much data as possible, and may incentivise organisations to keep long runs of historical data. Organisations must still be able to justify this retention. 8. Big data analytics is fed by the increasing volume of data being produced. This comes not only from data that people generate through web searches, social media posts or making purchases, but also from data that may be collected without people being directly aware of it, for example from connected devices as part of the Internet of Things. Furthermore, although more and more personal information is published on social media this does not necessarily mean that it can be re-used for any purpose. This was illustrated by the example of the Samaritans Radar app 3, which alerted users to references in social media postings suggesting that people were at risk of depression or suicide. The Samaritans withdrew the app, as a result of privacy concerns. 9. The propensity of big data to push against the boundaries of data protection is not an abstract issue. In certain applications it means increased risks of discrimination through profiling, of intrusion into private life, and of people being subject to decisions on the basis of processing that is opaque to them. Different risks arise from different big data scenarios - use for research purposes, to learn about trends and broader patterns, presents different data protection 3 Orme, Jamie. Samaritans pull suicide watch Radar app over privacy concerns. Guardian online, 7 November 2014. http://www.theguardian.com/society/2014/nov/07/samaritans-radar-app-suicide-watch-privacytwitter-users Accessed 26 August 2015

risks compared to when profiles are generated and used in a targeted way on individuals. Compliance solutions can be proportionate and matched to these different risks. Data protection promotes good practice in managing data 10. The data protection principles in the DPA still apply in the context of big data, if personal data is being processed; big data is not game that is played by different rules. Rather than being a barrier to progress, the principles can enable good practice in managing data. Organisations should be clear about what they are trying to achieve, and make a realistic assessment of the benefits of the analytics and they should carry out a Privacy Impact Assessment (PIA) to identify and mitigate privacy risks. They should also seek innovative ways in which to deliver privacy notices to people when they are collecting and processing their personal data. 11. Whilst different risks and challenges are emerging that make anonymisation harder we still think that anonymisation can be an important tool in big data analytics; where data is being used to detect general trends and correlations, wherever possible and appropriate it should be anonymised so that it no longer constitutes personal data. They key is to ensure that anonymisation techniques are effectively risk assessed and only used in appropriate scenarios. Techniques such as penetration testing will be increasingly important to demonstrate the effectiveness of anonymisation. 12. In our big data paper we noted that some organisations were developing their own ethical approaches to big data, based on being as transparent as possible, taking account of the customer s point of view and building a relationship of trust with customers 4. Since then, there is more evidence of this trend, for example in the Information Accountability Foundation s big data ethical framework initiative 5 and in the work of the Cabinet Office on data science and ethics 6. We welcome this trend as it supports compliance with the data protection requirements of fairness and transparency. The forthcoming EU Data Protection Regulation 13. The forthcoming EU General Data Protection Regulation (EU GDPR) is still to be finalised 7, but the drafts under discussion contain 4 Eg Johnson, David and Henderson-Ross, Jeremy The new data values. Aimia, 2012. http://www.aimia.com/content/dam/aimiawebsite/casestudieswhitepapersresea rch/english/whitepaperukdatavaluesfinal.pdf Accessed 26 August 2015 5 http://informationaccountability.org 6 Cabinet Office. Open policy making toolkit. Cabinet Office, 20 March 2015. https://www.gov.uk/open-policymaking-toolkit-data-science Accessed 26 August 2015 7 Negitionas on GDPR are currently at trilogue stage. The timetable proposes that agreement is reached by the end of 2015. There would then a two year implementation period.

provisions that seek to address some of the big data issues we have identified and to strengthen the rights of data subjects. These include stronger provisions on processing only the minimum data needed, requirements on clear privacy notices, and explicit requirements for data protection by design and default and for carrying out PIAs. 14. The Information Commissioner has supported the strengthening of data subjects rights whilst highlighting the overly prescriptive approach in certain aspects of the text and the need for the text to support a risk based approach. The Commissioner s latest views on the GDPR have been set out in a recent blog on the ICO website 8. Whether the Government has set out an appropriate and up-to-date path for the continued evolution of Big Data and the technologies required to support it; 15. Security of personal data is a key data protection principle, and we welcome the Government s support for research centres of excellence as part of the National Cyber Security Programme. We also welcome the development of secure research facilities, such as the Administrative Data Research Network that enables administrative data, which may in its original form identify individuals, to be used for analysis in a de-identified form by accredited researchers. 16. We would also stress the role of PIAs and the need to use them at an early stage in planning big data projects in order to mitigate privacy risks to individuals. This theme emerged strongly in the feedback we received on our big data paper. We have published a code of practice on conducting PIAs 9 and we are currently doing further work with selected industry sectors on the role of PIAs in big data, to assess how specific PIA guidance for big data can be developed. We have also published research illustrating how PIAs can be integrated with risk and project management methods already used by organisations (for example, the agile project methodology). It is important that PIAs are not seen as an afterthought to justify a controversial project but are used as a tool to assess risk from the start. This integrated approach counters the view that PIAs are additional bureaucracy or red tape. 17. There could be more explicit recognition by the Government that complying with data protection principles not only helps to protect privacy rights but also supports good practice in data management. 8 The EU Regulation approaching the home straight? ICO 26 August. Available at: https://iconewsblog.wordpress.com/2015/08/26/the-eu-regulation-approaching-the-home-straight/ 9 Information Commissioner s Office. Conducting privacy impact assessments code of practice. ICO, February 2014 Available at https://ico.org.uk/media/for-organisations/documents/1595/pia-code-of-practice.pdf

In industry discussions of big data there is an increasing emphasis on information governance and data quality. Data protection principles map to many of the issues being considered as part of data quality, for example the provenance of the data, its sensitivity, the permissions that come with it and retention periods 10. We believe that getting data protection right can help big data organisations to improve data quality. Where gaps persist in the skills needed to take advantage of the opportunities, and be protected from the risks, and how these gaps can be filled. 18. Organisations deploying big data techniques need to think about the effect of their processing from the customer s or citizen s point of view, and consider the impact of the analytics and expectations about how their data is used. This means that data protection is not simply an issue for the compliance department. It is important that big data analysts and those responsible for data management are aware of data protection and privacy issues and are able to integrate these into their practice. This suggests a need to build awareness of these issues into graduate and post graduate level education in data science and information management. The aim is not to turn data scientists into data protection experts, but to help them to develop risk assessment skills so that they are able to identify privacy risks and mitigation measures. 19. To support this aim, we are planning a project looking at how to embed awareness of information rights, including data protection, into the HE curriculum. A tender for this project will be launched in the autumn of 2015. How public understanding of the opportunities, implications and the skills required can be improved, and informed consent secured Improving public understanding 20. We welcome the emphasis on privacy and security in the previous government s 2013 Information Economy Strategy and its aim of helping consumers to understand the value of their data, how privacy risks are managed and the benefits from permitting wider use. However, helping consumers to understand should not simply be translated into reassuring them that there s nothing to worry 10 See for example: Information lifecycle governance in a big data environment, IBM January 2015. Available at: http://public.dhe.ibm.com/common/ssi/ecm/wv/en/wvw12356usen/wvw12356usen.pdf Accessed 27 August 2015

about. It should be recognised that people can have legitimate concerns about the opacity and the impact of big data analytics. 21. The view that people, and in particular younger people, are not really concerned about how their data is used, because they appear to be free in sharing it, is too simplistic. There is evidence of common privacy concerns across demographics. People s attitudes also depend on how they perceive the sensitivity of different types of data and how much they trust the organisation concerned. The research published by Sciencewise in 2014 11 is a good overview of the public s concerns and how they differ depending on various circumstances. 22. Helping consumers to understand should also include informing the public of their rights as data subjects, including the right to request their own data, including the option of receiving their data in an open, portable format where appropriate, and the right to object to certain processing. This is part of helping people to become informed and confident digital citizens. It also means highlighting instances where big data is helping to make life easier for consumers, for example if analytics are used to make the process of applying for insurance easier. 23. Education plays an important role in this and in making people aware of how their data may be used in a big data context. To assist with this, we have produced a set of lesson plans for primary and secondary schools, dealing with data protection issues 12. In a similar vein, the irights initiative 13 promotes information rights for young people in the digital world. Privacy seals 24. A further way to improve public understanding is to provide a visible stamp of approval or kite mark, to show that particular instances of data processing demonstrate good practice in in data protection compliance. The ICO is actively working to introduce a privacy seals scheme 14, under which we would endorse scheme operators who would award an ICO privacy seal to organisations that meet the assessment criteria and can demonstrate that they are following the highest data protection standards. There has been considerable interest in this idea since we first proposed it, and the forthcoming GDPR is likely to include a provision to encourage the use of similar 11 Sciencewise. Public views on big data. 2014. Available at: http://www.sciencewise-erc.org.uk/cms/publicviews-on-big-data/ 12 Information Commissioner s Office. Resources for schools. Available at https://ico.org.uk/fororganisations/resources-for-schools/ 13 http://irights.uk/ 14 Farmer, Gemma. What you need to know about ICO privacy seals. ICO, 28 January 2015. https://iconewsblog.wordpress.com/2015/01/28/what-you-need-to-know-about-ico-privacy-seals/

schemes. In the context of big data, we think privacy seals have an important role to play in improving transparency, building trust and promoting good practice. The first ICO endorsed schemes are likely to be operational in 2017. Securing informed consent 25. The issue of how to secure informed consent in the context of big data, (or indeed whether it is possible to do so), has been much discussed and it has been suggested that the concept of notice and consent does not work in relation to big data. We do not accept that view, but we do agree that it is more challenging to secure consent in big data, and we do point out that there are alternatives. 26. Providing information about the processing through a privacy notice is a requirement of data protection, and it is essential to obtaining consent. In the section above on the opportunities for big data and the risks we noted the difficulties that can arise in providing privacy notices for big data. To address these, we would firstly emphasise that the DPA requires data controllers to explain the purposes of the processing, rather than the detail of the analytics. Secondly, we consider that big data challenges organisations to seek innovative ways to deliver privacy information, for example by using graphics and videos, and by providing layered and just in time notices. Our guidance on Privacy in mobile apps 15 gives examples of these. We are also in the process of updating our code of practice on privacy notices and will consult on a new version in October 2015. 27. Consent is one of the conditions which allow an organisation to process personal data. The consent must be freely given, specific and informed. If an organisation is collecting personal data to use in big data analytics, and it is relying on consent to legitimise this, then it has to make people aware of all the intended uses of the data, including, for example, whether it is going to share the data with other organisations. Similarly, if an organisation is acquiring data from elsewhere, it has to satisfy itself that the original consent covers that further use of the data. Given the complex and sometimes unforeseen uses of data in big data analytics, this can of course be problematic. Furthermore, freely given means that people can also withdraw their consent. Records of consent should held in open formats to enable consent to be easily understood and respected by different systems. As part of our work on nuisance calls we are exploring how standards can be developed to support 15 Information Commissioner s Office. Privacy in mobile apps. ICO, December 2013 https://ico.org.uk/media/for-organisations/documents/1596/privacy-in-mobile-apps-dp-guidance.pdf

this 16. Open standards should also be developed to revoke consent across different systems. 28. Consent is not the only possible condition, and, given the difficulties with it, big data organisations should consider whether other conditions would be more appropriate. In particular, an alternative condition is that the processing is necessary for the legitimate interests of the organisation processing the data. The organisation may have a number of legitimate interests that may be relevant, such as profiling customers in order to target its marketing, preventing fraud or the misuse of services, or physical or IT security. However, in that case, the processing must be necessary for those legitimate interests; if there is another way to meet them which would not involve processing personal data, then the processing is not necessary. Furthermore, if the processing represents an unwarranted interference with people s rights, freedoms and their legitimate interests, then the condition does not apply. This means that the organisation must assess the impact of the processing on people s privacy and balance this against the interests it has identified. This condition therefore places the responsibility on the organisation to take account of all these factors and reach a balanced decision, rather than making the individual responsible for deciding whether to give their consent. Any further support needed from Government to facilitate R&D on Big Data, including to secure the required capital investment in Big Data research facilities and for their ongoing operation. 29. We consider that anonymisation is a key issue in big data analytics, and part of the research effort could usefully be directed to this. In many cases, big data analytics does not need to use data that identifies individuals, if the aim is to detect general trends and characteristics. If the data is anonymised so that individuals can no longer be identified from it, either alone or in combination with other data, then it is not personal data and is not covered by the DPA. Doing this removes an area of risk for the organisation, and it also means that they can assure people that data which identifies them is not being used. 30. However, this is a complex and indeed a controversial area. Some studies have suggested that anonymisation does not work, and that it is not possible to completely anonymise a dataset; however, other studies have challenged those findings. Although there are examples 16 Which nuisance calls taskforce on consent and lead generation. Which? 2014. Available at: http://www.which.co.uk/documents/pdf/nuisance-calls-task-force-recommendations-388317.pdf

of inadequate anonymisation, this does not mean that it cannot be done effectively. 31. The UK Anonymisation Network 17, which was initiated by the ICO, provides an expert resource on anonymisation techniques for practitioners. There may be scope for more work on this, in order to develop techniques that can be applied by practitioners in different sectors and big data applications. 32. The development of big data increases the risk that individuals may be re-identified from apparently anonymised datasets. If an organisation or an individual does this, then in DPA terms they become the data controller for that data. They take on all the responsibilities of a data controller, including telling the individuals concerned that they are processing their personal data. If they process personal data without their knowledge, and there is a risk of harm to the individuals, then the Commissioner may take regulatory action, including the imposition of a civil monetary penalty of up to 500,000. However, we propose that there is merit in considering whether the introduction of a specific criminal offence would be more appropriate and provide a stronger deterrent for those who deliberately seek to re-identify individuals. Information Commissioner s Office, 3 September 2015 Version 1.0 (final) 17 http://ukanon.net