Societal and Ethical Challenges in the Era of Big Data: Exploring the emerging issues and opportunities of big data management and analytics

Similar documents
17 January 2017 Information and Networking Days Richard Stevens - IDC. e-sides Ethical and Societal Implications of Data Sciences

Our position. ICDPPC declaration on ethics and data protection in artificial intelligence

Ethics Guideline for the Intelligent Information Society

Executive Summary Industry s Responsibility in Promoting Responsible Development and Use:

Artificial Intelligence and Society: the Challenges Ahead Yuko Harayama Executive Member Council for Science, Technology and Innovation (CSTI)

CONSENT IN THE TIME OF BIG DATA. Richard Austin February 1, 2017

Ethics of Data Science

Integrating Fundamental Values into Information Flows in Sustainability Decision-Making

Pan-Canadian Trust Framework Overview

The Information Commissioner s response to the Draft AI Ethics Guidelines of the High-Level Expert Group on Artificial Intelligence

Computers, Privacy & Data Protection 2015 Data Protection on the Move Brussels, 23 January 2015

CCTV Policy. Policy reviewed by Academy Transformation Trust on June This policy links to: Safeguarding Policy Data Protection Policy

ICO submission to the inquiry of the House of Lords Select Committee on Communications - The Internet : To Regulate or not to Regulate?

Big Data & Ethics some basic considerations

Big Data & AI Governance: The Laws and Ethics

ITAC RESPONSE: Modernizing Consent and Privacy in PIPEDA

Biometric Data, Deidentification. E. Kindt Cost1206 Training school 2017

Artificial intelligence and judicial systems: The so-called predictive justice

CCTV Policy. Policy reviewed by Academy Transformation Trust on June This policy links to: T:Drive. Safeguarding Policy Data Protection Policy

COMEST CONCEPT NOTE ON ETHICAL IMPLICATIONS OF THE INTERNET OF THINGS (IoT)

Data users and data producers interaction: the Web-COSI project experience

Surveillance Technologies: efficiency, human rights, ethics Prof. Dr. Tom Sorell, University of Warwick, UK

Robert Bond Partner, Commercial/IP/IT

Ministry of Justice: Call for Evidence on EU Data Protection Proposals

Ethics Review Data Sharing Bridging Legal Environments

Section 1: Internet Governance Principles

28 TH INTERNATIONAL CONFERENCE OF DATA PROTECTION

DIMACS/PORTIA Workshop on Privacy Preserving

December Eucomed HTA Position Paper UK support from ABHI

AI AS A FORCE OF GOOD

Towards Trusted AI Impact on Language Technologies

Children s rights in the digital environment: Challenges, tensions and opportunities

Privacy, Technology and Economics in the 5G Environment

LIVING LAB OF GLOBAL CHANGE RESEARCH

Measuring Intangible Assets (IP & Data) for the Knowledge-based and Data-driven Economy

Policies for the Commissioning of Health and Healthcare

Building DIGITAL TRUST People s Plan for Digital: A discussion paper

LAB3-R04 A Hard Privacy Impact Assessment. Post conference summary

How Explainability is Driving the Future of Artificial Intelligence. A Kyndi White Paper

The Internet of Things: an overview

Data ethics: digital dilemmas for the 21st century board

Should privacy impact assessments be mandatory? David Wright Trilateral Research & Consulting 17 Sept 2009

RecordDNA DEVELOPING AN R&D AGENDA TO SUSTAIN THE DIGITAL EVIDENCE BASE THROUGH TIME

UN-GGIM Future Trends in Geospatial Information Management 1

Ethical issues raised by big data and real world evidence projects. Dr Andrew Turner

March 27, The Information Technology Industry Council (ITI) appreciates this opportunity

The EFPIA Perspective on the GDPR. Brendan Barnes, EFPIA 2 nd Nordic Real World Data Conference , Helsinki

Council of the European Union Brussels, 10 November 2016 (OR. en)

Q1 Under the subject "Future of Work and the New Economy", which topics do you find important?

Surveillance and Privacy in the Information Age. Image courtesy of Josh Bancroft on flickr. License CC-BY-NC.

Data Protection and Privacy in a M2M world. Yiannis Theodorou, Regulatory Policy Manager GSMA Latam Plenary Peru, November 2013

Decentralisation, i.e. Internet for Social Good

How do you teach AI the value of trust?

I m sorry, my friend, but you re implicit in the algorithm Privacy and internal access to #BigDataStream

The Future of Patient Data The Global View Key Insights Berlin 18 April The world s leading open foresight program

EXIN Privacy and Data Protection Foundation. Preparation Guide. Edition

IAB Europe Guidance THE DEFINITION OF PERSONAL DATA. IAB Europe GDPR Implementation Working Group WHITE PAPER

EXPLORATION DEVELOPMENT OPERATION CLOSURE

General Briefing v.1.1 February 2016 GLOBAL INTERNET POLICY OBSERVATORY

Transparency and Accountability of Algorithmic Systems vs. GDPR?

Global Standards Symposium. Security, privacy and trust in standardisation. ICDPPC Chair John Edwards. 24 October 2016

clarification to bring legal certainty to these issues have been voiced in various position papers and statements.

The Information Commissioner s role

Committee on the Internal Market and Consumer Protection. of the Committee on the Internal Market and Consumer Protection

The IEEE Global Initiative for Ethical Considerations in Artificial Intelligence and Autonomous Systems. Overview June, 2017

Media Literacy Policy

Spurring Big Data-Driven Innovation and Promoting Responsible Data Governance in a Privacy-Centred Europe

Designing for recovery New challenges for large-scale, complex IT systems

Security and Risk Assessment in GDPR: from policy to implementation

We believe that... technological innovation and new uses of data can help solve big societal problems and improve lives.

Responsible AI & National AI Strategies

Challenges to human dignity from developments in AI

Ethics and technology

The University of Sheffield Research Ethics Policy Note no. 14 RESEARCH INVOLVING SOCIAL MEDIA DATA 1. BACKGROUND

This research is supported by the TechPlan program funded by the ITS Institute at the University of Minnesota

Towards a Magna Carta for Data

RECOMMENDATIONS. COMMISSION RECOMMENDATION (EU) 2018/790 of 25 April 2018 on access to and preservation of scientific information

EP Interest Group Mental Health, Brussels 22/09/11 Tinne Vandensande King Baudouin Foundation

Paris Messages for the IGF 2018

HTA Position Paper. The International Network of Agencies for Health Technology Assessment (INAHTA) defines HTA as:

Artificial Intelligence and the Law The Manipulation of Human Behaviour. Stanley Greenstein

Food Product Standards to Support Exports

DIGITAL INCLUSION STRATEGY

Most of us will have heard of Open Data. Many of us are working to implement it.

Metrology in the Digital Transformation

Australian Census 2016 and Privacy Impact Assessment (PIA)

TRUSTING THE MIND OF A MACHINE

The General Data Protection Regulation and use of health data: challenges for pharmaceutical regulation

Effective Data Protection Governance An Approach to Information Governance in an Information Age. OECD Expert Consultation Boston October 2016

technologies, Gigaom provides deep insight on the disruptive companies, people and technologies shaping the future for all of us.

Privacy and Security in Europe Technology development and increasing pressure on the private sphere

Justice Select Committee: Inquiry on EU Data Protection Framework Proposals

Self regulation applied to interactive games : success and challenges

OECD WORK ON ARTIFICIAL INTELLIGENCE

DATA PROTECTION IMPACT ASSESSMENT

ARTICLE 29 Data Protection Working Party

IoT in Health and Social Care

Open Science in the Digital Single Market

Market Access and Environmental Requirements

The Tool Box of the System Architect

Transcription:

Societal and Ethical Challenges in the Era of Big Data: Exploring the emerging issues and opportunities of big data management and analytics June 28, 2017 from 11.00 to 12.45 ICE/ IEEE Conference, Madeira (PT) e-sides Workshop - Madeira, Portugal e-sides Ethical and Societal Implications of Data Sciences 1

Agenda Item Presentation of e-sides Presentation of workshop objectives Ethical and legal issues overview Interactive session on ethical and legal issues Presentation of results and discussion Societal and economic issues overview Interactive session on societal and economic issues Presentation of results and discussion How to get involved with e-sides Next steps and conclusions Speaker Richard Stevens, Research Director IDC European Government Consulting Gabriella Cattaneo, Associate VP IDC European Government Consulting Daniel Bachlechner, Senior Researcher Fraunhofer ISI Stefania Aguzzi, Senior Consultant IDC European Government Consulting Richard Stevens, Research Director IDC European Government Consulting Societal and Ethical Challenges in the Era of Big Data Exploring the Emerging Issues and Opportunities of Big Data Management and Analytics ICE/IEEE 2017 Madeira, June 28 (11.00-12.45) e-sides Ethical and Societal Implications of Data Sciences

About e-sides e-sides Ethical and Societal Implications of Data Sciences 3

e-sides in a nutshell Involve the complete value chain of big data stakeholders to reach a common vision for an ethically sound approach to processing big data Improve the dialogue between data subjects and big data communities (industry, research, policy makers, regulators) and, thereby, to improve the confidence of citizens towards big data technologies and data markets. e-sides Partnership Coordinator Communication Community engagement Technical partner for socio-economic research Technical partner for legal and ethicsrelated research e-sides Ethical and Societal Implications of Data Sciences 4

e-sides key objectives Identify, discuss and validate ethical and societal implications of privacy-preserving big data technologies Liaise with big data community (researchers, business leaders, policy makers and society) through events Provide ethical-legal and societal-economic advice to facilitate responsible research and innovation on big data technologies Provide collective community position paper with recommendations for responsible research and innovation on big data e-sides Ethical and Societal Implications of Data Sciences 5

Workshop: Societal and Ethical Challenges in the Era of Big Data e-sides Ethical and Societal Implications of Data Sciences 6

Workshop objectives Introduce e-sides Present our initial work Validate results and collect your inputs e-sides Ethical and Societal Implications of Data Sciences 7

How to interact e-sides Ethical and Societal Implications of Data Sciences 8

e-sides Ethical and Societal Implications of Data Sciences 9

Ethical and Legal Issues Overview Gabriella Cattaneo, IDC European Government Consulting e-sides Ethical and Societal Implications of Data Sciences 10

Ethical versus legal issues Focus on ethical and legal issues Privacy, security Discrimination, stigmatization, polarization Consent, autonomy, self-determination Transparency, integrity, trust Ethics: how should we behave? Law: how must we behave? Rapid technological developments may cause ethical issues Violations of moral principles (e.g., human dignity) Conflicting moral principles (e.g. privacy vs. security) New moral principles? (e.g. right to be forgotten, right not to know) Legislation may not be up to date to address these issues therefore ethical issues may become legal issues e-sides Ethical and Societal Implications of Data Sciences 11

Inherent conflicts between Big Data and Personal Data Protection Personal data protection principles Purpose limitation (legitimate and specified before collection) Consent (simple, specific, informed and explicit) Lawfulness (consent obtained and/or processing needed for legitimate purposes) Necessity and data minimization Big Data risks BD often used for secondary purposes not yet known at collection time If purpose not clear, cannot ask consensus Without purpose limitation and consent lawfulness is doubtful BD needs accumulating data for potential use Transparency and openness Individuals cannot keep track of their data e-sides Ethical and Societal Implications of Data Sciences 12

Why do we care? Personal data protection principles Individual rights (to access, rectify, erase/be forgotten) Information security Accountability (compliance with principles) Data protection by design and by default Big Data risks Lacking transparency, individuals have difficulty to exercise their rights Collection of large quantities of data increases risks of violations and abuse Compliance does not hold and hence cannot be demonstrated Anonymise data! But: Too much anonymization = no use for BD Too little anonymization = possible re-identification e-sides Ethical and Societal Implications of Data Sciences 13

Legal issues in the implementation of DP principles Ineffective purpose limitation Lack of fully informed consent The volume, variety and velocity of Big Data sets may have unintended negative consequences even when processing neutral data with legitimate processes Blurring of sensitive data concept Harmful consequences from data processing e-sides Ethical and Societal Implications of Data Sciences 14

Ethical issues: discrimination, stigmatization, polarization BD profiling, categorising, classifying of data about individuals may... Weaken solidarity and social cohesion Weaken individual autonomy by manipulating choices Create bias about some groups of people e-sides Ethical and Societal Implications of Data Sciences 15

Ethical issues: trust, autonomy, selfdetermination Information asymmetry, dependency on BD holders, BDA-driven algorithms making automated choices (selection, pricing) may lead to... Loss of Trust Lack of Moral Responsibility e-sides Ethical and Societal Implications of Data Sciences 16

Interactive Session on Ethical and Legal Issues Gabriella Cattaneo, IDC European Government Consulting e-sides Ethical and Societal Implications of Data Sciences 17

e-sides Ethical and Societal Implications of Data Sciences 18

Societal and Economic Issues Overview Daniel Bachlechner, Fraunhofer ISI e-sides Ethical and Societal Implications of Data Sciences 19

Sources FP7 and H2020 projects assessing impacts of different (big data related) technologies BIG CANVAS CAPITAL CLARUS Coco Cloud CONSENT CRISP DwB ENDORSE ENFORCE Inter-Trust SIAM Socialising Big Data SURVEILLE SysSec e-sides Ethical and Societal Implications of Data Sciences 20

Unequal access Unequal access to data and technology, and information asymmetry leading to unequal chances Not everybody or every organization is in the same starting position with respect to big data The digital divide, for instance, refers to inequalities between those who have computers and online access, and those who don t Access to contact data, a privacy policy or information about data collection, processing and sharing depends on capabilities Relevant inequalities also exist between organizations of different industries, sizes and regional contexts Example Online policies on websites typically require a through legal and technological understanding to be fully understood, if they are found at all e-sides Ethical and Societal Implications of Data Sciences 21

Normalization The classification of people and organizations based on broad categories leading to restricted access to information and services People are put into broad categories whose characteristics are determined by what is most common and thus expected to be most likely Filter bubbles result when an algorithm selectively guesses what information somebody wants to see based on information about the individual as well as other similar individuals The breadth of choices is restricted and pluralism pushed back Normalization also happens on an organizational level but seems to be less critical Example Recommendations for products in online shops such as Amazon e-sides Ethical and Societal Implications of Data Sciences 22

Discrimination Unfair treatment of people and organizations based on certain characteristics leading to immediate disadvantages and unequal chances People or groups are treated differently depending on certain characteristics including age, disability, ethnicity or gender Big data technologies to some extent allow concluding initially unknown characteristics from others in the same or other datasets Discriminating people or groups might make economic sense and is difficult to be detected Data or algorithms upon which people are discriminated may be incorrect or unreliable Example Predictive policing, no-fly lists, or personalized pricing are examples where discrimination in the context of big data becomes visible e-sides Ethical and Societal Implications of Data Sciences 23

Dependency The dependency of people and organizations from organizations and technology leading to limitation of flexibility People and organizations depend on others collecting or processing data, or providing access to data Switching from one organization to another is often linked to high costs, if it is possible at all For many types of data or data-related services, there is a limited number of providers and a considerable share of them is based outside the EU Business practices as well as security measures can usually not be affected by externals Organizations are also highly dependent on the data as well as the big data technologies they use Example Data-intensive organizations such as NHS hospitals in the UK had to stop operating after being attacked with ransomware EU Commission has led several competition cases against Google s dominance in several markets (e.g. online advertising) Many online services can only be used after providing requested data (take-it-or-leave-it) e-sides Ethical and Societal Implications of Data Sciences 24

Intrusiveness The intrusion into the peoples' privacy and organizations' business practices leading to reduction of freedom Big data has integrated itself into nearly every part of people s online and to some extent also in their offline experience Data is stored for long periods of time and the potential to analyze the data or to integrate it with other data grows General suspicion of public authorities and an insatiable appetite of organizations for ever more data infringe people s freedom The behavior of people including how they live, work and interact is affected by intrusive big data applications Example The impact of the integration of big data and video surveillance is considered to have particular potential for being intrusive CCTV, body cameras and drones are increasingly used without the consent of the people observed Super markets use video data with face recognition to classify and guide customers e-sides Ethical and Societal Implications of Data Sciences 25

Non-transparency The lack of transparency of organizational algorithms and business practices leading to control loss Algorithms are often like black boxes, they are not only opaque but also mostly unregulated and thus perceived as uncontestable People and organizations cannot be sure who is collecting, processing or sharing which data There are limited means to check if an organization has taken suitable measures to protect sensitive data Law enforcement is often constrained by a lack of resources of public authorities There is a lack of practical experience with respect to audits including privacy impact assessments Example Data subjects right to information often impossible to exercise Right to information limited to data storage e-sides Ethical and Societal Implications of Data Sciences 26

Abusiveness The potential for abuse of data and technologies leading to control loss and deep mistrust Data as well as big data technologies may be used for illegal purposes or for purposes that fall into a legal grey zone It is difficult to check the validity of results of data analyses if they look plausible Data or algorithms can be manipulated in order to reach desired results The border between data use and abuse is blurry at times Example Data collected to remove security flaws may be used by criminals to take over vulnerable systems e-sides Ethical and Societal Implications of Data Sciences 27

Summary Normalization The classification of people and organizations based on broad categories leading to restricted access to information and services Dependency The dependency of people and organizations from organizations and technology leading to limitation of flexibility Unequal access Unequal access to data and technology, and information asymmetry leading to unequal chances Abusiveness The potential for abuse of data and technologies leading to control loss and deep mistrust Non-transparency The lack of transparency of organizational algorithms and business practices leading to control loss Discrimination Unfair treatment of people and organizations based on certain characteristics leading to immediate disadvantages and unequal chances Intrusiveness The intrusion into the peoples' privacy and organizations' business practices leading to reduction of freedom e-sides Ethical and Societal Implications of Data Sciences 28

Interactive Session on Societal and Economic Issues Daniel Bachlechner, Fraunhofer ISI e-sides Ethical and Societal Implications of Data Sciences 29

e-sides Ethical and Societal Implications of Data Sciences 30

Next Steps and Conclusions e-sides Ethical and Societal Implications of Data Sciences 31

Attend our following conferences, workshops and webinars Next e-sides event: November 2017, TBD Get access to relevant, novel and continually updated content Next e-sides Report presenting the issues in August 2017 Contribute to our online community paper on website at www.e-sides.eu Online community initiatives: e-sides.eu e-sides Ethical and Societal Implications of Data Sciences 32

@esides_eu esides_eu www.e-sides.eu e-sides info@e-sides.eu e-sides Ethical and Societal Implications of Data Sciences 33