PEER REVIEW EVALUATION PROCESS OF MARIE CURIE ACTIONS UNDER EU S FP7

Similar documents
Marie Curie Actions FP7 and Horizon 2020

Marie Skłodowska- Curie Actions under Horizon2020

Marie Skłodowska-Curie actions

Education and Culture

Co-funding of regional research fellowship programmes by FP7 Marie Curie Actions

Introduction to Marie Sklodowska-Curie Actions, under the EU R&I Framework Programme Horizon 2020

M&C Opportunites in FP7

Marie Sklodowska Curie Actions. Business participation and entrepreneurship in Marie Skłodowska- Curie actions (FP7 and Horizon 2020)

H2020 Excellent science arie Skłodowska-Curie Actions. Your research career in Europe. 17 November 2015

UNIVERSITE LIBRE DE BRUXELLES

Opportunities for Science & Technology Cooperation between the European Union and Russia

Marie Curie Actions - 15 years of boosting research careers in Europe

FP7 PEOPLE PROGRAMME. Update FP7. Programme Committee Meeting 12 March 2015

Introducing the 7 th Community Framework Programme for Research and Technological Development ( ) 2013)

Horizon 2020 Marie Skłodowska Curie Actions

Funding opportunities for BigSkyEarth projects. Darko Jevremović Brno, April

Opportunities for participation for international partners in Horizon Name: Function:

Christina Miller Director, UK Research Office

MARIE SKLODOSKA CURIE ACTIONS

Working with SMEs on projects

Building the ERA of Knowledge for Growth. Proposals for the 7 th Research Framework Programme

The Social Sciences in Horizon 2020: Societal Challenge 6 - Europe in a changing world - inclusive, innovative and reflective societies

COFUND Fellowships at CERN (

SoMoPro: An example of synergy between COFUND and EFIS Our Experience from the South Moravian Region

Enhancing and focusing EU international cooperation in research and innovation: A strategic approach. Policy Research and Innovation

Marie Curie Fellowship Association

Space technologies, science and exploration SPACE-20-SCI-2018: Scientific instrumentation and technologies enabling space science and exploration

The Rolling Agenda. 3-year strategic programme and 2-year work programme, what about it? Opportunities and obstacles

Horizon the new EU Framework Programme for Research and Innovation

A Research and Innovation Agenda for a global Europe: Priorities and Opportunities for the 9 th Framework Programme

The Doctoral Space Revisited

SEAS-ERA STRATEGIC FORUM

Tuning-CALOHEE Assessment Frameworks for the Subject Area of CIVIL ENGINEERING The Tuning-CALOHEE Assessment Frameworks for Civil Engineering offers

HELPING BIOECONOMY RESEARCH PROJECTS RAISE THEIR GAME

SAS in research and innovation ecosystempotential, Eva Majkova. Science is the future

The EU s new Research Framework Programme : FP7

HORIZON Peter van der Hijden. ACA Seminar What s new in Brussels Policies and Programme 20 th January Research & Innovation.

From FP7 towards Horizon 2020 Workshop on " Research performance measurement and the impact of innovation in Europe" IPERF, Luxembourg, 31/10/2013

Nguyen Thi Thu Huong. Hanoi Open University, Hanoi, Vietnam. Introduction

Marie Curie Alumni Associa0on [MCAA] Marie Curie Programmes. Wuraola Akande Ph.D Gender Summit 5 Africa Cape Town 28 th - 30 th April 2015

Long-term Care Strategies for Independent Living of Elderly People (ICT-AGE): presentation of the project

TOWARD THE NEXT EUROPEAN RESEARCH PROGRAMME

Welcome to the PhD study at NTNU

Funding Opportunities in Metrology Related Research

Towards the Ninth European Framework Programme for Research and Innovation. Position Paper from the Norwegian Universities

ACP/84/047/02 Final Cape Town, 28 July 2002 PAHD Dept. CAPE TOWN DECLARATION ON RESEARCH FOR SUSTAINABLE DEVELOPMENT

Position Paper of Iberian Universities Design of FP9

European Union - New Zealand SCIENCE AND TECHNOLOGY COOPERATION ROADMAP Research and Innovation Priorities

COMMISSION STAFF WORKING DOCUMENT STRATEGY FOR EUROPEAN TECHNOLOGY PLATFORMS: ETP 2020

Israel s comments on the Commission s proposal for the 7 th Framework Programme

2nd Call for Proposals

Report of Visit to Agency ANI Portugal. Lisbon, 2 May 2016

SCIENCE, TECHNOLOGY AND INNOVATION SCIENCE, TECHNOLOGY AND INNOVATION FOR A FUTURE SOCIETY FOR A FUTURE SOCIETY

Horizon 2020 Towards a Common Strategic Framework for EU Research and Innovation Funding

FP6 assessment with a focus on instruments and with a forward look to FP7

Mainstreaming PE in Horizon 2020: perspectives and ambitions

Synergies and Complementarities with EU Research & Innovation Policy

SME support under Horizon 2020 Diana GROZAV Horizon 2020 SME NCP Center of International Projects

The European Science Foundation (ESF) is the European association of 78 national research organisations in 30 countries, devoted to scientific

Europe Needs Science

FP7 Funding Opportunities for the ICT Industry

Horizon 2020 and CAP towards 2020

ANNEXES. to the. Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL

The New Delhi Communiqué

Framework Programme 7 and SMEs. Amaury NEVE European Commission DG Research - Unit T4: SMEs

EC-Egypt Science and Technology Cooperation Agreement. Road Map

Franco German press release. following the interview between Ministers Le Maire and Altmaier, 18 December.

Council of the European Union Brussels, 10 November 2016 (OR. en)

Gender Pay Gap Report

Commission proposal for Horizon Europe. #HorizonEU THE NEXT EU RESEARCH & INNOVATION PROGRAMME ( )

Rethinking the role of Social Sciences and Humanities (SSH) in Horizon 2020: toward a reflective and generative perspective

Materials in the knowledgesociety and the role of the EU 7th Framework Programme

Il programma di lavoro SSH 2013

Introducing the Work Programme of Horizon EU R&I Day Seoul, 24 November 2017

4th BioHorizon International Brokerage Event 26 th of June Fare clic per modificare lo stile del titolo

FET Open in Horizon Roumen Borissov Scientific/Technical Project Officer Future and Emerging Technologies, DG CONNECT European Commission

The AComIn Project in Brief

EUREKA in the ERA INTRODUCTION

Stakeholders Conference. Conclusions. EU-EECA S&T cooperation: The way forward. Athens June 2009

Working together to deliver on Europe 2020

IMI Revolutionising Europe s Pharmaceutical Industry. IMI Matters!

Systems Approaches to Health and Wellbeing in the Changing Urban Environment

Figure 1: When asked whether Mexico has the intellectual capacity to perform economic-environmental modeling, expert respondents said yes.

AARHUS UNIVERSITY 14. NOVEMBER 2013 EXCHANGE YOUR KNOWLEDGE KURT NIELSEN - VICE DEAN INNOVATION COLLABORATION AT AU WHY AND HOW? knowledge.

The European Union Research Framework Programme opportunities for cooperation with third countries

Score grid for SBO projects with an economic finality version January 2019

COUNCIL OF THE EUROPEAN UNION. Brussels, 18 May /06 Interinstitutional File: 2005/0044 (CNS) RECH 130 ATO 48 COMPET 129

EU initiatives supporting universities

Gender Pay Report 2017

NMR Infrastructures. in Europe. Lucia Banci Sco1

Funding excellence in innovation. Eurostars Application Assessment Guidelines. Version 2.0 March 2012

The European Union and Brazil: Cooperation in Science and Technology

Gender Pay Gap Report 2017

Research DG. European Commission. Sharing Visions. Towards a European Area for Foresight

Building global engagement in research Sources of funding for enabling international research collaborations

Report on the Results of. Questionnaire 1

How to find a European host for your research stay

QUALITY CHARTER FOR THE RESEARCHER S MOBILITY PORTAL

Whittard of Chelsea. Gender Pay Gap Reporting 2017

A Research & Innovation Agenda for a Global Europe: Priorities & Opportunities for the 9th Framework Programme

Transcription:

PEER REVIEW EVALUATION PROCESS OF MARIE CURIE ACTIONS UNDER EU S FP7 David Pina Research Executive Agency, European Commission, Brussels, Belgium Darko Hren Department of Psychology, School of Humanities and Social Sciences, University of Split, Split, Croatia Ana Marušić Department of Research in Biomedicine and Health, School of Medicine, University of Split, Split, Croatia

Marie Curie Actions EU Fellowship programmes for researchers mobility since 1990 Marie Curie since 1996 Aim: Structuring training, mobility and career development for researchers Under FP7 (2007-2013): 4.75 billion

Marie Curie Actions ITN IEF IOF IIF IAPP Action 1 ITN Early-stage Researchers Action 2 IF Experienced Researchers Action 3 RISE Exchange of Staff Innovative Training Networks Support for doctoral and early-stage training European Training Networks, European Industrial Doctorates, European Joint Doctorates Individual Fellowships Support for experienced researchers undertaking international and inter-sector mobility: European Fellowships and Global Fellowships Dedicated support for career restart and reintegration Research and Innovation Staff Exchange International and inter-sector cooperation through the exchange of staff

Marie Curie Actions COFUND 8% Economics 2% Mathematics 3% Social Sciences and Humanities 9% Chemistry 10% Life Sciences 27% Environmental and Geo-sciences 11% Budget distribution by scientific panel in FP7 Physics 13% Information Science and Engineering 17% - 60 000 researchers financed since the creation of the Marie Curie Actions - More than 10 000 PhD supported in FP7 - Marie Curie researchers coming from all over the world (around 130 nationalities) - Marie Curie host organisations in more than 80 countries - 46% of researchers coming to EU from industrialised countries stay in Europe after the end of their IIF fellowship - 38% women participation in FP7 MCA, close to the 40% target

Marie Curie Actions

Marie Curie Actions Excellent. Successfully addresses all relevant aspects of the criterion in question. Any shortcomings are minor. 5 Excellent Very Good. Addresses the criterion very well, although certain improvements are still possible. 4 4.9 4.0 Very Good Good. Addresses the criterion well, although improvements would be necessary. 3 3.9 3.0 Good Fair. Broadly addresses the criterion, there are significant weaknesses. 2 2.9 2.0 Fair Poor. Addressed in an inadequate manner, or there are serious inherent weaknesses. 1 1.9 1.0 Poor Fails to address the criterion or cannot be judged due to missing or incomplete information. 0

Marie Curie Actions CRITERIA S&T Quality Training (ITN, IEF) or Transfer of Knowledge (IAPP) Researcher (IEF) Implementation Impact

Marie Curie Actions CRITERIA weighting (ITN example) S&T Quality 30% Training 20% Implementation 30% Impact 30% Example: 4.2 0.3+4.7 0.2+3.8 0.3+4.4 0.2=4.22 Final score 4.22 20=84.40 (out of max. 100)

Aim of the study To examine the peer-review evaluation process in three MC Actions (ITN, IEF, IAPP) To assess the agreement among raters in the different phases of the evaluation workflow

Data sources IAPP from 2007 to 2009 and for 2011 (4 calls) ITN 2008 and from 2010 to 2012 (4 calls) IEF from 2007 to 2013 (7 calls). Total: n=24 897 proposals n=74 691 individual evaluation reports reviews

Data sources IAPP from 2007 to 2009 and for 2011 (4 calls) ITN 2008 and from 2010 to 2012 (4 calls) IEF from 2007 to 2013 (7 calls). Total: n=24 897 proposals n=74 691 individual evaluation reports reviews

Agreement among reviewers Average Deviation (AD) index Burke MJ, Finkelstein LM, Dusig MS. On average deviation indices for estimating interrater agreement. Organizational Research Methods. 1999;2: 49-68 Measure of disagreement that involves determining the average difference between scores of individual raters and the average scores of all raters Does not require the specification of null distribution Estimates inter-rater disagreement in the units of the original scale

Panel Total* Total 79.8 ±11.0 (n=24897) Chemisty 81.0±9.8 (n=2665) Economic and Social 78.1±12.9 Sciences/Humanities (n=4677) Information 76.9±11.9 Science/Engineering (n=2983) Environment 80.4±10.4 (n=3243) Life Sciences 80.9±10.3 (n=7658) Mathematics 78.2±10.2 (n=731) Physics 80.8±9.2 (n=2940) Results Mean score (±SD) in proposals where: All raters agree 81.0±10.1 (n=21398) 81.9±9.2 (n=2362) 79.8±12.4 (n=3646) 78.3±11.1 (n=2478) 81.5±9.4 (n=2860) 82.0±9.4 (n=6785) 79.6±8.6 (n=623) 81.6±8.5 (n=2644) One rater differs 74.0±13.1 (n=1424) 75.3±13.2 (n=132) 74.6±13.1 (n=431) 70.9±13.7 (n=199) 74.5±13.3 (n=153) 74.5±13.3 (n=354) 71.1±15.2 (n=41) 75.3±11.7 (n=114) All raters differ 70.9±12.8 (n=2075) 73.2±10.0 (n=171) 70.7±12.9 (n=600) 69.2±12.7 (n=306) 70.1±13.8 (n=230) 71.4±13.2 (n=519) 69.2±13.6 (n=67) 72.4±12.0 (n=182) Mean score (±SD) in proposals with AVIER vs CR difference 69.3±19.8 (n=368) 70.6±19.9 (n=32) 73.1±19.5 (n=142) 62.7±18.0 (n=50) 66.1±20.9 (n=42) 65.8±20.4 (n=71) 79.1±9.6 (n=5) 72.4±17.9 (n=26) ICC (one-way random ) range: 0.46 0.64 Overall:ICC=0.67, 95%CI=0.66-0.68

Results Panel Disagreement (No. Proposals, row %) One rater differs All raters differ AVIER vs CR difference IAPP (n=759) 71 (9.4%) 124 (16.3%) 23 (3.0%) ITN (n=3545) 280 (7.9%) 415 (11.7%) 104 (2.9%) IEF (n=20593) 1073 (5.2%) 1536 (7.5%) 241 (1.2%)

Results Distribution of differences between Consensus Reports (CR) and average Individual Evaluation Reports (AVIER) scores Mean = -0.3 SD = 3.19 61.4% of all proposals had less than 2 points difference between AVIER and CR scores IER individual evaluation report AVIER average IER from remote ev. CR consensus report

Results Overall median AD index = 5.4 points (on a scale 0-100) For three quarters of all proposals equal or below 8.3 points

Results More disagreement for proposals with lower scores IER individual evaluation report AVIER average IER from remote ev. CR consensus report AD average difference

Results Panel (No. proposals) No. proposals( row %) with disagreement One rater differs Chemistry (n=2665) 132 (5.0) Economic and Social Sciences/Humanities (n=4677) 431 (9.2) Information Science/Engineering (n=2983) 199 (6.7) Environment/Geosciences (n=3243) 153 (4.7) Life Sciences (n=7658) Mathematics (n=731) Physics (n=2940) Total (n=24897) 354 (4.6) 41 (5.6) 114 (3.9) 1424 (5.7) Scenario 1: one rater scores a proposal in a completely different way than the other two raters a) two agree (difference between their scores less than or equal to 5 points because 5.4 was the median AD for all proposals) b) One disagrees for 10 points - because this would put the difference above 3rd quartile for all AD indices for IER scores

Results Panel (No. proposals) No. proposals( row %) with disagreement One rater differs All raters differ Chemistry (n=2665) 132 (5.0) 171 (6.4) Economic and Social Sciences/Humanities (n=4677) 431 (9.2) 600 (12.8) Scenario 3: Disagreement of all three raters a) difference between eeach pair of IER scores 10 points (on a scale 0-100) Information Science/Engineering (n=2983) 199 (6.7) 306 (10.3) Environment/Geosciences (n=3243) 153 (4.7) 230 (7.1) Life Sciences (n=7658) Mathematics (n=731) Physics (n=2940) Total (n=24897) 354 (4.6) 519 (6.8) 41 (5.6) 67 (9.2) 114 (3.9) 182 (6.2) 1424 (5.7) 2075 (8.3)

Results Panel (No. proposals) No. proposals( row %) with disagreement One rater differs All raters differ Difference in AVIER vs CR Chemistry (n=2665) 132 (5.0) 171 (6.4) 32 (1.2) Economic and Social Sciences/Humanities (n=4677) 431 (9.2) 600 (12.8) 142 (3.0) Information Science/Engineering (n=2983) 199 (6.7) 306 (10.3) 50 (1.7) Environment/Geosciences (n=3243) 153 (4.7) 230 (7.1) 42 (1.3) Life Sciences (n=7658) Mathematics (n=731) Physics (n=2940) Total (n=24897) 354 (4.6) 519 (6.8) 71 (0.9) 41 (5.6) 67 (9.2) 5 (0.7) 114 (3.9) 182 (6.2) 26 (0.9) 1424 (5.7) 2075 (8.3) 368 (1.5) Scenario 3: absolute difference between CR and AVIER scores 10 (scale 0-100) Positive and negative differences were equally distributed (180 or 48.9% positive and 188 or 51.1% negative differences) Significantly lower CR scores than other proposals (69.3±19.8 vs 79.8±11.0; p<0.001)

Results Panel (No. proposals) No. proposals( row %) with disagreement One rater differs All raters differ Difference in AVIER vs CR Chemistry (n=2665) 132 (5.0) 171 (6.4) 32 (1.2) Economic and Social Sciences/Humanities (n=4677) 431 (9.2) 600 (12.8) 142 (3.0) Information Science/Engineering (n=2983) 199 (6.7) 306 (10.3) 50 (1.7) Environment/Geosciences (n=3243) 153 (4.7) 230 (7.1) 42 (1.3) Life Sciences (n=7658) Mathematics (n=731) Physics (n=2940) Total (n=24897) 354 (4.6) 519 (6.8) 71 (0.9) 41 (5.6) 67 (9.2) 5 (0.7) 114 (3.9) 182 (6.2) 26 (0.9) 1424 (5.7) 2075 (8.3) 368 (1.5) Scenario 3: absolute difference between CR and AVIER scores 10 (scale 0-100) Positive and negative differences were equally distributed (180 or 48.9% positive and 188 or 51.1% negative differences) Significantly lower CR scores than other proposals (69.3±19.8 vs 79.8±11.0; p<0.001)

Results Pearson s inter-correlations of IER criteria of different raters Rater 1 Rater2 Rater 3 S&T quality Training/ToK Researcher Implementation Impact S&T quality Training/ToK Rater 1 Researcher Implementation Impact S&T quality Training/ToK Researcher Implementation Impact S&T quality 1 0.698 0.600 0.668 0.693 0.291 0.279 0.231 0.278 0.274 0.296 0.290 0.231 0.289 0.282 Training/ToK 1 0.582 0.718 0.740 0.282 0.361 0.248 0.319 0.324 0.270 0.357 0.236 0.324 0.320 Researcher 1 0.582 0.646 0.217 0.231 0.293 0.230 0.241 0.234 0.246 0.306 0.249 0.251 Implementation 1 0.740 0.281 0.330 0.247 0.360 0.328 0.282 0.335 0.254 0.367 0.330 Impact 1 0.278 0.325 0.251 0.318 0.341 0.277 0.327 0.260 0.328 0.341 Rater 2 Rater 3 S&T quality 1 0.694 0.590 0.668 0.685 0.295 0.286 0.230 0.285 0.276 Low correlations between different rater's scores for the same criterion and the same proposal High correlations of the same rater's scores of different criteria for the same proposal Training/ToK 1 0.583 0.713 0.734 0.287 0.369 0.250 0.335 0.328 Researcher 1 0.564 0.639 0.228 0.240 0.294 0.244 0.244 Implementation 1 0.730 0.282 0.332 0.245 0.367 0.330 Impact 1 0.275 0.322 0.256 0.329 0.342 S&T quality 1 0.695 0.606 0.665 0.690 Training/ToK 1 0.589 0.710 0.737 Raters scored proposals in a more holistic way and, generally, assessed each criterion in relation to the other criteria of the same proposal Researcher 1 0.573 0.645 Implementation 1 0.733 Impact 1

Results Principal components analysis with the evaluation criteria to investigate latent structure that underlies a set of items (criteria scored by three raters) Three components, each representing a single rater Confirmed our conclusion that criteria scores reflected the rater s global score rather than specific aspects of the proposal. The three-component solution explained large portion of variance (73%) and component loadings were very high (all above 0.7).

Conclusions Good internal consistency and overall high agreement among expert reviewers Disagreement was greater for proposals with lower scores At least for some of the proposals, the remote assessments and its average score (AVIER) can provide reliable final judgment of the proposal (especially for IF)

Conclusions About 15% of the proposals population that may need more discussion in order to reach consensus on the final score IAPP and ITN calls had a greater number of proposals with disagreements, demonstrating that the evaluation of complex proposals, involving partnerships of several research groups with multidisciplinary and inter-sectorial features, require a more elaborate review procedure