International comparison of education systems: a European model? Paris, November 2008

Similar documents
Research Excellence Framework

CAPACITIES. 7FRDP Specific Programme ECTRI INPUT. 14 June REPORT ECTRI number

Brief to the. Senate Standing Committee on Social Affairs, Science and Technology. Dr. Eliot A. Phillipson President and CEO

The 45 Adopted Recommendations under the WIPO Development Agenda

RESEARCH AND INNOVATION STRATEGY

Written response to the public consultation on the European Commission Green Paper: From

COST FP9 Position Paper

10246/10 EV/ek 1 DG C II

Establishing a reference framework for assessing the Socio-economic impact of Research Infrastructures

demonstrator approach real market conditions would be useful to provide a unified partner search instrument for the CIP programme

Conclusions concerning various issues related to the development of the European Research Area

FINLAND. The use of different types of policy instruments; and/or Attention or support given to particular S&T policy areas.

PRINCIPLES AND CRITERIA FOR THE EVALUATION OF SCIENTIFIC ORGANISATIONS IN THE REPUBLIC OF CROATIA

COUNTRY: Questionnaire. Contact person: Name: Position: Address:

Report OIE Animal Welfare Global Forum Supporting implementation of OIE Standards Paris, France, March 2018

WIPO Development Agenda

Climate Change Innovation and Technology Framework 2017

Higher Education for Science, Technology and Innovation. Accelerating Africa s Aspirations. Communique. Kigali, Rwanda.

Evaluation report. Evaluated point Grade Comments

Research and Innovation Strategy and Action Plan UPDATE Advancing knowledge and transforming lives through education and research

ADVANCING KNOWLEDGE. FOR CANADA S FUTURE Enabling excellence, building partnerships, connecting research to canadians SSHRC S STRATEGIC PLAN TO 2020

WFEO STANDING COMMITTEE ON ENGINEERING FOR INNOVATIVE TECHNOLOGY (WFEO-CEIT) STRATEGIC PLAN ( )

High Performance Computing in Europe A view from the European Commission

Research strategy LUND UNIVERSITY

From FP7 towards Horizon 2020 Workshop on " Research performance measurement and the impact of innovation in Europe" IPERF, Luxembourg, 31/10/2013

Belgian Position Paper

Vice Chancellor s introduction

University of Queensland. Research Computing Centre. Strategic Plan. David Abramson

8365/18 CF/nj 1 DG G 3 C

COUNCIL OF THE EUROPEAN UNION. Brussels, 9 December 2008 (16.12) (OR. fr) 16767/08 RECH 410 COMPET 550

NEW ZEALAND. Evaluation of the Public Good Science Fund An Overview.

"The future of Social Sciences and Humanities in Horizon 2020"

European Charter for Access to Research Infrastructures - DRAFT

Introducing the 7 th Community Framework Programme for Research and Technological Development ( ) 2013)

Building global engagement in research Sources of funding for enabling international research collaborations

ECU Research Commercialisation

National Workshop on Responsible Research & Innovation in Australia 7 February 2017, Canberra

Transportation Education in the New Millennium

Strategic Plan for CREE Oslo Centre for Research on Environmentally friendly Energy

HORIZON Peter van der Hijden. ACA Seminar What s new in Brussels Policies and Programme 20 th January Research & Innovation.

Research Infrastructures and Innovation

An ecosystem to accelerate the uptake of innovation in materials technology

Societal engagement in Horizon 2020

Horizon 2020 Towards a Common Strategic Framework for EU Research and Innovation Funding

Impact Case Study Template. Guidance Document

A New Platform for escience and data research into the European Ecosystem.

Added Value of Networking Case Study INOV: encouraging innovation in rural Portugal. Portugal

SCAR response to the 2 nd Foresight Expert Group Report

FUTURE NOW Securing Digital Success

Fostering SME innovation through cross-border cooperation

ERDF Stakeholder Workshop 4 th May 2016: Overview and Priorities

SME support under HORIZON 2020

STRATEGY PREFACE

Please send your responses by to: This consultation closes on Friday, 8 April 2016.

Marie Sklodowska Curie Actions. Business participation and entrepreneurship in Marie Skłodowska- Curie actions (FP7 and Horizon 2020)

Universities and Sustainable Development Towards the Global Goals

UB DRUSSA Experience and Lessons

Working together to deliver on Europe 2020

Learning Lessons Abroad on Funding Research and Innovation. 29 April 2016

A Research and Innovation Agenda for a global Europe: Priorities and Opportunities for the 9 th Framework Programme

Can we better support and motivate scientists to deliver impact? Looking at the role of research evaluation and metrics. Áine Regan & Maeve Henchion

Profile description of the Chair of Ports and Waterways

Towards a Consumer-Driven Energy System

Expert Group Meeting on

Constants and Variables in 30 Years of Science and Technology Policy. Luke Georghiou University of Manchester Presentation for NISTEP 30 Symposium

BUILDING CAPACITIES: ENTREPRENEURIAL LEARNING AND SME SKILLS

Report on the Results of. Questionnaire 1

FP9 s ambitious aims for societal impact call for a step change in interdisciplinarity and citizen engagement.

AC : THE NATIONAL PROJECT FOR THE INNOVATIVE DEVELOPMENT OF THE HIGHER ENGINEERING EDUCATION IN RUSSIA

Strategic Plan Public engagement with research

The Defence of Basic

HOW TO BUILD AN INNOVATION ECOSYSTEM?

CARDIFF BUSINESS SCHOOL THE PUBLIC VALUE BUSINESS SCHOOL

Supportive publishing practices in DRR: Leaving no scientist behind

European Nuclear Education Network Association

The future of Research Universities in Asia: Reading the water well AND creating exciting new streams

The Institute for Marine and Antarctic Studies

Over the 10-year span of this strategy, priorities will be identified under each area of focus through successive annual planning cycles.

TOWARD THE NEXT EUROPEAN RESEARCH PROGRAMME

Initial draft of the technology framework. Contents. Informal document by the Chair

MILAN DECLARATION Joining Forces for Investment in the Future of Europe

DANUBE INNOVATION PARTNERSHIP

SETTING FOOD SAFETY RISK ANALYSIS RESEARCH PRIORITIES FOR HORIZON EUROPE. Marta Hugas EFSA Chief Scientist

STRATEGIC FRAMEWORK Updated August 2017

Post : RIS 3 and evaluation

SMART SPECIALIZATION PROCESS: THE CASE OF THE REPUBLIC OF MOLDOVA SERGIU PORCESCU JRC NCP KNOWLEDGE HUB MOLDOVA

WG/STAIR. Knut Blind, STAIR Chairman

Programme. Social Economy. in Västra Götaland Adopted on 19 June 2012 by the regional board, Region Västra Götaland

Water, Energy and Environment in the scope of the Circular Economy

THE ROLE OF TRANSPORT TECHNOLOGY PLATFORMS IN FOSTERING EXPLOITATION. Josef Mikulík Transport Research Centre - CDV

Terms of Reference. Call for Experts in the field of Foresight and ICT

7656/18 CF/MI/nj 1 DG G 3 C

GROUP OF SENIOR OFFICIALS ON GLOBAL RESEARCH INFRASTRUCTURES

What is on the Horizon? 2020

Workshop, University of Montenegro. WBCInno. From knowledge generation to knowledge diffusion at ZSI (Centre for Social Innovation) - Ines Marinkovic

Delft University of Technology

TRANSFORMATION INTO A KNOWLEDGE-BASED ECONOMY: THE MALAYSIAN EXPERIENCE

Into Moving Forward to Automated Driving. In this issue: ITS World Congress in Montreal. CARTRE and ERTRAC Joint Workshop

Roadmap for European Universities in Energy December 2016

Introduction to HSE ISSEK

Transcription:

International comparison of education systems: a European model? Paris, 13-14 November 2008 Workshop 2 Higher education: Type and ranking of higher education institutions Interim results of the on Assessment of University-Based Research convened by the European Commission s DG for Research Wolfgang Mackiewicz (Freie Universität Berlin, DE) Paris, 13-11-2008 1

Mandate of the AUBR Identify the various types of users and potential users of measurements of the quality of university-based research. Take stock of the main methodologies for assessing the quality of university-based research with a view to understanding their purpose, scope, uses, merits, limitations, and impact. Propose a consolidated multidimensional methodological approach, based on robust, relevant and widely accepted methods, addressing users needs and interests, and identifying data and indicator requirements. Paris, 13-11-2008 2

Mandate of the AUBR (cont.) the AUBR EG is not to develop a methodology for ranking the AUBR EG is not to deal with assessment of university-based teaching however, AUBR EG is aware of relevance of quality of research to quality of teaching point of departure: different user groups approach assessments of UBR with different purposes, needs, and interests in mind hence the need for a multidimensional methodological approach to assessment of UBR Paris, 13-11-2008 3

POLICY CONTEXT (i) Communication of May 2006 Delivering on the Modernisation Agenda for Universities: Education, Research and Innovation Select key points: call for higher investment in university-based research (UBR) universities should be funded more for what they do than what they are call for robust quality assurance of UBR Paris, 13-11-2008 4

POLICY CONTEXT (cont.) «Competitive funding should be based on institutional evaluation systems and on diversified performance indicators with clearly defined targets and indicators supported by international benchmarking for both inputs and economic and societal outputs.» call for focusing less on scientific disciplines and more on research domains; hence importance of cross-disciplinarity universities need to communicate the relevance of their research activities to society / stakeholders excellence emerges mainly at faculty / department level Paris, 13-11-2008 5

POLICY CONTEXT (cont.) relevance of UBR to Lisbon goals the overarching issues of QUALITY, TRANSPARENCY and COMPARABILITY (ii) Council Resolution of December 2007 Modernising universities for Europe s competitiveness in a global knowledge economy Select key points: globalisation => Europe s universities should aim to become worldwide competitive players Paris, 13-11-2008 6

POLICY CONTEXT (cont.) Member States invited to promote the internationalisation of HEIs by encouraging quality assurance through independent evaluation and peer review of universities Paris, 13-11-2008 7

COMMISSION CONTEXT (2008-9) Three s 1) «Impact of external research funding on financial management in universities» (12/2008) universities should adapt themselves to competitive project-based research funding, which is becoming an increasingly important stream of public funding for research 2) CREST Member States Working Group on «Mutual learning on approaches to improve the excellence of research in universities» (01/2009) universities have to enhance the quality and relevance of their research Paris, 13-11-2008 8

COMMISSION CONTEXT (2008-9) (cont.) CREST Group to take into account the needs concerning the measurements of the excellence of UBR and what role the various university rankings play in this context consider various approaches to the funding of UBR and related methodologies to assess the quality of research identify needs for further improving assessment methodologies of research performance as input for research funding 3) on «Assessment of University-Based Research» (07/2009) Paris, 13-11-2008 9

AUBR Identification and analysis of five interrelated key elements USERS RESEARCH DISCIPLINES METHODS IMPACT Paris, 13-11-2008 10

Anticipated users HE management and governance o Governing bodies / councils o HE executives / management o HE research groups Governments o European Commission o Member State governments o HE ministries o Local and regional governments o HE agencies Public funding organisations Peer review committees Paris, 13-11-2008 11

Individuals o Academics and researchers o Graduates Peer HEIs Industry partner organisations o Private companies and entrepreneurs o Public organisations o Employers Sponsors and private investors o Benefactors / philanthropists o Alumni Public opinion Paris, 13-11-2008 12

User groups and uses of research assessment (i) HE executives/management For what purpose do they require research assessment data? Policy and planning Strategic positioning Research development / management strategy Investor confidence / value-formoney and efficiency Quality assurance Publicity Graduate and academic recruitment What data is required? Discipline / field data re level of intensity, expertise, quality and competence Benchmarking against peer institutions Efficiency level: how much output vis-à-vis funding Quality of academic staff and PhD students Attraction capacity: recruitment of graduates/academics/researchers from outside region / internationally Paris, 13-11-2008 13

User groups and uses of research assessment (ii) Academics and researchers For what purpose do they require research assessment data? Identify career opportunities Identify research partners Identify best research infrastructure and support for research What data is required? Institutional / field data re level of intensity, expertise, quality, competence, and sustainability Performance of individual institution benchmarked against peers in field of interest Impact of research on teaching Institutional research support, incl. infrastructure Paris, 13-11-2008 14

Research The AUBR subscribes to an inclusive concept of research, ranging from blue sky / curiosity-driven research to user-led / practice-based research. General definition adopted (HEFCE/RAE): "original investigation undertaken in order to gain knowledge and understanding". Research is not identical with research output. The following dimensions should be distinguished: input, process, output, outlet, and impact/outcome. Different dimensions may be of specific interest to different user groups. Paris, 13-11-2008 15

Disciplines AUBR should cover the whole range of disciplines from natural sciences to arts and design. The methodology to be proposed should facilitate the assessment of trans-, multi- and interdisciplinary research, and of research carried out in emerging new disciplines. Different groups of (sub)disciplines produce different types of output. For example, peer-reviewed journal articles are a typical output of specific (sub)disciplines only. Paris, 13-11-2008 16

Methods / Indicators / Impact N.B. The Group has not yet discussed these elements in detail. (i) Productivity indicators (how many? how much?) research publications and other outputs completions of research training degrees research active academics research income (ii) Quality and scholarly impact (how good? how significant? what impact on the body of knowledge in the field) publications in top-ranked, high-impact journals and other outlets (ranking of outlets is discipline specific) citations (of limited use in a number of fields) peer esteem Paris, 13-11-2008 17

Methods / Indicators / Impact (cont.) (iii) Innovation and socio-economic benefit (what contribution is made to the economy and broader society?) N.B. there may be a significant time-lag between the conduct of the research and the impact. demonstrated benefits likelihood of impact: (i) engagement through research collaboration or funding research; (ii) uptake of research to generate new policies / products / processes / attitudes / behaviours / outlooks research income (disadvantage: lack of demonstrated correlation between funding source and eventual actual impact) industry employment of PhD graduates commercialisation revenue and equity end-user esteem Paris, 13-11-2008 18

(iv) Sustainability and scale of research enterprise sustainability (postgraduate research student load; involvement of early career researchers; accessibility of research infrastructures and facilities) scale (number of collaborations and partnerships) inter- and transdisciplinarity Paris, 13-11-2008 19

A few key messages Units of assessment = knowledge clusters, and not entire universities; the methodology proposed should allow aggregation to institutional level. Information needs to be provided of all the factors used in a given assessment. This way, users may decide themselves on how the indicators used should be weighted. Indicators must be useful, relevant, comparable, reliable, and feasible. Use should be made of audited and verifiable data whenever possible. Critical test of the assessment methodology: accommodation of diversity in university research. Paris, 13-11-2008 20

Not all European institutions want to be global players, but among those that do not there may well be institutions that wish to excel in research of one kind or another. Assessment nut just of past performance, but also of potential for future performance. Need for common terminology; hence AUBR EG to create a glossary. Paris, 13-11-2008 21

From complexity to feasibility The analysis of the various elements has provided evidence of the complexity of the task at hand. A way out of this: make PURPOSE / OBJECTIVE a determining factor in a given assessment exercise. - If you want to use assessment to allocate resources inside a HEI, then use. - If you want to use assessment to improve performance, then use. - If you want to use assessment to attract talent, then use. Examples like these would be made available in a kind of tool box. Also, advice on when to combine quantitative and qualitative metrics. We will hopefully have a typology of research assessments. Paris, 13-11-2008 22

From complexity to feasibility Next steps preparation and analysis of case studies of current AUBR practices thorough discussion of the complex of data, indicators and methods with a view to producing a prototype toolbox presentation for discussion of preliminary outcomes at a workshop attended by a substantial number of external experts and stakeholder representatives Final Report: Towards a European Framework for the Assessment of University-Based Research Follow-on activities: piloting, and further elaboration of the multidimensional methodology proposed Paris, 13-11-2008 23