Research Excellence Framework

Similar documents
International comparison of education systems: a European model? Paris, November 2008

Creative Informatics Research Fellow - Job Description Edinburgh Napier University

CHEAD. (Council for Higher Education in Art & Design) Review of the Year 2007/08

Review of the University vision, ambition and strategy January 2016 Sir David Bell KCB, Vice-Chancellor

Developing the Arts in Ireland. Arts Council Strategic Overview

. Faye Goldman. July Contents

RESEARCH AND INNOVATION STRATEGY

Medical Research Council

UK High-Field NMR Funding and the UKRI Infrastructure Roadmap

UK Film Council Strategic Development Invitation to Tender. The Cultural Contribution of Film: Phase 2

CCG Improvement and Assessment Framework 2016/17. Briefing Document

Engaging UK Climate Service Providers a series of workshops in November 2014

Public engagement, impact, and the 21st Century University: the context. Paul Manners Director, National Coordinating Centre for Public Engagement

Academy of Social Sciences response to Plan S, and UKRI implementation

The NHS England Assurance Framework: national report for consultation Chief Officer, Barnet Clinical Commissioning Group

Getting the evidence: Using research in policy making

UKRI research and innovation infrastructure roadmap: frequently asked questions

THE NUMBERS OPENING SEPTEMBER BE PART OF IT

Doing, supporting and using public health research. The Public Health England strategy for research, development and innovation

Knowledge Exchange Strategy ( )

The Defence of Basic

GSA SUMMARY REPORT OF EQUALITY CONSIDERATION AND ASSESSMENT OF EQUALITY IMPACT. PGT Ethics Policy. New: Existing/Reviewed: Revised/Updated:

Big data for the analysis of digital economy & society Beyond bibliometrics

NHS SOUTH NORFOLK CLINICAL COMMISSIONING GROUP COMMUNICATIONS AND ENGAGEMENT STRATEGY

Research and Innovation Strategy and Action Plan UPDATE Advancing knowledge and transforming lives through education and research

A review of the role and costs of clinical commissioning groups

Capturing the impacts of Liverpool 08 Evaluating European Capital of Culture

COUNTRY: Questionnaire. Contact person: Name: Position: Address:

3. Title NHSE & Ipsos Mori CCG 360 Stakeholder Survey

House of Commons Science and Technology Select Committee Inquiry into the Science Budget and Industrial Strategy

Increased Visibility in the Social Sciences and the Humanities (SSH)

ECONOMIC AND SOCIAL RESEARCH COUNCIL IMPACT REPORT

ECU Research Commercialisation

HDR UK & Digital Innovation Hubs Introduction. 22 nd November 2018

Patient and Community Engagement Indicator (Compliance with statutory guidance on patient and public participation in commissioning health and care)

GOVERNING BODY MEETING in Public 25 April 2018 Agenda Item 3.2

Research impact and its assessment: lessons from the UK Research Excellence Framework

Whole of Society Conflict Prevention and Peacebuilding

CCG Assurance and the Balanced Scorecard Balanced Scorecard An overview of the tool, and its role in CCG assurance. Khadir Meer Richard Wells

Report of EPSRC Mathematical Sciences in Healthcare Scoping Meeting 15 SEPTEMBER 2014

Gender pay gap reporting tight for time

FP9 s ambitious aims for societal impact call for a step change in interdisciplinarity and citizen engagement.

The ICT industry as driver for competition, investment, growth and jobs if we make the right choices

Circuit Programme Handbook

Programme Specification

Making a difference: the cultural impact of museums. Executive summary

CAPACITIES. 7FRDP Specific Programme ECTRI INPUT. 14 June REPORT ECTRI number

Expectations around Impact in Horizon 2020

CO-ORDINATION MECHANISMS FOR DIGITISATION POLICIES AND PROGRAMMES:

Projects will start no later than February 2013 and run for 6 months.

COMMISSION RECOMMENDATION. of on access to and preservation of scientific information. {SWD(2012) 221 final} {SWD(2012) 222 final}

Programme Specification

Multidisciplinary education for a low-carbon society. Douglas Halliday, Durham University, UK

Assessing and Monitoring Social Protection Programs in Asia and the Pacific

LIVING LAB OF GLOBAL CHANGE RESEARCH

FP7 ICT Call 6: Cognitive Systems and Robotics

Tuning-CALOHEE Assessment Frameworks for the Subject Area of CIVIL ENGINEERING The Tuning-CALOHEE Assessment Frameworks for Civil Engineering offers

Science and engineering driving the global economy David Delpy, CEO May 2012

CCG Assurance Framework. England

Submission to the Productivity Commission inquiry into Intellectual Property Arrangements

Appointment of External Auditors

Innovative Approaches in Collaborative Planning

Introducing Elsevier Research Intelligence

Strategic Plan Public engagement with research

West Norfolk CCG. CCG 360 o stakeholder survey 2014 Main report. Version 1 Internal Use Only Version 7 Internal Use Only

Nuffield Foundation Strategy

SHTG primary submission process

November 18, 2011 MEASURES TO IMPROVE THE OPERATIONS OF THE CLIMATE INVESTMENT FUNDS

Collaboration Agreement

Impact for Social Sciences and the Handbook for Social Scientists

8th Floor, 125 London Wall, London EC2Y 5AS Tel: +44 (0) Fax: +44 (0)

A Science & Innovation Audit for the West Midlands

NEW ZEALAND. Evaluation of the Public Good Science Fund An Overview.

Expert Group Meeting on

Guidelines for the Professional Evaluation of Digital Scholarship by Historians

The risks and opportunities for CCGs when co commissioning primary care: Things to consider when making your decision

Belgian Position Paper

Guidance on. Pay-As-You-Go. prepayment meters

FEE Comments on EFRAG Draft Comment Letter on ESMA Consultation Paper Considerations of materiality in financial reporting

The Research Project Portfolio of the Humanistic Management Center

Digitisation Plan

TECHNOLOGY MASTER PLAN

CARDIFF BUSINESS SCHOOL THE PUBLIC VALUE BUSINESS SCHOOL

IV/10. Measures for implementing the Convention on Biological Diversity

NHS Next Stage Review: Innovation

14:40-15:10 Gene Editing in New Zealand: Building Social Acceptance of Emerging Opportunities

Written response to the public consultation on the European Commission Green Paper: From

Exploring emerging ICT-enabled governance models in European cities

Reviewing public engagement

Principles and structure of the technology framework and scope and modalities for the periodic assessment of the Technology Mechanism

University of Dundee. Design in Action Knowledge Exchange Process Model Woods, Melanie; Marra, M.; Coulson, S. DOI: 10.

ArtWorks code of practice

Arrangements for: National Progression Award in Food Manufacture (SCQF level 6) Group Award Code: GF4N 46. Validation date: July 2012

THE LABORATORY ANIMAL BREEDERS ASSOCIATION OF GREAT BRITAIN

FINLAND. The use of different types of policy instruments; and/or Attention or support given to particular S&T policy areas.

EXPLORATION DEVELOPMENT OPERATION CLOSURE

Connected Communities A Roadmap for Big Society Research and Impact

The Design Economy. The value of design to the UK. Executive summary

Convergence and Differentiation within the Framework of European Scientific and Technical Cooperation on HTA

United Nations Statistics Division Programme in Support of the 2020 Round of Population and Housing Censuses

Transcription:

Research Excellence Framework CISG 2008 20 November 2008 David Sweeney Director (Research, Innovation, Skills) HEFCE

Outline The Policy Context & Principles REF Overview & History Bibliometrics User-Valued Research

Introduction: the policy context National policy and HEFCE research funding Principles of Research Assessment Overview of REF

National policy Government policy: A strong and innovative national research base is essential to support national prosperity in a globalised knowledge based economy Need to strengthen links between undertaking research and developing new products and services (Science and innovation investment framework, 2004-14)

HEFCE research funding HEFCE research grant is part of the dual support system working together with parallel funding streams from other sources It is also one of several funding streams from HEFCE (including HEIF for example) which together support policy aims for HE to contribute to economic growth and innovation

HEFCE research funding (2) Our grant for research is allocated to enable universities collectively to: maintain a research base of world leading quality across the full range of disciplines create a sustainable and flexible national baseline capacity which enables the sector to respond strategically to a changing external environment and on which research and other activity funded from other sources can build.

HEFCE research funding (3) Our funding creates the capacity for HEIs to undertake innovative research, including in new fields and opening new lines of enquiry the potential to make connections across subjects with a technological, business and social focus is particularly important in this context. This must be fully reflected in our assessment and allocation processes.

HEFCE research funding (4) Therefore we allocate our funding selectively to support high quality research of all kinds wherever this is found. We also strive to avoid either: Micromanaging the system through targeted allocations or detailed incentives, or Too broad an approach creating undesired incentives, or not supporting important activity

Research assessment: principles HEFCE undertakes research quality assessment: As an assurance mechanism To provide information and benchmarking for HEIs, research users and others As an essential element in our funding allocation process

REF: overview REF is an integrated framework for funding and assessment within which we can support activity and create incentives through either or both of these elements Its primary focus is still excellence, but In context of the policy aims above; and Always taking account of the breadth and diversity of research done in HE

REF: overview (2) REF is a work in progress: following previous consultations we have already moved some way from where we started REF is not only about basic research, nor is it only about bibliometrics. We are developing a range of assessment approaches which can be combined in different ways to fit particular subject fields and types of research

REF: overview (3) We see REF as a further development of what we do now, not something completely new We aim to avoid unhelpful perturbations in funding We aim to build upon the lessons of RAE including examples of good practice in RAE that we wish to carry forward into REF

REF: working with stakeholders We are developing REF in continuing consultation with partners and stakeholders and have already heard and acted on views on some aspects We are committed to reducing the burden on HEIs, but we need to discuss how to achieve this without losing responsiveness to sector concerns

Directions We remain committed to supporting an excellent, diverse and innovative research base that responds to policy aims and user needs And in particular to supporting research that is both excellent and valued by users REF as it is developing offers us a range of approaches to bring this about.

Background: HEFCE research funding HEFCE allocates 1.5 billion for research in England Under dual support, HEFCE funding supports curiosity driven research and underpins work funded by the research councils, NHS, charities and others A block grant to be spent at the discretion of the HEI Allocated selectively using measures of quality (RAE results) and volume

Background: the RAE RAE was introduced in 1986 and the 2008 exercise is now in its final stages RAE has had a strong positive impact: Showing that the block grant is well spent Driving up research quality A respected source of information about research quality But there are negative impacts too and the exercise has arguably run its course in this form

Background: Reform of research assessment and funding 2006: DfES proposed moving to a metrics-based system of allocating research funding Following consultation, HEFCE was asked to develop proposals for a system based on: a bibliometric indicator of quality, research income and research student data for the science-based subjects light-touch peer review informed by metrics for the other subjects 2007-08: HEFCE consulted on proposals for the Research Excellence Framework (REF), and announced some changes in May 2008

The Research Excellence Framework (REF) A unified framework for research assessment and funding which accommodates differences between disciplines Robust research quality profiles for all subjects across the UK, benchmarked against international standards Emphasis on identifying and encouraging excellent research of all kinds Greater use of metrics than at present including bibliometric indicators for all disciplines where these are meaningful Reduced burden on HEIs

Broad approach to assessment Assessment will be through: Bibliometric indicators of quality or expert review of outputs (possibly a combination of these) Other quantitative indicators Supplementary qualitative information Which of these elements are employed, and the balance between them, will vary as appropriate to each subject For all subjects, expert panels will advise on the selection, interpretation and combination of the assessment components to produce overall quality profiles

Up to spring 2009 Mid-2009 Autumn 2009 Calendar year 2010 2011-12 2013 Timetable Bibliometrics pilot and other development work Consult on all main features of the REF including operational details of the bibliometrics process Decide on the main operational features of the framework Undertake bibliometrics exercise in appropriate subjects. Establish expert panels for all subjects. Consult on choice and use of assessment components for each subject group. Metrics begin to inform an element of HEFCE funding in some subjects. Undertake full assessment process for all subjects including light-touch peer review.

Bibliometrics Building on expert advice and consultation, we have identified the following key features: Bibliometrics have the potential to provide robust proxy indicators of quality across a number of subjects To be used alongside other data and information Advice on interpretation from expert panels Indicators to be based on citation rates per paper, benchmarked against worldwide norms for the field (and year and type of publication) Results to be aggregated for substantial bodies of work; presented as a citation profile

The bibliometrics pilot The pilot aims to develop and test a number of issues: Which disciplines? (All disciplines with at least moderate citation coverage are included in the pilot) Which staff and papers should be included? Universal or selective coverage? Are papers credited to the researcher or the institution? How to collect data and the implications for institutions Which citation database(s)? Refining the methods of analysis including normalisation fields and handling self citation Thresholds for the citation profile Interpretation and use by expert panels

Bibliometrics pilot institutions Bangor University London Sch of Hygiene and Trop Med University of Bath University of Birmingham Bournemouth University University of Cambridge University of Durham University of East Anglia University of Glasgow Imperial College London Institute of Cancer Research University of Leeds University of Nottingham University of Plymouth University of Portsmouth Queens University, Belfast Robert Gordon University Royal Veterinary College University of Southampton University of Stirling University of Sussex University College London

The bibliometrics pilot - timetable May 08-Jun 08 Select HEIs/contractors Aug 08-Oct 08 Data collection Nov 08-March 09 Data analysis Spring 09 Pilot results

The pilot so far Evidence Ltd has been commissioned to run the pilot Mainly using the Web of Science data, but we will also explore SCOPUS The institutions were asked to provide data about all known research staff and outputs for the period Jan 2001 to Dec 2007 (in relevant disciplines) This will be supplemented by records identified in the Web of Science JISC project to produce case studies of pilot institutions data collection systems We are currently tendering for a project to identify lessons learned by the pilot HEIs and disseminate these to the wider sector

Further information www.hefce.ac.uk/research/ref REF-NEWS mailing list Queries to ref@hefce.ac.uk Events to be organised by King s College London

Key issues for further work (1) Developing a robust method of producing citation indicators, and determining how and in which subjects they will be used Options for a lighter touch approach to peer review Defining a common family of other types of indicators and supplementary information Challenge to define a common set of elements that can capture key aspects of quality for all types of research and across all disciplines, while minimising burden and complexity

Key issues for further work (2) How to combine metrics and expert input to form quality profiles How to group disciplines and configure expert panels The role and constitution of expert panels Reducing burden on the sector Promoting equality and diversity Detailed plans for phasing in and implementing the REF (including further consultations to determine which elements will be employed and the weightings between them in each subject group)

Progress so far Pilot of the bibliometrics process is underway Exploring how best to assess the quality of user-valued research Identifying available sources of data for potential metrics We are establishing a series of expert groups to advise on the key issues for the design of the REF, drawing in particular on the experience of the RAE Work has been commissioned to gather evidence about the accountability burden and equality and diversity implications of the move from RAE to the REF Informal discussions with a range of stakeholders on the key questions have begun

Light touch peer review Need to consider workload on institutions and on panels Selection of staff and outputs are there any realistic alternatives? We will take advice from RAE panel members on the options for reducing the burden of reviewing outputs Opportunities for greater use of metrics

Metrics We aim to define a common family of metrics for the REF, that subject groups can draw on: Bibliometrics Research income Research students Esteem indicators? User-valued research? We will take advice from expert groups on the choice and use of metrics, and seek to draw on existing sources of data as far as possible

User-valued research Consultation responses revealed a variety of views: Should user-valued research be promoted through REF or other funding streams? Some supported simple proxy metrics Some supported more complex subject-specific metrics Some felt alternatives to metrics are needed Workshop to identify how best to assess the quality of user-valued research within the framework

What is user valued research? It is widely held that RAE does not deal well certain research activity which is valued by users but which is not recognised for excellence within the academic community in the usual way for its field. This body of work resists definition in terms of basic and applied. It may equally well be either curiosity driven or specifically commissioned.

A typology We suggest that UVR has one or more the following features: Undertaken using different approaches, or in a different environment, to most work in cognate fields (might include practice based research in the arts or STEM research done in a clinical or user setting?) Leading to no standard outputs, including non text or non-academic text (civil engineering or policy advice to government?) Published in traditional form but less cited by other researchers (certain journals in engineering?)

A typology (2) In other words we seek a definition which Captures the full body of work that we need to consider, across all disciplines (and crucially including multi-disciplinary work) Fits constructively to the overall conceptual framework or REF and will help us to identify and assess excellence in fields and forms that we might otherwise miss

Framework for assessment (1) Our starting points are: National policy imperatives Responses to last year s consultation unanimous that there may be a problem but much less so on the solution Lessons to be learnt from the RAE experience especially from the enhanced provision made by panels in 2008

Framework for assessment (2) RAE defines excellence in terms of Significance Originality Rigour In REF as in RAE we are ready to work with a broad definition of significance recognising the different ways in which the influence and impact of a piece of research may be seen and judged. Originality and rigour are essential criteria for all research too.

Framework for assessment (3) RAE assesses a body of research activity in terms of: Quality of outputs Research environment Esteem We are willing to consider using indicators in REF that may capture How a research environment is supportive of UVR Evidence that specific work is esteemed for its quality by particular audiences

Tools and approaches for assessment The toolkit: Bibliometric analysis Expert review of outputs Other available statistical indicators Submission of information by HEIs All of these would be collected and interpreted with advice and input from expert panels as required.

Tools: bibliometric analysis We aim to develop bibliometric approaches which Are accepted to be both robust and responsive to the full range of approaches within the disciplines covered Use appropriate normalisation tools so that outputs are compared to others with similar citation patterns We are looking at possible evidence that user valued research is generally less cited by other academics and would welcome guidance on how to tackle this

Tools: expert review We see a particular role for expert review: Where peer review is the primary approach, to ensure that UVR is assessed by experts against suitable criteria Where bibliometrics is the dominant approach, if necessary to assess outputs and activities for which citation indices are not a good tool And to advise on interpreting all other information We welcome advice on which outputs and activities particularly require expert review

Tools: other statistics We already have access to a range of other statistical information (primarily HESA and HEBCI) We are reluctant to introduce additional returns but this is not absolutely ruled out Interpreting statistics can be challenging and mechanistic approaches may not be good enough (eg how far does industrial income indicate quality and how far that the work was expensive?) Which indicators are promising in your field and how should we interpret these?

Tools: other information There is potential to collect a wealth of other information But this can be hard to interpret and we would expect to need expert input This has already been debated in several contexts (RAE, engineering, the Worton group in the arts) What may be promising indicators that: Outputs and activities enjoy user esteem? A research environment is supportive?

Thank you for listening