Bibliometrics between ambition and responsibility

Similar documents
S E R B A N I O N E S C U M. D. P H. D. U N I V E R S I T É P A R I S 8 U N I V E R S I T É D U Q U É B E C À T R O I S - R I V I È R E S

U-Multirank 2017 bibliometrics: information sources, computations and performance indicators

International comparison of education systems: a European model? Paris, November 2008

Data integration in Scandinavia

Public Consultation: Science 2.0 : science in transition

Performance Measurement and Metrics

Infrastructures as analytical framework for mapping research evaluation landscapes and practices

and its repercussions on the scientific community Berlin, September 2013 Koenraad Debackere, KU Leuven

Constants and Variables in 30 Years of Science and Technology Policy. Luke Georghiou University of Manchester Presentation for NISTEP 30 Symposium

Chair: Dorothea Sturn (Austria)

Increased Visibility in the Social Sciences and the Humanities (SSH)

SciELO SA: Past, Present and Future (September 2018)

Sept Vienna. programme european summer school for scientometrics

Next generation research evaluation:!!!!!!!!!!! the ACUMEN Portfolio and web based information tools

Because what is Known must be Shared

STI 2018 Conference Proceedings

Competing for Excellence: Perverse and constructive uses of evaluation machines in academia

Japan s Initiative for the Science of Science, Technology and Innovation Policy and Human Resource Development Program

SEPTEMBER BERLIN. programme european summer school for scientometrics

Can we better support and motivate scientists to deliver impact? Looking at the role of research evaluation and metrics. Áine Regan & Maeve Henchion

TRANSLATIONAL TWISTS AND TURNS: SCIENCE AS A SOCIO-ECONOMIC ENDEAVOR

SEPTEMBER 4 9 GRANADA. programme european summer school for scientometrics

Evolution of the Development of Scientometrics

Information Science in International Perspective

The European Research Council. ERC Monitoring & Evaluation Strategy

SUPPORTING THE JOURNAL SELECTION PROCESS & RESEARCH PUBLICATION PRACTICES FOR RESEARCH PERFORMANCE EVALUATION IN SERBIA TITLE

Sabrina Petersohn & Thomas Heinze, University of Wuppertal Science, Technology and Innovation Indicators Conference 2017, Paris Sept

A Bibliometric Analysis of Australia s International Research Collaboration in Science and Technology: Analytical Methods and Initial Findings

Mother Jacobs Home Remedies, Now with an Economics Flavour: Tracking Jane Jacobs Influence using Bibliometric and Network Analysis

Building National Infrastructure for Supporting Technology Transfer in Slovakia

Innovating together Collaborations between multi-national companies and academia in China

Research group self-assessment:

Modelling and Mapping the Dynamics and Transfer of Knowledge. A Co-Creation Indicators Factory Design

An introduction to the concept of Science Shops and to the Science Shop at The Technical University of Denmark

Impact for Social Sciences and the Handbook for Social Scientists

Scientometrics Today: A Methodological Overview

Observing Science, Technology and Innovation Studies in Russia HSE ISSEK Surveys

Evaluation of Strategic Research Initiatives at Roskilde University Guidelines for the evaluator s report

Handbook of Quantitative Science and Technology Research

Finland s drive to become a world leader in open science

Research Excellence Framework

Handbook of Quantitative Science and Technology Research

Combining scientometrics with patentmetrics for CTI service in R&D decisionmakings

Exploring alternative cyberbibliometrics for evaluation of scholarly performance in the social sciences and humanities in Taiwan

CDP-EIF ITAtech Equity Platform

Evaluation Axis and Index in the Next Mid to Long-Term Objectives (draft)

Outlining an analytical framework for mapping research evaluation landscapes 1

GENEVA COMMITTEE ON DEVELOPMENT AND INTELLECTUAL PROPERTY (CDIP) Fifth Session Geneva, April 26 to 30, 2010

Big data for the analysis of digital economy & society Beyond bibliometrics

Characterizing Research and Technology Organizations (RTOs) using bibliometrics - The case of the Netherlands

Pathways from Science into Public Decision Making: Theory, Synthesis, Case Study, and Practical Points for Implementation

On Epistemic Effects: A Reply to Castellani, Pontecorvo and Valente Arie Rip, University of Twente

RESEARCH COMMITTEE 23: SOCIOLOGY OF SCIENCE AND TECHNOLOGY INTERNATIONAL SOCIOLOGICAL ASSOCIATION. Nadia Asheulova, RC23 President

Implementation of IP Policy Methodological Issues: Establishing Action Plans with Specific Indicators

The EUROHORCs and ESF Vision on a Globally Competitive ERA and their Road Map for Actions to Help Build It

GUIDELINES SOCIAL SCIENCES AND HUMANITIES RESEARCH MATTERS. ON HOW TO SUCCESSFULLY DESIGN, AND IMPLEMENT, MISSION-ORIENTED RESEARCH PROGRAMMES

Working Paper Series of the German Data Forum (RatSWD)

COMMISSION RECOMMENDATION. of on access to and preservation of scientific information. {SWD(2012) 221 final} {SWD(2012) 222 final}

Heterogeneity and homogeneity in library and information science research

Development of the Strategic Research Agenda of the Implementing Geological Disposal of Radioactive Waste Technology Platform

Altmetrics could enable scholarship from developing countries to receive due recognition.

Statistics and Science, Technology and Innovation Policy: How to Get Relevant Indicators

PRINCIPLES AND CRITERIA FOR THE EVALUATION OF SCIENTIFIC ORGANISATIONS IN THE REPUBLIC OF CROATIA

STI 2018 Conference Proceedings

Evaluation of Scientific Disciplines for Turkey: A Citation Analysis Study

The compliance of Iranian library and information science journals with Thomson Reuters basic standards

Patent Statistics as an Innovation Indicator Lecture 3.1

Mapping private R&D outputs: The contribution of top R&D companies to scientific literature 1

BSH Background Paper #2 Part One

Contribution of the support and operation of government agency to the achievement in government-funded strategic research programs

Christina Miller Director, UK Research Office

Stakeholders in academic publishing: text and data mining perspective and potential

Collaboration in Science and in Technology Proceedings of the Second Berlin Workshop on Scientometrics and Informetrics September 1 3, 2000

August (draft); January (final version) UK Arts and Humanities Research Council (AHRC)

The impact of the Online Knowledge Library: Its Use and Impact on the Production of the Portuguese Academic and Scientific Community ( )

A STUDY ON THE DOCUMENT INFORMATION SERVICE OF THE NATIONAL AGRICULTURAL LIBRARY FOR AGRICULTURAL SCI-TECH INNOVATION IN CHINA

Research assessment and evaluation in Russian fundamental science

International Atomic Energy Agency. Global Safety Assessment Network (GSAN) and Safety Assessment Education and Training (SAET)

Rebuilding for the Community in New Orleans

Design and Development of Information System of Scientific Activity Indicators

How to use Bibliometric Data to Rank Universities according to their Research Performance?

Current Policies regarding IP and Innovation: The Philippine Experience. Ma. Amelou E. Lim Head, Technology Transfer Division

Expert Group Meeting on

PLAN-E 4th Plenary meeting Dublin 9-10 May 2016

SCIndeks: Serbian Citation Index Regional Perspective

CRC Association Conference

ILNAS-EN 14136: /2004

Indicators from the web - making the invisible visible?

Policy Partnership on Science, Technology and Innovation Strategic Plan ( ) (Endorsed)

Social Big Data. LauritzenConsulting. Content and applications. Key environments and star researchers. Potential for attracting investment

Open Science. challenge and chance for medical librarians in Europe.

Rules of Usage for the BESSY II Electron Storage Ring and the BER II Neutron Source at the Helmholtz-Zentrum Berlin für Materialien and Energie GmbH

Assessment of Smart Machines and Manufacturing Competence Centre (SMACC) Scientific Advisory Board Site Visit April 2018.

On Practical Innovation Policy Learning. Per M. Koch Head of the Science Policy Project

Trust, but Verify : What the Digital and Transparency Revolutions in Social Science Mean for You. Andrew Moravcsik

esss Berlin, 8 13 September 2013 Monday, 9 October 2013

Hamburg, 25 March nd International Science 2.0 Conference Keynote. (does not represent an official point of view of the EC)

Strategic Plan for CREE Oslo Centre for Research on Environmentally friendly Energy

Patented Medicine Prices Review Board P M P R B GUIDELINES REFORM. 15 th Annual Market Access Summit. Douglas Clark Executive Director PMPRB

Establishing a reference framework for assessing the Socio-economic impact of Research Infrastructures

Transcription:

Bibliometrics between ambition and responsibility Nordic Workshop on Bibliometrics and Research Policy 25-26 September 2014, Reykjavik Dr. ifq - Institute for Research Information and Quality Assurance 10117 Berlin www.research-information.de

Structure of the presentation What is bibliometrics / scientometrics? Why bibliometrics / scientometrics? Institutionalization Application and use Towards guiding principles

What is bibliometrics / scientometrics? First attempts to measure science date back to as early as 18th century but, systematic developmemt of quantitative and evaluative analysis of science since mid 20th century Present-day bibliometrics are based on different scientific fields with own concepts that were more or less combined to an interdisciplinary understanding: History of science (D. de Sola Price) Philosophy (V.V. Nalimov) Information science (E. Garfield) Sociology of science (R.K. Merton) Mathematics (S.D. Haitun, A.I. Yablonsky) Based on Hornbostel / Glänzel, esss 2014, Vienna

What is bibliometrics / scientometrics? Robert K. Merton represents the sociologists view of scientometrics. Among his most famous ideas related to science and its measurement, the Matthew effect and his notion of citation as a reward system (currency of science) should be mentioned. According to the sociologists view communication in science is not merely linked to cognitive processes (cf. information science), but also characterized by the position scientists hold in the community. Based on Hornbostel / Glänzel, esss 2014, Vienna

What is bibliometrics / scientometrics? In his book entitled Little Science Big Science (1963), Derek J. de Solla Price analyzed the recent system of science communication and, thus, presented the first systematic approach to the structure of modern science applied to science as a whole. His pioneering work also laid the foundation of modern research evaluation techniques. His ideas were also timely since the development of science reached a stage where traditional information, retrieval, evaluation, and funding mechanisms became more and more difficult and expensive. Based on Hornbostel / Glänzel, esss 2014, Vienna

What is bibliometrics / scientometrics? Addressed questions: Why should we not turn the tools of science on science itself? Why should we not measure and generalize, make hypotheses, and derive conclusions? He paved the way for scientometric research by showing how to get away from methods and models adopted from other fields towards the development of a scientometric-specific methodology; proposing the growth model and studying scientometric transactions, e.g. the network of citations between scientific papers; finding that a paper that is frequently cited will probably get more citations than one cited less often and creating a model for this phenomenon; conducting scientometric studies for policy implications and research evaluation, thus opening the door for the present-day evaluative bibliometrics. Based on Hornbostel / Glänzel, esss 2014, Vienna

What is bibliometrics / scientometrics? Eugene Garfield, founder and chairman of the Institute for Scientific Information (now part of Thomson Reuters) developed the Science Citation Index (SCI) in the early 1960s for the advanced Information retrieval and for science-information services, which has become an important source for scientometric studies. The SCI was not originally created either to conduct quantitative studies, calculate impact factors, nor to facilitate the study of history of science. (Garfield, From information retrieval to scientometrics is the dog still wagging its tail? 2009) Based on Hornbostel / Glänzel, esss 2014, Vienna

What is bibliometrics / scientometrics? Examples The Journal Impact Factor (JIF) was first used as a measure for comparing journals independently of size and to help to select journals for the SCI. (Garfield & Sher, American Documentation, 1963) The co-citation based Atlas of Science developed and issued by the Institute for Scientific Information (ISI) was considered a new kind of review literature which is also suited to help students in choice of careers in science. (Garfield, Current Comments, 1975) Only later Garfield recognized the power of the JIF for journal evaluation and considered it also a journal performance indicator. Based on Hornbostel / Glänzel, esss 2014, Vienna

What is bibliometrics / scientometrics? Pritchard (1969) explained the term bibliometrics as the application of mathematical and statistical methods to books and other media of communication. Nalimov and Mulchenko (1969) defined scientometrics as the application of those quantitative methods which are dealing with the analysis of science viewed as an information process. Otto Nacke (1979) defined Informetrics, Informetrie: Ein neuer Name für eine neue Disziplin, Nachrichten für Dokumentation 30, n. 6 (1979): 219-26 Based on Hornbostel / Glänzel, esss 2014, Vienna

Why bibliometrics / scientometrics? Its early goals Monitoring, describing, and modelling the production, dissemination, and use of knowledge, including information seeking, library circulation, and scholarly communication Optimizing library access and circulation, improving bibliographic databases, and extending information services Based on Hornbostel / Glänzel, esss 2014, Vienna

Why bibliometrics / scientometrics? Necessities of metrics for scientific research: Growth of scientific literature Challenges of big science Economic and societal use of science Internationalization of scientific community Bibliometrics / scientometrics represented a statistical approach to master the growing flood of scientific information and to analyze and understand the underlying cognitive processes of communication in science by measuring quantitative aspects of these processes and by providing the results to scientists and users outside the scientific community. Based on Hornbostel / Glänzel, esss 2014, Vienna

Why bibliometrics / scientometrics? Science indicators movement in the US with the discussion about the possible use of bibliometrics in science policy in the 1970s marked the begin of a new era in bibliometrics. Research evaluation using quantitative methods Distribution of funding on the basis of performance indicators Comes also with consequences for bibliometrics: Re-interpretation of prior bibliometric concepts New fields of applications and challenges opened to bibliometrics; but many tools were still designed for use in scientific information, information retrieval; and libraries; they became used in a context for which they were not designed. Based on Hornbostel / Glänzel, esss 2014, Vienna

Institutionalization The institutionalization process, started in the 1970s with a major momentum in the 1980s. Structured scientific research, service activities, higher education Germany: Institut für Dokumentation und Information über Sozialmedizin und öffentliches Gesundheitswesen; Center für Science Studies (later IWT) Univ. Bielefeld Hungary: ISSRU Netherlands: CWTS (2014: 25th anniversary); Univ. Amsterdam Dept. Science Dynamics France: Ecole de Mines, OST Spain: CINDOC (now IEDCYT)... Documented scholarly communication Scientometrics (1978); Research Evaluation (1991); Journal of Informetrics (2007) Relevant books e.g. Handbook of Quantitative Science and Technology Research (first 1988, 2004)... Public perception and visibility Conference series: ISSI (1987), STI (1988), Nordic Workshop (1996), CollNet (1998), et al. International Societies / Organizations: ISSI (1993); ENID (2008) Training Mainly Library and Information Science Tailored courses (CWTS, esss)

Institutionalization Source: https://www.youtube.com/watch?v=abnz66zlbju&list=plfd2dcd0c1f06b795

Institutionalization Source: https://twitter.com/bibliometrics

Measuring science Indicators = Proxies represent a highly complex reality empirically ascertainable variables and factors, that are used to reflect aspects that cannot be directly measured terms are rather vaguely defined quality, performance, progress, usefulness, importance Accommodate the need for objective data but, also the interest to better understand developmental processes and contexts of science itself indicators used as analytical tools e.g. to better understand the complicated system of knowledge production and knowledge exchange itself but, also as to inform science policy decisions

Measuring science Input Output Human resources Prizes Activity Financial resources Bibliometric indicators Performance Structure Authorship Citation Infrastructure (equipment, laboratory space etc.) Patent indicators Reception Cognitive Collaboration Third party funding Others: PhDs Habilitations Presentations Grantss etc. Efficiency Various Input / Output relations Inter and intra institutional comparisons Based upon Hornbostel 1999, p. 59

Application and use Assessment and evaluation Formula-based funding systems Foresight processes Monitoring of public funding Strategic decision making processes

Application and use Makro Meso Mikro global developments national R&D systems policies cross-sectional fields research and grant programs academic fields universities, research institutes, funding agencies university institutes/departments target/status groups research groups individuals

Application and use: Reporting

Application and use: performance based funding Examples PBF with bibliometric component: Funding of Flemish universities via Bijzonder Onderzoeksfonds (BOF) with part of the allocation key being based on publication and citation data (Debackere & Glänzel, Scientometrics, 2004) Distribution of basic research funding in the Norwegian HE as well as Denmark, Finland, Czech Republic, New Zealand, partly in Germany et al. (OECD 2010) Australia, bibliometric data used for ERA but not used to distribute basic funding to HEI

Application and use: performance based funding Germany: Average no. of indicators applied by discipline Source: Böhmer/ Neufeld/ Hinze/ Klode / Hornbostel (2011): Wissenschaftler-Befragung 2010: Forschungsbedingungen von Professorinnen und Professoren an deutschen Universitäten. ifq Working Paper No. 8, p. 91

Application and use Third party funding 63,5 Publications 40,7 PhDs 40,1 Evaluation results 26,2 Citation / JiF 22,7 Other 6,4 0% 20% 40% 60% 80% 100% Yes No Don't know / no answer Source: Böhmer/ Neufeld/ Hinze/ Klode / Hornbostel (2011): Wissenschaftler-Befragung 2010: Forschungsbedingungen von Professorinnen und Professoren an deutschen Universitäten. ifq Working Paper No. 8

Application and use: Performance based funding Source: Butler et al 2002: Impact of evaluation-based funding on the production of scientific knowledge: What to worry about, and how to find out. S. 13. Institut für Forschungsinformation und Qualitätssicherung

Application and use Makro Meso Mikro global developments national R&D systems policies Bibliometrics cross-sectional fields research and grant programs academic fields universities, research institutes, funding agencies university institutes/departments Peer review target/status groups research groups individuals

Application and use Source: ERC: Information for applicants to the Starting and Consolidator Grant 2014 Call, p. 25. and Advanced Grant 2014 Call, p. 22

Application and use Did you seek out more information in addition to the information provided in the proposal and by the DFG respectively? What kind of information did you use? Complete lists of publications by the researchers participating in the proposal 56,6 43,4 (N=394) Performance indicators for the researchers participating in the proposal (N=394) 55,1 44,9 Information from colleagues about the researchers participating in the proposal (N=394) Self-promotion by the universities or Institutions of Excellence on their websites (N=394) 38,1 32,0 61,9 68,0 Position of the host universities in rankings (N=393) 24,4 75,6 Project presentations on the DFG video portal on the Excellence Initiative (N=391) 8,7 91,3 Media reports about the researchers participating in the proposal, the Institutions of Excellence or the host universities 7,1 92,9 0% 20% 40% 60% 80% 100% Yes No Source: Möller, T. / Antony, P. / Hinze, S. / Hornbostel, S. (2012): Exzellenz begutachtet. Befragung der Gutachter in der Exzellenzinitiative. ifq-working Paper No.11. Berlin. http://www.forschungsinfo.de/publikationen/download/working_paper_11_2012.pdf

Application and use None the least due to the availability and accessibility of the underlying data the application of bibliometric methods is expanding as is the group of actors providing respective services This expansion may be also accompanied by inappropriate or uninformed use of bibliometric information: Insufficient knowledge of data source and methodology Misinterpretation due to insufficient contextualization Need to actively contribute to informed and cautious used Strive for standardization and implementation of guiding principles Picking up a discussion that started 1995 at the ISSI conference Workshops at ISSI 2013, STI2013, STI2014, Paris Workshop on Guidelines and good practices on quantitative assessments of research Objective: develop standards for accountability and expert advice on good scientometric practices

Guiding Principles Leiden Manifesto Drafted and discussed during STI2014, Leiden, September 4-6 (based on Diana Hicks) Metrics properly used support assessments; they do not substitute for judgment. Everyone retains responsibility for their assessments. Accurate, high quality data requires considerable time and money to produce. It is easy to underestimate the difficulty of constructing accurate data. Those mandating use of metrics should be able to provide assurance that the data is accurate. Metrics should be transparent, the construction of the data should follow a clearly stated set of rules. Everyone should have access to the data. Data should be verified by those evaluated, who should be offered the opportunity to contribute explanatory notes if they wish

Guiding Principles Leiden Manifesto Different metrics suite different fields. Sensitivity to field differences is important. Humanists will not be able to use citation counts; computer scientists will need to ensure conference papers are included; and chemists will look the best in raw metrics constructed from Web of Science data. The state-of-the-art is to select a suite of possible indicators and allow fields to choose among them. Data must be normalized to account for variation in citation and publication rates by field and over time. Metrics should align with strategic goals.

Thank you very much for your attention! hinze@forschungsinfo.de