Altmetrics could enable scholarship from developing countries to receive due recognition.

Similar documents
Title: Can we innovate how we measure scientific impact?

A conversation with David Jay on 03/14/13

New perspectives on article-level metrics: developing ways to assess research uptake and impact online

SciELO SA: Past, Present and Future (September 2018)

Title: The Impact of Altmetrics on library for biomedical research

Researchers and new tools But what about the librarian? mendeley.com

Plum Goes Orange Elsevier Acquires Plum Analytics - The Scho...

Increased Visibility in the Social Sciences and the Humanities (SSH)

Performance Measurement and Metrics

Publishing for Impact

STI 2018 Conference Proceedings

Eugene to Altmetrics: A chase for virtual foot prints!

New forms of scholarly communication Lunch e-research methods and case studies

Altmetrics for large, multidisciplinary research groups: A case study of the Leibniz Association

Guidelines for the Professional Evaluation of Digital Scholarship by Historians

14 th Berlin Open Access Conference Publisher Colloquy session

Why we need a Network of Usage Data Providers - OpenAIRE Impact Metrics Results

More Than Citations and Impact Factor: Altmetric.com

Exploring alternative cyberbibliometrics for evaluation of scholarly performance in the social sciences and humanities in Taiwan

Science Policy Research Report: The Use of Innovation Prizes in Government

On Epistemic Effects: A Reply to Castellani, Pontecorvo and Valente Arie Rip, University of Twente

A MODEL OF SCHOLARLY COMMUNICATION IN TOURISM AND AN OPEN ACCESS INITIATIVE

Can we better support and motivate scientists to deliver impact? Looking at the role of research evaluation and metrics. Áine Regan & Maeve Henchion

The modern global researcher:

Redefining Value: Alternative Metrics and Research Outputs

Scholarly profiles, user preferences and impact scores

Publishing in academic journals. Tips to help you succeed

Science 2.0 VU Introduction

SPREADING THE WORD THE PRODUCTIVE WRITER PROMOTES EFFECTIVELY BY: Having a presence online where you can be easily found.

Oxford Scholarship Online

The Early-Career Researcher

The Digital Divide: Communities Growing up on the wrong side of the Technology Tracks. By: Sarita Talwar. E-Business II

King s Research Portal

Altmetric. Ben McLeish

Open Science and Research Initiative Infrastructures and networking for Open Science Seminar on at the University of Helsinki

Copyright 2015 Silicon Valley Digital Marketing Institute, All Rights Reserved

32 THE TRIPLE HELIX, OPEN

Why and how science has gone wrong Science in Transition started in January 2013

The role of SciELO on the road towards the Professionalization, Internationalization and Financial Sustainability of developing country journals

WHAT IS FEATURED AUTHORS?

University of Dundee. Design in Action Knowledge Exchange Process Model Woods, Melanie; Marra, M.; Coulson, S. DOI: 10.

Altmetrics: Help Your Researchers Measure Their Full Impact

Key Contributions and Future Directions of Academic Social Networking Services for the Digital Academic Dennis Relojo Sonia Janice Pilao Abstract

Public consultation on Europeana

A new Journal in the field of Innovation Management Editorial Kick-off CERN 31 st May 2016

Let s get started! Instructions: Set aside minutes and answer these questions thoroughly.

SUPPORTING THE JOURNAL SELECTION PROCESS & RESEARCH PUBLICATION PRACTICES FOR RESEARCH PERFORMANCE EVALUATION IN SERBIA TITLE

Competing for Excellence: Perverse and constructive uses of evaluation machines in academia

Design and Development of Information System of Scientific Activity Indicators

A Bibliometric Analysis of Australia s International Research Collaboration in Science and Technology: Analytical Methods and Initial Findings


ORBIS via: A Situated Perspective of a Transportation Network Based on Computer Gaming Principles

IEEE PES Authoring Webinar

Research Excellence Framework

Next-generation metrics: Responsible metrics and evaluation for open science

Pre- Tenure Book Publica1on

Strategic Research Plan Summary for the Canada Research Chairs Program

networked Youth Research for Empowerment in the Digital society MANIFESTO

The Uses of Big Data in Social Research. Ralph Schroeder, Professor & MSc Programme Director

Constants and Variables in 30 Years of Science and Technology Policy. Luke Georghiou University of Manchester Presentation for NISTEP 30 Symposium

An Essential Health and Biomedical R&D Treaty

Resource Review. In press 2018, the Journal of the Medical Library Association

Perspectives on Negotiating Licenses and Copyright

STI 2018 Conference Proceedings

Special Issue. Current Key Perspectives in Video Gaming and Religion. by Gregory Grieve, Kerstin Radde-Antweiler, and Xenia Zeiler

Vice Chancellor s introduction

President Barack Obama The White House Washington, DC June 19, Dear Mr. President,

Doing, supporting and using public health research. The Public Health England strategy for research, development and innovation

30-DAY ACTION PLAN: CREATE AN ONLINE IDENTITY THAT GIVES CLIENTS CONFIDENCE!

Interpreting altmetrics : viewing acts on social media through the lens of citation and social theories

Understanding Social Computing: Challenges and Opportunities for Europe

National Workshop on Responsible Research & Innovation in Australia 7 February 2017, Canberra

Media Literacy Expert Group Draft 2006

Web of Science InCites Web of Science Custom Data An introduction to publisher analytics

Gameguyz Review By: Cristina Chow

Altmetrics as traces of the computerization of the research process 1, 2

Energy for society: The value and need for interdisciplinary research

Tips & best practices for writing

MODULE 5 FACEBOOK PROMOTION AND MARKETING STRATEGIES

REPORT FROM THE COMMISSION TO THE EUROPEAN PARLIAMENT, THE COUNCIL, THE EUROPEAN ECONOMIC AND SOCIAL COMMITTEE AND THE COMMITTEE OF THE REGIONS

2018 NISO Calendar of Educational Events

In Defense of the Book

Pathways from Science into Public Decision Making: Theory, Synthesis, Case Study, and Practical Points for Implementation

SCHOLARLY PUBLISHING MYTHS AND MISCONCEPTIONS. Jadranka Stojanovski, Assistant Professor University of Zadar, Ruđer Bošković Institute

STRATEGIC FRAMEWORK Updated August 2017

Quality in legal science: the case of evaluating legal monographs

Mission: Materials innovation

Modelling Science, Technology, and Innovation

GUIDELINES SOCIAL SCIENCES AND HUMANITIES RESEARCH MATTERS. ON HOW TO SUCCESSFULLY DESIGN, AND IMPLEMENT, MISSION-ORIENTED RESEARCH PROGRAMMES

Top Search Websites Client Worksheet

Some Thoughts On Peer Review In The Global Internet Context

SCIndeks: Serbian Citation Index Regional Perspective

Headings: Scholarly publishing. Academic discourse. Academic Libraries

Written response to the public consultation on the European Commission Green Paper: From

Science Impact Enhancing the Use of USGS Science

Users, Narcissism and Control Tracking the Impact of Scholarly Publications in the 21 st Century

Islamic-world. the context. project aims. A moment of opportunity. Project methodology. Project partners. Summary of outputs. A project proposal

Introduction to Open Science

A COMPREHENSIVE DATABASE OF HIGH-QUALITY RESEARCH. natureindex.com. Track top papers Explore collaborations Compare research performance

Knowledge abundance and the global network of science

Transcription:

2014, 10 March Altmetrics could enable scholarship from developing countries to receive due recognition. The Web of Science and its corresponding Journal Impact Factor are inadequate for an understanding of the impact of scholarly work from developing regions, argues Juan Pablo Alperin. Alternative metrics offer the opportunity to redirect incentive structures towards problems that contribute to development, or at least to local priorities. But the altmetrics community needs to actively engage with scholars from developing regions to ensure the new metrics do not continue to cater to well-known and well-established networks. A significant group of scholars from around the world love to hate the Journal Impact Factor (JIF). An incredible amount of ink has been spilled on describing its methodological limitations, its abuse and misuse, and its pervasive effects on science. But, while the loathing of the JIF (I hazard to guess) is distributed fairly equally around the world, the scholars who are affected by its use are not. It is scholars from developing regions who suffer the most egregious consequences. The problems for developing regions stem from the under-representation of developing world research in Thomson Reuters Web of Science (WoS), from which the JIF is calculated. In a seminal piece from fifteen years ago, Cetto & Alonso-Gamboa (1998) laid out the disheartening situation of Latin American journals in international information systems such as the WoS. As can be seen from the figure below, which shows the relative number of works authored by scholars from around the world in WoS, this situation has not significantly changed over time. The shortage of research from developing regions is not for a lack of research. In Latin America, to draw on the region I am most familiar with, in 2012, only 4% of Latin American peer reviewed journals were included in WoS (242 out of over 5,000) (see them in the Latindex Catalog). To give another example, two initiatives, SciELO and RedALyC, working with only a subset of these 5000+ journals, have indexed over half a million articles in regional journals, primarily from Latin American authors. Thomson Reuters recently announced a partnership with SciELO, whereby journals in SciELO will be indexed and appear in the Web of Knowledge. SciELO also calculates an Impact Factor based on its collection of over 1100 journals. However, even with SciELO, only a fraction of Latin America s research can receive an Impact Factor. There is an abundance of locally produced and published research in developing regions,

just not in WoS. See here, for an interactive map going back to 1990. The argument for this bias has always been that research from the developing world does not form part of mainstream or international science. Although I take exception to this argument, regardless of its rationale, the end result is the same: the WoS is an inadequate dataset to understand the impact of scholarly communications from developing regions, or to otherwise study it. To serve scholars from developing regions, it is imperative to find an alternative. The proposal The scholarly community is abuzz with altmetrics and the related (but different) term Article Level Metrics. Ian Mulvany, Head of technology for elife, drew a nice venn diagram depicting the distinction between the two. These metrics, derived primarily from the social Web, have been purposely constructed to be alternatives to the JIF. Since the drafting of the altmetrics manifesto, there has been a special issue, a PLOS collection, a Mendeley group, several annual workshops, an increasing number of research papers, and several altmetric start-ups. During a

few months in the last 12, the term altmetrics has even been more popular on Google than bibliometrics and bibliometric (although the term Journal Impact Factor still dwarfs both). All of these signs indicate that altmetrics may not remain alternative for long. Whether they supplant or complement the JIF, they bring with them a promise, but no guarantees, for developing regions. Or see what this looks like today The promise Altmetrics are captured from the Web (i.e., social media, blogs, wikipedia), and thus are (somewhat) more democratic one reader, one vote. More precisely: one reader, several potential votes. Unlike citations, which can only be counted if the citing document is in a select group of journals, altmetrics are counted regardless of where in the world they are originated, with one important consequence: they open the possibility of tracking impact in new segments, both within and beyond the academy. The corollary to this consequence is that altmetrics enable scholars to be incentivized and rewarded for impact in these new audiences. The JIF only captured impact from one audience, those publishing in WoS journals; altmetrics, on the other hand, can capture diverse audiences including scholars but also practitioners, clinicians, educators and the general public (Piwowar, 2013) And, given that developing world scholars were systematically under-represented in the WoS, new audiences could mean into developing world audiences, interested in local and regional academic or public interests. Of all the potential benefits of altmetrics, this is the true promise for the developing world: an opportunity to redirect incentive structures towards

problems that contribute to development, or at least to local priorities, be it through academic, policy, personal, or professional-practice impact. The problem This promise, however, is not by any means guaranteed. To realize the promise, the altmetrics community needs to actively engage with scholars from developing regions. As the field advances, it will become essential to further understand the ways in which altmetrics are different from citations, and how these new metrics might shape research agendas. So far, much of the research has found that these new metrics capture a different dimension, flavour or type of impact than citations (Torres-Salinas et al., 2013; Costas et al., 2014; Haustein & Peters, 2013; Eysenbach, 2011), but it has not theorized or explored what else they might mean. There is reason to be optimistic most altmetric research ends with a call for further study of the reliability, validity, and context of the available metrics, but there is also a risk that without this understanding, altmetrics will only be used as a proxy for traditional citation impact. There is also a risk of altmetrics becoming yet another method for ranking scholars. If this happens, then it will once again turn attention of developing world scholars to the audiences in the United States and Europe for this is where social media has most deeply penetrated. At the ALM workshop in October 2013, I presented a series of maps showing the varying levels of penetration of Internet, Twitter, Facebook, and Mendeley (all common altmetric sources). A new map from the Oxford Internet Institute now shows us the uneven geography of Wikipedia. If the name of the game becomes increasing altmetric scores, it will still be a better strategy to cater research to places where the sources of altmetrics are more heavily used (read: not in the developing world).

Read more here The prognosis The current focus on assessment (and to a lesser degree filtering and discovery) will give authors a new tool for demonstrating impact beyond citations, and it may help connect researchers with research. It is an important and necessary first step for the reasons I note above. If the first years of altmetrics/alms saw the shift from WHAT (are ALMs and should we care) to HOW, then beneficial directions for developing regions would be to ask WHO (is behind the metrics) or to again ask WHAT, but this time, WHAT (do we need metrics of). The WHAT should be to expand the offerings of altmetrics to the non-traditional research products of scholars from the developing world, like those proposed by Piwowar (2013), but also arbitrary URLs of reports, policy briefs, program evaluations, and to the many journal articles published without DOIs. The WHO would necessitate building tools that help scholars identify and connect with their audiences, which should create a feedback loop that encourages authors to direct their research to areas of relevance to their primary audience, be it academic or public. It is early days and these are only early ideas. Further research on altmetrics, especially in contexts beyond well-known and well-established journals (i.e., Science and Nature), and outside of the in so-called global North (i.e., the United States or Western Europe) is desperately needed. However, this alone is not enough. The field and the tools of altmetrics must be crafted with the

participation of a diverse group scholars, so that their development can be inclusive of multiple perspectives and needs. In this direction, the Public Knowledge Project (where I am a researcher) is now working with PLOS ALM application to provide a free altmetrics service to journals using the OJS platform (primarily in developing regions, see our OJS Map), and working to explore available indicators of Open Access in developing regions (project report forthcoming). The idea is to bring altmetrics to developing regions now, not after scholars elsewhere have established the methods and norms of how altmetrics will be used. As the field begins starting to consolidate, I remain optimistically pro-altmetrics for developing regions, and I have faith in the altmetrics community to serve all scholars. Which directions altmetrics should go, how they should be used, or how the tools should be implemented is not for me to prescribe, but if we exclude (or do not seek to include) scholars from developing regions, altmetrics will become another measure from the North, for the North. And we already know that story. Note: This article gives the views of the author, and not the position of the Impact of Social Science blog, nor of the London School of Economics. Please review our Comments Policy if you have any concerns on posting a comment below. About the Author Juan Pablo Alperin is a PhD Candidate in the Stanford School of Education as well as a researcher and systems developer with the Public Knowledge Project. He is currently involved in several research initiatives aimed at improving the quality, impact, and reach of scholarly publishing in Latin American, and continues to contribute to the award-winning software, Open Journal Systems (although admittedly less often). He can be be found at juan@alperin.ca an on Twitter at @juancommander.