Why and how science has gone wrong Science in Transition started in January 2013 Frank Miedema (UMC Utrecht) Huub Dijstelbloem (WRR, UvA) Frank Huisman (UMC Utrecht, Descartes Centre, UU) Wijnand Mijnhardt (Descartes Centre, UU)
October 2013
Today, annually 1.5 million scientific publications 3% annual growth 1940 1960 1980 2000 2020
2006: 1,35 miljoen publicaties
Biomedical Research suffers from major systemic flaws Womens Health, CVD and Oncology: only about 25% of published preclinical studies could be validated to the point at which projects could continue (Prinz et al, Nature Reviews Drug Discovery, 2011) Amgen: of 53 landmark oncology papers in only 6 (11%) cases scientific findings were confirmed (Begley and Ellis, Nature, 2012)
Much biomedical research does not lead to worthwhile achievements. 85% of research investment equating to $200 billion of the investment in 2010 is wasted Significant problem with reliability, quality, reproducibility and because of publication bias. Few identified biomarkers have been confirmed by subsequent research and few have entered routine clinical practice. Lancet, January 2014, Ioannidis, Altman, Chalmers, Glaziou, Horton et al.
METRICS is a research to action center focused on transforming research practices to improve the quality of scientific studies in biomedicine and beyond
How Scientists get Credit Volkskrant Pierre Bourdieu, Science of Science, 2004 Hessels et al, Science and public policy, 2009
Credibility cycle + Big Science = Dominant bibliometric quality assessment Publish or perish culture where quality and relevance are subordinate to quantity Open Science, FAIR DATA and Open Access are just nice to have Fields that are not intrinsically less relevant suffer Short-termism and risk aversion Universities outsource talent management
Problems of the system Problem choice is determined by risk reduction and risk avoidance Relation with society: because of science ideology and overselling, image and understanding of science becomes problematic Problem choice in democracy: who is settting the agenda? The mores and behaviour of administrators, scientists and students have changed. Quantity dominates over Quality and CONTENT in research evaluation (New Public Managment) Volkskrant
Why, where and how science has gone wrong And what to do about it?..
A Toolbox for Science in Transition 2. Impact The evaluation of research impact needs to change. The simple use of journal impact factors in funding, appointment and promotion must be abandoned, see the San Francisco Declaration on Research Assessment. Meaningful Metrics needed for both science for science and science for society (societal impact); parameters of value attributed by potential users in society must be developed with stakeholders outside academia. The use of Case studies must be developed
A Toolbox for Science in Transition 3. Incentives and Rewards Grants from government, EU, charities, etc should explicitly be awarded based on evaluation of a set of Composite Indicators of Impact. This must result in a mix of basic and targeted research. TEAM SCIENCE; We will have to take into account teamwork* and academic duties and provide for career opportunities for the different types of researchers *CWTS Leiden, Authorship in Transition, February 2015
A Toolbox for Science in Transition 5. PhD Talent Management and Education Our graduate school should also coach PhDs and post docs that want to make a career outside academia. How much PhD s should we produce? Do MD's in high numbers need a PhD? What are (optimal) criteria for PhD theses? We have to promote scientific literacy among our Master and PhD students. Paula Stephan: How economics shapes Science, 2012
Why Science in Transition? research Cure and care innovations Citation dip
Science in Transition in the UMC Utrecht: Open Science Multidisciplinary Research facilitated Stakeholders involved in prioritizing research Diversify talent management Evaluation by hybrid panels Recognize academic duties & good citizenship Develop and use novel indicators for quality and impact
Inclusive set of indicators for research impact Structure Process Outcomes (from SEP) Leadership & culture Collaborations with stakeholders Continuity and infrastructure Setting research priorities Posing the right questions Incorporation of next steps Design, conduct, analysis Regulation and management, FAIR data sharing Research products for peers Research products for societal groups Use of research products by peers Use of research products by societal groups Marks of recognition from peers Marks of recognition from societal groups
Change scientific incentives and rewards for broader impact & open science Provide incentives and rewards for academics to work on Open Science and use of Open Science Funders (public and private) want us to work according to Open Science and Open Access
Adopt Incentives and Rewards for Open Science incorporated in research evaluations Provide incentives and rewards for academics to work on Open Science and use Open Science and Open Access Funders (public and private) want us to work according to Open Science and OA
How to improve biomedical science? Towards a nationwide Personalized Medicine & Personalized Health Research Infrastructure : Health RI Prof.dr. Frank Miedema Dean UMCU 20
Building an integrated research infrastructure Incentivizing data sharing Providing an open platform F findable A accessible I interoperable R reusable FAIR data 21
HANDS Handbook for Natural Data Stewardship Cross-UMC guideline on data stewardship Provides pointers to local expertise First version will be continously updated based on input from the community 27 februari 2017 22
1. Harmonize data stewardship guidelines 6. Good Research Practice 9. Coordinate access to experts and support 3. Access to data and samples 4. Biomedical data Sharing and analysis 2. Harmonize IT Architecture 5. Electronic Health Records for research 7. Facilities for high-throughput data processing
the 21 st century the past