Education 1994 Ph.D. in Software Engineering, University of Oslo Master of Science in Economy and Computer science, Universität Karlsruhe (TH).

Similar documents
Towards a Software Engineering Research Framework: Extending Design Science Research

Strategic Plan for CREE Oslo Centre for Research on Environmentally friendly Energy

Getting the evidence: Using research in policy making

EduSymp Panel How do we inspire students to model?

Kjell Stordahl. EDUCATIONAL BACKGROUND Master of Science, degree in Statistics (Cand real), Oslo University, 1972

Information Systems Frontiers CALL FOR PAPERS. Special Issue on: Digital transformation for a sustainable society in the 21st century

User Experience Questionnaire Handbook

UCL Energy & Resource Economics Group. Welcome!

Risk Center Workshop Autonomous Decision-Making: Assessing the Technology and its Impact on Industry and Society

FROM WORDS TO NUMBERS

Identification and Reduction of Risks in Remote Operations of Offshore Oil and Gas Installations

7KH 0DQLIHVWR .HWLO6W OHQ -XO\

Accreditation Requirements Mapping

A COMPREHENSIVE DATABASE OF HIGH-QUALITY RESEARCH. natureindex.com. Track top papers Explore collaborations Compare research performance

SciELO SA: Past, Present and Future (September 2018)

Kyiv National University of Trade and Economics Faculty of Trade and Marketing INFORMATION PACKAGE

Building a foresight system in the government Lessons from 11 countries

Newsletter. RiMaCon. Mid-Term Review Meeting. Issue: 3 September 2015 PROJECT PARTNERS:

MAT 1272 STATISTICS LESSON STATISTICS AND TYPES OF STATISTICS

Improving Application Development with Digital Libraries

Time-aware Collaborative Topic Regression: Towards Higher Relevance in Textual Items Recommendation

The Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES)

SPICE: IS A CAPABILITY MATURITY MODEL APPLICABLE IN THE CONSTRUCTION INDUSTRY? Spice: A mature model

From Observational Data to Information IG (OD2I IG) The OD2I Team

How to use Bibliometric Data to Rank Universities according to their Research Performance?

Presented by Menna Brown

Examples of Mentoring Agreements

SURVEY ON USE OF INFORMATION AND COMMUNICATION TECHNOLOGY (ICT)

General Education Rubrics

FINAL ACTIVITY AND MANAGEMENT REPORT

Evolution of the Development of Scientometrics

University of Northampton. Graduate Leaders in Early Years Programme Audit Monitoring Report by the Quality Assurance Agency for Higher Education

2018 ASSESS Update. Analysis, Simulation and Systems Engineering Software Strategies

Terms of Reference. Call for Experts in the field of Foresight and ICT

RESEARCH, MONITORING AND EVALUATION

Why Randomize? Jim Berry Cornell University

Figure 1: When asked whether Mexico has the intellectual capacity to perform economic-environmental modeling, expert respondents said yes.

GENEVA COMMITTEE ON DEVELOPMENT AND INTELLECTUAL PROPERTY (CDIP) Fifth Session Geneva, April 26 to 30, 2010

Stage 2 Society and Culture Assessment Type 2: Interaction Group Activity

Evidence Engineering. Audris Mockus University of Tennessee and Avaya Labs Research [ ]

Program Report June 26 - July 2, 2016

Expression Of Interest

Architectures On-Demand for Any Domain Using Stable Software Patterns

Issues in Emerging Health Technologies Bulletin Process

STUDY PLAN. Aerospace Control Engineering - master

NEW RULES OF SPEAKING

Technology Transition through the Forensic Technology Center of Excellence

Design and Technology Subject Outline Stage 1 and Stage 2

Using Program Slicing to Identify Faults in Software:

General Manager Assurance and Risk Management in Oakton;

Helsinki University of Technology Systems Analysis Laboratory. Ahti Salo. P.O. Box 1100, FIN TKK Finland

Energy for society: The value and need for interdisciplinary research

Building Collaborative Networks for Innovation

Emerging Technologies: What Have We Learned About Governing the Risks?

Current Challenges for Measuring Innovation, their Implications for Evidence-based Innovation Policy and the Opportunities of Big Data

Findings from the ESRC s Impact Evaluation Programme Faye Auty, 21 st June 2011

Selecting, Developing and Designing the Visual Content for the Polymer Series

Committee on Development and Intellectual Property (CDIP)

Research programme

PRIMATECH WHITE PAPER COMPARISON OF FIRST AND SECOND EDITIONS OF HAZOP APPLICATION GUIDE, IEC 61882: A PROCESS SAFETY PERSPECTIVE

A STUDY ON THE DOCUMENT INFORMATION SERVICE OF THE NATIONAL AGRICULTURAL LIBRARY FOR AGRICULTURAL SCI-TECH INNOVATION IN CHINA

DOC-CAREERS II Project, Final conference Brussels 2012 University-Industry Intellectual property rights: Balancing interests

SYSTEMATIC MODEL BASED AND SEARCH BASED TESTING OF CYBER PHYSICAL SYSTEMS

Keynote Speakers. Swarm-bots and Swarmanoid. Marco Dorigo. Ph.D., co-director of the IRIDIA lab at the Université Libre de Bruxelles

Executive Summary: Understanding Risk Communication Best Practices and Theory

The project. The implementation phase 17/10/2014. Effects of insufficient human factors focus in a design phase of new automated technology.

Contribution of the support and operation of government agency to the achievement in government-funded strategic research programs

Objectives. Module 6: Sampling

Introduction to HSE ISSEK

summary Background and scope

Aalborg Universitet. The immediate effects of a triple helix collaboration Brix, Jacob. Publication date: 2017

GCR THE HANDBOOK OF COMPETITION ECONOMICS. A Global Competition Review special report published in association with: Copenhagen Economics

100 Year Study on AI: 1st Study Panel Report

November 6, Keynote Speaker. Panelists. Heng Xu Penn State. Rebecca Wang Lehigh University. Eric P. S. Baumer Lehigh University

Enfield CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Oxfordshire CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Southern Derbyshire CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

South Devon and Torbay CCG. CCG 360 o stakeholder survey 2015 Main report Version 1 Internal Use Only

Mathematics for Data Science

Attempting to fly: Deployment of systems oriented design methodology - conducted by the Norwegian Design Council

Portsmouth CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Academic Course Description. VL2004 CMOS Analog VLSI Second Semester, (Even semester)

Evaluation. n Scale and. Specificc Talent Aptitude: Music, Dance, Psychomotor, Creativity, Leadership. to identification. criterion or available to

Information points report

STI 2018 Conference Proceedings

Sutton CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

A Different Kind of Scientific Revolution

Making Sense of Science

GUIDE TO SPEAKING POINTS:

PETER N. IRELAND. Department of Economics Boston College 140 Commonwealth Avenue Chestnut Hill, MA

Social Innovation and new pathways to social changefirst insights from the global mapping

INTELLIGENT TECHNOLOGIES FOR ADVANCING AND SAFEGUARDING AUSTRALIA

Information and Communication Technology

Summary Remarks By David A. Olive. WITSA Public Policy Chairman. November 3, 2009

Evaluating Software Products Dr. Rami Bahsoon School of Computer Science The University Of Birmingham

On the Monty Hall Dilemma and Some Related Variations

West Norfolk CCG. CCG 360 o stakeholder survey 2014 Main report. Version 1 Internal Use Only Version 7 Internal Use Only

Boston University School of Law Transactional LawMeet -Information Session - September 15, 2015

Outline of Presentation

Measuring Knowledge in Learning Economies and Societies

Transcription:

CV Magne Jørgensen Personal data Date of birth: October 10, 1964 Nationality: Norwegian Present position: Professor, University of Oslo, Chief Research Scientist, Simula Research Laboratory Home page: www.simula.no/people/magnej Education 1994 Ph.D. in Software Engineering, University of Oslo. 1988 Master of Science in Economy and Computer science, Universität Karlsruhe (TH). Work experience 2001 present Chief Research Scientist at Simula Research Laboratory 2002 present Professor, University of Oslo (80% leave) 2009 present Guest professor, Kathmandu University, Nepal 2014 present Advisor at Scienta 1999 2002 Assoc. Professor, University of Oslo 1998 1999 Head of the Software Process Improvement Department, Storebrand IT 1997 1998 Head of the Software Process Improvement Department, Telenor Telecom Solutions 1995 1997 Senior Adviser, IT strategy, Telenor ONP 1989 1995 Programmer, scientific researcher, Telenor Research and Development Fields of research: Management of software projects Software development methods Evidence-based software engineering Judgment and decision making in software development Empirical methods for software engineering Publication record, research impact, funding, collaborations, other activities: h-index (Google Scholar): 45. Number of citations (Google Scholar): >8000. Most cited paper is reported to be the 18 th most cited software engineering paper, based on average annual citations, by a review paper published in Information and Software Technology in 2016. Ranked as the top scholar (most productive researcher) in system and software engineering for the periods 2001-2005, 2002-2006, 2003-2007 and 2004-2008. The rankings, published in Journal of Systems and Software, is based on number of publications published in the top system and software engineering journal and includes about 4000 researchers. Published ca. 80 journal papers, 4 sections in edited books, ca. 70 refereed conference papers, ca. 25 keynote talks, and ca. 170 invited talks in software engineering, psychology, forecasting and project management. Transfer of research results to software professionals by writing a monthly column in Computerworld (Norway). Since 2004 about 100 articles have been published. Founder of evidence-based software engineering, together with prof. Barbara Kitchenham and prof. Tore Dybå. Received in 2014 the ACM Sigsoft award for most influential paper last ten years for the initial paper on evidence-based software engineering.

Paper on expert judgment ranked as the most cited paper on human factors in software development from the previous decade according to a review published in Information and Software Technology in 2014. Supervision of ten PhD-students to completion (seven in software engineering, two in psychology and one in education). Started/supported, with former PhD-students, two consultancy companies based on evidence-based software engineering principles. Member of the editorial board of Journal of Systems and Software. Editorial board of Simula Springer Briefs on Computing Ranked by Computerworld Norway to be one of the fifty most influential professionals within ICT in Norway in 2012, 2013 and 2014 (ranking not done for 2015 or 2016). Member of the Norwegian Digitization Council (digitaliseringsrådet) since 2016. Leader of a national industry-research collaboration network (the HIT-network) with regular information transfer, research studies, seminars and experience sharing with software industry participants. Simula Researcher of the year 2004 and Simula Outreach Award 2016 Selected research results the last ten years: Software project governance: In several studies I examine how the clients strategies of selecting software development companies affect the likelihood of project success. Especially of interest are the results on how much selecting a bidder with low price negatively affects the likelihood of success, even when the selected company is among those with high competence. Also of great interest to the software industry is the demonstration of how much software companies with seemingly similar skill may differ in productivity and quality. Based on the studies, I propose an improved method for the selection of software providers. The results have been published in academic journal papers and presented at industry arenas (seminars, conferences, newspaper articles). A new selection process trialsourcing has been introduced and has been successfully used by several clients. Results in judgment-based effort estimation: I wrote with my PhD-student - the currently most comprehensive knowledge summary on judgment-based effort estimation. The review is published in in Psychological Bulletin (ISI Impact factor of 12). One of my recent studies gave that a selection bias, i.e., the tendency towards selecting among the most over-optimistic bidders, may explain much of the tendency towards cost overruns reported in software project surveys. A study (invited paper to International Journal of Forecasting) gave that judgment-based effort estimates are typically more accurate than model-based ones in the domain of software development, which is a surprising result given that the opposite is the case in most other domains. The most important results of the review are related the insight into when we can expect formal estimation models to produce more accurate estimates than expert judgment and when not. Improved research design in empirical software engineering: In two studies I show how and why software engineering studies documenting economy-of-scale and increased cost overrun with increased project size cannot be trusted. In a third study I show the need for improvement in the design of software engineering experiment, which most likely include a lot of incorrect results due to low statistical power, high degree of publication bias and questionable use of statistical analyses. In a more recent paper (accepted for publication, not yet published) I empirically evaluate the result validity of result in empirical software engineering. These results have been presented (keynotes, invited talks) on several occasions and started to affect empirical practices. I have published several papers where I disclose myths and misinterpretation in software engineering. In particular much cited is the paper where I show that the main source documenting the software crisis, i.e., the Standish Group s CHAOS report, is severely flawed. Improved effort estimation of software projects: A (not yet published) review by researchers at Brunel University, they find that I am involved in about 10% of all journal papers on software development effort estimation. In particular of strong impact are the studies where I document how easily software project cost and effort estimates are misled by irrelevant factors. One example is the finding on how estimation is misled by anchoring and priming effects. My results on estimation biases are regularly presented to the software industry and, as a result, there seems to be an increasing awareness among software companies about these

effects. One of the studies resulted in a new method for judgment-based estimation, based on contrasting ideal and most likely use of effort to complete a task. Improved methods for assessment of uncertainty in use of effort in software projects: I have documented a high level of over-confidence in the accuracy of effort estimates. Several new uncertainty assessment methods have been suggested and evaluated in real-life contexts, some of them with documented good effect on judgment realism and implemented in software companies. In a study on risk assessment, I found that there were situations where more work on risk assessment led to higher degree of over-confidence. I explain this surprising effect and suggest way to avoid this potentially harmful side effect of extensive risk analysis. Selected journal publications: 1. Magne Jørgensen, Parastoo Mohagheghi, and Stein Grimstad, Indirect and direct connections between type of contract and software project outcome, International Journal of Project Management,35 (8), (2017), 1573-1586 2. Mika Maantylaa, Magne Jørgensen, Paul Ralph, and Hakan Erdogmus, Guest editorial for special section on success and failure in software engineering, Empirical Software Engineering (2017), 1-17. 3. Magne Jørgensen, Better selection of software providers through trialsourcing, IEEE Software 33 (2016), 48 53. 4. A survey on the characteristics of projects with success in delivering client benefits, Information and Software Technology 78 (2016), 83 94. 5. Unit effects in software project effort estimation: Work-hours gives lower effort estimates than workdays, Journal of Systems and Software 117 (2016), 274 281. 6. Magne Jørgensen, Tore Dybå, Knut Liestøl, and Dag Ingar Kondrup Sjøberg, Incorrect results in software engineering experiments: How to improve research practices, Journal of Systems and Software 117 (2016), 274 281. 7. Erik Løhre and Magne Jørgensen, Numerical anchors and their strong effects on software development effort estimates, Journal of Systems and Software 116 (2016), 49 56. 8. M. Jørgensen. Failure Factors of Software Projects at a Global Outsourcing Marketplace, Journal of Systems and Software, 92, 157-169, 2014. 9. M. Jørgensen. Relative Estimation of Software Development Effort: It Matters With What and How You Compare, IEEE Software(March): 74-79, 2013. 10. M. Jørgensen. The Influence of Selection Bias on Effort Overruns in Software Development Projects, Information and Software Technology 55(9): 1640-1650, 2013. 11. T. Halkjelsvik and M. Jørgensen. From origami to software development: A review of studies on judgmentbased predictions of performance time, Psychological Bulletin, 138(2): 238-271, 2012. 12. M. Jørgensen and S. Grimstad. Software Development Estimation Biases: The Role of Interdependence, IEEE Transactions on Software Engineering, 38(3): 677-693, 2012. 13. M. Jørgensen and B. Kitchenham. Interpretation problems related to the use of regression models to decide on economy of scale in software development, Journal of Systems and Software, 85(11): 2494-2503, 2012. 14. T. Halkjelsvik, M. Jørgensen, and K. H. Teigen. To Read Two Pages, I Need 5 Minutes, but Give Me 5 Minutes and I Will Read Four: How to Change Productivity Estimates by Inverting the Question, Applied Cognitive Psychology, 25(2): 314-323, 2011. 15. M. Jørgensen. Contrasting Ideal and Realistic Conditions as a Means to Improve Judgment-based Software Development Effort Estimation, Information and Software Technology, 53(12): 1382-1390, 2011. 16. M. Jørgensen and S. Grimstad. The Impact of Irrelevant and Misleading Information on Software Development Effort Estimates: A Randomized Controlled Field Experiment, IEEE Transactions on Software Engineering, 37(5): 695-707, 2011. 17. M. Jørgensen. Selection of Effort Estimation Strategies, Journal of Systems and Software, 83(6): 1039-1050, 2010. 18. M. Jørgensen. Identification of More Risks Can Lead to Increased Over-Optimism of and Over-Confidence in Software Development Effort Estimates, Information and Software Technology, 52(5): 506-516, 2010.

19. M. Jørgensen and T. Halkjelsvik. The Effects of Request Formats on Judgment-based Effort Estimation, Journal of Systems and Software, 83(1):29-36, 2010. 20. M. Jørgensen. How to Avoid Selecting Providers with Bids Based on Over-Optimistic Cost Estimates, IEEE Software (May/June), 26(3): 79-84, 2009. 21. M. Jørgensen and T. Gruschke. The Impact of Lessons-Learned Sessions on Effort Estimation and Uncertainty Assessments, IEEE Transactions of Software Engineering, 35(3): 368-383, 2009. 22. T. M. Gruschke and M. Jørgensen. The role of outcome feedback in improving the uncertainty assessment of software development effort estimates, ACM Transactions on Software Engineering and Methodology, 17(4): 20-35, 2008. 23. J. E. Hannay and M. Jørgensen. The Role of Artificial Design Elements in Software Engineering Experiments,Transactions on Software Engineering, 34(2): 242--259, 2008. 24. M. Jørgensen and S. Grimstad. Avoiding Irrelevant and Misleading Information When Estimating Development Effort, IEEE Software(May/June): 78-83, 2008. 25. S. Grimstad and M. Jørgensen. Inconsistency in Expert Judgment-based Estimates of Software Development Effort, Journal of Systems and Software, 80(11): 1770--1777, 2007. 26. M. Jørgensen. Estimation of Software Development Work Effort: Evidence on Expert Judgment and Formal Models, International Journal of Forecasting, 23(3): 449-462, 2007. 27. M. Jørgensen and M. Shepperd. A Systematic Review of Software Development Cost Estimation Studies, IEEE Transactions on Software Engineering, 33(1): 33-53, 2007. 28. M. Jørgensen, B. Faugli, and T. M. Gruschke. Characteristics of Software Engineers with Optimistic Predictions, Journal of Systems and Software, 80(9): 1472-1482, 2007. 29. S. Grimstad, M. Jørgensen, and K. J. Moløkken-Østvold. Software Effort Estimation Terminology: The Tower of Babel, Journal of Information and Software Technology, 48(4): 302-310, 2006. 30. M. Jørgensen. The Effects of the Format of Software Project Bidding Processes, International Journal of Project Management, 24(6): 522-528, 2006. 31. M. Jørgensen and K. J. Moløkken-Østvold. How Large Are Software Cost Overruns? Critical Comments on the Standish Group s CHAOS Reports, Information and Software Technology, 48(4): 297-301, 2006. 32. T. Dybå, B. Kitchenham, and M. Jørgensen. Evidence-based Software Engineering for Practitioners, IEEE Software, 22(1): 58-65, 2005. 33. M. Jørgensen. Practical guidelines for better support of expert judgement-based software effort estimation, IEEE Software, 22(3): 57--63, 2005. 34. M. Jørgensen. Evidence-Based Guidelines for Assessment of Software Development Cost Uncertainty, IEEE Transactions on Software Engineering, 31(11): 942-954, 2005. 35. A. Karahasanovic, B. C. D. Anda, E. Arisholm, S. E. Hove, M. Jørgensen, D. I. K. Sjøberg, and R. Welland. Collecting Feedback during Software Engineering Experiments, Empirical Software Engineering, 10(2): 113-147, 2005. 36. K. J. Moløkken-Østvold and M. Jørgensen. Expert Estimation of the Effort of Web-Development Projects: Are Software Professionals in Technical Roles More Optimistic Than Those in Non-Technical Roles?, Journal of Empirical Software Engineering, 10(1): 7-30, 2005. 37. K. J. Moløkken-Østvold and M. Jørgensen. A Comparison of Software Project Overruns Flexible vs. Sequential Development Models, IEEE Transactions on Software Engineering, 31(9): 754-766, 2005. 38. K. H. Teigen and M. Jørgensen. When 90% confidence intervals are only 50% certain: On the credibility of credible intervals, Applied Cognitive Psychology, 19(4): 455-475, 2005. 39. M. Jørgensen. Top-Down and Bottom-Up Expert Estimation of Software Development Effort, Journal of Information and Software Technology, 46(1): 3-16, 2004. 40. M. Jørgensen. Regression Models of Software Development Effort Estimation Accuracy and Bias, Journal of Empirical Software Engineering, 9(4): 297-314, 2004. 41. M. Jørgensen. A Review of Studies on Expert Estimation of Software Development Effort, Journal of Systems and Software, 70(1-2): 37-60, 2004. 42. M. Jørgensen. Increasing Realism in Effort Estimation Uncertainty Assessments: It Matters How You Ask,IEEE Transactions on Software Engineering, 30(4): 209-217, 2004.

43. M. Jørgensen and G. J. Carelius. An Empirical Study of Software Project Bidding, IEEE Transactions of Software Engineering, 30(12): 953-969, 2004. 44. M. Jørgensen and K. J. Moløkken-Østvold. Reasons for Software Effort Estimation Error: Impact of Respondents Role, Information Collection Approach, and Data Analysis method, IEEE Transactions of Software Engineering, 30(12): 993-1007, 2004. 45. M. Jørgensen and D. I. K. Sjøberg. The impact of customer expectation on software development effort estimates, International Journal of Project Management, 22(4): 317-325, 2004. 46. M. Jørgensen, K. H. Teigen, and K. J. Moløkken-Østvold. Better sure than safe? Overconfidence in judgment based software development effort prediction intervals, Journal of Systems and Software, 70(1-2): 79-93, 2004. 47. K. J. Moløkken-Østvold and M. Jørgensen. Group Processes in Software Effort Estimation, Empirical Software Engineering, 9(4): 315-334, 2004.