Open access

Rigour and reproducibility in Canadian research: call for a coordinated approach

Publication: FACETS
6 January 2022

Abstract

Shortcomings in the rigour and reproducibility of research have become well-known issues and persist despite repeated calls for improvement. A coordinated effort among researchers, institutions, funders, publishers, learned societies, and regulators may be the most effective way of tackling these issues. The UK Reproducibility Network (UKRN) has fostered collaboration across various stakeholders in research and are creating the infrastructure necessary to advance rigorous and reproducible research practices across the United Kingdom. Other Reproducibility Networks, modelled on UKRN, are now emerging in other countries. Canada could benefit from a comparable network to unify the voices around research quality and maximize the value of Canadian research.
For research to maximally benefit society, the methods and findings must be available, interpretable, and trustworthy. Nonetheless research often remains unpublished (Glasziou and Chalmers 2017; EBM DataLab 2018) and results nonreproducible or potentially false (e.g., Ioannidis 2005; Open Science Collaboration 2015; Klein et al. 2018; Scheel et al. 2020; Errington et al. 2021). This state of affairs entails a waste of resources and an ethical failing towards research participants and the lives of animals used in research. These shortcomings can lead to useless or even harmful applications of research findings (e.g., in health care: Prasad and Cifu 2019).
While researchers have identified several practices that lead to research waste and low reproducibility1 over the past few decades, these problems largely remain unresolved. For example, an estimated 85% of clinical research funding is wasted (Chalmers and Glasziou 2009; Glasziou and Chalmers 2016) through nonpublication (Riedel et al. 2021; Ross et al. 2012; Wieschowski et al. 2019); lack of clarity, completeness, and accuracy in published reports (Glasziou et al. 2014); and flaws in study design (Yordanov et al. 2015). Beyond clinical research, large-scale replication attempts have revealed low reproducibility of methods and results in disciplines as diverse as psychology (Open Science Collaboration 2015), cancer biology (Errington et al. 2021), economics (Camerer et al. 2016), and water resource management (Stagge et al. 2019). Research waste and low reproducibility arise in part because sharing protocols, data, and analysis scripts remains extremely rare (e.g., in psychology (Hardwicke et al. 2020a), social sciences (Hardwicke et al 2020b), and biomedicine (Iqbal et al. 2016)), and because researchers regularly misunderstand (Gigerenzer 2004; Hoekstra et al. 2014; Lyu et al. 2020) and misapply (e.g., Nieuwenhuis et al. 2011) common statistical methods. In contrast to research misconduct—which occurs when an individual fabricates data, falsifies the research record, or plagiarizes the work of others—research waste and irreproducibility are widespread systemic issues with pernicious impacts. Fortunately, improvements in reproducibility also make misconduct more difficult to commit and easier to detect. Taken together, these issues undermine the trustworthiness and value of research. In a largely publicly funded research environment, such as Canada’s, this waste is all the more problematic.
Researchers aren’t the only ones responsible for rigour2 and reproducibility. These issues are embedded in a complex research ecosystem that includes various parties with similar end-goals, but diverging incentives and proximate goals. Universities want to rank high in league tables and tend to hire and promote researchers based on journal Impact Factor and grant funding, whilst overlooking open and reproducible research practices such as data sharing and protocol registration (Rice et al. 2020). Publishers want to attract readers and citations and generally prefer striking findings (SAGE Publishing 2015; Mlinarić et al. 2017; Nature Publishing Group 2021) which can encourage questionable research practices (Fiedler and Schwarz 2016; Fraser et al. 2018; Gopalakrishna et al. 2021) and spin (Jellison et al. 2020). Some regulators develop policies to counter these issues (e.g., FDA 2016), but they often fail to monitor for compliance or provide the infrastructure necessary to meet the requirements (EBM DataLab 2018; Scaffidi et al. 2021; TARG Meta-Research Group & Collaborators 2021). Simply telling researchers how to do rigorous and reproducible research is not enough. A network approach that coordinates the incentives and proximate goals of researchers, funders, publishers, institutions, learned societies, regulators, and other stakeholders towards the common end-goal of maximizing the value of research could work best. This approach requires training of researchers (e.g., workshops in open science), resources to implement best practices (e.g., data management staff or data champions), the development and use of user-friendly tools (e.g., data registries, experimental design assistants), regulations that come with audits and feedback, and a common understanding around the importance of rigour and reproducibility.
In a few countries, these coordinated efforts are underway. The United Kingdom has led the charge with several national level reports surrounding research culture (Nuffield Council on Bioethics 2014; Wilsdon et al. 2015; House of Commons Science and Technology Committee 2018; Vitae UK 2020) and a recent parliamentary inquiry into reproducibility and research integrity (UK Parliament 2021). In parallel, British researchers founded the UK Reproducibility Network (UKRN; ukrn.org). Launched in 2019, this network now has local networks at more than 50 universities, over 20 institutions that have formally joined by creating a senior academic lead role focused on research improvement, and external stakeholders including funders (e.g., UK Research and Innovation, Research England, Wellcome), learned societies (e.g., British Psychological Society), and publishers (e.g., Nature Publishing Group, Wiley). They have developed and delivered training programs on open research across the United Kingdom and have worked with researchers, institutions, and stakeholders to coordinate efforts to improve research quality. Their unified voice for reproducibility led to the recent award of £4.5M by Research England—a “major strategic investment” intended to drive the uptake of open research practices. These achievements speak to the power of a coordinated approach that provides a voice for researchers themselves.
Other countries have developed their own Reproducibility Networks including Australia, Finland, Germany, Italy, Portugal, Slovakia, and Switzerland. A handful of countries also have organizations that serve as hubs for research rigour and reproducibility such as the Association for Interdisciplinary Meta-Research and Open Science (AIMOS) in Australia, the Center for Open Science (COS) and Meta-Research Innovation Center at Stanford (METRICS) in the United States, the QUEST Center for Responsible Research in Germany, the Research on Research Institutes in the United Kingdom and the Netherlands, and the BRIGHTER Meta-Research Group in Brazil. Canadian researchers and organizations have expressed interest in these topics (e.g., the Centre for Journalology at the Ottawa Health Research Institute), but we lack a more formal structure to tackle these issues as a nation.
Canada punches above its weight in terms of the quantity of research output (Nature Index 2021; World Bank 2018), but we, like other countries, remain susceptible to the shortcomings discussed earlier. Canadian funders and institutions lag behind in terms of reporting clinical trial results (Cobey et al. 2017), and Canadian universities use hiring and promotion criteria that overlook practices such as data sharing, open access publishing, study registration, and use of reporting guidelines (Rice et al. 2021). Some organizations aim to address these problems, for example, the recent Tri-Agency data management policy will soon require grant recipients to deposit their data in a digital repository (Government of Canada 2021). While this policy is a step forward, a clear roadmap for how to implement this policy is absent. It will require training, resources, financial support, and auditing for compliance (Moher and Cobey 2021). At the moment, Canadian funders and universities lack publicly available data regarding compliance with their own policies in open access publishing, study registration, and data management. This shortcoming becomes prominent when considering the Canadian Government’s dedication to Open Government, which includes a specific commitment to open science (Government of Canada 2016). Meta-research specific to the Canadian research environment remains limited and would help elucidate the best paths forward.
By emulating national organizations such as the UKRN, and the other initiatives mentioned, Canadians can accelerate our progress towards more rigorous and reproducible research. We can increase our attractiveness for international collaborations and international funding competitions. We can create a research culture that aligns stakeholders in the Canadian research ecosystem towards the common good of available, interpretable, and trustworthy research. The Canadian public, including patients and other end-users of research findings, would surely welcome such advances.
If you or your organization is interesting in being part of such a network, please email the corresponding author.
1
In this paper, we use the term reproducibility broadly. Some researchers distinguish replicating the methods of an experiment from reproducing the results using data that already exist (Barba, 2018). Others recommend distinguishing between methods reproducibility, results reproducibility, and inferential reproducibility (Goodman et al. 2016). All of these concepts relate to the overarching issue we discuss in this paper: making methods and findings available, interpretable, and trustworthy. Thus, we use the term reproducibility to encompass the many activities of replicating or reproducing any aspect of research.
2
We use the term rigour in line with the National Institutes of Health definition of ensuring “robust and unbiased experimental design, methodology, analysis, interpretation, and reporting of results” (NIH n.d.).

Funding

Robert Thibault is supported by a general support grant awarded to METRICS from the Laura and John Arnold Foundation and a postdoctoral fellowship from the Fonds de recherche du Québec – Santé, Canada. The funders had no role in the preparation of the manuscript or decision to publish.

References

Barba LA. 2018. Terminologies for Reproducible Research. arXiv, 1802.03311 [cs]. [online]: Available from arxiv.org/abs/1802.03311.
Camerer CF, Dreber A, Forsell E, Ho TH, Huber J, Johannesson M, et al. 2016. Evaluating replicability of laboratory experiments in economics. Science, 351: 1433–1436.
Chalmers I, and Glasziou P. 2009. Avoidable waste in the production and reporting of research evidence. The Lancet, 374: 86–89.
Cobey KD, Fergusson D, and Moher D. 2017. Canadian funders and institutions are lagging on reporting results of clinical trials. CMAJ, 189: E1302–E1303.
EBM DataLab. 2018. FDAAA TrialsTracker. [online]: Available from fdaaa.trialstracker.net/.
Errington TM, Mathur M, Soderberg CK, Denis A, Perfito N, Iorns E, et al. 2021. Investigating the replicability of preclinical cancer biology. eLife, 10: e71601.
FDA. 2016. 42 C.F.R. § 11 – Clinical trials registration and results information submission. [online]: Available from: ecfr.gov/current/title-42/chapter-I/subchapter-A/part-11.
Fiedler K, and Schwarz N. 2016. Questionable research practices revisited. Social Psychological and Personality Science, 7: 45–52.
Fraser H, Parker T, Nakagawa S, Barnett A, and Fidler F. 2018. Questionable research practices in ecology and evolution. PLoS ONE, 13: e0200303.
Gigerenzer G. 2004. Mindless statistics. Journal of Socio-Economics, 33: 587–606.
Glasziou P, Altman DG, Bossuyt P, Boutron I, Clarke M, Julious S, et al. 2014. Reducing waste from incomplete or unusable reports of biomedical research. Lancet, 383: 267–276.
Glasziou P, and Chalmers I. 2017. Can it really be true that 50% of research is unpublished? The BMJ.
Glasziou P, and Chalmers I. 2016. Is 85% of health research really “wasted”? The BMJ.
Goodman SN, Fanelli D, and Ioannidis JPA. 2016. What does research reproducibility mean? Science Translational Medicine, 8: 341ps12-341ps12.
Gopalakrishna G, Riet G, Vink G, Stoop I, Wicherts J, and Bouter L. 2021. Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands.
Government of Canada. 2016. Third biennial plan to the open government partnership. [online]: Available from open.canada.ca/en/content/third-biennial-plan-open-government-partnership.
Government of Canada. 2021. Tri-Agency research data management policy - Science.gc.ca. [online]: Available from science.gc.ca/eic/site/063.nsf/eng/h_97610.html.
Hardwicke TE, Thibault RT, Kosie JE, Wallach JD, Kidwell M, and Ioannidis J. 2020a. Estimating the prevalence of transparency and reproducibility-related research practices in psychology (2014–2017) (preprint). MetaArXiv.
Hardwicke TE, Wallach JD, Kidwell MC, Bendixen T, Crüwell S, and Ioannidis JPA. 2020b. An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017). Royal Society Open Science, 7: 190806.
Hoekstra R, Morey RD, Rouder JN, and Wagenmakers EJ. 2014. Robust misinterpretation of confidence intervals. Psychonomic Bulletin and Review, 21: 1157–1164.
House of Commons Science and Technology Committee. 2018. Research integrity sixth report of session 2017-19. [online]: Available from publications.parliament.uk/pa/cm201719/cmselect/cmsctech/350/350.pdf.
Ioannidis JPA. 2005. Why most published research findings are false. PLOS Medicine, 2: e124.
Iqbal SA, Wallach JD, Khoury MJ, Schully SD, and John P. 2016. Reproducible research practices and transparency across the biomedical literature 1–13.
Jellison S, Roberts W, Bowers A, Combs T, Beaman J, Wayant C, et al. 2020. Evaluation of spin in abstracts of papers in psychiatry and psychology journals. BMJ EBM 25: 178–181.
Klein RA, Vianello M, Hasselman F, Adams BG, Adams RB, Alper S, et al. 2018. Many Labs 2: Investigating Variation in Replicability Across Samples and Settings. Advances in Methods and Practices in Psychological Science, 1: 443–490.
Lyu XK, Xu Y, Zhao XF, Zuo XN, and Hu CP. 2020. Beyond psychology: Prevalence of p value and confidence interval misinterpretation across different fields. Journal of Pacific Rim Psychology,
Mlinarić A, Horvat M, and Šupak Smolčić V. 2017. Dealing with the positive publication bias: Why you should really publish your negative results. Biochem Med (Zagreb), 27: 030201.
Moher D, and Cobey KD. 2021. Ensuring the success of data sharing in Canada. FACETS, 6: 1534–1538.
Nature Index. 2021. 2021 tables: Countries/territories | 2021 tables | Countries/territories | Nature Index. [online]: Available from natureindex.com/annual-tables/2021/country/all.
Nature Publishing Group. 2021. Editorial criteria and processes. Nature. [online]: Available from nature.com/nature/for-authors/editorial-criteria-and-processes.
Nieuwenhuis S, Forstmann BU, and Wagenmakers E. 2011. Erroneous analyses of interactions in neuroscience: A problem of significance. Nature Neuroscience, 14: 1105–1109.
NIH. n.d. Rigor and Reproducibility. National Institutes of Health (NIH). [online]: Available from nih.gov/research-training/rigor-reproducibility.
Nuffield Council on Bioethics. 2014. The culture of scientific research in the UK (No. December). [online]: Available from nuffieldbioethics.org/publications/the-culture-of-scientific-research.
Open Science Collaboration. 2015. Estimating the reproducibility of psychological science. Science 349.
Prasad V, and Cifu AS. 2019. Ending Medical Reversal: Improving Outcomes. Saving Lives, Johns Hopkins University Press, Baltimore, Maryland.
Rice DB, Raffoul H, Ioannidis JPA, and Moher D. 2021. Academic criteria for promotion and tenure in faculties of medicine: A cross-sectional study of the Canadian U15 universities. FACETS 6, 58–70.
Rice DB, Raffoul H, Ioannidis JPA, and Moher D. 2020. Academic criteria for promotion and tenure in biomedical sciences faculties: Cross sectional analysis of international sample of universities. BMJ, 369, m2081.
Riedel N, Wieschowski S, Bruckner T, Holst MR, Kahrass H, Nury E, et al. 2021. Results dissemination from completed clinical trials conducted at German university medical centers remains delayed and incomplete. The 2014–2017 cohort (preprint). Public and Global Health.
Ross JS, Tse T, Zarin DA, Xu H, Zhou L, and Krumholz HM. 2012. Publication of NIH funded trials registered in ClinicalTrials.gov: Cross sectional analysis. BMJ, 344: d7292–d7292.
SAGE Publishing. 2015. Increasing Citations and Improving Your Impact Factor. SAGE Publications Inc. [online]: Available from us.sagepub.com/en-us/nam/increasing-citations-and-improving-your-impact-factor.
Scaffidi MA, Elsolh K, Li J, Verma Y, Bansal R, Gimpaya N, et al. 2021. Do authors of research funded by the Canadian Institutes of Health Research comply with its open access mandate?: A meta-epidemiologic study. PLoS ONE, 16: e0256577.
Scheel AM, Schijen M, and Lakens D. 2020. An excess of positive results: Comparing the standard psychology literature with registered reports. Advances in Methods and Practices in Psychological Science, 4(2): 25152459211007467.
Stagge JH, Rosenberg DE, Abdallah AM, Akbar H, Attallah NA, and James R. 2019. Assessing data availability and research reproducibility in hydrology and water resources. Science Data, 6: 190030.
TARG Meta-Research Group & Collaborators. 2021. Estimating the prevalence of discrepancies between study registrations and publications: A systematic review and meta-analyses (preprint). medRxiv,
UK Parliament. 2021. Reproducibility and research integrity - Committees - UK Parliament. [online]: Available from committees.parliament.uk/work/1433/reproducibility-and-research-integrity/.
Vitae UK. 2020. Research integrity: A landscape Study. [online]: Available from vitae.ac.uk/vitae-publications/reports/research-integrity-a-landscape-study.
Wieschowski S, Riedel N, Wollmann K, Kahrass H, Müller-Ohlraun S, Schürmann C, et al. 2019. Result dissemination from clinical trials conducted at German university medical centers was delayed and incomplete. Journal of Clinical Epidemiology, 115: 37–45.
Wilsdon J, Allen L, Belfiore E, Campbell P, Curry S, Hill S, et al. 2015. The metric tide: Independent review of the role of metrics in research assessment and management (No. June 2017).
World Bank. 2018. Scientific and technical journal articles | Data. [online]: Available from data.worldbank.org/indicator/IP.JRN.ARTC.SC?most_recent_value_desc=true&year_low_desc=true.
Yordanov Y, Dechartres A, Porcher R, Boutron I, Altman DG, and Ravaud P. 2015. Avoidable waste of research related to inadequate methods in clinical trials. BMJ, 350: h809.

Information & Authors

Information

Published In

cover image FACETS
FACETS
Volume 7Number 1January 2022
Pages: 18 - 24
Editor: Iain E.P. Taylor

History

Received: 13 October 2021
Accepted: 9 December 2021
Version of record online: 6 January 2022

Data Availability Statement

All relevant data are within the paper.

Key Words

  1. reproducibility
  2. rigour
  3. replication
  4. meta-research
  5. metascience
  6. open research
  7. transparency
  8. data sharing
  9. evidence-based medicine

Sections

Subjects

Authors

Affiliations

Robert T. Thibault [email protected]
Meta-Research Innovation Center at Stanford (METRICS), Stanford University, California, 94305, United States
School of Psychological Science, University of Bristol, Bristol, BS8 1TH, United Kingdom
MRC Integrative Epidemiology Unit at the University of Bristol, Bristol, BS8 1TH, United Kingdom
Marcus R. Munafò
School of Psychological Science, University of Bristol, Bristol, BS8 1TH, United Kingdom
MRC Integrative Epidemiology Unit at the University of Bristol, Bristol, BS8 1TH, United Kingdom
David Moher
Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, ON, K1Y 4E9, Canada
School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, ON, K1N 6N5, Canada

Author Contributions

All the authors conceived this article from their conversations.
RTT wrote the article with feedback from MRM and DM.

Competing Interests

All authors have a current interest in improving research rigour and reproducibility. Marcus Munafò is a co-founder of the UK Reproducibility Network and currently chairs its Steering Group.

Metrics & Citations

Metrics

Other Metrics

Citations

Cite As

Export Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

1. The White House’s march towards open science: implications for Canada

View Options

View options

PDF

View PDF

Media

Media

Other

Tables

Share Options

Share

Share the article link

Share on social media