Academic criteria for promotion and tenure in faculties of medicine: a cross-sectional study of the Canadian U15 universities
Abstract
Background: The objective of this study was to determine the presence of a set of prespecified criteria used to assess scientists for promotion and tenure within faculties of medicine among the U15 Group of Canadian Research Universities.
Methods: Each faculty guideline for assessing promotion and tenure was reviewed and the presence of five traditional (peer-reviewed publications, authorship order, journal impact factor, grant funding, and national/international reputation) and seven nontraditional (citations, data sharing, publishing in open access mediums, accommodating leaves, alternative ways for sharing research, registering research, using reporting guidelines) criteria were collected by two reviewers.
Results: Among the U15 institutions, four of five traditional criteria (80.0%) were present in at least one promotion guideline, whereas only three of seven nontraditional incentives (42.9%) were present in any promotion guidelines. When assessing full professors, there were a median of three traditional criteria listed, versus one nontraditional criterion.
Conclusion: This study demonstrates that faculties of medicine among the U15 Group of Canadian Research Universities base assessments for promotion and tenure on traditional criteria. Some of these metrics may reinforce problematic practices in medical research. These faculties should consider incentivizing criteria that can enhance the quality of medical research.
Introduction
Recent national-level commitments have been made to support research excellence and the recognition of Canada as a global leader in research (Canada’s Fundamental Science Review 2016). Appropriately conducting medical research requires adhering to best practice approaches to produce high-quality, replicable, and publishable research with accessible findings. Currently, important deficiencies in medical research exist and contribute to ongoing wasted resources (Moja et al. 2005; Collier et al. 2016; Ioannidis et al. 2017; Moher et al. 2017).
Assessing the quality of research has been highlighted as the cornerstone of adjudication that should be applied when evaluating Canadian research and researchers (Advisory Panel on Federal Support for Fundamental Science and Naylor 2017). Academic institutions may influence and improve research through the evaluation process used for hiring, promotion, and tenure of their faculty (Moher et al. 2016; Flier et al. 2017). Academics tailor their publication practices to evaluation criteria applied in their institution (Wolff et al. 2016). The current incentives being applied to assess researchers, however, may include problematic metrics that reinforce the limitations of medical research (Hammarfelt et al. 2017; Quan et al. 2017). Many universities incentivize the quantity of publications rather than the reliability of findings (Rice et al. 2020). This can inadvertently result in a focus on conducting research sacrificing accuracy and transparency. This has important consequences in health sciences where clinical decision-making relies on research findings (Collier et al. 2016). It has been recommended to provide incentives and rewards (e.g., promotions) for research that is conducted appropriately, adheres to best publication practices, and produces results that more meaningfully impact society (Flier et al. 2017; Rice et al. 2020). In 2017, an international expert panel comprised of academic leaders, funders, and scientists was convened to review key documents about promotion and tenure and to discuss redesigning the current approach to assessing scientists. Six progressive principles, including rewarding researchers for open science practices and the transparent and complete reporting of research were highlighted (Moher et al. 2018). Incentivizing the complete and transparent publishing of all research were identified as a basis for establshing best practices for adapting the current approach used to assess scientists (Moher et al. 2018).
Within Canada, the U15 Group of Canadian Research Universities is a collective of the nation’s prominent research-intensive universities. The U15 conducts approximately $8.5 billion worth of research annually and receives 80% of all competitively allocated research funding in Canada (Group of Canadian Research Universities 2019). The U15 collaborates with federal policy makers and prestigious funding bodies, such as the Canadian Institutes for Health Research and the Social Sciences and Humanities Research Council, to foster scientific investigation in Canada (Group of Canadian Research Universities 2019). Given the influence of the U15 in Canadian research, understanding the criteria used to incentivize and reward academics through promotion and tenure is necessary. Ensuring that the reward systems applied will encourage best publication practices can help Canada remain an international leader in research and ensure reliable evidence for health care. Therefore, we aimed to identify and document the presence of a set of prespecified criteria used to assess scientists for promotion and tenure within U15 faculties of medicine.
Methods
The protocol for this study was registered in the Open Science Framework (OSF) repository (osf.io/26ucp/?view_only=b80d2bc7416543639f577c1b8f756e44) prior to the study’s data collection. The search for criteria, definitions of criteria, and data collection are similar in this protocol and in another protocol where we have evaluated the respective criteria in institutions around the world (Moher et al. 2018). The STROBE Statement for cross-sectional studies checklist (Von Elm et al. 2007) was used to ensure that methods and findings are clearly reported (Supplementary Material 1).
Eligible university institutions
Members of the U15 were eligible. The U15 include (alphabetically): Dalhousie University (Halifax, Nova Scotia), McMaster University (Hamilton, Ontario), McGill University (Montreal, Quebec), Queens University (Kingston, Ontario), Université de Montréal (Montreal, Quebec), Université Laval (Quebec City, Qubec), University of Alberta (Edmonton, Alberta), University of British Columbia (Vancouver, British Columbia), University of Calgary (Calgary, Alberta), University of Manitoba (Winnepeg, Manitoba), University of Ottawa (Ottawa, Onario), University of Saskatchewan (Saskatoon, Saskatchewan), University of Toronto (Toronto, Ontario), University of Waterloo (Waterloo, Ontario), and Western University (London, Ontario).
Searching of institution criteria
Searching for institutional criteria involved an iterative process. Two reviewers (DBR, HR) independently searched institution websites for their reported guidelines and policies used for the evaluation, promotion, and tenure in the faculty of medicine or relevant biomedical sciences faculty. Keywords searched on the institution websites included “academic performance”, “career mobility”, “criteria”, “evaluation”, “guidelines”, “policy”, “promotion”, and “tenure”, as recommended by a medical information specialist.
Approach to selecting list of criteria
Twelve criteria were selected to enable a comparison between traditional (e.g., quantity of publications) and nontraditional (e.g., reproducibility of research) criteria used to assess scientists for promotion and tenure (Supplementary Material 2). We divided criteria into two groups: traditional and nontraditional. This characterization was ultimately subjective, but we based our decisions on evidence and policy initiatives from several sources (e.g., McKiernan et al. 2017; Rice et al. 2020), and the criteria used were applied in a recently published study (Moher et al. 2018). Traditional criteria are those that were proposed many decades ago, whereas nontraditional criteria are those whose advent has been more recent. An early version of the criteria included 10 items; however, after pilot testing a set of 5 institutions, two additional items were added. The final set of criteria included five traditional criteria (peer-reviewed publications, authorship order, journal impact, grant funding, national or international reputation) and seven nontraditional criteria (citations, data sharing, publishing in open access mediums, registration of research, adherence to reporting guidelines, alternative approaches to sharing research, accommodations or adjustments for employment leave).
Data collection
Faculty of medicine guidelines were reviewed to determine if any of the 12 items from our list of criteria for faculty promotion and tenure were present. We extracted this information for evaluation of assistant professor, associate professor, full professor, and the granting of tenure. This information was extracted for tenure-track positions rather than nontenure track or clinical professor positions. We did not extract promotion and tenure criteria for aspects of career advancement related to teaching or clinical duties or for positions that were comprised of more educational or clinical activities as compared to research activities. The faculty name, the year that the criterion was published, the associated URL of the criteria, and the date that the website was searched were also extracted. The rank of the relevant faculty of medicine or biomedical sciences for each institution as reported by the Centre for Science and Technology Studies (CSTS) Leiden Ranking of world universities from the list available in 2018 was also extracted using the CSTS default settings of indicators (leidenranking.com/ranking/2018/list) that include under the “impact” type indicator: the total number of publications (P), the number of publications among the top 10% of most frequently cited publications in the same field and in the same year (P, top 10%), and the proportion of total publications in the top 10% (PP, top 10%). Collaborative publications were counted fractionally. Rankings from the list of top “Biomedical and Health Sciences” faculties based on the minimum publication output were set at the default of 100 and restricted to Canada and all international universities listed were extracted (Centre for Science and Technology Studies 2019). The Leiden Ranking list was selected to align with a related study that focused on a sample of international institutions and because it is a well-known list (Rice et al. 2020). Two reviewers (DBR, HR) independently extracted all data, and the results were compared for consistency. Where consensus was not achieved between reviewers after discussion, a third team member (DM) was consulted to address discrepancies. Data collection was performed using a standardized electronic data collection form in Distiller Systematic Reviewer (Evidence Partners, Ottawa, Ontario, Canada). All data extraction forms are available on OSF (osf.io/9cgws/?view_only=b80d2bc7416543639f577c1b8f756e44).
Approach to synthesis
We present the number and percentage of institutions that listed each criterion for each hiring or promotion level. We also sum the number of traditional criteria (out of five) and the number of nontraditional criteria (out of seven) that were present.
Results
Each of the U15 institutions had a department or faculty of medicine or applied health sciences and had faculty- or department-level guidelines for promotion and tenure publicly available. Faculty- and institution-level guidelines were last updated between 2004 and 2018 (median = 2016, interquartile range = 2014–2017). Guidelines were available for the evaluation of assistant professor, associate professor, full professor, and professor for tenure in 9, 14, 15, and 7 institutions, respectively. Three institutions (McGill, Laval, and McMaster) had criteria that evaluated promotion and tenure combined (e.g., promotion to professor with tenure). These instances were included once at the level of professor. Among Canadian institutions, rankings of the U15 faculties of medicine ranged from 1 to 15 (among 28 ranked institutions, Table 1).
Table 1.
University | Name of faculty or department | Leiden ranking (world) | Criteria type used for extraction | Years that criteria were published or updated |
---|---|---|---|---|
Dalhousie University | Faculty of Medicine | 12 (262) | Faculty level | 2017 |
McGill University | Department of Medicine | 3 (34) | Faculty and institution level | 2016 |
McMaster University | Faculty of Medicine | 6 (104) | Faculty and institution level | 2011, 2012 |
Queens University | School of Medicine | 13 (285) | Faculty and departmental level | 2004, 2017 |
Université de Montréal | Faculty of Medicine | 7 (106) | Faculty level | 2004 |
Université Laval | Faculty of Medicine | 10 (227) | Faculty and department level | 2014 |
University of Alberta | The Faculty of Medicine and Dentistry | 4 (73) | Faculty level | 2016 |
University of British Columbia | Faculty of Medicine | 2 (30) | Faculty and institution level | 2008 |
University of Calgary | Faculty of Medicine | 8 (109) | Faculty level | 2016 |
University of Manitoba | Faculty of Medicine | 11 (231) | Faculty level | 2016 |
University of Ottawa | Faculty of Medicine | 5 (102) | Faculty and institution level | 2018 |
University of Saskatchewan | College of Medicine | 14 (355) | Faculty level | 2008, 2015, 2017 |
University of Toronto | Faculty of Medicine | 1 (2) | Faculty and institution level | 2016 |
University of Waterloo | Faculty of Applied Health Sciences | 15 (385) | Faculty and institution level | 2014 |
Western University | Schulich School of Medicine | 9 (113) | Faculty and institution level | 2017 |
Four of five (80.0%) traditional incentives were present in at least one level of promotion guidelines for some or all 15 faculties of medicine. Specifically, the guidelines always mentioned publications and the receipt of grant funding, and they also commonly mentioned specific authorship order within publications (in 7/15 guidelines) and professors and (or) their research being recognized at a national or international level (in 10/15 guidelines). The journal impact factor was not specifically referred to in any guidelines; however, many guidelines included statements describing assessing professors based on if their publications are in “prestigious journals”, “high-impact”, or “quality journals” without specifying how these descriptions are assessed. Three institutions (20%) had at least one mention of minimum number of peer-reviewed publications (range 4–8 of papers per year, two papers in the last 5 years; 4–8 papers per year when limited to professor rank). No institutions (0%) mentioned a specific amount of money for funding. The institutions varied on requirements for authorship order encouraging: senior author publications (1), corresponding author (1), sole author (1), lead or corresponding author (1), senior or corresponding author (1), and first or senior author (1). One institution did not specify a preference but requested that publications as senior author, principal author, co-principal author, or collaborator be noted, and the remaining institutions did not specify requirements.
Three of seven nontraditional items (42.9%) were present in at least one level of promotion guidelines in at least one guideline: citations of research, alterative metrics for sharing research, and adjustments to expectations when professors go on leave (mentioned in 5/15, 1/15, and 10/15 guidelines, respectively). Mention of data sharing, publishing in open access mediums, registering studies or reviews, and adhering to reporting guidelines were absent from all institutions (see Table 2).
Table 2.
Criteria | Presence of criteria for assistant professor (n = 9), n (%) | Presence of criteria for associate professor (n = 14), n (%) | Presence of criteria for full professor (n = 15), n (%) | Presence of criteria for tenure (n = 7), n (%) | Example of quantitative information if present | Example of relevant quote from university website | |
---|---|---|---|---|---|---|---|
Traditional incentives | |||||||
1. | Is any quantitative or qualitative mention made about publications required? If quantitative, please specify the requirement. | 6 (67) | 14 (100) | 15 (100) | 7 (100) | At least eight peer-reviewed publications | “there should be evidence of successful peer-reviewed publication and strong promise of more to come.” |
2. | Is any quantitative or qualitative mention made about the specific authorship order in publications? If so, please specify order (e.g., first, senior, single) required. | 2 (22) | 7 (50) | 7 (47) | 2 (29%) | 1–2 papers per year as senior or corresponding author | “he/she may be first or senior author, […] but should have served as the senior author on a substantial number of manuscripts from the study. ” |
3. | Is any mention made of journal impact factors? If quantitative, what are the minimum thresholds? | 0 (0) | 0 (0) | 0 (0) | 0 (0) | N/A | N/A |
4. | Is any mention made of grant funding? If quantitative, what are the minimum thresholds (i.e., amount of funding and (or) number of grants as principal investigator)? | 5 (56) | 14 (100) | 14 (93) | 6 (86) | N/A | “An individual seeking promotion on the basis of achievement in research must also have a strong and continuing record of external funding commensurate with the type and area of research.” |
5. | Is any mention made requiring that research is recognized at a national or international level? If so, please specify the requirement. | 2 (22) | 7 (50) | 10 (67) | 3 (43) | N/A | “In general, the candidate must have an international reputation as a leading researcher in the field.” |
Nontraditional incentives | |||||||
6. | Is any mention made of citations? If quantitative, what are the thresholds of minimum requirement? Are specific citation databases mentioned? | 1 (11) | 4 (29) | 5 (33) | 1 (14) | N/A | “consistent or repeated positive citation in the relevant literature for one’s field.” |
7. | Is any mention made of data sharing? If quantitative, what are the minimum thresholds (e.g., percentage of data that is to be made available)? | 0 (0) | 0 (0) | 0 (0) | 0 (0) | N/A | N/A |
8. | Is any mention made of publishing in open access mediums? If quantitative, what are the minimum thresholds (e.g., percentage of studies to be published in open access journals)? | 0 (0) | 0 (0) | 0 (0) | 0 (0) | N/A | N/A |
9. | Is any mention made of registration (including preregistration challenge) of studies? If yes, are there thresholds of minimum requirement (e.g., percentage of studies that are to be registered). | 0 (0) | 0 (0) | 0 (0) | 0 (0) | N/A | N/A |
10. | Is any mention made of adherence to reporting guidelines for publications? If so, are specific guidelines mentioned? | 0 (0) | 0 (0) | 0 (0) | 0 (0) | N/A | N/A |
11. | Is any mention made of alternative metrics for sharing research (e.g., social media and print media)? If so, are specific metrics mentioned? | 1 (11) | 1 (7) | 1 (7) | 1 (14) | N/A | “In evaluating research productivity, the volume of published work will be judged in accordance with its impact, quality and significance. Applicable metrics will necessarily vary from specialty to specialty: if used, their relevance should be identified and explained in the case file.” |
12. | Is any mention made of accommodations or adjustments to expectations due to employment leave? If so, please specify the description of accommodations (e.g., an extra year to defer tenure consideration) and the type of eligible circumstances (e.g., parental leave, medical leave)? | 6 (67) | 9 (64) | 10 (67) | 4 (57) | N/A | “Time limits can be extended for sick leaves or other relevant leaves.” |
When considering the faculty of medicine guidelines for promotion to full professor, all institutions had at least one traditional criterion in their guidelines (median = 3), with six of 15 institutions (40.0%) including four traditional criteria. Eleven of 15 institutions (73.3%) included at least one non-traditional criterion for promotion to full professor (median = 1), with no faculties having more than two of seven (28.6%) nontraditional criteria. Results were similar across levels of promotion with the evaluation of assistant professors (see Fig. 1).
Fig. 1.
Discussion
Among Canada’s leading research-intensive institutions, faculties of medicine were found to assess professors on traditional aspects of research and rarely used nontraditional criteria. The written guidelines reviewed incentivize mostly the quantity of publications and receiving funding, and they never consider data sharing, following reporting guidelines, registering studies, or publishing in open access journals. These findings are in line with a recent study that applied the same set of criteria when reviewing 92 guidelines for biomedical sciences faculties from an international sample of institutions (Rice et al. 2020). The number of traditional items that were present in any of the international sample of institutions was greater than those present in the U15 (five of five present, 100.0%), as approximately one-third of international guidelines considered the journal impact factor in evaluating scientists. One additional nontraditional item was also present among the international sample of universities resulting in four of seven items present (57.1%) from the sample of international universities as compared with three of seven items (42.9%) present in the U15. In this study, institutions in North America and Australia were found to report more traditional criteria compared with institutions in Europe, Asia, and South America. The U15 schools are consistent whereby the Canadian institutions reviewed included a greater proportion of certain traditional criteria, including a higher percentage of institutions noting the requirement of publications (100%) and specific authorship order (47%) compared with the international sample that included 95% and 37% of institutions requiring these criteria, respectively.
There is substantial evidence that researchers fail to register their proposals, including randomized trials (Zarin et al. 2007; Zarin et al. 2017). This is particularly problematic in Canada where taxpayer dollars are used to fund a substantial amount of biomedical research. Data sharing is gaining considerable momentum as an essential prerequisite to tackling the reproducibility crisis (Munafò et al. 2017). Data sharing is associated with higher citations (Piwowar et al. 2007) and it is also supported by patients (Mello et al. 2018). Similarly, for data sharing, Canada’s Roadmap for Open Science will require full implementation of FAIR (findability, accessibility, interoperability, reproducibility) data principles by 2025 (Office of the Chief Science Advisor of Canada 2020). Our results suggest that U15 faculties of medicine have considerable work to do to meet Canada’s roadmap for open science. While societal value needs to be judged on a case-by-case basis and all-encompassing criteria are difficult to set for all disciplines, several of the nontraditional criteria are also more likely to be aligned with societal value than the traditional assessment criteria.
Traditional criteria vary in their strengths, weaknesses, and appropriateness to assess the performance of scientists. The way they are defined and operationalized may also be important in this regard. Some traditional criteria are widely discredited as problematic. Of particular relevance, for multiple reasons, the journal impact factor has been criticized for its inappropriate use in assessing scientists, since only a small set of papers in a journal account for the journal’s impact factor (Seglen et al. 1997; The PLoS Medicine Editors 2006; Callaway et al. 2016; Lariviere et al. 2016). In this regard, it is reassuring that the U15 criteria do not specifically mention requiring specific impact factors; however, they often used synonymous concepts such as “high impact” or “prestigious” journals that closely align to journal impact factor (McKiernan et al. 2019). The current system of research involves using journal metrics as a promotional tool for publishers. Encouraging the use of a range of evidence-based criteria to focus on assessments based on the scientific content and quality of an article rather than publication metrics of the journal in which it was published is needed. Shortcomings of traditional criteria for incentivizing and assessing scientists have driven international efforts to identify more progressive and useful ways to accomplish this goal. The San Francisco Declaration on Research Assessment (DORA), for example, is an initiative that involves funders, publishers, professional societies, institutions, and researchers supportive of the development and dissemination of best practice approaches in the assessment of scientists (Bladek et al. 2014). DORA explicitly recommend organizations not use the journal impact factor to assess researchers. Nearly 2000 institutions and 16 000 individuals have signed DORA’s initiative; however, none of the U15 have signed DORA to date. However, a few university-affiliated research institutes and other organizations have signed DORA. Other stakeholders such as journals and funding institutions have started to support several nontraditional criteria, many of which relate to the open science movement. These efforts have been encouraged by early-career researchers who are committed to improving the usability of research (McKiernan et al. 2016, 2017; Bell et al. 2017; Rowhani-Farid et al. 2018), and these practices could have a positive impact for Canadian universities.
Previous studies have reviewed promotion and tenure guidelines used in universities across disciplines. Studies have identified an overemphasis placed on research criteria as opposed to teaching and service criteria (Green et al. 2008; Alperin et al. 2018). Our work provides a focused assessment of the research criteria being applied in faculties of medicine and the presence of traditional and nontraditional criteria among Canada’s 15 top-rated faculties of medicine. Although faculties of medicine among the U15 currently rely on many traditional criteria, Canada is well-positioned to be at the forefront of change (Advisory Panel on Federal Support for Fundamental Science and Naylor 2017) and to lead a movement towards shaping, testing, and evaluating the use of evidence-based criteria for assessing scientists.
There are limitations that should be considered when interpreting our results. We extracted information that was available in relevant documents that were publicly available on institution and department websites. It is possible that additional documents that we did not have access to are used when assessing professors for promotion and tenure. Moreover, it is possible that some criteria are not explicitly listed in written guidelines but nevertheless permeate the philosophy and everyday practice of an entire institution or are highly influential in specific microenvironments within institutions. These would be very difficult to capture unless extensive surveys of faculty experiences were to be performed. Even then, the accuracy of such surveys would be uncertain. An additional limitation is that incentivizing can also occur in the form of financial or other bonuses (Zauner et al. 2018; Quan et al. 2019) (e.g., based on the number of papers published or impact factor metrics), and we did not capture those. Furthermore, for medical faculties clinical work and teaching may also be highly important, and we did not assess these dimensions. It is possible that promotion and tenure committees use their in-person meetings to discuss criteria not formally included in the documents we examined. Providing a qualitative analysis of such meetings was outside the remit of our examination. Furthermore, positions in medical faculties often involve clinical work, teaching, and supervisory duties, which we did not include.
We should also caution that we selected the terms traditional and nontraditional for the examined criteria a priori, but our choices had unavoidable subjectivity. In addition, the specific way that the nontraditional criteria are operationalized can make a difference in their usefulness and many of these criteria could end up being gameable and potentially limit the ability to significantly improve research and health outcomes.
Systemic level efforts and changes to the status quo for evaluating academics could encourage large-scale improvements in research quality and in the development and dissemination of rigorous evidence-based medicine. We are at a crucial time in the movement of research reform; a movement that is crossing disciplinary and national borders. There is a window of opportunity now to make changes that can drive better research in medicine. Canadian university promotion and tenure committees should consider which assessment criteria would be best in this regard, as well as of highest value to society and ethically sustainable.
Acknowledgements
We would like to thank Ms. Becky Skidmore for her input on appropriate key terms to search on institution websites. We would also like to thank Dr. Juan Pablo Alperin for providing us access to tenure and promotion documents from Canadian and American institutions.
References
Advisory Panel on Federal Support for Fundamental Science, and Naylor CD. 2017. Investing in Canada’s future: strengthening the foundations of Canadian research. Canada’s Fundamental Science Review.
Alperin JP, Fischman GE, McKiernan EC, Nieves CM, Niles MT, and Schimanski L. 2018. Meta-Research: how significant are the public dimensions of faculty work in review, promotion and tenure documents? eLife, 8: e42254.
Bell V. 2017. Open science in mental health research. The Lancet Psychiatry, 4(7): 525–526.
Bladek M. 2014. DORA: San Francisco declaration on research assessment (May 2013). College & Research Libraries News, 75(4): 191–196.
Callaway E. 2016. Beat it, impact factor! Publishing elite turns against controversial metric. Nature, 535: 210–211.
Canada’s Fundamental Science Review. 2016. Canada’s Fundamental Science Review—questions and answers. Innovation, Science and Economic Development Canada, Ottawa, Ontario.
Centre for Science and Technology Studies. 2019. CWTS Leiden Ranking 2018. Centre for Science and Technology Studies, Leiden University, the Netherlands [online]: Available from leidenranking.com/ranking/2018/list.
Collier R. 2016. Transparency poor in academic medical research. Canadian Medical Association Journal, 188: E133.
Flier J. 2017. Faculty promotion must assess reproducibility. Nature, 549(7671): 133.
Green RG. 2008. Tenure and promotion decisions: the relative importance of teaching, scholarship, and service. Journal of Social Work Education, 44(2): 117–128.
Group of Canadian Research Universities. 2019. About us [online]: Available from u15.ca/about-us.
Hammarfelt B. 2017. Recognition and reward in the academy: valuing publication oeuvres in biomedicine, economics and history. Aslib Journal of Information Management, 69(5): 607–623.
Ioannidis JPA. 2017. Acknowledging and overcoming nonreproducibility in basic and preclinical research. JAMA, 317(10): 1019–1020.
Lariviere V, Kiermer V, MacCallum CJ, McNutt M, Patterson M, Pulverer B, et al. 2016. A simple proposal for the publication of journal citation distributions. bioRxiv.
McKiernan EC. 2017. Imagining the “open” university: sharing scholarship to improve research and education. PLoS Biology, 15(10): e1002614.
McKiernan EC, Bourne PE, Brown CT, Buck S, Kenall A, Lin J, et al. 2016. How open science helps researchers succeed. eLife, 5: e16800.
McKiernan EC, Schimanski LA, Munoz Nieves C, Matthias L, Niles MT, and Alperin Pablo J. 2019. Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations. eLife, 8: e47338.
Mello MM, Lieou V, and Goodman SN. 2018. Clinical trial participants’ views of the risks and benefits of data sharing. The New England Journal of Medicine, 378(23): 2202–2211.
Moher D, Goodman SN, and Ioannidis J. 2016. Academic criteria for appointment, promotion and rewards in medical research: where’s the evidence? European Journal of Clinical Investigation, 46(5): 383–385.
Moher D, Shamseer L, Cobey KD, Lalu MM, Galipeau J, Avey MT, et al. 2017. Stop this waste of people, animals and money. Nature, 549(7670): 23–25.
Moher D, Naudet F, Cristea IA, Miedema F, Ioannidis JPA, and Goodman SN. 2018. Assessing scientists for hiring, promotion, and tenure. PLoS Biology, 16(3): e2004089.
Moja LP, Telaro E, D’Amico R, Moschetti I, Coe L, and Liberati A. 2005. Assessment of methodological quality of primary studies by systematic reviews: results of the metaquality cross sectional study. BMJ, 330(7499): 1053.
Munafò MR, Nosek BA, Bishop DVM, Button KS, Chambers CD, Du Sert NP, et al. 2017. A manifesto for reproducible science. Nature Human Behaviour, 1(1): 0021.
Office of the Chief Science Advisor of Canada. 2020. Roadmap for open science [online]: Available from science.gc.ca/eic/site/063.nsf/vwapj/Roadmap-for-Open-Science.pdf/$file/Roadmap-for-Open-Science.pdf.
Piwowar HA, Day RS, and Fridsma DB. 2007. Sharing detailed research data is associated with increased citation rate. PLoS ONE, 2(3): e308.
Quan W, Chen B, and Shu F. 2017. Publish or impoverish: an investigation of the monetary reward system of science in China (1999–2016). Aslib Journal of Information Management, 69(5): 486–502.
Quan W, Chen B, and Shu F. 2019. Publish or impoverish: an investigation of the monetary reward system of science in China (1999–2016) [online]: Available from arxiv.org/ftp/arxiv/papers/1707/1707.01162.pdf.
Rice DB, Raffoul H, Ioannidis JP, and Moher D. 2020. Academic criteria for promotion and tenure in biomedical sciences faculties: cross sectional analysis of international sample of universities. BMJ, 369: m2081.
Rowhani-Farid A. 2018. Towards a culture of open science and data sharing in health and medical research. Ph.D. thesis, Queensland University of Technology.
Seglen PO. 1997. Why the impact factor of journals should not be used for evaluating research. BMJ, 314(7079): 498.
The PLoS Medicine Editors. 2006. The impact factor game. It is time to find a better way to assess the scientific literature. PLoS Medicine, 3(6): e291.
Von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP, et al. 2007. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Medicine, 4(10): e296.
Wolff C. 2016. UK Survey of Academics 2015: Ithaka S+R | Jisc | RLUK.
Zarin DA, Ide NC, Tse T, Harlan WR, West JC, and Lindberg DB. 2007. Issues in the registration of clinical trials. JAMA, 297(19): 2112–2120.
Zarin DA, Tse T, Williams RJ, and Rajakannan T. 2017. Update on trial registration 11 years after the ICMJE policy was established. The New England Journal of Medicine, 376(4): 383–391.
Zauner H, Nogoy NA, Edmunds SC, Zhou H, and Goodman L. 2018. Editorial: We need to talk about authorship. GigaScience, 7(12): giy122.
Supplementary materials
Supplementary Material 1 (DOCX / 18 KB)
- Download
- 18.04 KB
Supplementary Material 2 (DOCX / 15.8 KB)
- Download
- 15.82 KB
Information & Authors
Information
Published In
FACETS
Volume 6 • Number 1 • January 2021
Pages: 58 - 70
Editor: Yann Joly
History
Received: 4 June 2020
Accepted: 28 October 2020
Version of record online: 21 January 2021
Copyright
© 2021 Rice et al. This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
Data Availability Statement
Study protocol, assessment criteria, extraction forms, and all data associated with this study can be found on Open Science Framework: osf.io/9cgws/?view_only=b80d2bc7416543639f577c1b8f756e44.
Key Words
Sections
Subjects
Authors
Author Contributions
JPAI and DM conceived and designed the study.
DBR and HR performed the experiments/collected the data.
DBR analyzed and interpreted the data.
DM contributed resources.
All drafted or revised the manuscript.
Competing Interests
DBR, HR, and DM are each affiliated with one university included in this review (McGill University, University of Waterloo, and University of Ottawa, respectively). DBR and HR extracted all data and are not affiliated with the health or medical departments that were the focus of the present study. DM is on the editorial board for FACETS.
Funding Information
This research received no specific grant from any funding agency in the public, commercial or non-for-profit sectors. DBR is supported by a Canadian Institutes of Health Research Vanier Graduate Scholarship. DM is supported by a University Research Chair. JPAI is supported by a grant from the Laura and John Arnold Foundation.
Metrics & Citations
Metrics
Other Metrics
Citations
Cite As
Danielle B. Rice, Hana Raffoul, John P.A. Ioannidis, and David Moher. 2021. Academic criteria for promotion and tenure in faculties of medicine: a cross-sectional study of the Canadian U15 universities. FACETS.
6(): 58-70. https://doi.org/10.1139/facets-2020-0044
Export Citations
If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.
Cited by
1. Value dissonance in research(er) assessment: individual and perceived institutional priorities in review, promotion, and tenure
2. Choice of Open Access in Elsevier Hybrid Journals
3. Power to the People: Measuring Social Media and Mass Media Impact for Promotion and Tenure in Social and Applied Sciences
4. “The Best Home for This Paper”: A Qualitative Study of How Authors Select Where to Submit Manuscripts
5. Faculty Development for Medical Faculty: Importance and Strategies
6. Recognition of knowledge translation practice in Canadian health sciences tenure and promotion: A content analysis of institutional policy documents
7. Open science failed to penetrate academic hiring practices: a cross-sectional study
8. Rigour and reproducibility in Canadian research: call for a coordinated approach
9. An integrated paradigm shift to deal with ‘predatory publishing’
10. Ensuring the success of data sharing in Canada