Literature DB >> 34249190

The impact of local health professions education grants: is it worth the investment?

Susan Humphrey-Murto1, Kyle Walker1, Simran Aggarwal2, Nina Preet Kaur Dhillon3, Scott Rauscher4, Timothy J Wood5.   

Abstract

BACKGROUND: Local grants programs are important since funding for medical education research is limited. Understanding which factors predict successful outcomes is highly relevant to administrators. The purpose of this project was to identify factors that contribute to the publication of local medical education grants in a Canadian context.
METHODS: Surveys were distributed to previous Department of Innovation in Medical Education (DIME) and Department of Medicine (DOM) grant recipients (n = 115) to gather information pertaining to PI demographics and research outcomes. A backward logistic regression was used to determine the effects several variables on publication success.
RESULTS: The overall publication rate was 64/115 (56%). Due to missing data, 91 grants were included in the logistic regression. Variables associated with a higher rate of publication; cross departmental compared to single department OR = 2.82 (p = 0.04), being presented OR = 3.30 (p = 0.01), and multiple grant acquisition OR = 3.85 (p = 0.005).
CONCLUSION: Although preliminary, our data suggest that increasing research publications from local grants may be facilitated by pooling funds across departments, making research presentations mandatory, and allowing successful researchers to re-apply.
© 2021 Humphrey-Murto, Walker, Aggarwal, Preet Kaur Dhillon, Rauscher, Wood; licensee Synergies Partners.

Entities:  

Year:  2021        PMID: 34249190      PMCID: PMC8263034          DOI: 10.36834/cmej.71357

Source DB:  PubMed          Journal:  Can Med Educ J        ISSN: 1923-1202


Background

Medical education is evolving. The last several decades have noted changes in assessment, for example more emphasis on narrative comments and work-place based assessment,[1] the increased reliance on simulation for teaching and assessment,[2] and now, the paradigm shift to competency based medical education.[3] While there has been a significant increase in medical education research over the last four decades, several authors have raised concerns regarding the quality of research.[4] Authors have called for greater methodological rigor, larger multicenter studies and more meaningful objectives such as patient outcomes.[5] It is imperative that rigorous research informs our educational practices, as it impacts how we train and assess physicians, which ultimately impacts patient care. Lack of funding has been suggested to be the greatest barrier to undertaking high quality research[6],[7] and an association has been established between the amount of funding obtained for medical education research and the methodological quality of the corresponding studies.[8] However, despite the need to improve the rigor of education research, funding remains limited.[9] Due to the scarcity of funding in medical education, many organizations have developed local grants programs and some outcomes have been documented. A grant program at the Medical College of Pennsylvania where a total of 13 projects were funded, reported outcomes including immediate student-centered changes to the curriculum, presentations and publications.[10] The University of California San Francisco has awarded $2.2 million to 103 projects over 11 years and noted significant impact such as accelerated faculty promotion, expanded networking opportunities, more scholarly publications/presentations and increased external funding.[11] The Duke Graduate Medical Education Quasi-Endowment fund boasts impressive innovations that have resulted in enduring instructional strategies, but have been less successful in dissemination of findings.[12] Another US study examining a small collaborative research grant program compared funded versus unfunded projects and concluded that funded projects had increased collaboration, and a higher output of scholarly projects including papers, posters and presentations.[13] Other studies have described similar findings.[14],[15] While some positive outcomes of local grants programs have been demonstrated, remarkably, there is limited literature on the factors that play a role in their failures and successes, and the literature is primarily from the U.S., which may limit generalizability to other countries or contexts. Compared to larger grants, local grants programs have scarcer resources. Identifying factors which increase the chance of recipient “success”, which arguably equates to publication, can help guide grant administrators in developing a more rewarding program and allocating limited funds. In the general literature, several factors have been identified that predict success in research or grant acquisition. Supporting factors include training in grant writing,[8] number of previously published studies[8] and having mentorship,[16] research training[17] and fellowships.[18] Barriers include lack of dedicated time for research, lack of help, poor motivation, no personal goal to publish, lack of confidence, less post-fellowship mentoring, ineffective relationships with co-authors, and manuscript rejection.[18] While there is literature regarding factors that predict success for grant acquisition and success in research, it is not clear what factors would allow grant recipients to be more successful. In this era of accountability, it is imperative to demonstrate to payers that local grants are valuable to research dissemination and their money is well spent. Medical education research is a key component of advances in healthcare however, its significance is not matched by available funding. Several American studies have shown that financial support led to increased medical education research and dissemination, and supported local curricular innovations and faculty promotion.[10]–[12],[14],[15] However, little is known about which factors lead to success among funded projects. In addition, there are no studies examining local medical education research grants in Canada, although they have been identified in four of eight medical schools.[19] The purpose of this project is to identify factors that contribute to the success of local medical education grants by describing and analyzing outcomes of local medical education grants in a Canadian context.

Methods

To address the funding needs locally, two Departments at the University of Ottawa established local grants programs in 2008; The Department of Innovation in Medical Education (DIME), previously named the Academy for Innovation in Medical Education (AIME) and the Department of Medicine (DOM). The programs were offered annually and established as a means of encouraging faculty to pursue research in medical education or to design, and to implement and evaluate their educational innovations. The DIME grants are open to faculty members and medical education fellows from multiple departments, while DOM grants are only available to DOM members. The applicant was considered the PI, and junior faculty and fellows were supervised by faculty who were listed as senior authors. As of 2015, only junior faculty (less than five years on faculty) were allowed to apply as PI. Both are peer reviewed, by both internal and external reviewers and final selection decisions are made by a committee. Both committees include MDs and PhDs from various departments within the university. These local grants programs will be the focus of the study. The maximum allocation per grant varied between $20,000-25,000 for DIME and $25,000 for DOM grants. Using Survey Monkey, we developed a 46-question survey (Appendix A). Survey questions were informed by the literature search outlined in the background section. Survey questions explored respondent demographics (e.g. postgraduate training in research in education), number of publications at the time of grant receipt, awarded grant details (dollars awarded), and research production and dissemination (poster or oral presentations completed at local, national and international meetings). Responses were either yes/no, dropdown menus with selection or free text. A draft survey was piloted by two individuals who had previously been grant recipients and reviewed by a research assistant. Based on their feedback, amendments were made for clarity. The individuals who completed the pilot were asked to participate in the study at a later date. Surveys were distributed to all past DIME and DOM grant recipients (principal investigators, PI) (n = 115) between January 1, 2008 and July 1, 2017. Since the median time from abstract presentation to publication has been documented at 10-30 months, a cut-off of 2017 was selected.[20] Recipients were asked to complete one survey for each grant received. A second request was distributed to grant recipients in attempts to increase our response rate. It was anticipated that response rates might be lower because of the large number of questions and a small financial incentive ($10 gift card from Tim Hortons Coffee Shop). Recipients who did not complete surveys, or completed one but not all surveys, were asked to share their curriculum vitae. In addition, in order to ensure completeness of data provided with regards to research dissemination PubMed was searched by PI name and where available, online abstract archives of meetings. Information was searched for peer reviewed articles, local, national and international meeting presentations, posters and oral presentations completed by May 2019. Online abstract archive meetings included: National Meetings (Canadian Conference for Medical Education 2012-2019, CCME; Association for Medical Education in Europe 2008-2018, AMEE; International Conference on Residency Education 2014-2018, ICRE), and our local medical education day. The last PubMed search was completed December 2019. A few instances were encountered where survey data provided did not coincide with meeting abstract archives. These situations were clarified using provided CVs. Ethics approval was obtained from the Ottawa Health Science Network Research Ethics Board. Data analysis includes summary statistics.

Statistical analysis

Descriptive statistics were used to describe funding source (DOM, DIME), rank (Assistant, Associate, Full, Other (trainee/scholar), post-graduate training (yes, no), mentorship (yes, no), number of publications (<5, 5-10, >10), money rewarded (<$10,000, $10,000-$20,000, > $20,000), presentations (yes, no) and whether the study was published (yes, no). For these latter two variables, if we could not find a publication or presentation, they were coded as no. Only orals and posters were considered as presentations and workshops were not. In order to explore if PIs who had received more than one grant in this dataset were more likely to publish, we also added a variable called multiple grants (one, or more than one). A backward logistic regression model was used as this type of analysis is appropriate if one is exploring data and there is no theoretical reason why one variable might be more important than another Dummy coding was used for variables with multiple categories (i.e., rank, number of publications, money rewarded). Within each category, the sub-category with the highest number was used as the base reference for dummy coding.

Results

Grants characterization

In total, 115 grants were awarded money between January 1, 2008 and July 1, 2017. For DIME grants, since the call would have been in the Fall of 2017, the year 2017 was not included. During that time, the DIME funding program distributed $1,013,232.80 to 66 (57%) research projects (2008- 2016); $661,459.75 has been distributed to 49 (43%) projects through the DOM funding program (2008-2017). PIs received a survey for each funded grant (115 surveys sent); PIs who received more than one grant were sent a separate survey for each project. The response rate for the surveys was 53/115 (46%). Missing information was supplemented through data searches, and review of CVs such that 91 of the 115 (79%) grants could be used for complete analysis. Table 1 presents frequency related data for the collected variables on all 115 grants as well as 91 grants included in the logistic regression analysis. A total of 60 PIs received 91 grants. 46 PIs received one grant each (46 grants), and 14 PIs received two or more grants for a total of 45 grants (specifically nine PIs received two grants, three PIs received five, two PIs received six).
Table 1

Characteristics of all the Grants and those included in the logistic regression

VariableCategoriesAll grants (n = 115)Grants included in logistic regression (n = 91)
n%Published n%n%Published n%
Total1151006456911005358
Funding SourceDIME6657395956623664
DOM4943255135381749
RankAssistant5144336549543265
Associate2017126018201056
Full645778457
Other (trainee/scholar)201710501719741
Missing1715529N/AN/AN/AN/A
Post-Grad trainingYes4842306344482966
No4842245047522451
Missing19171053N/AN/AN/AN/A
MentorshipYes2824155428311554
No14127501415750
Missing7363425849543163
Number of Publications015139601415857
<518169501820950
5 to 101513106715161067
>101816116118201161
Missing49432551262915
Money Rewarded<$10,0003631205631341548
$10,000 - $20,0003228185622241673
>20,0004741265638422258
PresentationsYes6657446752573669
No4943204139431744
Multiple grantsYes5750393446513943
No5850252245492527
Characteristics of all the Grants and those included in the logistic regression Of the 91 grants, 49 (54%) were received by assistant professors, 18 (20%) by associate professors, 7 (8%) by full time professors and 17 (19%) by trainees and scholars. About an equal percentage reported having post-graduate (48%) vs. no post-graduate (52%) training, while 28 (31%) participants reported having a mentor versus 14 (15%) who reported having no mentor. In terms of publication experience, 18 (20%) of the 91 surveys reported having <5 previous publications, 15 (16%) reported 5-10 publications and 18 (20%) reported >10 publications. For the 91 grants, 35 (52%) were for < $10,000; 47 (70%) for $10,000-20,000 and 33 (49%) for $20,000. In total, 53 of 91 (58.2%) led to publications; 36/56 DIME grants (64%) and 17/35 (49%) DOM grants.

Logistic regression analysis

A backward logistic regression was performed to determine the effects of funding source, academic rank, post graduate training, money awarded, being awarded more than one grant and presentation on the publication success. Mentorship and number of previous publications were not included in the analysis because of the volume of missing data. Of the predictor variables, three were statistically significant: funding source, presentation and being awarded more than one grant. The logistic regression model was statistically significant, χ2(3) = 15.63, p = .001. The model explained 24% (Nagelkerke R2) of the variance in publication success and correctly classified 70% of cases. If the grant was a DIME grant it had a 2.82-times (1.06-7.51, p = 0.04) higher odds of being published compared to a DOM grant. If the grant had been presented, it had a 3.30 (1.29-8.48, p = 0.01) times higher odds of being published compared to grants that had not been presented. Having multiple grants locally led to a 3.85-times (1.49-9.94, p = 0.01) increase in the odds of being published. (See Table 2) Of the remaining variables that were not included the model, p-values associated with their Wald statistic ranged from p = .22 (trainee(other)) to p = .99 (post-grad training).
Table 2

Logistic regression predicting publication status

Odds RatioP value95% C.I.
LowerUpper
Funding Source(DIME vs DOM)2.82.041.067.51
Presentation(Yes vs No)3.30.011.298.48
Multiple grants(Yes vs No)3.85.0051.499.94
Logistic regression predicting publication status

Discussion

The aim of this project was to identify factors that contribute to publication of locally funded medical education grants by studying the outcomes of two local grants at a single Canadian university. We collected information on respondent demographics, awarded grant details, and research outcomes. Of the analyzed variables, three significantly predicted publication success: presentations, funding source, and multiple grant acquisition. We found no significant effects of academic rank (Assistant vs Associate vs Full professor), amount of money awarded, trainee status or post graduate training in education or research. The odds ratio for presentation was 3.30, indicating that presenting the work orally or as a poster, led to an increased likelihood of publication. This correlation has been observed before[20] and there are several reports in the literature attempting to identify characteristics of abstracts presented at national meetings, that subsequently lead to publication. Even for the largest North American medical education conferences (combined Canadian Conference Medical Education and Research In Medical Education),a Pediatric combined American/Canadian education meeting (Council on Medical Student Education in Pediatrics- COMSEP), and more recent data on the Canadian Conference in Medical Education, the publication rates of abstracts presented were only 34.7%, 34%, and 30.5% respectively.[20],[21], [22] This begs the question: why do some presentations succeed in publication and others do not? In one study, publication was found to be more likely for articles with a PhD as the last author, for oral compared to poster presentations, and those outlining completed work (Walsh et al.).[20] In our study, the rank of the PI, other than trainee status, or having post graduate training in a health professions education or related field had no significant effect on likelihood of publication. This signifies that advanced degrees in medical education may not, in isolation, be a substantial factor for research success.[23] In 2016, Wyatt et al. retrospectively evaluated the productivity of educational research fellowship graduates and identified three key themes that contributed to their fellows’ academic success: dedicated time to conducting educational research, opportunities to engage with others and understanding the difference between educational and clinical research.[24] Several obstacles have also been identified to publication success, with lack of time and mentorship reported to be most important barriers with lack of skills and training also playing a notable role.[21] Due to missing data, our study could not analyze the impact of mentorship on publication success. Although not included in the analysis as a factor, it is of note that 34/69 PIs had fewer than five publications when they submitted a grant. This makes sense if local grants are targeting newer, more junior researchers to encourage engagement in research. Furthermore, while funding is an important and under-emphasized part of medical education research, our study found no significant difference between the amounts of funding received and their influence on publication success. Notably, however, over two-thirds of the publications (44/64; 69%) had received $10,000- 25,000 in grants. We speculate that the difference between less than $10,000 and $25,000 is not relevant; in other words, comparing $10,000 to $100,000 might demonstrate a difference. This speaks to one of the limitations of this study; the lack of data for a multitude of variables made it difficult to identify their significance in predicting the likelihood of publication. More research is needed in order to properly elucidate their impact. It is interesting to note that grants awarded to PIs who had secured more than one local grant had a higher odds ratio of being published. Presumably junior faculty committed to medical education scholarship would have been applying for grants more consistently and dedicating the time required to complete the manuscripts. Those with a single grant may have been “dabblers,” and not as committed to publishing. This might suggest that local grant funds should not be denied to previous recipients if publication is the end goal. Another significant finding of this study was that the odds of publication also relied on funding source; a DIME funded grant was over 2.82 times more likely to be published than a DOM grant. One of the largest differences between DIME and DOM is that DIME grants are open to multiple departments, presumably making it a more competitive process. The selection committee for DIME grants is more diverse, with more departments, PhDs, and external reviewers. This may make the peer review process more rigorous, hence selecting higher quality grants more likely to be successful. This hypothesis needs to be tested. At the University of Ottawa, DIME consists of a core research unit specialized to provide support for professional development, technological development and inter-disciplinary collaboration.[25] This targets some of the key barriers to publication success: mentorship, dedicated time to conducting educational research and opportunities to engage with others.[21],[24] The very same factors have been shown to be key perceived supports for medical education scholars. Research support (consultation, data analysis, grant writing), colleague interactions and ongoing development in the form of faculty development have been identified as important contributors to the success of medical education researchers.[23],[26] The local research support unit (RSU) is accessible to all faculty members, including DOM members, so it should not account for any differences found. Analyzing the differences between how the two grants are run could inform other local grants developing their own strategies to target publication success.

Conclusions

There were two main limitations to this study. First, the study was conducted at a single Canadian university and while the results could be used to inform future local grants, they cannot be generalized. Another limitation was a lack of data for multiple variables, precluding our ability to properly identify their significance in impacting likelihood of publication. In summary, we identified three factors associated with publication success. Presenting orally or as a poster, and cross departmental funding appear to increase the odds of publishing while not having a formal academic appointment reduced the odds of publishing. Although our findings are preliminary, we would suggest the following if publication is the goal. Departments wishing to offer grants for education research may want to consider collaborating with other departments and pooling funds. The departments could also share in the peer review process, thus maximizing scarce resources. At the time of offering grants, departments could make presentation at a meeting a mandatory deliverable, prior to releasing all of the money. This may be a strong incentive for PIs to present the work. Finally, PIs who were awarded multiple grants over the years are more likely to publish, thus making an argument to allow successful PIs to re-apply. Clearly, further research to confirm and expand these results is needed, but we hope this study will inform other centers which currently have or are considering a local grants program. Ultimately, supporting rigorous research and thoughtful innovation in medical education has the potential to benefit trainees, and more importantly, our patients.
  25 in total

1.  Factors contributing to success in surgical education research.

Authors:  S Dutta; G L Dunnington
Journal:  Am J Surg       Date:  2000-03       Impact factor: 2.565

2.  The need for evidence in medical education: the development of best evidence medical education as an opportunity to inform, guide, and sustain medical education research.

Authors:  W Dale Dauphinee; Sharon Wood-Dauphinee
Journal:  Acad Med       Date:  2004-10       Impact factor: 6.893

3.  Funding medical education research: opportunities and issues.

Authors:  Jan D Carline
Journal:  Acad Med       Date:  2004-10       Impact factor: 6.893

4.  Education scholarship: it's not just a question of 'degree'.

Authors:  Mark A Goldszmidt; Elaine M Zibrowski; W Wayne Weston
Journal:  Med Teach       Date:  2008-02       Impact factor: 3.650

5.  Factors related to publication success among faculty development fellowship graduates.

Authors:  Mindy A Smith; Henry C Barry; John Williamson; Carole W Keefe; William A Anderson
Journal:  Fam Med       Date:  2009-02       Impact factor: 1.756

6.  Without proper research funding, how can medical education be evidence based?

Authors:  Julian Archer; Chris McManus; Katherine Woolf; Lynn Monrouxe; Jan Illing; Alison Bullock; Trudie Roberts
Journal:  BMJ       Date:  2015-06-26

7.  The impact of intramural grants on educators' careers and on medical education innovation.

Authors:  Shelley R Adler; Anna Chang; Helen Loeser; Molly Cooke; Jason Wang; Arianne Teherani
Journal:  Acad Med       Date:  2015-06       Impact factor: 6.893

8.  Further dissemination of medical education projects after presentation at a pediatric national meeting (1998-2008).

Authors:  Sherilyn Smith; Terry Kind; Gary Beck; Jocelyn Schiller; Heather McLauchlan; Mitchell Harris; Joseph Gigante
Journal:  Teach Learn Med       Date:  2014       Impact factor: 2.414

9.  Reading between the lines: faculty interpretations of narrative evaluation comments.

Authors:  Shiphra Ginsburg; Glenn Regehr; Lorelei Lingard; Kevin W Eva
Journal:  Med Educ       Date:  2015-03       Impact factor: 6.251

10.  Will I publish this abstract? Determining the characteristics of medical education oral abstracts linked to publication.

Authors:  Jean-Michel Guay; Timothy J Wood; Claire Touchie; Chi Anh Ta; Samantha Halman
Journal:  Can Med Educ J       Date:  2020-12-07
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.