Carrie Lee1, Theresa L Werner2, Allison M Deal1, Cassandra J Krise-Confair3, Tricia Adrales Bentz4, Theresa M Cummings5, Stefan C Grant6, Ashley Baker Lee7, Jessica Moehle2, Kristie Moffett8, Helen Peck9, Stephen Williamson10, Aleksandar Zafirovski11, Kate Shaw12, Janie K Hofacker12. 1. UNC Lineberger Comprehensive Cancer Center, University of North Carolina at Chapel Hill, Chapel Hill, NC. 2. Huntsman Cancer Institute, University of Utah, Salt Lake City, UT. 3. Starzl Network for Excellence in Pediatric Transplantation, Pittsburgh, PA. 4. Hollings Cancer Center, Medical University of South Carolina, Charleston, SC. 5. University of Maryland Marlene and Stewart Greenebaum Comprehensive Cancer Center, Baltimore, MD. 6. Wake Forest University Baptist Comprehensive Cancer, Winston Salem, NC. 7. City of Hope Comprehensive Cancer Center, Duarte, CA. 8. Moffitt Cancer Center, Tampa, FL. 9. Wilmot Cancer Institute, University of Rochester Medical Center, Rochester, NY. 10. University of Kansas Cancer Center, Kansas City, KS. 11. Robert H. Lurie Comprehensive Cancer Center of Northwestern University, Chicago, IL. 12. Association of American Cancer Institutes, Pittsburgh, PA.
Abstract
PURPOSE: Cancer clinical trials offices (CTOs) support the investigation of cancer prevention, early detection, and treatment at cancer centers across North America. CTOs are a centralized resource for clinical trial conduct and typically use research staff with expertise in four functional areas of clinical research: finance, regulatory, clinical, and data operations. To our knowledge, there are no publicly available benchmark data sets that characterize the size, cost, volume, and efficiency of these offices, nor whether the metrics differ by National Cancer Institute (NCI) designation. The Association of American Cancer Institutes (AACI) Clinical Research Innovation (CRI) steering committee developed a survey to address this knowledge gap. METHODS: An 11-question survey that addressed CTO budget, accrual and trial volume, full-time equivalents (FTEs), staff turnover, and activation timelines was developed by the AACI CRI steering committee and sent to 92 academic cancer research centers in North America (n = 90 in the United States; n = 2 in Canada), with 79 respondents completing the survey (86% completion rate). RESULTS: The number of FTE employees working in the CTOs ranged from 4.5 to 811 (median, 104). The median number of analytic cases (ie, newly diagnosed or received first course of treatment) reported by the main center was 3,856. Annual CTO budgets ranged from $250,000 to $23,900,000 (median, $8.2 million). The median trial activation time, based on 61 centers, was 167 days. The median number of accruals per center was 480 (range, 5-6,271) and median number of trials per center was 282 (range, 31-1,833). Budget and FTE ranges varied by NCI designation. CONCLUSION: The response rate to the survey was high. These data will allow cancer centers to evaluate their CTO infrastructure, funding, portfolio, and/or accrual goals as compared with peers. A wide range in each of the outcomes was noted, in keeping with the wide variation in size and scope of cancer center CTOs across the United States and Canada. These variations may warrant additional investigation.
PURPOSE: Cancer clinical trials offices (CTOs) support the investigation of cancer prevention, early detection, and treatment at cancer centers across North America. CTOs are a centralized resource for clinical trial conduct and typically use research staff with expertise in four functional areas of clinical research: finance, regulatory, clinical, and data operations. To our knowledge, there are no publicly available benchmark data sets that characterize the size, cost, volume, and efficiency of these offices, nor whether the metrics differ by National Cancer Institute (NCI) designation. The Association of American Cancer Institutes (AACI) Clinical Research Innovation (CRI) steering committee developed a survey to address this knowledge gap. METHODS: An 11-question survey that addressed CTO budget, accrual and trial volume, full-time equivalents (FTEs), staff turnover, and activation timelines was developed by the AACI CRI steering committee and sent to 92 academic cancer research centers in North America (n = 90 in the United States; n = 2 in Canada), with 79 respondents completing the survey (86% completion rate). RESULTS: The number of FTE employees working in the CTOs ranged from 4.5 to 811 (median, 104). The median number of analytic cases (ie, newly diagnosed or received first course of treatment) reported by the main center was 3,856. Annual CTO budgets ranged from $250,000 to $23,900,000 (median, $8.2 million). The median trial activation time, based on 61 centers, was 167 days. The median number of accruals per center was 480 (range, 5-6,271) and median number of trials per center was 282 (range, 31-1,833). Budget and FTE ranges varied by NCI designation. CONCLUSION: The response rate to the survey was high. These data will allow cancer centers to evaluate their CTO infrastructure, funding, portfolio, and/or accrual goals as compared with peers. A wide range in each of the outcomes was noted, in keeping with the wide variation in size and scope of cancer center CTOs across the United States and Canada. These variations may warrant additional investigation.
Cancer clinical trials play a crucial role in prevention, early detection, cancer treatment, and, ultimately, cancer cures. The vast majority of academic cancer centers that support a large number of clinical trials include a clinical trials office (CTO) in their organizational structure. CTOs are centralized offices that support the various pillars of clinical trial conduct (eg, finance, regulatory, clinical, and data management). A CTO is not required to open clinical trials, but centralized infrastructure allows investigators to focus more on novel science rather than personnel management and ensures that gaps in research staffing are covered by a shared resource. National Cancer Institute (NCI) designation requires that centers establish a clinical protocol and data management system that provide centralized management and oversight of functions for coordinating, facilitating, and reporting on the cancer clinical trials of the institution.It is widely acknowledged that the execution of clinical trials is fraught with challenges, including administrative burdens, staffing barriers, regulatory constraints, rising costs, and low patient accrual. An NCI-ASCO Trial Accrual Symposium concluded there was a need for sites to benchmark and monitor their accrual performance against similar sites to realistically plan for staffing, workload, and number and complexity of trials.[1,2] A National Comprehensive Cancer Network (NCCN) Clinical Research Benchmarking Survey has been conducted six times over the past 12 years in an effort to develop best practices for conducting the most effective and efficient clinical trials for patients with cancer. However, the survey results are only available to NCCN members. There are no large, publicly available benchmark data sets that characterize the volume of work involved in cancer clinical trials, the costs and funding sources to support this work, the time to achieve clinical trial activation, or the workforce characteristics necessary to carry out clinical trial activities.The Association of American Cancer Institutes (AACI) comprises 102 of the leading academic and freestanding cancer research centers in North America. AACI advances the objectives of cancer centers by facilitating interaction among the centers, educating policymakers,[3] and fostering partnerships between cancer centers and other cancer organizations to improve cancer care. The AACI Clinical Research Innovation (CRI) was established as an AACI initiative in 2009 to address the shared administrative challenges in clinical trial conduct. CRI is guided by a member-elected steering committee. Steering committee members are cancer center CTO medical directors and administrators who represent the various pillars of clinical trial conduct.The objective of AACI and the CRI steering committee was to develop and widely disseminate a benchmarking survey to allow centers to compare their performance and use the data to promote efficient clinical research operations. These practical data can help CTOs better understand how they compare with peers and if they are “right-sized” and appropriately funded to meet their cancer center clinical trial goals.
METHODS
Participants
At the time of the survey, AACI consisted of 98 academic cancer center members. The survey was distributed to 90 centers in the United States and two in Canada that provide clinical care and have a CTO. Of the six centers that did not receive the survey, five are basic science research centers and do not provide clinical care or have a CTO, and one center was newly established and not treating patients at the time of the survey. For consistency of data collection, the cancer centers were asked to report interventional treatment clinical trial activity for a 12-month period after 2016 and to use the same period for all survey questions. Centers could determine a consistent 12-month reporting period to allow for flexibility of reporting according to their institutional standards (ie, calendar or fiscal year). On May 1, 2018, the survey was sent to the cancer center director, administrative director, and CTO administrative director of these 92 AACI cancer centers. Nonresponders were sent three reminder e-mails. There was no incentive for participation. The survey closed on January 15, 2019.
Design
The 11-question survey was crafted using Qualtrics assessment software (Salt Lake City, UT). Questions were designed by the AACI CRI steering committee; Institutional review board (IRB) approval was not deemed necessary given the survey objectives. The survey addressed cancer center demographics, CTO sources of financial support, interventional treatment trial volume, accrual by trial sponsor type, and staff turnover. To standardize survey answers, a dictionary of terms was included (Appendices A and B). The term “matrix” refers to a cancer center that is intertwined with and dependent on a university structure. Freestanding cancer centers are entities unto themselves and not part of a larger organization.Interventional treatment trials were defined using the following NCI definition: trials designed to evaluate one or more interventions for treating a disease, syndrome, or condition. A standardized definition was used to enhance validity of responses; the NCI definition was chosen given site familiarity with this definition for reporting to ClinicalTrials.gov and NCI funding opportunities. All centers reported budget data in US dollars; the two Canadian centers converted budget data to US dollars using the applicable exchange rate.Categories and definitions of funding sources were as follows: national: NCI National Clinical Trials Network or other NCI/National Institutes of Health (NIH)-supported national trial networks; industry: pharmaceutical company–controlled trial design and implementation; externally peer-reviewed: supported by the NIH or by organizations with a peer-review funding system (eg, R01, SPORE, U01, U10, P01, CTEP); institutional: in-house clinical research study conceptualized, designed, and implemented by cancer center investigators with scientific peer review provided solely by the protocol review and monitoring system of the cancer center; industry or other entities may provide support (eg, drug, device, other funding), but the trial should be the intellectual product of the center investigator.In addition, information was sought on the number of analytic cases (defined as newly diagnosed or receiving first course of treatment at the center as reported by their tumor registry) reported by the main cancer center and sites outside of the main center.Trial activation data were measured in calendar days and collected as time intervals between activation milestones, including receipt of protocol to approval by scientific review committee (SRC)/protocol review committee (PRC); SRC/PRC approval to IRB approval; IRB approval to study activation date (when consent to enroll participants is released); contract draft receipt to execution; SRC/PRC approval to first patient accrued; and from initial budget review to budget approval by sponsor. To best estimate the overall activation time, we added the time from (1) receipt of protocol to SRC/PRC approval to (2) SRC/PRC approval to IRB approval to (3) the time from IRB approval to study activation, with the caveat that this may overestimate the time because some centers submit to PRC and IRB simultaneously. The number of full-time equivalent (FTE) positions supported by the CTO budget included approved vacancies.Descriptive statistics, including medians and ranges, are provided for survey responses. Differences between centers, based on NCI designation, are compared using the Fisher exact test. For each center, the percentage of its total annual budget that came from each source was calculated. Then the median and range of this number across all centers were reported. Accrual to FTE ratio was calculated by median accrual/FTE and rounded to the nearest whole number. The accrual-to-trial ratio was calculated by median accrual/trial (all sponsors). Institutional funds included school of medicine, central university, health system, investigator-initiated trial support, philanthropic, and state-appropriated funds. The correlation between time to activation and number of accruals was estimated using the Spearman correlation coefficient. Analyses were performed using SAS statistical software, version 9.4 (SAS Institute, Cary, NC).
RESULTS
Of the 92 eligible AACI centers, 79 (86%) completed the survey. There were no duplicate responders. All centers reported on a 12-month period between January 2016 and July 2017. Survey participant demographics and geographic distribution are shown in Table 1.
TABLE 1.
Demographics of Cancer Center Benchmarking Survey Participants
Demographics of Cancer Center Benchmarking Survey Participants
Characteristics of Survey Participants
Approximately three-quarters of responding centers (77%) were NCI designated (Appendix C). Of the NCI-designated responding centers, 85% were matrix type. There were no significant differences in NCI designation by geographic location of responding centers (P = .72).
Analytic Cases
The median number of analytic cases reported by the main center was 3,856 (range, 635-22,255).
Budget Range and Budget Sources
Annual CTO budgets ranged from $250,000 to $23,900,000; the median was $8.2 million. The number of centers in each budget category is shown in Table 1. Fifty-eight of the 75 responding centers had an annual CTO budget of < $12 million. The median number of trials and accruals for centers within each budget range is shown in Table 2. Twenty-one cancer centers had a CTO budget range of $8-12 million, the median number of trials supported within this budget range was 245, and the median number of accruals to these trials was 498. Seventy-six percent of non-NCI centers had budgets < $4 million versus NCI centers, which had sites operating in all budget ranges (P = .0002). For each center, the percentage of total annual CTO budget from each source was calculated, and the median of this number across all centers is described (Table 3). The largest percentage of total annual CTO budget source came from industry-sponsored clinical trials (45%; range, 0%-96%), followed by institutional (40%; range, 0%-100%), national cooperative group (4%; range, 0%-31%), and external (3%; range, 0%-36%). The NCI Cancer Center Support Grant accounted for 2% (range, 0%-24%) of annual CTO budget. Seventy-six percent had ≥ 23% of their funding from institutional sources. Seventy-five percent of all centers had < 7% of their annual budget come from national cooperative group sources.
TABLE 2.
Median Number of Trials and Accruals for Centers Within Each Budget Range
TABLE 3.
Total Annual Budget Sources
Median Number of Trials and Accruals for Centers Within Each Budget RangeTotal Annual Budget Sources
Accrual and Trial Volume
The centers reported 55,573 accruals to 27,493 trials over a 12-month period. The median number of accruals per center was 480 (range, 5-6271) and the median number of trials per center was 282 (range, 31-1833). Median accruals per trial (all sponsors) was 1.5 (range, 0.2-26.6). The estimated ratio of median accrual to FTE was five.The median percentage of trials, accruals, and annual CTO budget that came from each sponsor type were as follows: industry: 43% trials, 39% accruals, 45% budget; institutional: 15% trials, 33% accruals, 0% budget; national cooperative group: 33% trials, 19% accruals, 4% budget; and external: 4% trials, 5% accruals, 3% budget.Clinical trial activity primarily took place at the main cancer center versus network (ie, all other) sites. Respondents reported a median of 282 (range, 31-1,833) interventional treatment trials open at the main cancer center. The same centers reported a median of 22 (range, 0-710) trials open at a network site. Of the trials open at a network site, most were sponsored by national cooperative groups.
CTO FTEs
The number of FTE workers housed within the CTOs ranged from 4.5 to 811 FTEs (median, 104). Of non-NCI designated centers, 71% had < 50 FTEs compared with 10% of NCI-designated centers (P < .0001; Table 1). The median numbers of CTO FTEs at NCI-designated centers and non-NCI-designated centers were 114 and 33, respectively. Twenty-four cancer center CTOs reported 100-149 FTEs. Twenty of these centers were NCI designated, three were not designated, and one was a Canadian center. Eighteen centers had ≥ 150 FTEs, all but one, a Canadian center, were NCI designated. FTEs by budget range are shown in Table 2. The median number of FTE workers brought on in a 12-month period was 22 (range, 0-216). The median number of vacancies was 6.5 (range, 0-89).
Activation Timelines
The median time from receipt of the protocol to SRC/PRC approval was 36 days (range, 7-140 days). The median time from SRC/PRC approval to IRB approval was 58 days (range, 0-195 days). The median time from full IRB approval to study activation (participants may be consented) was 55 days (range, 1-270 days). The median time from receipt of draft contract to execution was 94 days (range, 14-283 days). The median time from SRC/PRC approval to first patient accrued was 167 days (range, 14-327 days).The median activation time, based on 61 centers, was 167 days (range, 53-322 days). There was no significant difference in activation times by NCI designation (median NCI designation v non-NCI designation: 166 v 167 days; P = .64) or budget ranges (P = .9). Increased time was associated with decreased accruals (Spearman correlation coefficient, −0.21).
Impact of NCI Designation
Of the 22 centers with a CTO budget of < 4 million (n = 13 NCI centers; n = 9 non–NCI-designated centers), the median number of trials at NCI centers was 358 (range, 95-491) versus 137 for non-NCI centers (range, 31-269; P = .01). The median number of accruals was 289 for NCI (range, 138-925) versus 96 for non-NCI centers (range, 5-2,100; P = .02). The median number of FTEs was 60 for NCI (range, 28-125) versus 25 for non-NCI centers (range, 5-535; P = .01). There was no significant difference by NCI designation in the number of new FTE workers hired (NCI v non-NCI center: 13 v 8 FTE workers; P = .24).
DISCUSSION
Cancer centers are hungry for data to help them better understand the business of clinical trials and how they compare with peer institutions. The landscape of clinical trials has changed dramatically over the past decade.[4,5] In years past, large clinical trials with broad eligibility criteria and minimal fresh-tissue requirements were the norm. Today, clinical trials are increasingly multifaceted in design, with complex tissue sampling and molecular and processing requirements. These trials, particularly those involving biologics, are subject to higher levels of regulatory monitoring due to increased risk and complexity. CTO leaders strive to maintain past levels of productivity (measured by trials and accruals) in this highly specialized environment, a task further complicated by the expectation of rapid trial activation, high clinical research staff turnover rates due to industry competition, and the intent to reach more patients in rural and underserved areas. These elements are essential to cancer care delivery but are costly and push the boundaries of adequate clinical trial oversight.The median CTO budget was $8.2 million, but there was wide variation (range, $250,000-$23,900,000). The majority of CTOs had budgets of < $12 million. Industry-funded trials accounted for a similar percentage of all trials, accruals, and budget source (43% trials, 39% accruals, 45% budget). We observed that institution-sponsored trials accounted for 15% of trials and 33% of accruals; this is in keeping with greater emphasis on accruals to investigator-initiated, home-grown science as comprehensive cancer centers share in this mission. In addition, cooperative group trials and externally funded trials demonstrate a tremendous gap in trials and accruals relative to budget support. These trials have long been woefully underfunded. Cancer centers must offset the cost of running these trials with other industry-funded trials and budget sources.Estimating based on the benchmark data, the median accrual-to-FTE ratio was five and the median cost per accrual was $17,363. These metrics have plagued CTO administrative directors because organizational leaders often correlate them with operational efficiency. However, that notion is oversimplified because the ratio includes all CTO staff required to support an accrual, not just the enrolling coordinator. The benchmark data support the expansion of CTOs and the commensurate increase in financial support needed to manage the work; it truly takes a village. Health care systems have been reluctant to support clinical research,[6,7] given the cost and the difficult-to-quantify return on investment, but clinical trials are essential to fulfilling an institution’s academic mission and attracting patients to centers for cutting-edge care, which, in turn, supports the health care system.The accrual-to-trial ratio was 1.5. This number highlights the challenge of efficiently using resources. Much time and effort are required to run a clinical trial. Opening trials with little accrual potential places financial strain on a center and has negative implications for sponsors and patients. The cancer community can address this pitfall in many ways; for example, by designing trials to be more inclusive while maintaining patient safety. Broadening eligibility criteria can be done without compromising safety or efficacy, specifically in relation to patients with treated brain metastases, well-controlled HIV, and prior or concurrent malignancies.[2] Selecting trials with high accrual potential and that reflect the populations they serve is also challenging. Using technology, including electronic health records, search engines, and artificial intelligence, holds promise. Careful trial selection and performance monitoring is necessary given the high cost of maintaining low-accruing trials.Median activation time, based on 61 centers, was 167 days. Increased time was associated with lower accruals, which supports the finding that prolonged activation undermines accrual potential.[3] Reasons for slow activation include budget and contract negotiation, regulatory start-up processes, PRC review, and IRB review.[1] True critical-path analyses are important for mapping the activation timeline, identifying rate-limiting steps, and determining the highest-yield intervention points.Staff turnover is an ongoing challenge for CTOs. The median number of FTE workers brought in over a 12-month period was 22. The median number of vacancies was 6.5. Unfortunately, turnover begets turnover as loss of experienced staff translates into extra work and job dissatisfaction for those remaining. Reasons for turnover include competition from CROs or industry, lack of career growth opportunities, professional school, heavy workload, poor fit, and normal life events. Strategies to mitigate turnover include comprehensive orientation for new staff and continuing education; competitive salary; career development; flexible and remote work options, including job sharing; leadership and management training; support for wellness; role clarity; work-life balance; and positive office culture.Among the smaller centers (CTO budget < $4 million), there was an association between a larger number of trials, accruals, FTEs, and NCI designation. This finding is hypothesis generating; it cannot be concluded that the relationship is causal. AACI provides support to centers considering and in pursuit of NCI designation; deeper understanding of this relationship and the value of NCI designation warrants more investigation.Transparent sharing of these benchmark data are essential for helping centers determine if their offices are “right-sized” for their accrual goals and for justifying the cost of oncology clinical trials. The data may also assist smaller centers in understanding the associations between clinical trial metrics and NCI designation. They shine a light on common problems and may be used as a baseline for cancer centers to collectively develop solutions, such as how to systematically address the gap between trial selection and trial accrual (ie, improve the accrual-to-trial ratio) and how to more collaboratively address the problem of slow activation and underfunded trials to ultimately benefit patients with cancer.[8,9]
Authors: Julie M Vose; Laura A Levit; Patricia Hurley; Carrie Lee; Michael A Thompson; Teresa Stewart; Janie Hofacker; Suanna S Bruinooge; Daniel F Hayes Journal: J Clin Oncol Date: 2016-11-01 Impact factor: 44.544
Authors: C L Bennett; T J Stinson; V Vogel; L Robertson; D Leedy; P O'Brien; J Hobbs; T Sutton; J C Ruckdeschel; T N Chirikos; R S Weiner; M M Ramsey; M S Wicha Journal: J Clin Oncol Date: 2000-08 Impact factor: 44.544
Authors: David M Dilts; Steven K Cheng; Joshua S Crites; Alan B Sandler; James H Doroshow Journal: Clin Cancer Res Date: 2010-11-09 Impact factor: 12.531
Authors: Delos M Cosgrove; Michael Fisher; Patricia Gabow; Gary Gottlieb; George C Halvorson; Brent C James; Gary S Kaplan; Jonathan B Perlin; Robert Petzel; Glenn D Steele; John S Toussaint Journal: Health Aff (Millwood) Date: 2013-02 Impact factor: 6.301
Authors: Catherine M Alfano; Deborah K Mayer; Ellen Beckjord; David K Ahern; Michele Galioto; Lisa K Sheldon; Lisa M Klesges; Eliah Aronoff-Spencer; Bradford W Hesse Journal: JCO Clin Cancer Inform Date: 2020-06
Authors: Edward S Kim; Suanna S Bruinooge; Samantha Roberts; Gwynn Ison; Nancy U Lin; Lia Gore; Thomas S Uldrick; Stuart M Lichtman; Nancy Roach; Julia A Beaver; Rajeshwari Sridhara; Paul J Hesketh; Andrea M Denicoff; Elizabeth Garrett-Mayer; Eric Rubin; Pratik Multani; Tatiana M Prowell; Caroline Schenkel; Marina Kozak; Jeff Allen; Ellen Sigal; Richard L Schilsky Journal: J Clin Oncol Date: 2017-10-02 Impact factor: 44.544
Authors: Andrea M Denicoff; Worta McCaskill-Stevens; Stephen S Grubbs; Suanna S Bruinooge; Robert L Comis; Peggy Devine; David M Dilts; Michelle E Duff; Jean G Ford; Steven Joffe; Lidia Schapira; Kevin P Weinfurt; Margo Michaels; Derek Raghavan; Ellen S Richmond; Robin Zon; Terrance L Albrecht; Michael A Bookman; Afshin Dowlati; Rebecca A Enos; Mona N Fouad; Marjorie Good; William J Hicks; Patrick J Loehrer; Alan P Lyss; Steven N Wolff; Debra M Wujcik; Neal J Meropol Journal: J Oncol Pract Date: 2013-10-15 Impact factor: 3.840
Authors: Jennifer S Mascaro; Patricia K Palmer; Marcia J Ash; Caroline Peacock; Cam Escoffery; George Grant; Charles L Raison Journal: Int J Environ Res Public Health Date: 2021-11-12 Impact factor: 3.390