Literature DB >> 20016744

Cancer diagnostic assessment programs: standards for the organization of care in Ontario.

M Brouwers1, T K Oliver, J Crawford, P Ellison, W K Evans, A Gagliardi, J Lacourciere, D Lo, V Mai, S McNair, T Minuk, L Rabeneck, C Rand, J Ross, J Smylie, J Srigley, H Stern, M Trudeau.   

Abstract

BACKGROUND: Improving access to better, more efficient, and rapid cancer diagnosis is a necessary component of a high-quality cancer system. How diagnostic services ought to be organized, structured, and evaluated is less understood and studied. Our objective was to address this gap.
METHODS: As a quality initiative of Cancer Care Ontario's Program in Evidence-Based Care, the Diagnostic Assessment Standards Panel, with representation from clinical oncology experts, institutional and clinical administrative leaders, health service researchers, and methodologists, conducted a systematic review and a targeted environmental scan of the unpublished literature. Standards were developed based on expert consensus opinion informed by the identified evidence. Through external review, clinicians and administrators across Ontario were given the opportunity to provide feedback.
RESULTS: The body of evidence consists of thirty-five published studies and fifteen unpublished guidance documents. The evidence and consensus opinion consistently favoured an organized, centralized system with multidisciplinary team membership as the optimal approach for the delivery of diagnostic cancer assessment services. Independent external stakeholders agreed (with higher mean values, maximum 5, indicating stronger agreement) that DAP standards are needed (mean: 4.6), that standards should be formally approved (mean: 4.3), and importantly, that standards reflect an effective approach that will lead to quality improvements in the cancer system (mean: 4.5) and in patient care (mean: 4.3).
INTERPRETATION: Based on the best available evidence, standards for the organization of DAPS are offered. There is clear need to integrate formal and comprehensive evaluation strategies with the implementation of the standards to advance this field.

Entities:  

Keywords:  Diagnostic assessment; cancer; organizational; standards; systematic review

Year:  2009        PMID: 20016744      PMCID: PMC2794680          DOI: 10.3747/co.v16i6.400

Source DB:  PubMed          Journal:  Curr Oncol        ISSN: 1198-0052            Impact factor:   3.677


1. INTRODUCTION

The provision of efficient and rapid cancer diagnosis is a necessary component of a high-quality cancer system, but how diagnostic services ought to be organized, structured, and evaluated is less understood and studied. The inefficient and inappropriate use of diagnostic imaging procedures (test duplication, inappropriate tests ordered) can have substantial resource implications and can delay patient treatment—a serious health care concern. One prospective Canadian study found that median wait times to diagnosis were 37 days, 71 days, and 81 days for patients with lung, colorectal, and prostate cancer respectively 1. In relation to lung cancer, Liberman et al. 2 reported mean and median wait times of 208 and 109 days respectively between initial contact with a physician or first onset of symptoms and diagnostic surgery. Similarly, data from seven Canadian provinces measuring the time from an abnormal breast screen to diagnosis showed a median time to diagnosis of 3.7 weeks; 10% of women waited 9.6 weeks or longer for a diagnosis 3. Diagnostic assessment programs (daps) are one component of an overall rapid-access strategy for diagnosis. The daps may be either actual or virtual entities characterized by facilitated access to comprehensive diagnostic services, multidisciplinary consultative expertise, patient information resources, and psychosocial supports. Programs of this type have been associated with high patient satisfaction 4–7, a reduction in time from diagnosis to the initiation of treatment for various disease sites 5,8, and potentially, improvements in clinical outcomes 9. However, less clear are the organizational and practice setting features that define a high-quality dap, the role of a dap in a comprehensive rapid-access strategy, the defining features of a dap that lend themselves to unique geographic or jurisdictional situations, and the indicators that should be used to measure quality and impact. In the province of Ontario, the population of approximately 12 million people is spread over more than 1 million square kilometres, and the distribution of new cancer cases varies considerably across the various regions serving that population 10. Population size and geographic spread are important considerations in strategizing about quality improvement actions meant to increase access and to reduce wait times to diagnosis. At the same time, it must be acknowledged that solutions for one region may or may not be generalizable to another. In Ontario, approximately 65,000 new cases of cancer per year are predicted 11, with most patients presenting with lung, breast, colorectal, or prostate cancer 11. These patients will require a high standard of care, starting with their entry into the cancer system. As opposed to current opportunistic systems, an organized entry into the cancer system and diagnostic processes has the potential to reduce duplication of tests, to improve efficiency, to reduce costs and waiting times, to enhance the overall quality of care for patients throughout the cancer system, and conceivably, to improve the outcome of treatment. The objectives of the Ontario standards for the organization of care for cancer daps are to provide advice to administrators, planners, and government on the optimal strategic planning and investment options required to provide the highest standard of care for patients with cancer. The Diagnostic Assessment Standards Panel was convened to work with the Program in Evidence-Based Care (pebc) to develop recommendations that could guide the design, implementation, and evaluation of daps in Ontario.

2. METHODS

The Diagnostic Assessment Standards Panel, composed of clinical oncology experts, regional vice presidents, clinical administrative leaders, health service researchers, and methodologists (Table I), conducted a systematic review and environmental scan of the literature to help inform the development of provincial standards. External validation of the standards was conducted through an external review by relevant practitioners and administrators throughout the province of Ontario.
TABLE I

Membership of the Diagnostic Assessment Programs Standards Panel

Melissa Brouwers md (facilitator)Director, pebc, cco, and Associate Professor (PT), Department of Clinical Epidemiology and Biostatistics, McMaster University, Hamilton, ON
Terry Minuk md (diagnostic radiology specialist)Hamilton, ON
Joanne Crawford msc (nursing)McMaster University, Hamilton, ON
Tom OliverResearch Coordinator, pebc, cco, McMaster University, Hamilton, ON
Phil Elison md (family medicine)Liaison from the Ontario College of Family Physicians to cco, Toronto, ON
Linda Rabeneck md (gastroenterologist)Vice President, Regional Cancer Services, cco, Toronto, ON
William K. Evans md (medical oncologist, lung cancer specialist)Co-Chair, Lung Disease Site Group, pebc, cco, and Vice-President, Regional Cancer Services, cco, Hamilton, ON
Carol RandRegional Director, Systemic, Supportive and Palliative Care, Juravinski Cancer Centre, Hamilton, ON
Anna Gagliardi phdScientist, Sunnybrook Research Institute, and Assistant Professor, Departments of Surgery and of Health Policy, Management and Evaluation, Faculty of Medicine, University of Toronto, Toronto, ON
Jill RossDirector, Clinical Programs, cco, Toronto, ON
Joanne LacourciereManager, Northwest Regional Cancer Program, Thunder Bay Regional Health Sciences Centre, Thunder Bay, ON
Jennifer SmylieClinical Manager, Regional Assessment Centre for Lung, Colorectal and Prostate Cancers, The Ottawa Hospital Regional Cancer Centre, Ottawa, ON
Dorothy Lo md (medical oncologist)Medical oncology resident, University of Toronto, and Master of Health Sciences student, University of Toronto, Toronto, ON
John Srigley md (pathologist)Provincial Head, Laboratory Medicine/Pathology, cco, Kingston, ON
Verna Mai mdDirector, Screening Program, cco, Toronto, ON
Hartley Stern md (surgeon)Provincial Head, Surgical Oncology, and Vice-President, Regional Cancer Services, cco, Ottawa, ON
Sheila McNair phdAssistant Director, pebc, cco, McMaster University, Hamilton, ON
Maureen Trudeau md (medical oncologist, breast cancer specialist)Co-Chair, Breast Disease Site Group, pebc, and Provincial Head, Systematic Therapy Program, cco, Toronto, ON

pebc = Program in Evidence-Based Care; cco = Cancer Care Ontario.

2.1 Search Strategy

A systematic review published by Gagliardi et al. in 2004 12 served as the evidence foundation for the current standards for practice. In that systematic review, the authors identified twenty relevant studies, published up to 2002, that evaluated both the clinical and the economic components of daps for suspected cases of breast, colorectal, lung, head-and-neck, prostate, and other cancers. The search of the literature was conducted using mesh and the keyword terms “ambulatory care facilities/ OR community health centers/ OR outpatient clinics, OR hospital/Ambulatory Care/ OR cancer care facilities/ OR (keywords: rapid or same day or one stop or multidisciplinary AND clinic AND diagnosis) AND Breast Neoplasms/di OR Prostatic neoplasms/di OR Lung neoplasms/di OR Exp colorectal neoplasms/di or Exp head and neck neoplasms/di OR prostatic neoplasms OR breast neoplasms OR lung neoplasms OR exp colorectal neoplasms OR Exp head and neck neoplasms”. The search was limited to English-language citations. The original literature search, which spanned 1985–2002, was updated to October 2006 using medline (ovid: 2002 through October 2006), embase (ovid: 2002 to October 2006), the Cochrane Library (ovid; Issue 3, 2006), the Canadian Medical Association Infobase, and the National Guideline Clearinghouse. Reference lists of related papers and recent review articles were also scanned for additional citations.

2.2 Selection Criteria

Articles were included in the systematic review of the evidence they met these criteria: Randomized controlled trials (rcts), case–control studies, and prospective or retrospective cohort studies (letters, editorials, and comments were excluded) January 2002 through October 2006 Diagnostic assessment programs or units, or one-stop, fast-track, or rapid-access clinics with a focus on care provision for patients with suspected cancer, and encompassing the diagnostic assessment of patients with a potential malignancy. English Quality of the primary studies was assessed using common appraisal tools, including the domains from the Jadad scale 13 (for rcts) and Downs and Black 14 for observational studies. The environmental scan involved two processes. First, inquiries were made directly to key cancer leaders and contacts in Ontario, Canada, and to selected groups outside of Canada. Second, a targeted Internet search was undertaken of key sites, including professional associations, guideline registries, and health care organizations (Table II). Any reports detailing models, frameworks, descriptions, and evaluations of daps (including quality improvement initiatives) from these targeted individuals, organizations, or information sources were considered eligible for inclusion. No specific quality evaluation criteria were applied, because no scales or quality domains have been evaluated using traditional health measurement principles.
TABLE II

Environmental scan of the literature

TargetSourceMethod
Local jurisdictionsOntario regionsDirect inquiry
British ColumbiaDirect inquiry
AlbertaDirect inquiry
SaskatchewanDirect inquiry
ManitobaDirect inquiry
QuebecDirect inquiry
Nova ScotiaDirect inquiry
NewfoundlandDirect inquiry
Guideline directoriesOntario Guidelines Advisory CommitteeInternet search
OtherAmerican Society of Clinical OncologyInternet search
American College of RadiologistsDirect inquiry, Internet search
Canadian Association of RadiologistsDirect inquiry, Internet search
Canadian Strategy for Cancer ControlDirect inquiry, Internet search
National Health Services, United KingdomInternet search
Scottish Intercollegiate Guidelines Network, ScotlandInternet search
Standards, Options, Recommendations, FranceInternet search
Veterans Affairs, United StatesInternet search
New ZealandInternet search
AustraliaDirect inquiry, Internet search

3. RESULTS

The evidence base comprises thirty-four published studies 4,7,15–46 and fifteen unpublished guidance documents 47–61. The present report focuses on a brief overview of the evidence found. The reader is referred to the full systematic review and environmental scan report (published elsewhere) for a full description and report of the evidence 62.

3.1 Systematic Review

3.1.1 Search Results

The original systematic review by Gagliardi et al. 12 included twenty articles that described outcomes related to specific disease-site assessment units: eleven for breast cancer 4,15–24, three for colorectal cancer 7,25,26, and six for head-and-neck cancer 31–36. There were seventeen case series that involved 38–3119 patients, two rcts that included 478 and 791 patients, and one case–control study that included 177 cases and 162 controls 4,7,15–26,31–36. The update of the literature search identified 823 citations in which patient outcomes related to diagnostic assessment units were described for colorectal cancer in four studies 27–30, head-and-neck cancer in two studies 37,38, lung cancer in two studies 39,40, gynecologic cancers in three studies 41–43, neurologic cancers in one study 44, lymph node cancers in one study 45, and upper gastrointestinal cancers in one study 46. Study designs included one small rct (88 patients), seven prospective cohort studies (359–3637 patients), and six retrospective studies (69–930 patients) 27–30,37–46. Elements of the Downs and Black quality assessment scale for observational studies 14 were used to assess the quality of relevant studies included in the updated review. Four key domains were used in the evaluation of comparability of subjects, exposure or intervention, outcome measure, and statistical analysis. The quality of the studies was variable, but generally modest, with approximately half the studies not using a comparative control group, thus increasing the risk for selection bias.

3.1.2 Outcomes

The overall findings from Gagliardi et al. 12 included the benefits of diagnostic assessment services in terms of reduced wait times for specific diagnostic procedures, increased patient satisfaction, and reduced anxiety for patients with negative findings. Most patients were diagnosed at the initial visit, and most diagnoses were confirmed by a pathology determination. A number of studies reported increased anxiety in women diagnosed with breast cancer at one-stop clinics, and one study measured clinical outcomes for breast cancer patients. For the updated systematic review, all studies but one were undertaken in the United Kingdom and included the National Health Service referral guidelines as a quality performance indicator for improving timely access 27–30,37–41,43–46. Only one study evaluated the cost of follow-up visits to general practitioners in an rct evaluating a centralized two-stop rapid assessment unit against conventional routine diagnostic evaluation 39. Ten studies defined cancer-specific risk criteria for general practitioners to utilize in their risk assessment and decision-making to expedite high-risk referrals to rapid diagnostic units 27,28,30,38,41–46. Numerous studies evaluated or addressed a multidisciplinary team approach for the rapid diagnostic assessment of cancer 37–40,43,45. The findings from the update of the literature were similar to those reported by Gagliardi et al. 12: Most of the studies evaluating rapid diagnostic assessment for suspected cases of cancer demonstrated a reduced time from first referral to specialist visit and time to first treatment in that setting. The studies that evaluated patient satisfaction found greater patient satisfaction with service provision and personal care given by medical staff 30,35,42. Studies assessing multidisciplinary care found that it translated into a more comprehensive patient assessment and might contribute to better care overall 35,37,38,40,43,45,. Various studies reported that specific referral criteria for individual cancer types aided in decision-making for general practitioners and might assist in ensuring appropriate referral for high-risk suspected cases of cancer to rapid daps 27,28,30,38,41–46.

3.2 Environmental Scan

3.2.1 Search Results

The environmental scan found fifteen guidance documents on the organization of cancer diagnostic services. Although it was not the specific stated purpose of many of the documents, some organizational elements of daps were addressed in each of the guidance documents—for example, mandate, centralized access, scope of diagnostic activity, team criteria, linkages and collaborations, volume prerequisites, and quality indicators. In most cases, the conclusions derived from the guidance documents were supported by consensus-level evidence.

3.2.2 Outcomes

A consistent message was that coordinated and organized diagnostic assessment services managed by multidisciplinary teams with operational links to other specialty services resulted in reduced wait times and improved services—and possibly in improved patient outcomes. The guidance documents also outlined many of the requirements for a dap, including centralized access to diagnostic assessment services, multidisciplinary team criteria, and the diagnostic services needed to successfully operate a dap. Centralized access was most commonly characterized as a one-stop clinic, with integrated and coordinated cancer services, that provides seamless diagnostic assessment services. The composition of the disease-specific multidisciplinary team included not only the appropriate spectrum of disease-specific professionals needed to perform a diagnostic assessment, along with the appropriate disease-specific support personnel, but also coordinators and directors or chairs who were recommended to ensure the coordination of services. The common clinical examination, imaging, diagnostic, and staging procedures and surgical consultation procedures were listed in the guidance documents. Also reported were the pathology services, disease-specific tests, and supportive services that might be needed as part of the spectrum of diagnostic care. There was a general indication in the documents that the appropriate diagnostic investigations and procedures would lead to improved services and patient outcomes. Several of the guidance documents reported the need for linkages to maintain communication between primary health care providers and the coordinated diagnostic and treatment services as patients navigate the system. It was suggested that, in low-volume or underserviced areas, smaller programs should have formal collaborative links with larger programs. There was little evidence to indicate the patient volumes required to maintain one-stop daps. Each jurisdiction would need to determine the appropriate volume requirements for each type or model of dap implemented. Several documents established indicators of quality, with wait times being the most common indicator reported. Other documents recommended that the time from signs or symptoms suggestive of cancer to diagnosis should not exceed 4 weeks. A more thorough analysis of benchmarking is warranted. The development of quality assurance through performance measurement and audit programs was also recommended.

4. CONSENSUS PROCESS AND EXTERNAL REVIEW

The Diagnostic Assessment Standards Panel used the evidence that was available from the published literature, the environmental scan, and their expert opinion to reach consensus for standards on the organization and delivery of diagnostic assessment services in Ontario. The process of developing standards included the formation of the Diagnostic Assessment Standards Panel with a subset working group responsible for writing the draft standards. The panel met often through teleconferences and once in person to draft and approve the standards for practice before the standards were sent for external review. Approval was obtained through informal consensus at the meetings and also through an e-mail survey with 10 questions asking about the level of agreement with the completeness of the evidentiary base and the recommendations as stated. Conflicting views were noted and discussed, and it was agreed that the majority opinion of the panel would be adopted. Upon final approval of the draft by the Diagnostic Assessment Standards Panel, the document underwent internal review by the Report Approval Panel and the Scientific Manager of the pebc. The draft standards were then distributed for review to 74 external Ontario stakeholders: 24 primary care providers, 17 chairs of provincial disease site groups, 25 regional vice presidents of cancer programs and senior administrators, and 8 cancer screening program experts. External review included the opportunity for written feedback and a survey on level of agreement with the manner of evidence collection, with the process used to derive recommendations, and with the recommendations themselves. Responses were received from 11, 3, 12, and 5 participants in each of the respective groups (41% overall return rate). The written feedback from both the clinical and the administrative experts was similar in nature. Feedback was extremely positive. Most stakeholders agreed (with higher mean values, maximum 5, indicating stronger agreement) that there was a need for dap standards (mean: 4.6), that the standards were clear (mean: 4.1), that the draft standards as stated were acceptable (mean: 4.2), that the standards should be formally approved (mean: 4.3), and importantly, that the standards reflect an effective approach that will lead to quality improvements in the cancer system (mean: 4.5). There was also some indication that the standards would be challenging to implement (mean: 3.9), but that the draft standards for the organization of care were achievable (mean: 4.0) and would reflect a more desirable system than current practice for improving the quality of patient care (mean: 4.3). No major modifications to the draft standards were deemed necessary after external review; however, several minor modifications that had been suggested were discussed and incorporated into the draft. Upon final review, the standards were presented to the Executive Team and the Board of Cancer Care Ontario, and the final version of the standards was formally approved by the Diagnostic Assessment Standards Panel. The final approved standards are set out in Appendix A.

5. CONCLUSIONS

It is clear that organized, centralized systems with multidisciplinary team membership are considered the optimal organization for the delivery of diagnostic cancer assessment services. Even though much of the available literature is limited in quality, and expert consensus opinion was often used to inform the guidance documents, the evidence across studies, the statements of credible guidance organizations, and the expert consensus opinion of the Diagnostic Assessment Standards Panel all deliver a consistent message. There are, however, significant and frequently cited challenges associated with the implementation of dap programs. There is a general consensus that implementation of the standards would not be cost-neutral and that additional resources (that is, human resources, new equipment, equipment replacement, and appropriate fees and incentives) would likely be necessary. The reallocation of scarce resources would likely cause hardship on other components of the cancer system, not only in terms of cost, but also in terms of demand for services beyond diagnostic assessment—that is, moving patients at a faster rate into treatment, with the associated potential for backlogging at that juncture. The transition protocol between diagnostic assessment and treatment management with multidisciplinary team membership would need to be carefully mapped out according to service and jurisdictional demands. The reorganization of care would also require strong and collaborative leadership between clinicians, clinical administrators, hospital ceos, it leaders, and the local health integration networks across a variety of settings. The confluence between cancer and non-cancer diagnostic care agendas was also seen as a barrier to implementation. The ability to affect change is limited in a system defined by multiple stakeholders representing many types of diseases, with cancer being only one; the competition with other non-cancer programs could create access barriers to clinicians and equipment. In addition, there may be challenges with the communication required to facilitate buy-in by all providers. There is also concern regarding the need for adequate it systems and connectivity, particularly in regions with a large rural demographic, where the “virtual program” model and single central registry are particularly relevant. These are daunting challenges. However, success models emerging in Ontario show that the implementation of a dap can be achieved without undue burden to the health system. In Ottawa, a collaborative model of surgical cancer care was developed with the primary tertiary centre anchoring a virtual model with eight partnering hospitals in the region. An integral part of this model was the development of diagnostic assessment units (for patients with thoracic cancer, colorectal cancer, breast cancer, and prostate cancer) that have been opened under the umbrella of a central cancer assessment clinic. The cancer assessment clinic was developed to act as a central access point offering coordinated and streamlined multidisciplinary care, where a patient with a suspicion of cancer enters a system (organized by the four disease sites) that acts as the gateway and triage centre for access to coordinated cancer services. Under this system, important collaborative linkages, known as “communities of practice,” have been established across the region, and improvements in patient and system outcomes, such as reductions in wait times, have been observed (Fung-Kee-Fung M. The Ottawa Hospital. Personal communication). It is hoped that the organizational standards will be a useful tool in the development of diagnostic assessment models across various jurisdictions. It is also hoped that, regardless of the model chosen, coordinated rapid access to care in a multidisciplinary team environment will result in a “raising of the bar” in the provision of timely diagnostic assessment services to patients. The standards concerning daps were generated to meet the demand of cancer diagnostic assessment services in Ontario, but the structure and organization of a dap will be influenced by the regional and geographic realities of each jurisdiction, the diagnostic tests necessary to assess an organ system (symptom complexity or physical abnormalities, for instance), and the anticipated volume of cases. Hence, it is reasonable to suggest that the standards will also be generalizable to other jurisdictions outside of Ontario. Regardless of the dap structure implemented in any given jurisdiction, there will be an ongoing need for a comprehensive and formal evaluation strategy not only to refine existing and future diagnostic assessment services in Ontario, but also to help develop a more complete evidence base concerning the value of organized daps across many jurisdictions.

6. CONFLICT OF INTEREST

No conflicts of interest were declared. This project was sponsored by the Ontario Ministry of Health and Long-Term Care through Cancer Care Ontario’s Program in Evidence-Based Care.
  44 in total

1.  The accuracy of "one-stop" diagnosis for 1,110 patients presenting to a symptomatic breast clinic.

Authors:  A Eltahir; J A Jibril; J Squair; S D Heys; A K Ah-See; G Needham; F J Gilbert; H E Deans; M E McKean; L M Smart; O Eremin
Journal:  J R Coll Surg Edinb       Date:  1999-08

2.  A multidisciplinary healthcare delivery model for women with breast cancer: patient satisfaction and physical and psychosocial adjustment.

Authors:  M H Frost; R D Arvizu; S Jayakumar; A Schoonover; P Novotny; K Zahasky
Journal:  Oncol Nurs Forum       Date:  1999 Nov-Dec       Impact factor: 2.172

3.  The development of a community breast center.

Authors:  R M Edge; C Peterson; S James Ward
Journal:  Radiol Manage       Date:  1999 May-Jun

4.  Three-year evaluation of a rapid-access coloproctology clinic.

Authors:  M Lamah; S M Ahmad; A Charalampopoulos; J Ho; R J Leicester
Journal:  Dig Surg       Date:  2000       Impact factor: 2.588

5.  Patients with neck lumps: can they be managed in a 'one-stop' clinic setting?

Authors:  A Murray; C J Stewart; G W McGarry; K MacKenzie
Journal:  Clin Otolaryngol Allied Sci       Date:  2000-12

6.  Breast cancer outcomes at the Strathfield Breast Centre.

Authors:  K H Loveridge; C W Kennedy; N C Janu; H L Carmalt; D J Gillett
Journal:  Aust N Z J Surg       Date:  1998-06

7.  Developing a comprehensive breast center.

Authors:  J K Harness; R H Bartlett; P A Saran; M A Bord; W C Noble
Journal:  Am Surg       Date:  1987-08       Impact factor: 0.688

Review 8.  Evaluation of diagnostic assessment units in oncology: a systematic review.

Authors:  Anna Gagliardi; Eva Grunfeld; William K Evans
Journal:  J Clin Oncol       Date:  2004-03-15       Impact factor: 44.544

9.  The pathway study: results of a pilot feasibility study in patients suspected of having lung carcinoma investigated in a conventional chest clinic setting compared to a centralised two-stop pathway.

Authors:  P V Murray; M E R O'Brien; R Sayer; N Cooke; G Knowles; A C Miller; V Varney; N P Rowell; A R Padhani; D MacVicar; A Norton; S Ashley; I E Smith
Journal:  Lung Cancer       Date:  2003-12       Impact factor: 5.705

10.  A prospective study of peri-diagnostic and surgical wait times for patients with presumptive colorectal, lung, or prostate cancer.

Authors:  E Grunfeld; J M Watters; R Urquhart; K O'Rourke; J Jaffey; D E Maziak; C Morash; D Patel; W K Evans
Journal:  Br J Cancer       Date:  2008-12-16       Impact factor: 7.640

View more
  16 in total

1.  Cost analysis of breast cancer diagnostic assessment programs.

Authors:  G N Honein-AbouHaidar; J S Hoch; M J Dobrow; T Stuart-McEwan; D R McCready; A R Gagliardi
Journal:  Curr Oncol       Date:  2017-10-25       Impact factor: 3.677

2.  Budget impact analysis of a breast rapid diagnostic unit.

Authors:  M Elmi; H Hussein; S Nofech-Mozes; B Curpen; A Leahey; N Look Hong
Journal:  Curr Oncol       Date:  2017-06-27       Impact factor: 3.677

3.  Improving cancer control in Canada one case at a time: the "Knowledge Translation in Cancer" casebook.

Authors:  M C Brouwers; J Makarski; K Garcia; S Bouseh; T Hafid
Journal:  Curr Oncol       Date:  2011-04       Impact factor: 3.677

4.  Piloting a regional collaborative in cancer surgery using a "community of practice" model.

Authors:  M Fung-Kee-Fung; R P Boushey; J Watters; R Morash; J Smylie; C Morash; C Degrasse; S Sundaresan
Journal:  Curr Oncol       Date:  2014-02       Impact factor: 3.677

5.  Improving patient flow and timeliness in the diagnosis and management of breast abnormalities: the impact of a rapid diagnostic unit.

Authors:  J M Racz; C M B Holloway; W Huang; N J Look Hong
Journal:  Curr Oncol       Date:  2016-06-09       Impact factor: 3.677

6.  A patient-centred approach toward surgical wait times for colon cancer: a population-based analysis.

Authors:  Amy Gillis; Matthew Dixon; Andrew Smith; Calvin Law; Natalie G Coburn
Journal:  Can J Surg       Date:  2014-04       Impact factor: 2.089

7.  Regional process redesign of lung cancer care: a learning health system pilot project.

Authors:  M Fung-Kee-Fung; D E Maziak; J R Pantarotto; J Smylie; L Taylor; T Timlin; T Cacciotti; P J Villeneuve; C Dennie; C Bornais; S Madore; J Aquino; P Wheatley-Price; R S Ozer; D J Stewart
Journal:  Curr Oncol       Date:  2018-02-28       Impact factor: 3.677

8.  Interventions to reduce the time to diagnosis of brain tumours.

Authors:  Robin Grant; Therese Dowswell; Eve Tomlinson; Paul M Brennan; Fiona M Walter; Yoav Ben-Shlomo; David William Hunt; Helen Bulbeck; Ashleigh Kernohan; Tomos Robinson; Theresa A Lawrie
Journal:  Cochrane Database Syst Rev       Date:  2020-09-04

9.  How can diagnostic assessment programs be implemented to enhance inter-professional collaborative care for cancer?

Authors:  Anna R Gagliardi; Terri Stuart-McEwan; Julie Gilbert; Frances C Wright; Jeffrey Hoch; Melissa C Brouwers; Mark J Dobrow; Thomas K Waddell; David R McCready
Journal:  Implement Sci       Date:  2014-01-03       Impact factor: 7.327

10.  Effect of specialized diagnostic assessment units on the time to diagnosis in screen-detected breast cancer patients.

Authors:  L Jiang; J Gilbert; H Langley; R Moineddin; P A Groome
Journal:  Br J Cancer       Date:  2015-05-05       Impact factor: 7.640

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.