Literature DB >> 35710666

Skin Cancer Education Interventions for Primary Care Providers: A Scoping Review.

Ashley E Brown1, Maleka Najmi2, Taylor Duke1, Daniel A Grabell1, Misha V Koshelev1, Kelly C Nelson3.   

Abstract

Primary care physicians (PCPs) are often the first line of defense against skin cancers. Despite this, many PCPs do not receive a comprehensive training in skin conditions. Educational interventions aimed at skin cancer screening instruction for PCPs offer an opportunity to detect skin cancer at earlier stages and subsequent improved morbidity and mortality. A scoping review was conducted to collect data about previously reported skin cancer screening interventions for PCPs. A structured literature search found 51 studies describing 37 unique educational interventions. Curriculum elements utilized by the interventions were divided into categories that would facilitate comparison including curriculum components, delivery format, delivery timing, and outcome measures. The interventions varied widely in design, including literature-based interventions, live teaching sessions, and online courses with durations ranging from 5 min to 24 months. While several interventions demonstrated improvements in skin cancer knowledge and competency by written exams, only a few revealed positive clinical practice changes by biopsy review or referral analysis. Examining successful interventions could aid in developing a skin cancer detection curriculum for PCPs that can produce positive clinical practice and population-based changes in the management of skin cancer.
© 2022. The Author(s) under exclusive licence to Society of General Internal Medicine.

Entities:  

Keywords:  Cancer screening; Dermoscopy; Detection of cancer; Early detection; Family doctor; GP; General practitioner; Melanoma; PCP; Primary care physician; Primary care provider; Secondary prevention; Skin cancer screening

Mesh:

Year:  2022        PMID: 35710666      PMCID: PMC9202989          DOI: 10.1007/s11606-022-07501-9

Source DB:  PubMed          Journal:  J Gen Intern Med        ISSN: 0884-8734            Impact factor:   6.473


INTRODUCTION

Notwithstanding massive community dermatology efforts in primary prevention of skin cancer via sun protection, skin cancer incidence continues to rise. Healthcare estimates forecast 101,280 new melanoma cases in 2021 and national melanoma care costs of $1.6 billion by 2030.[1,2] Secondary prevention, or early detection, offers benefits for melanoma and non-melanoma skin cancers, as early diagnosis significantly improves morbidity and mortality.[3] Additionally, visual examination to detect melanoma serves as one of the most rapid, safe, and cost-effective interventions in medicine, particularly when compared to screening for internal malignancies such as colorectal and lung cancer.[4] For these reasons, more efforts should be allocated towards secondary prevention of skin cancers. Primary care physicians (PCPs) are often the first line of defense against patient mortality due to skin conditions. This is especially true in rural populations where patients primarily rely on PCPs for disease management due to a lack of medical specialists in the area. In dermatology, this may be due to a workforce shortage, in addition to an increasing disparity in the density of dermatologists from urban to rural areas.[5,6] Underserved, under-insured, and uninsured patients are groups disproportionately affected by lack of specialty access. In these populations, access is limited by specialist and referral coordinator shortages, lack of insurance or insurance acceptability by providers, lack of clinic-hospital affiliations, transport or clinic location factors, and poor communication between primary and specialty providers.[7] This is significant as studies have indicated that socio-economic disparities may be associated with advanced-stage melanoma diagnosis in minority, low-income, and/or uninsured populations.[8,9] Therefore, PCPs serve an important role in diagnosing and managing skin cancer in populations where dermatology access gaps exist. However, most PCPs do not receive a comprehensive training in skin conditions which may lead to reduced diagnostic accuracy as compared to dermatologists, unnecessary tests, or inappropriate specialist referrals.[10] Studies have reported that many PCPs do not perform full-body skin exams, even in patients at high risk for skin cancer.[11,12] Barriers to performing skin cancer screening by PCPs include lack of confidence in diagnostic ability in addition to reimbursement, time, and patient-related barriers.[11] Educational interventions offer an opportunity to address PCP’s diagnostic abilities and thus lessen the disparities in skin cancer morbidity and mortality. Several interventions instructing skin cancer detection management have been attempted and published in the literature. A previous 2011 systematic review[15] evaluated 20 studies and 13 interventions according to five outcome measures that included knowledge, competence, confidence, diagnostic performance, and systems outcomes. These interventions were compared against components of curriculum and delivery format. Curriculum criteria included diagnosis, epidemiology, counseling, management, dermoscopy, and detection algorithm, while delivery formats involved live projection, literature, multimedia, feedback, interactive, and web-based. Ninety percent of studies in this review showed significant improvement in at least one of the following five outcome categories, with competence being the most measured outcome. However, a correlation of outcomes with intervention characteristics was not established in this systematic review.[15] Our authors hereby provide an updated scoping review to address the effectiveness of all previously attempted interventions utilized to train PCPs.

METHODS

This scoping review followed the methodological frameworks of Arksey and O’Malley[13] and Levac et al..[14] Scoping reviews are exploratory studies that aim to examine the extent, range, and nature of a research activity.[13] They are like systematic reviews in that they use rigorous and transparent methods that would allow the study to be replicated; however, they differ in that they are broader in nature, without quality appraisal of studies, or synthesis via meta-analysis. Scoping reviews are useful study designs to provide a contextual map of available literature, especially when the literature is heterogenous.[14] The framework includes (1) identifying the research question, (2) identifying relevant studies, (3) study selection, (4) charting the data, and (5) collating, summarizing, and reporting the results. These steps are detailed in subsequent sections.

Step 1: Identifying the Research Question

The research question that guided this study was: What is known from the literature about skin cancer educational programs for PCPs?

Step 2: Identifying the Relevant Studies

A medical librarian assisted in developing a search protocol to identify English-language articles using PubMed (MEDLINE), EMBASE, and Scopus through October 2020. Search terms identified any combination of educational intervention, primary care providers, and skin cancer. A combination of Medical Subject Headings (MeSH; MEDLINE) and Emtree (EMBASE) terms was used with text word search terms. Terms used to capture education intervention included education, curriculum, continuing medical education, interprofessional education, course, training, learning, and professional education. Primary care provider was captured with a general practitioner, general provider, general physician, GP, family medicine, family doctor, family physician, primary care physician, primary care provider, and PCP. Finally, search terms for skin cancer included melanoma, skin cancer, skin neoplasm, cutaneous neoplasm, basal cell carcinoma, squamous cell carcinoma, and cancer of the skin. Cited reference searching was performed via Scopus on articles that made it to full-text review stage during study selection.

Step 3: Study Selection

Three independent reviewers (AEB, TD, DAG) conducted a title/abstract review with results blinded using the review engine Rayyan (https://rayyan.qcri.org), followed by a full-text review. The three reviewers used pre-determined inclusion and exclusion criteria (Table 1). In the event of a disagreement in title/abstract review, the article was referred to full-text review; verbal discussion and agreement resolved full-text discrepancies.
Table 1

Inclusion and exclusion criteria for article selection

Inclusion criteria:

• Studies examining some aspects of skin cancer educational training aimed at PCPs.

• Participants, or intended participants, were primarily (>50%) PCPs, including family doctors, family medicine residents, general practitioners, internal medicine physicians in primary care, and nurse practitioners or physician assistants who practice in primary care.

• Skin cancer was defined to include melanoma, basal cell carcinoma, and squamous cell carcinoma; studies did not have to instruct on all three listed skin cancers for inclusion.

Exclusion criteria:

• General reviews of dermatology, with less than 50% dedicated to skin cancer.

• Participants were primarily (>50%) medical students, dermatologists, patients, or residents in specialties other than the above; studies that use dermatologists as a control cohort were an exception to this criterion.

• Studies utilizing decision-making software (artificial or augmented intelligence).

• Teledermatology studies in which dermatologists interpreted clinical or dermoscopic pictures.

• Duplicate publication in the form of a conference abstract

Inclusion and exclusion criteria for article selection Inclusion criteria: • Studies examining some aspects of skin cancer educational training aimed at PCPs. • Participants, or intended participants, were primarily (>50%) PCPs, including family doctors, family medicine residents, general practitioners, internal medicine physicians in primary care, and nurse practitioners or physician assistants who practice in primary care. • Skin cancer was defined to include melanoma, basal cell carcinoma, and squamous cell carcinoma; studies did not have to instruct on all three listed skin cancers for inclusion. Exclusion criteria: • General reviews of dermatology, with less than 50% dedicated to skin cancer. • Participants were primarily (>50%) medical students, dermatologists, patients, or residents in specialties other than the above; studies that use dermatologists as a control cohort were an exception to this criterion. • Studies utilizing decision-making software (artificial or augmented intelligence). • Teledermatology studies in which dermatologists interpreted clinical or dermoscopic pictures. • Duplicate publication in the form of a conference abstract For studies existing only as a conference abstract (i.e., no corresponding full publication), corresponding authors were contacted for the full presentation, or the Internet was searched for conference proceedings. Studies did not need to be evaluated or assessed for success for inclusion, as this scoping review provides a description of past educational efforts rather than just a synthesis of successful educational components.

Step 4: Charting the Data

A data extraction spreadsheet was created using Google Sheets. Data extracted included authors, year of publication, country of origin, study design, curriculum components, delivery format, delivery length, assessment type, and outcome measures. If a data element was unclear, corresponding authors were contacted for further information. Previous studies guided the data extracted regarding curriculum components, delivery format, and outcome measures based off criteria most useful for intervention comparison.[15,16] In addition to previous studies, information regarding timing of intervention, including single or multitude of days and synchrony of instruction, was collected, as well nature of assessment for outcome measures. Definitions of study variables are defined in Table 2 and were derived from a 2011 systematic review of skin cancer educational interventions for PCPs.[15]
Table 2

Definitions of study variables

CriteriaSpecificsDefinition
CurriculumEpidemiologyProvided background information on skin cancer, trends in incidence or mortality, risk factors (skin types, family history, sun exposure, etc.)
Pigmented lesionsTaught basic principles of recognizing melanoma and differentiating benign pigmented lesions
Non-pigmented lesionsTaught basic principles of recognizing squamous cell carcinoma or basal cell carcinoma and differentiating benign non-pigmented lesion
DermoscopyInstructed participants on use of dermoscopy in recognizing skin cancer and/or addition of dermoscopy to skin exam
AlgorithmUsed a novel or pre-existing clinical (ex: ABCDE) or dermoscopic (ex: 3-point checklist) algorithm to aid in triage of skin lesions
ManagementInstructed participants on determining a plan of action for skin lesion (biopsy, observation, referral, etc.)
CounselingInstructed participants on prevention strategies for patients including photoprotection, skin self-examination, and/or follow-up
Delivery formatLiveParticipants attended a training session in person; included speaker given large lectures or small group sessions
LiteratureProvision of educational books, pamphlets, posters, cards, etc.
E-learningUse of computer software, multimedia, or the internet. Ranged from video lectures to interactive training curriculums
FeedbackSimultaneous or delayed feedback given to participants. Included review of biopsies or review of written assessments with comments provided to the learner
InteractiveRequires cognitive engagement for participation. Ranges from intermittent practice quizzes to participant-guided learning
Patient interactionIncluded interaction with real or standardized patients; either as demonstration or for procedure clinics
Delivery timingSynchronySynchronous interventions are delivered at the same time to an audience, while asynchronous interventions vary in timing of delivery based on an individual completing a task.
DayTraining delivered over one or multiple days. If training took place individually based off the minimum time to finish the intervention, or average reported by paper
LengthCumulative length of intervention if available. If over multiple days, total span included. If training took place individually based off the minimum time to finish the intervention, or average completion time if provided by paper
Assessment typePre-testExam given before intervention takes place, either immediately or at some time interval before
Immediate post-testExam given immediately after completion of an intervention
Spaced post-testExam given at a spaced time interval following intervention completion. Either set or averaged time interval specified.
Biopsy reviewBiopsies performed by participants audited to determine diagnostic accuracy
Other clinical measureIncluded patient or physician interviews, electronic medical record (EMR) review, referral analysis, dermoscopic image comparison
Outcome measuresKnowledgeObjective report of conceptual understanding of skin cancer (ex: risk factor identification) determined via a written exam
CompetenceObjective report of clinical skills (ex: diagnostic accuracy) determined via a written exam
Self-efficacySubjective report of confidence in, attitude towards, or beliefs about skin cancer diagnosis and management
Diagnostic performanceObjective assessment of diagnostic abilities in a clinical practice setting through biopsy review or referral analysis with expert evaluation
Systems outcomesSubjective or objective assessment of behaviors in practice and/or effects on patients (ex: number of TBSE performed, referral patterns)
Definitions of study variables

Step 5: Collating, Summarizing, and Reporting the Results

A flowchart adapted from the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines was implemented to present the literature search and study selection.[17] Levac et al.’s[14] framework of scoping reviews guided the presentation of data including a numerical summary of descriptive study components (number of studies, type of study designs, years of publication, characteristics of countries of origin) and organizing results into thematic elements as related to the research question. As our research question aimed to present the breadth of literature describing skin cancer education interventions for PCPs, study components were divided into categories that would facilitate comparison including curriculum components, delivery format, delivery timing, and outcome measures.

RESULTS

As depicted in Figure 1, 894 records were identified from literature search and citation list search. Of 523 unique records screened for eligibility and inclusion criteria, 51 studies were ultimately included (Fig. 1). One of the studies identified existed solely as a conference abstract and was included as the presentation is available on YouTube.[18,19] The 51 studies described 37 unique educational interventions. Given the broad nature of the individual studies, in addition to the descriptive tables in this text, the full data extraction table with specific comments is available as an online supplement (Supplement 1).
Fig. 1

PRISMA flowchart of literature search and study selection process. aPubMed yield = 142, Scopus yield = 442, EMBASE yield = 187.

PRISMA flowchart of literature search and study selection process. aPubMed yield = 142, Scopus yield = 442, EMBASE yield = 187.

Study Characteristics

Study characteristics are detailed in Table 3.[18-69] The first study was published in 1995 and the last in 2020. There has been a steady rise in publications since the first intervention was described in 1995 (Fig. 2). Only 13 studies (25%) had a randomized control trial (RCT) design while 28 studies (55%) had a before/after intervention design with or without controls, 6 studies (12%) examined after intervention effects only, and 4 studies (8%) solely described an intervention and did not assess outcome measures. The USA possessed the highest number of included publications with 17 studies (33%), followed by Australia with 10 (20%), and the UK with 5 (10%). The remaining publications came from Italy, Switzerland, Belgium, Canada, France, the Netherlands, Spain, Germany, Ireland, Portugal, and Sweden.
Table 3

Study characteristics. Interventions chronologically from the first year described. If the intervention name was specified in the paper, it was included, and all interventions assigned a numerical value based off chronological order. aIf participants were divided into intervention and control, intervention is mentioned first. bDesign categories: B/A before and after, A after only, C controlled, RCT randomized controlled trial, I intervention only

InterventionAuthor, yearLocationParticipantsaDesignb
1. Newcastle Melanoma Unit GP TrainingGirgis, 1995[78]Australia24,17B/A; C
Burton, 1998[30]Australia31, 32A; C
2. Algorithm and instant cameraDel Mar, 1995[33]Australia53, 52RCT
English, 2003[38]Australia245, 228RCT
3. NSW Cancer Council seminarWard, 1995[66]Australia147B/A
4Laidlaw, 1996[50]UK980I
5Dolan, 1997[34]USA46, 36RCT
6. Skin cancer triageGerbert, 1998[79]USA26, 26RCT
Gerbert, 2002[39]USA39, 32RCT
7. Melanoma education for primary careHarris, 1999[80]USA17B/A
Harris, 2001[45]USA354B/A
Harris, 2001[46]UK150B/A
8. SkinWatchRaasch, 2000[58]Australia23, 23RCT
Youl, 2007[69]Australia16B/A
9Westerhoff, 2000[68]Australia37, 37RCT
10Brochez, 2001[29]Belgium146B/A
11Bedlow, 2001[26]UK17B/A
12. Basic skin cancer triageMikkilineni, 2001[54]USA22B/A
Mikkilineni, 2002[55]USA23B/A
Markova, 2013[51]USA21, 30RCT
13De Gannes, 2004[81]Canada10, 17RCT
14Carli, 2005[31]Italy41B/A
15Dolianitis, 2005[35]Australia61A
16Argenziano, 2006[21]Italy, Spain36, 37RCT
17Menzies, 2009[53]Australia63B/A
18Peuvrel, 2009[57]France210A
19Shariff, 2010[82]UK94B/A
20. MinSKINBadertscher, 2011[24]SwitzerlandN/AI
Badertscher, 2013[23]Switzerland78B/A
Badertscher, 2015[25]Switzerland39, 39RCT
21Bradley, 2012[28]USA6B/A
22. INFORMEDShaikh, 2012[63]USAN/AI
Eide, 2013[37]USA54B/A
Weinstock, 2016[67]USA101, 21, UnknownB/A, C
Swetter, 2017[65]USA5B/A
23Grange, 2014[42]France398B/A, C
24Koelink, 2014[83]The Netherlands27, 26RCT
25. GP Skin Cancer Referral toolkitGulati, 2015[43]UK8163B/A
26Hartnett, 2016[48]USA10B/A
27Anders, 2017[20]Germany573B/A
28Secker, 2017[61]The Netherlands293B/A
29Beecher, 2018[27]Ireland23B/A
30Duarte, 2018[36]PortugalUnknownA
31. Longitudinal curriculum with procedure clinicRivet, 2018[18,19]Canada60B/A, C
32. Mastery learningRobinson, 2018[59]USA44, 45RCT
Robinson, 2018[84]USA44, 45A
33Augustsson, 2019[22]Sweden27, 16B/A, C
34. Triage Amalgamated Dermoscopic Algorithm (TADA)Seiverling, 2019[85]USA59B/A
35. Five-point checklist for skin cancer detection in primary careMoscarella, 2019[56]ItalyN/AI
36Harkemanne, 2020[44]Belgium56B/A
37. Suspicious Skin LesionsMarra, 2020[52]The Netherlands83, 102A; C
Fig. 2

a Number of publications per year, b cumulative number of publications.

Study characteristics. Interventions chronologically from the first year described. If the intervention name was specified in the paper, it was included, and all interventions assigned a numerical value based off chronological order. aIf participants were divided into intervention and control, intervention is mentioned first. bDesign categories: B/A before and after, A after only, C controlled, RCT randomized controlled trial, I intervention only a Number of publications per year, b cumulative number of publications.

Curriculum

Table 4 displays curricular elements of individual programs. All 37 programs included melanoma diagnosis instruction, while only 23 (67%) addressed non-melanoma skin cancer diagnosis. Additional instruction included epidemiology in 21 programs (57%), management in 24 (65%), and counseling in 12 (32%). Fourteen (38%) of the programs included dermoscopy instruction, with increasing prevalence in more recent studies. Nineteen (51%) described instruction of a clinical or dermoscopic algorithm in their training program.
Table 4

Curriculum elements

InterventionEpidemiologyPigmented lesionsNon-pigmented lesionsDermoscopyAlgorithmManagementCounseling
1[78,30]XXX
2[33,38]XXX
3[66]XXXX
4[50]XXX
5[34]XXXX
6[79,39]XXXXX
7[80,45,46]XXXXX[45,46]
8[58,69]X[69]XX[58]X[69]X[69]X
9[68]XXX
10[29]XXX
11[26]XX
12[54,55,51]XXXXXX
13[81]XXXXX
14[31]XXXX
15[35]XXX
16[21]XXXX
17[53]XX
18[57]XXXXX
19 [82]XX
20[24,23,25]XXXX[24,25]X
21[28]XXXXX
22[63,37,67,65]XXXX[37,63,67]XXX
23[42]XXXX
24[83]XXXXX
25[43]XXX
26[48]XXX
27[20]XXXXX
28[61]XXX
29[27]XXX
30[36]XXXX
31[18,19]XXXXX
32[59,84]XXXXX
33[22]XXXX
34[85]XXXX
35[56]XXXXX
36[44]XXX
37[52]XXXXX
Curriculum elements

Delivery Format

Delivery format and timing are displayed in Table 5. The most widely used teaching format was live (68%), followed by interventions that utilized literature (54%), e-learning (38%), interactive (38%), feedback (14%), and patient interaction (8%). Thirteen studies (35%) used only one modality, another 13 studies used two modalities (35%), five used three formats (14%), four used four formats (11%), and one study each used five and six modalities.
Table 5

Delivery format and timing. aSynchrony: A asynchronous, S synchronous. bFor events that finished in variable time, the shortest length to finish is finished. Others are estimated by CME hour credit or given as averaged time. cIntervention had 3 groups with different types of teaching style, one of which was interactive

InterventionLiveLiteratureE-learningFeedbackPatient interactionInteractiveSynchronyaDaysLengthb
1[78,30]XXAMultiple>6 h
2[33,38]XAMultiple

24 months[33]

10 months[38]

3[66]XXSSingle~8 h
4[50]XAUntimedUntimed
5[34]XSSingle2 h
6[79,39]X[79]XX[39]XXAMultiple

>3 h [79]

>1 h[39]

7[80,45,46]XXXASingle or multiple

1 h[80]

6 h[45]

18 h[46]

8[58,69]XAMultiple

3 weeks[58]

6 months[69]

9[68]XXSSingle1 h
10[29]XXSSingle2 h
11[26]XXSSingleNot specified
12[54,55,51]X[54,55]X[54,55]X[51]X[54,55]SSingle2 h
13[81]XSSingle12 min
14[31]XSSingle4 h
15[35]XXXASingleUntimed
16[21]XSSingle4 h
17[53]XXXXAMultiple>2 h
18[57]XXSSingle2 h
19 [82]XAUntimedUntimed
20[24,23,25]XX[24,25]XS

Multiple[24,25]

Single[23]

12 months[24,25]

~8 h[23]

21[28]XXSSingle45 min
22[63,37,67,65]XXASingle1–2 h
23[42]XXXXSSingle2.5 h
24[83]XSMultiple10 h
25[43]XXASingle or multiple>5 min
26[48]XSSingle15 min
27[20]XXXXSSingle8 h
28[61]XXXASingle1 day
29[27]XSSingle1 h
30[36]XXSSingle3 h
31[18,19]XXXXXXAMultiple8 months
32[59,84]XXXAMultiple9 weeks
33[22]XXSSingle5 h
34[85]XXcSSingle75 min
35[56]XN/AN/AN/A
36[44]XSSingle2 h
37[52]XXASingle or multiple>2 h
Delivery format and timing. aSynchrony: A asynchronous, S synchronous. bFor events that finished in variable time, the shortest length to finish is finished. Others are estimated by CME hour credit or given as averaged time. cIntervention had 3 groups with different types of teaching style, one of which was interactive 24 months[33] 10 months[38] >3 h [79] >1 h[39] 1 h[80] 6 h[45] 18 h[46] 3 weeks[58] 6 months[69] Multiple[24,25] Single[23] 12 months[24,25] ~8 h[23]

Delivery Timing

The length of studies ranged from 5-min to 24 months. Excluding studies that lasted longer that 1 day, the average length was 3.5 h and the median 2 h. Twenty-two interventions were single day (59%) while eight were multiple days (22%). Four interventions (10%) could be single or multiple days. Twenty-one studies were conducted synchronously (57%), while 15 were asynchronous (41%).

Outcome Measures

Outcome measures can be seen in Table 6. The most used method of assessing was pre- and post-exams. Clinical outcomes were assessed through biopsy review, patient exit interviews, physician telephone interviews or surveys, referral analysis, and EMR review. Of the exam-based questions, 25 out of 29 (86%) interventions that assessed for competence exhibited improvement in test scores, 11 of 13 (85%) studies measuring knowledge showed improvement, and 16 of 18 (89%) studies examining self-efficacy demonstrated improvement. For clinical measures, 8 of 17 studies (47%) showed an improvement in diagnostic accuracy and 18 of 21 studies (86%) showed an improvement in at least one systems outcome (i.e., identifying risk factors, performing more total body skin exams, including diagnosis on referrals, etc,).
Table 6

Assessment types and outcome measures. aKey: + = statistically significant improvement; − = no statistical improvement; +* = improvement, no statistics performed; −* = no improvement, no statistics. bStatistically significant difference in physician reported TBSE after 1 month, difference not present at 12 months post-intervention

InterventionAssessment typeOutcome measures a
Author, yearPre-testImmediate post-testSpaced post-testBiopsy reviewOther clinical measuresKnowledgeCompetenceSelf-efficacyDiagnostic performanceSystems outcomes
1Girgis, 1995[78]XXXX+++
Burton, 1998[30]X
2Del Mar, 1995[33]X+
English, 2003[38]+
3Ward, 1995[66]X3 monthsX+++
4Laidlaw, 1996[50]
5Dolan, 1997[34]X1 monthX++
6Gerbert, 1998[79]X3 weeks+
Gerbert, 2002[39]XX8 weeks++*
7Harris, 1999[80]XX++
Harris, 2001[45]XX+++
Harris, 2001[46]XX+++
8Raasch, 2000[58]X+
Youl, 2007[69]X+
9Westerhoff, 2000[68]X23 days+
10Brochez, 2001[29]XX++*
11Bedlow, 2001[26]X2 weeks+
12Mikkilineni, 2001[54]XXX++
Mikkilineni, 2002[55]X1 month+++
Markova, 2013[51]X++/−b
13De Gannes, 2004[81]X6 monthsX
14Carli, 2005[31]XX+
15Dolianitis, 2005[35]X+
16Argenziano, 2006[21]X+
17Menzies, 2009[53]XXXX+++
18Peuvrel, 2009[57]15 monthsX+*+*
19Shariff, 2010[82]X
20Badertscher, 2011[24]XX1 year
Badertscher, 2013[23]XX+
Badertscher, 2015[25]XX1 year
21Bradley, 2012[28]XXX+++*+
22Shaikh, 2012[63]
Eide, 2013[37]XX6 monthsX++*+
Weinstock, 2016[67]X++
Swetter, 2017[65]X+
23Grange, 2014[42]XX+++*+
24Koelink, 2014[83]XX+*
25Gulati, 2015[43]XX+*−*+
26Hartnett, 2016[48]XX+++*
27Anders, 2017[20]XX+++
28Secker, 2017[61]X3 months+
29Beecher, 2018[27]X3 months+
30Duarte, 2018[36]X+*
31Rivet, 2018[18,19]XXX+*+*
32Robinson, 2018[59]XXX++
Robinson, 2018[84]X+*
33Augustsson, 2019[22]XX6 months+
34Seiverling, 2019[85]XX+
35Moscarella, 2019[56]
36Harkemanne, 2020[44]XX+
37Marra, 2020[52]X++
Assessment types and outcome measures. aKey: + = statistically significant improvement; − = no statistical improvement; +* = improvement, no statistics performed; −* = no improvement, no statistics. bStatistically significant difference in physician reported TBSE after 1 month, difference not present at 12 months post-intervention

DISCUSSION

This scoping review demonstrates that several interventions have been implemented for the instruction of skin cancer screening to primary care providers with varying teaching styles, intervention length, and methods of evaluation. A prior systematic review performed in 2011 found similar findings in thirteen interventions, concluding that a lack of uniformity across interventions prevents direct comparison of efficacy.[15] This review highlights an updated literature on skin cancer education interventions with the inclusion of 24 additional educational interventions. Moreover, additional variables were noted including delivery timing, synchrony of instruction, and assessment type. Most studies utilized more than one format to deliver curriculum content; however, the most utilized form was the “live” delivery format. This format is familiar to all learners which could explain its popularity. E-learning, or education delivered in an online format, allows for the dissemination of education to a wider audience than typical in-person education. Additionally, the use of asynchronous teaching supports a learner-friendly environment, allowing the user to complete the training at the time, location, and pace that is convenient for the user.[70] However, e-learning interventions require access to and familiarity with web-based educational platforms. Educational platforms with more interactive features may have annual or user-based fees, which may limit use among educators and researchers. In past post-intervention completion focus groups, PCPs who completed an e-learning intervention cited the need for assistance with challenging cases encountered during patient care.[37,67] Thus, creating an e-learning formatted educational intervention will likely support the greatest practice change when paired with provider-to-provider e-consultations or telementoring through frameworks such as Project ECHO (Extension for Community Healthcare Outcomes).[71] Delivery timing of interventions varied wildly, but a majority of programs focused on short programs with the aim of allowing busy PCPs time to attend their course. Only one study performed a longitudinal study.[18] This study’s participants consisted of family medicine residents in an 8-month rotation, and the participants received a 1-h interactive teaching session followed by a procedural clinic every 2 weeks.[19] While many programs opted for immediate post-test examinations, other interventions assessed the durability of their education with the use of spaced post-tests. The timing of spaced post-tests ranged from 2 weeks to 15 months. Only 5 manuscripts discussed the use of immediate and spaced post-tests.[22,24,25,37,39] The online-based curriculums described by Gerbert et al.[39] and Eide et al.[37] found statistically significant improvement in post-intervention exams immediately after the intervention that persisted at 8 weeks and 6 months respectively, while Badertscher et al.[25] did not show improvement in either exam. Augustsson et al.[22] also showed persistent improvement in immediate and spaced post-test exams following their dermoscopy course. All programs instructed on the diagnosis of melanoma which is prudent as it is responsible for the vast majority of skin cancer mortality; however, a majority also included instruction on keratinocytic skin cancer diagnosis. The instruction of management and counseling varied by intervention and in scale. While some programs instructed on referral vs. watchful waiting strategy as management, others assumed a more involved role of the PCP with training in biopsies, procedures, and other treatments. Dermoscopy was included in 38% of the programs. Dermoscopy training for family physicians has been shown to increase the sensitivity for melanoma detection versus naked eye examination with no decrease in specificity.[72] This clinical diagnostic skill has the potential to improve patient comfort and satisfaction, improve clinicians’ self-efficacy regarding non-invasive diagnosis, and reduce costs to the healthcare system. While skillful use of dermoscopy reduces false positives and negatives of melanoma early detection, restricting education to dermatologists alone fails to benefit the majority of at-risk individuals who lack access to dermatologic care.[3,73,74] The positive knowledge and clinical response to dermoscopy instruction were noted in several studies in this review. Most studies evaluated the success of their program by pre- and post-intervention written exams and/or evaluated the clinical application by biopsy review, patient/physician interviews, or referral analysis. For this discussion, the authors define “success” of a program as demonstrating one or more statistically significant improvements in an outcome measure. Several studies demonstrated improvement in competence exams (25 out of 29 analyzed studies) but less than half (8 out of 17 analyzed studies) were able to demonstrate improvement in clinical diagnostic performance. In the first program attempted, the authors observed this discrepancy and concluded that while it is feasible to impart knowledge via training programs, it is much more difficult to translate this gain of knowledge into clinical change.[41] While many studies were unable to produce practice change, others were successful. A study by Grange et al.[42] conducted a population-based study, evaluating regional Breslow thickness of melanomas before and after training, and found that in the intervention group, there was a decreased incidence of very thick melanomas as well as a decrease in mean Breslow thickness; this change was not seen in the control region. The training program comprised multiple modalities including a live teaching session, interactive quizzes and clinical scenarios, literature distribution with clinical pictures, and a website with all the course information.[42] This study provides support that training PCPs could decrease melanoma morbidity and mortality by enhanced detection of earlier-stage cancer. The most recent intervention, a study by Marra et al.,[52] compared trained and untrained PCPs and found that the trained group had better diagnostic accuracy and demonstrated a clinical change in the quality of referrals including fewer unnecessary referrals. This study utilized both an online course and an optional live course on dermoscopy instruction. It aimed to not only improve skin cancer detection and management by PCPs, but to ultimately transfer the management of low-risk non-melanoma skin cancer to PCPs.[52] A study by Weinstock and colleagues[67] examined the downstream effects of the online curriculum INFORMED (INternet course FOR Melanoma Early Detection) by comparing patient outcomes of PCPs who had participated in the curriculum and those who had not. The trained group showed an increased percentage of melanoma diagnosis per patients receiving an annual physical exam; this increase came without an increase in skin surgeries or dermatology visits.[67] Shaikh et al.[63] originally described the INFORMED curriculum in 2012, and three subsequent studies evaluating its success were generally positive.[37,65,67] The curriculum was designed with input from PCPs to aid early detection of skin cancers; it includes a large number of clinical photographs and has information to improve decision-making of skin cancer management and referrals.[67] Finally, another notable successful intervention is the online course described by Robinson and colleagues.[59] This course was designed by dermatologist, PCPs, and medical educators and utilized a mastery learning format, a variety of competency-based education that allows learners to acquire knowledge and skills in a self-paced course with a focus on deliberate practice and advancing in the course by meeting a minimum passing standard. Compared to controls, PCPs who took the course referred fewer benign lesions and referred a significantly greater melanomas.[59] These successful interventions demonstrate that online format, interactive format, PCP input in designing curriculum, and instruction on management are advantageous components of a curriculum. A few patterns emerge when examining the programs that failed to show a significant positive change. Three programs had unsuccessful outcomes in both knowledge and clinical application.[32,43,58] Two of these interventions were e-learning delivery formats: one consisted of a short 12-min video and the other was a website with helpful links of which users spent an average of 5 min in total.[32,43] The third was a 3-week audit of PCP biopsies in which dermatologists provided feedback on diagnosis.[58] This is notable as another intervention relying on solely feedback as a training tool failed to show improvement in PCP’s diagnostic competency on written exams.[25] Additionally, the successful Robinson et al.[60] mastery learning intervention included a feedback portion where dermatologists gave feedback about a PCP’s diagnosis; the diagnostic accuracy agreement between PCP and dermatologist did not change after feedback sessions. Two programs relied on participants reading mailed literature and did not demonstrate a significant change in clinical practice.[38,64] From these unsuccessful interventions, we can conclude that passive, brief interventions would not be adequate in producing clinical practice change. One intervention demonstrated the importance of interactive elements. The Basic Skin Cancer Triage was originally shown to produce success in knowledge and clinical practice,[54,55] but when it was developed as a web-based module, the success was no longer apparent.[51] The authors admitted that this may be a fault of lack of interactive design including practice exercises, repetition, and feedback elements.[51] This reflects the principals of adult learning theory: formative assessments are key for long-term retention of knowledge as they invoke an active retrieval of information.[75] Effective adult learning strategies in medical education include identifying baseline skills, knowledge, and attitudes; beginning instruction with a problem relevant to the participants; incorporation of collaborative, problem-solving activities; having the learner do the work of learning (e.g., limited didactic time, more group discussion focused, with practice scenarios); engage motivation, attachment, and emotions by applying learned topics to clinical practice (e.g., discuss how skills learned would have changed prior experiences); and reflection on learning experience via survey assessments (e.g., pre- and post-tests demonstrating gain in knowledge or confidence).[75,76] The scoping review is limited by the amount and quality of studies reporting skin cancer screening interventions for PCPs. Furthermore, it is possible that successful programs are more likely to be published, skewing our narrative results to describe more interventions demonstrating positive outcomes. Another limitation lies in the wide variability in the intervention design, leading to difficulty in accurately comparing variables. Additionally, most of the data reported are from unvalidated knowledge instruments developed by the individual studies based on their individual curriculum and may be an overrepresentation of the enduring educational benefit to the training recipients; the validation of metrics quantifying knowledge gains is essential to measure the impact of educational efforts.[77]

CONCLUSION

This study highlights not only the variety of skin cancer educational interventions for PCPs, but the difficulties of translating gains in knowledge and self-efficacy into practice change and ultimately patient care. Implementation of a successful PCP training program in skin cancer could be advantageous in decreasing the morbidity and mortality from skin cancer, especially in populations where significant dermatology access gaps exist such as in rural, underserved, and uninsured populations. Interventions such as the one by Grange et al.[42] show that a successful intervention has the potential to have an effect on skin cancer at a population level. Examining the interventions that produced practice change, successful elements include online format, dermoscopy instruction, interactive format, PCP input in designing curriculum, and instruction on management, while unsuccessful interventions tended to be brief and passive. Additionally, assessments should include clinically relevant outcomes and endpoints, including providers’ intent to change practice and/or practice change. Instruction in dermoscopy is becoming more prevalent in skin cancer curriculums as skillful dermoscopy use has shown to be superior to naked eye inspection. The online format allows dissemination to a wide audience, as well as being an ideal format for interactive and competency-based learning. An ideal intervention should produce positive clinical practice change and have the ability to reach a large audience of PCPs to achieve the ultimate goal of decreasing skin cancer morbidity and mortality on a population level. (XLSX 28 kb)
  67 in total

1.  Visual screening for malignant melanoma: a cost-effectiveness analysis.

Authors:  Elena Losina; Rochelle P Walensky; Alan Geller; Frederick C Beddingfield; Lindsey L Wolf; Barbara A Gilchrest; Kenneth A Freedberg
Journal:  Arch Dermatol       Date:  2007-01

2.  Melanoma Staging: American Joint Committee on Cancer (AJCC) 8th Edition and Beyond.

Authors:  Jeffrey E Gershenwald; Richard A Scolyer
Journal:  Ann Surg Oncol       Date:  2018-05-30       Impact factor: 5.344

3.  Comparison of Dermatologist Density Between Urban and Rural Counties in the United States.

Authors:  Hao Feng; Juliana Berk-Krauss; Paula W Feng; Jennifer A Stein
Journal:  JAMA Dermatol       Date:  2018-11-01       Impact factor: 10.282

4.  Social determinants of racial and ethnic disparities in cutaneous melanoma outcomes.

Authors:  Valerie M Harvey; Hitesh Patel; Sophia Sandhu; Sherrie Flynt Wallington; Ginette Hinds
Journal:  Cancer Control       Date:  2014-10       Impact factor: 3.302

5.  Predictors of neighborhood risk for late-stage melanoma: addressing disparities through spatial analysis and area-based measures.

Authors:  Shasa Hu; Recinda Sherman; Kristopher Arheart; Robert S Kirsner
Journal:  J Invest Dermatol       Date:  2013-11-07       Impact factor: 8.551

6.  Cancer Statistics, 2021.

Authors:  Rebecca L Siegel; Kimberly D Miller; Hannah E Fuchs; Ahmedin Jemal
Journal:  CA Cancer J Clin       Date:  2021-01-12       Impact factor: 508.702

7.  Diagnosis of skin disease by nondermatologists.

Authors:  A B Fleischer; C R Herbert; S R Feldman; F O'Brien
Journal:  Am J Manag Care       Date:  2000-10       Impact factor: 2.229

8.  The US dermatology workforce: a specialty remains in shortage.

Authors:  Alexa Boer Kimball; Jack S Resneck
Journal:  J Am Acad Dermatol       Date:  2008-08-23       Impact factor: 11.527

9.  Vital signs: melanoma incidence and mortality trends and projections - United States, 1982-2030.

Authors:  Gery P Guy; Cheryll C Thomas; Trevor Thompson; Meg Watson; Greta M Massetti; Lisa C Richardson
Journal:  MMWR Morb Mortal Wkly Rep       Date:  2015-06-05       Impact factor: 17.586

10.  Specialty-care access for community health clinic patients: processes and barriers.

Authors:  Mabel C Ezeonwu
Journal:  J Multidiscip Healthc       Date:  2018-02-22
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.