| Literature DB >> 32273671 |
Nehal Khamis1, Richard Satava2, David E Kern3.
Abstract
BACKGROUND AND OBJECTIVES: In 2016 we published a stepwise evidence-based model (subsequently named SimSteps) for curriculum development (CD) of simulation-based courses. The current study aimed to assess the uses, user friendliness, and perceived effectiveness of this model and its worksheet and to obtain suggestions for improvement.Entities:
Keywords: assessment; curriculum development; model; simulation; six step
Mesh:
Year: 2020 PMID: 32273671 PMCID: PMC7134544 DOI: 10.4293/JSLS.2019.00060
Source DB: PubMed Journal: JSLS ISSN: 1086-8089 Impact factor: 2.172
Summary of Model Guidelines*
• Description of problem being addressed by simulation curriculum • Performance of gap Analysis to identify difference between current training activities and ideal • Done at international, national or regional rather than institutional level | |
• Collection of data on learners' existing competencies and needs at institutional level • Coordination with other curricula to integrate the simulation training into overall curriculum • Identification of stakeholders and involving them early in the curriculum design process | |
• Identification of cognitive pre-requisites • Defining psychomotor/technical and non-technical skills/competencies at individual and team level • Developing objectives (outcome measures) with their metrics; specific quantifiable (e.g. centimeters, etc.) or unambiguously defined non-numeric values (e.g. cross check defined as assistant repeating surgeon's request verbatim) | |
• Knowledge post-test to determine eligibility to start psychomotor training • Procedure/skill deconstruction into key steps including common and important errors • Setting criteria for expected levels of proficiency • Choice of most appropriate simulation and level of fidelity and determining Proficiency benchmark values • Training through cycles of practice with increasing complexity, recording and review of performance • Planning for faculty Development to ensure expertise in teaching and assessment using simulation | |
• Development of assessment tool and inclusion of space for open-ended comments • Establishment of inter- and intra-rater reliability and documentation of validity evidence • Use for both formative and summative assessment • Consideration of repeating assessments to ensure maintenance of proficiency | |
• Kirkpatrick pyramid hierarchy is applicable: clinical outcomes>clinical behaviors int practice>knowledge, skills and attitudes gain>satisfaction with learning experience • Use of aggregated learner assessments to evaluate success of program • Subjective assessment of curricular components, areas of strength and improvement and practical value of the course • If feasible, long term follow up of learners. | |
• Paying attention to simulation methodology and setting to ensure fidelity and planning for interprofessional training sessions • Seeking political and administrative support for allocation of resources and addressing barriers • Introduction of curriculum with consideration of piloting or phasing in as appropriate |
A copy of the full guidelines is available upon request from the first (corresponding) author
Template Worksheet for SimSteps (the Stepwise Model for Simulation-Based Curriculum Development for Clinical Skills, a Modification of the Six-Step Approach)
| 1. Problem identification: |
| - |
| - |
| - |
| 2. Current approach: |
| - |
| - |
| - |
| 3. Ideal approach: |
| - |
| - |
| - |
| 4. Needs identified (use gap analysis, which is the difference between current and ideal) |
| - |
| - |
| - |
| 1. Targeted learners: |
| - |
| - |
| - |
| 2. Needs assessment/targeted learners: |
| - |
| - |
| - |
| 3. Needs assessment/targeted learning environment: |
| - |
| - |
| - |
| -Competency(ies): |
| Course title: __ |
| - |
| - |
| -For each competency: |
| -Goal(s): |
| - |
| - |
| -Outcome measures (specific objectives) and their metrics (quantifiable numeric values or clear definition for nonnumeric values—include correct knowledge/actions and ERRORS as based on task deconstruction reached in general needs assessment): |
| -By the end of the program, learners will be able to: |
| -Knowledge (cognitive prerequisites): |
| - |
| - |
| - |
| - Technical (psychomotor) skills: |
| - |
| - |
| - |
| -Nontechnical (team performance, communication skills, professionalism, etc.): |
| - |
| - |
| - |
| Competencies and Goals Outcome Measures and Metrics |
| As based on the ideal approach identified in step 1, your targeted needs (Step 2), your objectives (Step 3), and available resources, identify: |
| -Content to be taught: |
| - |
| - |
| - |
| Educational methods: |
| 1. Orientation |
| 2. Syllabus material: e.g. textbook, handouts, online learning modules, etc. |
| 3. Cognitive (didactic) component: |
| a. Knowledge pretest |
| b. Teaching and learning methods (e.g., interactive computer tutorials, video-recorded tutorials, or interactive live tutorials): |
| - |
| - |
| - |
| c. Knowledge posttest (to determine eligibility to begin the psychomotor component): |
| 3. Psychomotor and nontechnical components: |
| a. Simulation method appropriate for outcome measures and learner level of expertise: |
| b. For technical procedures: |
| -Deconstruction into key components (e.g., steps, tasks, subtasks, skills). Inclusion of common and critical errors and how to identify/prevent/correct them if they occur. These are based on consensus among clinical subject matter experts, medical educators, and behavioral psychologists and on existing evidence of effectiveness, when available. |
| - |
| - |
| - |
| c. For nontechnical skills (e.g., team training, communication and professionalism): |
| -Deconstruction into key components; steps/tasks/subtasks/skills for team-related skills (e.g., TeamSTEPPS) for relevant professionalism components, etc. |
| - |
| - |
| - |
| d. Metrics to be used to quantify the steps/task/subtasks/skills and errors: |
| (quantitative, e.g., time in seconds, distance in millimeters, number of errors or qualitative in the form of a distinctive attribute or characteristic possessed, e.g., repeats commands, inserts the trocar). |
| - |
| - |
| - |
| e. Designing the training tool (will also be used also as assessment tool, e.g., checklist, rating scale, etc.) based on the output from task deconstruction and metrics identification. |
| Item Done Not done |
| f. Set the benchmark value for proficiency to be acquired; thus, simulation exercises are gradually increased in complexity in a proficiency-based progression, and each level must reach 100% proficiency benchmark value before progressing to the next level. |
| - |
| - |
| - |
| g. Review of individual learner-recorded performance at the end of a training trial. |
| 4. Faculty development: |
| -Faculty development for simulation (to ensure expertise in use of simulation method), for example, in feedback, small-group facilitation, or other relevant teaching skills |
| -Total duration of training: |
| -Topics of training sessions: |
| -Number of sessions: |
| -Duration of each training session: |
| -Educational methods: |
| - |
| - |
| - |
| -Development: |
| 1. Assessment tools: |
| a. Pre- and posttests for cognitive component (pretest only needed for research trials) |
| b. For psychomotor and nontechnical skills, edit training tool developed in step 4 to develop the assessment tool: |
| i. Add task and subtask benchmark values (as set by experts) |
| ii. Include cells for total scores and global rating scores |
| iii. Include space for open-ended comments (helpful for formative assessment/feedback). |
| iv. Establish inter- and intra-evidence for assessment tools (of >0.80); consider internal structure (homogeneity), and alternate form reliability evidence when appropriate. |
| v. Plan for other forms of evidence of validity when feasible (e.g., concurrent, predictive, or discriminant validity). |
| - |
| - |
| - |
| 2. Pass scores: |
| (As noted above, it is recommended that each learner achieves a 100% score before progressing to the next level.) |
| -Cognitive posttest: |
| -Psychomotor and nontechnical skills: |
| 3. Use: |
| -Formative assessment with feedback (inform learner of errors they commit during the training trial) until benchmark value is achieved) |
| -Summative assessment (final grade/certification of level of proficiency) |
| -Reassessment, often at 6–8 weeks, and retraining if necessary to ensure maintenance of proficiency. |
| 1. What questions are you trying to answer with your program evaluation? |
| - |
| - |
| - |
| 2. For learner perspectives on the curriculum, what questions are you trying to answer?…tquality of faculty performance?…tsatisfaction with content of the course?…tperceived effectiveness of educational methods?…ttechnical problems?…tetc.? What method will you use: e.g., questionnaire, focus group, etc. This type of evaluation usually uses a posttest-only design. |
| - |
| - |
| - |
| 3. For effectiveness of the course in achieving desired learner outcomes (often aggregates of individual assessments): |
| a. What is/are your evaluation designs (e.g., posttest only, pre-posttest, control/comparison group, randomization or not)? |
| - |
| - |
| - |
| b. Evaluation methods: these will usually be the methods used for individual assessments. Will this be supplemented by other methods, such as video-review? |
| - |
| - |
| - |
| 4. Revision/improvement of curriculum based on evaluation. How will you decide on revisions in the curriculum based upon evaluation results? |
| 3. Review/analysis of evaluation results. What is your plan for preparing and distributing evaluation reports? |
| - |
| - |
| - |
| Because simulation is a resource-intensive educational methodology, the curriculum developer wants to ensure that necessary support and adequate resources for the curriculum can be obtained, that requested resources are justified, and that they are efficiently used. Curricular plans may need to be adapted based on available resources and support. |
| 1. Political and administrative support: who are the stakeholders whose support you need (e.g., dean, department head? How will you secure their support? |
| - |
| - |
| - |
| 2. Resources needed: e.g., personnel, time, facilities, equipment, funding |
| - |
| - |
| - |
| 3. Administration of curriculum: what needs to be done, e.g., developing and distributing schedules and reports, collecting information, collating data, communicating information to learners and faculty; who will be responsible for each task? |
| - |
| - |
| - |
| 4. Identification of barriers and solutions: |
| - |
| - |
| - |
| 5. Introduction of the curriculum (consider a pilot study first, then phasing in of the full curriculum): |
| - |
| - |
| - |
Characteristics of Survey Respondents
| Characteristics | n | (%) |
| Location | ||
| United States | 9 | (56%) |
| Saudi Arabia | 2 | (13%) |
| Denmark | 1 | (6%) |
| Netherlands | 1 | (6%) |
| South Africa | 1 | (6%) |
| South Korea | 1 | (6%) |
| United Kingdom | 1 | (6%) |
| Profession | ||
| Physician | 8 | (50%) |
| Nurse | 4 | (25%) |
| Physiotherapist | 1 | (6%) |
| Physician assistant | 1 | (6%) |
| Social work professor | 1 | (6%) |
| Education scientist | 1 | (6%) |
| Foci of work | ||
| Education | 15 | (94%) |
| Clinical care | 7 | (44%) |
| Research | 7 | (44%) |
| Leadership and administration | 8 | (50%) |
| Safety systems design and improvement | 1 | (6%) |
| Other (did not specify) | 1 | (6%) |
Percentages are rounded to the nearest whole number.
Multiple choices were allowed.
Use of the Stepwise Model in Course Design and Faculty Development (n = 16)
| Variables | Yes | Intend to use | No |
| Use in course design | 9 (56%) | 5 (31%) | 2 (13%) |
| Use in faculty development | 7 (44%) | 7 (44%) | 2 (13%) |
Percentages are rounded to the nearest whole number.
Total number of users of the model and worksheet among respondents is 10 of 16 (63%) because six of 10 users (60%) used them for both course design and faculty development.
Examples of the Courses That the Respondents Used the Model to Develop
| Learners | Courses |
| Undergraduate HPE students | |
| Year 1 and 2 medical students | Simulations for the preclinical (basic) sciences for the health professions |
| Year 1 nurse practitioner students | Emergency medicine rotator simulation course |
| Fourth-year medical students | |
| Graduate HPE students/trainees: | |
| Categorical emergency medicine residents, PGY1–5 | Emergency resident's simulation course |
| Medical residents | Thoracentesis |
| Paracentesis | |
| Surgical, anesthesiology, medical, family medicine residents | Central line insertion |
| Senior anesthesia trainees | Crisis resource management |
| Gastroenterology fellows | Fundamentals of gastrointestinal endoscopy |
| Master of medical education students | Module on simulation-based education |
| Attending physicians, practitioners and faculty | |
| Anesthetic practitioners and consultants | Crisis resource management |
| Attending emergency physicians | Emergency medicine attending maintenance of certification resuscitation course |
| Health professions education faculty | Procedural skills |
| Curriculum development of simulation-based courses | |
| Others | |
| Graduate social work students | Assessment and intervention skills in psychosocial oncology |
| Social workers | |
HPE, health professions education; PGY, postgraduate year.
Respondents' Level of Agreement on the Helpfulness of the Model and Worksheet in Guiding Curriculum Developers (n = 16)
| Item | Agreement | ||||
| Strongly Agree | Agree | Not Sure | Disagree | Strongly Disagree | |
| n (%) | n (%) | n (%) | n (%) | n (%) | |
| Model helps/guides curriculum developers to: | |||||
| 1. Use a systematic and comprehensive approach for curriculum development | 10 (63%) | 6 (38%) | 0 (0%) | 0 (0%) | 0 (0%) |
| 2. Integrate criteria of educational effectiveness of simulation into the curriculum development process | 8 (50%) | 7 (44%) | 1 (6%) | 0 (0%) | 0 (0%) |
| 3. Use general needs assessment and problem identification to better focus the course and target it to meet an educational goal | 9 (56%) | 7 (44%) | 0 (0%) | 0 (0%) | 0 (0%) |
| 4. Argue for the need for the course and gain educational management support | 5 (31%) | 6 (38%) | 5 (31%) | 0 (0%) | 0 (0%) |
| 5. Assure the inclusion of relevant cognitive background in the developed course | 5 (31%) | 9 (56%) | 2 (13%) | 0 (0%) | 0 (0%) |
| 6. Assure the inclusion of technical skills (e.g. joint injection, chest tube insertion) | 7 (44%) | 6 (38%) | 3 (19%) | 0 (0%) | 0 (0%) |
| 7. Assure the inclusion of nontechnical skills (e.g. communication skills, teamwork skills) | 5 (31%) | 5 (31%) | 6 (38%) | 0 (0%) | 0 (0%) |
| 8. Include common errors into simulation-based training and assessment | 4 (25%) | 8 (50%) | 4 (25%) | 0 (0%) | 0 (0%) |
| 9. Develop more objective assessment tools of learner's performance (e.g., checklists with clear outcome measures and metrics) | 5 (31%) | 10 (63%) | 1 (6%) | 0 (0%) | 0 (0%) |
| 10. Introduce the proficiency-based progression process | 5 (31%) | 9 (56%) | 2 (13%) | 0 (0%) | 0 (0%) |
| 11. Plan for course implementation as an essential step of curriculum development | 10 (63%) | 5 (31%) | 1 (6%) | 0 (0%) | 0 (0%) |
| 12. Enhance the validity of their developed simulation-based training course(s) | 6 (38%) | 9 (56%) | 1 (6%) | 0 (0%) | 0 (0%) |
Percentages are rounded to the nearest whole number.