| Literature DB >> 28954718 |
Selma Omer1, Sunhea Choi1, Sarah Brien2, Marcus Parry1.
Abstract
BACKGROUND: For an increasingly busy and geographically dispersed faculty, the Faculty of Medicine at the University of Southampton, United Kingdom, developed a range of Web-based faculty development modules, based on Kolb's experiential learning cycle, to complement the faculty's face-to-face workshops.Entities:
Keywords: computer simulation; computer-assisted instruction; education, medical; models, educational; staff development
Year: 2017 PMID: 28954718 PMCID: PMC5637066 DOI: 10.2196/mededu.7939
Source DB: PubMed Journal: JMIR Med Educ ISSN: 2369-3762
Figure 1The design of “The role of the OSCE examiner” Medical Education Staff Access (MEDUSA) module based on Kolb’s experiential learning cycle.
Figure 2Screenshot from the “Have a go” activity in the module showing a video of a student undergoing an OSCE and the online marking sheet that users complete with the activity feedback that compares their score with expert examiners and peers.
Overall satisfaction ratings (evaluation data from seven Medical Education Staff Access (MEDUSA) modules: Assessment of Clinical Competence, Planning and Delivering Lectures, the Role of the OSCE Examiner, Giving Constructive Feedback, the Student Assistantship, Diversity and From Classroom to Clinical Learning. Satisfaction ratings completed by 283 MEDUSA users).
| Item | Median rating (1-5 scale) | Rating ≥ 4a |
| Amount of interaction | 4 | 86.7% (241/278) |
| Ease of navigation | 4 | 82.8% (231/279) |
| Maintenance of interest | 4 | 81.0% (226/279) |
| Meeting learning outcomes | 4 | 76.5% (216/282) |
| Overall structure | 4 | 84.9% (236/278) |
| Relevance | 4 | 87.8 % (244/278) |
| Type of interaction | 4 | 83.9% (234/279) |
a% (n/N), where n reflects the number of rating reporting a score of ≥ 4; and N is the total number of rating (scored between 1-5) reported for that item.
Themes identified from the Medical Education Staff Access (MEDUSA) features that participants liked (qualitative data from a total of 368 comments reported by 225 participants who completed the evaluation survey).
| Themes | Codes |
| Content | Cases and examples used |
| Opportunity to practice | |
| Feedback on activities | |
| Practical tips | |
| Key concepts and models | |
| Relevant, informative, and realistic | |
| Relates students and examiner views | |
| Thought provoking | |
| Resources and references | |
| Delivery | Animations/video design |
| Use of multimedia | |
| Ease of use, access, and navigate | |
| Presentation | Engaging and interactive |
| Appropriate length | |
| Clear | |
| Concise | |
| Simple language | |
| Organized |
Medical Education Staff Access (MEDUSA) features that were liked by users mapped to Kolb’s experiential learning cycle (qualitative data from a total of 368 comments reported by 225 participants who completed the inbuilt evaluation survey).
| Kolb’s experiential learning cycle | Instructional strategies | Technologies/design solutions | Sample quotes |
| Experience | Building understanding through an experience | Simulations to showcase an experience | “...good to see a video of an examination.” |
| Engaging learners in meaningful and relevant tasks so they can apply knowledge in real-world situations | Videos and graphics to bring a case to life or to demonstrate a situation | “...simulated student.” | |
| Case scenarios describing challenging situations | “...seeing student patient interaction.” | ||
| “...real life example shown.” | |||
| Reflection | Promoting reflection | Reflection activities and thought-provoking questions | “...space to consider as well as do.” |
| “...it made me think.” | |||
| “...could relate to other examiners description of problems they encountered.” | |||
| Conceptualization | Generating new knowledge and concepts | Presenting knowledge and theoretical models though engaging animations | “...animation of Millar’s pyramid.” |
| Promoting authentic learning tasks | Interactive case scenarios, video demonstrations for practical tips and guidance | “...examples of constructive feedback.” | |
| Supporting multiple perspectives | Videos showcasing different perspectives | “...trouble shooting strategies.” | |
| “...the views of a variety of very experienced lecturers on how to prepare for them and deal with stress involved.” | |||
| “...good to get students views and experiences of feedback.” | |||
| “...video recordings of difficult situations people have encountered.” | |||
| Experimentation | Testing concepts through active experimentation | Performing tasks and receiving feedback on task | “...being able to score an actual ACC and compare with peers and experienced examiners.” |
| Promoting collaboration and encourage dialogue between teachers and other learners | Discussion forum | “...opportunity to upload my lecture for review.” |
Figure 3Kolb’s based design framework showing applied in MEDUSA modules showing; instructional strategies and design solutions used (Blue), learning management functions used to meet learner needs (Brown) and reporting functions used to meet the needs of faculty developers/administrators (pink).
The users reported the various ways in which Medical Education Staff Access (MEDUSA) modules will change their work as educators (Qualitative data from a total of 189 comments reported by 174 participants who completed the inbuilt evaluation survey).
| Category | Description (percentage of comments) | Sample quotes | Kolb’s experiential learning cycle |
| Awareness | Raising awareness, reminding and reinforcing concepts (18%) | “Reinforced some things I knew but do not always focus on and a good opportunity to reflect on own skills and course design.” | Reflection |
| Knowledge | Gaining knowledge; improved understanding (17%) | “I feel more informed, and have a better idea of standard required.” | Conceptualization |
| Change | Changing practices—shifting in the focus or method (13%) | “Encouraged me to get students to discuss with each other their feedback after they get it and to offer more opportunity to discuss feedback they get on an assignment.” | Experimentation |
| Reflection | Making the user reflect on their practice (12%) | “I think it will help me to consider again how I present things to students, to enable as wide an inclusion as possible.” | Reflection |
| Performance | Building skill, improving performance (11%) | “It will improve how I deliver lectures and help me to keep my audience engaged throughout so that I can maximize how much the students get out of it.” | Experimentation |
| Confidence | Improving confidence (8%) | “I have more confidence that I’m on the right track!” | Conceptualization |
| Application | Applying learning into practice (8%) | “I took away some useful ideas to try out with my next student...” | Experimentation |