| Literature DB >> 34686818 |
Charlotte E Eckhardt1, Jadbinder Seehra2, Stephen M Chadwick3, Kim Voerman4, Alex Landau5, Fiona S Ryan6, Padhraig S Fleming7, Matthew Garrett8, Martyn T Cobourne9.
Abstract
Introduction The Royal College of Surgeons of England (RCSEng) and the Royal College of Physicians and Surgeons of Glasgow (RCPSG) offer the bi-collegiate Membership in Orthodontics (MOrth) examination, a summative assessment of specialist knowledge, skill and behaviour in orthodontics. The COVID-19 pandemic has had a profound global effect on almost every facet of normal life, including the conduct of face-to-face examinations. We highlight development, implementation and feedback for the bi-collegiate MOrth Part 2 examination delivered remotely to a cohort of candidates in September 2020 by RCSEng/RCPSG.Methods Two anonymised online surveys (Google Forms) were distributed via electronic mail following completion of the examination diet. Forty-two candidates were sent a survey covering four domains and comprising a total of 31 questions. The 20 examiners were sent a survey containing eight questions. In both surveys, free-text responses were also collected. A rating system was used to categorise responses. All survey responses were summarised in an online data collection sheet.Results The response rate was 78.5% (33/42) and 75% (15/20) for candidates and examiners, respectively. Overall, favourable responses in relation to all sections of the assessment were elicited from candidates with the majority (mean 79.8%; 75.8-81.9%) reporting that the online examination format worked well. Equally, favourable responses were reported by examiners. Notably, 80% of examiners felt that the online exam style did not affect the mark a candidate would receive, and 100% were confident that the marks the candidates received were a reflection of their ability and were not affected by the online delivery of the assessment.Conclusions The feedback from both candidates and examiners relating to an online remote assessment of the bi-collegiate MOrth Part 2 was generally positive. Based on the survey responses, this format of a high-stakes examination was acceptable to all stakeholders, and demonstrated a high level of perceived validity and reliability in terms of content.Entities:
Mesh:
Year: 2021 PMID: 34686818 PMCID: PMC8531912 DOI: 10.1038/s41415-021-3535-5
Source DB: PubMed Journal: Br Dent J ISSN: 0007-0610 Impact factor: 2.727
Traditional and reframed formats of the Bi-MOrth examination. Prior to the pandemic, the MOrth examination took place over a number of days in a single examination centre, with candidates sitting the written paper and examiners meeting them face to face during the presented and unseen cases and OSCE sections. Candidates would travel from across the UK and internationally to attend the examination
| RCSEng and RCPSG intercollegiate membership examination in orthodontics (Bi-MOrth) | |||
|---|---|---|---|
| Traditional Bi-MOrth format (face-to-face assessment) | Reframed Bi-MOrth format (remote assessment) | Design and adjustment | Domains assessed |
| Combined MCQ and SAQ written paper | SAQ written paper only | Existing SAQ and knowledge-based OSCE questions from the question bank were modified or used as a template for question design | Knowledge Application of knowledge |
OSCE circuit Knowledge stations Communication (four stations with independent actors) Practical stations | Knowledge stations (re-purposed into SAQ written paper) Communication station (replaced by two communication scenarios) Practical stations not included | Examiners acted the roles of actors in each communication scenario All other components of communication stations were maintained as per face-to-face assessment | Assimilation of information Application of knowledge Communication |
| Presented cases | Maintained | Candidates submitted their cases electronically in advance Cases were accepted if they were nearing completion +/- incomplete records If uncompleted cases were presented, then viva to include questions and discussion concerning potential finishing procedures, approaches to analysis of outcomes using cephalometric superimposition and other techniques, and discussion of approaches to retention | Treatment planning Practical assimilation of information Application of knowledge Communication |
Structured clinical reasoning/unseen cases Candidates provided with clinical photographs, representation of study models, radiographs and cephalometric tracings | Maintained | Physical models were not available for review and direct measurements on related models could not be undertaken Case templates included photographs of models and documentation of key inter-arch measurements All other components of structured clinical reasoning/unseen cases were maintained as per face-to-face assessment | Assimilation of information Application of knowledge Communication Treatment planning |
Key: MCQ = multiple-choice questions; SAQ = short-answer questions; | |||
Attributes assessed in the OSCE component of Part 2 Bi-MOrth
| Attributes assessed in OSCE | Re-purposed in reframed Part 2 Bi-MOrth |
|---|---|
| Good communication skills | Communication stations, unseen and presented cases |
| Ability to analyse and interpret diagnostic information and material | Assessed in the unseen, presented cases and SAQs |
| Demonstrate practical skills normally undertaken as part of clinical practice | Assessed in WBA (not formally assessed in overseas candidates) |
| Interpret and appraise data from publications | SAQ |
| Apply appropriate decision-making in clinical situations | Unseen cases, presented cases and SAQs |
Key: SAQ = short-answer questions; WBA = workplace-based assessments. | |
Percentage of candidate responses for the SAQ section (n = 33)
| SAQ | Strongly agree | Agree | Neutral | Disagree | Strongly disagree |
|---|---|---|---|---|---|
| The exam seemed fair | 21.2%* | 45.5%* | 27.3%** | 3.0%† | 3.0%† |
| Were the questions asked expected from what you had been taught? | 30.3%* | 51.5%* | 18.2%** | - | - |
| Enough time was given to complete the exam | 12.1%* | 24.2* | 30.3%** | 15.2%† | 18.2%† |
| The online exam software was easy to use | 15.2%* | 33.3%* | 27.3%** | 12.1%† | 12.1%† |
| Did you feel dissatisfied with the question format? | 9.1%† | 12.1%† | 27.3%** | 39.4%* | 12.1%* |
| Did you feel the staff had properly briefed you on the exam format? | 36.4%* | 27.3%* | 24.2%** | 9.1%† | 3.0%† |
| Did you feel adequately supported by the staff during the exam? | 51.5%* | 24.2%* | 15.2%** | 6.1%† | 3.0%† |
Key: * = favourable responses. ** = neutral responses. † = unfavourable responses. | |||||
Percentage of candidate responses for the case presentations section (n = 33)
| Case presentations | Strongly agree | Agree | Neutral | Disagree | Strongly disagree |
|---|---|---|---|---|---|
| The examiners seemed friendly | 39.4%* | 42.4%* | 9.1%** | 9.1%† | - |
| Were the questions from the examiners clear? | 33.3%* | 51.5%* | 6.1%** | 3.0%† | 6.1%† |
| Were you asked questions that you felt were relevant to your presented material? | 36.4%* | 60.6%* | - | 3%† | - |
| Enough time was given for the discussion | 24.2%* | 69.7%* | 3.0%** | - | 3.0%† |
| Did you feel comfortable during the discussion? | 30.3%* | 42.4%* | 18.2%** | 9.1%† | - |
| The sound quality was to a good standard | 33.3%* | 54.5%* | 12.1%** | - | - |
| The video quality was to a good standard | 39.4%* | 39.4%* | 15.2%** | 3.0%† | 3.0%† |
| The online examination format worked well | 36.4%* | 45.5%* | 9.1%** | 6.1%† | 3.0%† |
Key: * = favourable responses. ** = neutral responses. † = unfavourable responses. | |||||
Percentage of candidate responses for the communication section (n = 33)
| Communication | Strongly agree | Agree | Neutral | Disagree | Strongly disagree |
|---|---|---|---|---|---|
| Did you understand the actor? | 42.4%* | 45.5%* | 6.1%** | 6.1%† | - |
| Were the questions from the actors clear? | 33.3%* | 54.5%* | 3.0%** | 9.1%† | - |
| Were you asked questions you felt were relevant to the presented material? | 24.2%* | 60.6%* | 15.2%** | - | - |
| Enough time was given for the discussion | 39.4%* | 51.5%* | 3.0%** | 6.1%† | - |
| Did you feel comfortable during the discussion? | 27.3%* | 48.5%* | 15.2%** | 3.0%† | 6.1%† |
| The sound quality was to a good standard | 33.3%* | 48.5%* | 15.2%** | 3.0%† | - |
| The video quality was to a good standard | 36.4%* | 45.5%* | 9.1%** | 6.1%† | 3.0%† |
| The online examination format worked well | 30.3%* | 45.5%* | 15.2%** | 3.0%† | 6.1%† |
Key: * = favourable responses. ** = neutral responses. † = unfavourable responses. | |||||
Percentage of candidate responses for the unseen cases section (n = 33)
| Unseen cases | Strongly agree | Agree | Neutral | Disagree | Strongly disagree |
|---|---|---|---|---|---|
| The examiners seemed friendly | 27.3%* | 39.4%* | 24.2%** | 6.1%† | 3.0%† |
| Were the questions from the examiners clear? | 33.3%* | 48.5%* | 9.1%** | 9.1%† | - |
| Were you asked questions you felt were relevant to the presented material? | 33.3%* | 54.5%* | 12.1%** | - | - |
| Enough time was given for the discussion | 24.2%* | 54.5%* | 12.1%** | 9.1%† | - |
| Did you feel comfortable during the discussion? | 21.2%* | 42.4%* | 24.2%** | 12.1%† | - |
| The sound quality was to a good standard | 30.3%* | 54.5%* | 12.1%** | 3.0%† | - |
| The video quality was to a good standard | 24.2%* | 57.6%* | 9.1%** | 3.0%† | 6.1%† |
| The online examination format worked well | 33.3%* | 48.5%* | 15.2%** | 3.0%† | - |
Key: * = favourable responses. ** = neutral responses. † = unfavourable responses. | |||||
Thematic grouping of candidates' free responses (SAQ = short-answer questions)
| Theme | Responses |
|---|---|
| Technical issues with internet | 'My connection was disrupted, and I had to restart my laptop and phone twice' (SAQ) 'As long as a good internet connection it was good quality' (case presentation) 'Minor sound issues - had some feedback noise as I was talking' 'Sometimes I missed words that had been said due to internet quality, but I just asked for the question to be repeated or confirmed and therefore was not a major issue' 'Lost connection and had to relocate mid-discussion' 'I did have a short drop out internet connectivity on my end for one minute - it didn't really cause any major problems though' 'One of the examiner's video and sound froze during the exam, which was off-putting, but the other examiner dealt with the situation well' 'Video quality - I had an issue on the last day of the examinations and I was not able to see the examiners. I found this made the exam harder and it would have been nice to see the examiners. I appreciate that it isn't a fault of the examination and is a problem with the online format' |
| Software issues | 'Help from the proctor exam online messenger service was very prompt and reassuring when my page hadn't loaded' (SAQ) 'Unable to click directly to the question flagged' (SAQ) 'Unable to scroll through numbers, mine kept reverting back to screen share option so wasted time and increased stress' (SAQ) 'When examiners blurred their background, it was a little off-putting to only see half their head at times' 'I had a bit of a technical issue during one of my communication scenarios, which meant that the conversation was quite stilted' 'Needing to maximise and minimise the screen took up time during the preparation' (unseen cases) 'Some difficulties navigating the screen to see the case and examiners, but managed to overcome it in subsequent cases' (unseen cases) |
| Support from staff | 'The support up to the exam was good as this was all explained to us and having the testing sessions was great help to put us at ease' 'RCS staff were all very friendly and helpful' 'Examiners were very friendly and encouraging in general' 'RCS support were great and practice sessions prior were very helpful' |
| Overall | 'Well-run exam, very friendly and supportive administrative team, kept us informed of everything' 'Worked well' 'Felt strange but workable alternative given our circumstances' 'Overall, I am very happy that I got the opportunity to sit MOrth in September and appreciate how much work must have gone into setting this all up. Thank you' |
Percentage of examiners' responses (n = 15)
| Question/statement | Strongly agree | Agree | Neutral | Disagree | Strongly disagree |
|---|---|---|---|---|---|
| Enough time was given for the discussion element of the assessment | 80.0%* | 13.3%* | - | 6.7%† | - |
| The sound quality was to a good standard | 53.3%* | 40.0%* | - | 6.7%† | - |
| The video quality was to a good standard | 40.0%* | 53.3%* | - | 6.7%† | - |
| The online examination format worked well | 66.7%* | 26.7%* | 6.7%** | - | - |
| Did you feel that the online exam style affected the mark a candidate would receive? | - | 6.7%† | 13.3%** | 20.0%* | 60.0%* |
| Did you feel as if proper precautions were taken with the exam in response to the coronavirus pandemic? | 93.3%* | 6.7%* | - | - | - |
| Do you think a face-to-face examination would have been more advantageous for the candidate? | 6.7%† | 13.3%† | 20.0%** | 26.7%* | 33.3%* |
| I am confident that the marks the candidates received were a reflection of their ability and were not affected by the online delivery of the assessment | 66.7%* | 33.3%* | - | - | - |
Key: * = favourable responses. ** = neutral responses. † = unfavourable responses. | |||||
Thematic grouping of examiners' free responses
| Theme | Responses |
|---|---|
| Technical issues with internet | 'The most stressful part was getting online' 'A few candidates had connection difficulties' 'Sound and video quality were variable and not always consistent' |
| Support from staff | 'Occasional IT dropouts but this was very well managed by the examinations team' 'All worked very well with excellent back-up from the examination team' 'Very well organised. Smooth movement between rooms as the detailed timetable was very helpful' 'It ran better than I thought due to excellent team at RCS' |
| Overall | 'The exam was excellent under the circumstances. In-person assessment does, however, remain the gold standard with better interaction, responsiveness and lower risk of problems in normal times' 'The online exam was extremely well set up and managed. However, it was definitely not the same as a face-to-face exam. A lot of non-verbal cues were lost and this sometimes disrupted the flow of the vivas' 'Worked extremely well and allowed all aspects that would usually be examined to be so' 'Score collation was more difficult and time consuming' 'The fact we used formats already established by other examinations gave us a lot of confidence we could make it work for our assessment' 'A thoroughly well-run and equivalent exam process. Allowed for the full range usually examined to be assessed in a similar manner. A fair, well-run and robust examination that safe-guarded the standards of the IMOrth and put the candidates at the heart of the process. In a time of such uncertainty, it was a great achievement to be able to supply an appropriate examination to allow candidates to progress in their careers' 'I think we have been part of revolution in assessment; however, the future may involve the integration of the face-to-face assessment with the online assessment rather than the replacement of one form of assessment with the other' 'Fantastic job by all. The right thing to have done in the circumstances but I feel the exam lost the special sense of occasion due to the online format. I don't think this is something that could ever be recreated without a face-to-face format - ideally in the college' 'Ran really well' |