| Literature DB >> 35248032 |
Samar A Ahmed1, Nagwa N Hegazy2, Archana Prabu Kumar3,4, Enjy Abouzeid5, Nourhan F Wasfy5, Komal Atta6, Doaa Wael7, Hossam Hamdy8.
Abstract
BACKGROUND: This is a practice guide for the evaluation tool specifically created to objectively evaluate longitudinal faculty development programs (FDP) using the "5×2 -D backward planning faculty development model". It was necessary to create this tool as existing evaluation methods are designed to evaluate linear faculty development models with a specific endpoint. This backward planning approach is a cyclical model without an endpoint, consisting of 5 dynamic steps that are flexible and interchangeable, therefore can be a base for an evaluation tool that is objective and takes into account all the domains of the FDP in contrast to the existing, traditional, linear evaluation tools which focus on individual aspects of the program. The developed tool will target evaluation of longitudinal faculty development programs regardless of how they were planned.Entities:
Keywords: Evaluation; Faculty development; Indicator
Mesh:
Year: 2022 PMID: 35248032 PMCID: PMC8898439 DOI: 10.1186/s12909-022-03208-x
Source DB: PubMed Journal: BMC Med Educ ISSN: 1472-6920 Impact factor: 2.463
Fig. 15X2 D cycle Backward Planning Model
Delphi Scores in Round 1 and 2
| Round 1 Delphi ( | Round 2 Delphi | |||||
|---|---|---|---|---|---|---|
| Questions | Number of experts agreed on the question | Mean | Percentage of consensus | Number of experts agreed on the question | Mean | Percentage of consensus |
| Domain A | ||||||
| A1- Has the context of the training been well defined? a | 17 | 4.4 | 89.5 | 17 | 4.6 | 94.5 |
| A2- Is it mentioned in the faculty development program description? b | 14 | 4 | 73.7 | 16 | 4.2 | 88.8 |
| A3-Does the context identify the potential target audience? a | 17 | 4.3 | 89.5 | 16 | 4.6 | 89.8 |
| A4-Does the context identify the specific need or situation necessitating the training? a | 17 | 4.3 | 89.4 | 17 | 4.7 | 94.5 |
| A5-Does the context identify the physical attributes to the needed training? b | 15 | 4 | 79 | 16 | 4.4 | 88.9 |
| A6-Is the program aligned with emerging trends in faculty development like blended learning, online learning, competency-based education.... etc.? a | 17 | 4.5 | 89.5 | 18 | 4.7 | 100 |
| Domain B | ||||||
| B1-Are the faculty selected for the program identified? a | 16 | 4.4 | 84.2 | 15 | 4.2 | 83.3 |
| B2-Are the faculty selected for the program stratified according to their knowledge? a | 14 | 3.9 | 73.6 | 15 | 4.2 | 83.3 |
| B3-Are the faculty selected for the program stratified according to interest? a | 14 | 3.9 | 73.7 | 14 | 4.1 | 77.7 |
| B4-Are the faculty selected for the program homogenous in terms of knowledge and interest? b | 10 | 4 | 52.6 | 16 | 4.1 | 88.7 |
| B5- Is there a degree of heterogeneity employed in the selection of the trainees? d | 16 | 4.1 | 88.7 | |||
| Domain C | ||||||
| C1-Have the trainee needs been studied? a | 17 | 4.5 | 89.5 | 17 | 4.5 | 94.5 |
| C2-Have the identified needs been prioritized? b | 16 | 4.3 | 84.2 | 17 | 4.5 | 94.5 |
| C3- Have the needs been reflected on the content or methods of training? b | 16 | 4.5 | 84.2 | 17 | 4.5 | 94.5 |
| C4- Have the institutional needs been studied? b | 16 | 4.3 | 84.2 | 18 | 4.6 | 100 |
| C5- Have the identified needs been prioritized? e | 16 | 4.3 | 84.3 | |||
| C6- Have the needs been reflected on the content or methods of training? e | 16 | 4.4 | 84.2 | |||
| Domain D | ||||||
| D1-Are there defined objectives for the training? a | 16 | 4.5 | 84.2 | 18 | 4.8 | 100 |
| D2-Are the objectives SMART? a | 16 | 4.3 | 84.2 | 18 | 4.8 | 100 |
| D3-Are the objectives aligned with any of the identified needs? a | 15 | 4.3 | 79 | 18 | 4.8 | 100 |
| D4- Are there objectives that deal with trainee soft skills? c | 15 | 4.4 | 83.3 | |||
| Domain E | ||||||
| E1-Are there materials for the training? a | 15 | 4.2 | 79 | 18 | 4.6 | 100 |
| E2-Are the materials authentic? a | 15 | 4.1 | 78.9 | 17 | 4.5 | 94.4 |
| E3-Are the materials in proper format? a | 15 | 4.1 | 79 | 18 | 4.5 | 100 |
| E4-Are the materials adequate for the training content? a | 16 | 4.2 | 84.2 | 18 | 4.7 | 100 |
| Domain F | ||||||
| F1-Are the instruction methods planned? a | 16 | 4.3 | 84.2 | 18 | 4.6 | 100 |
| F2-Are there proper guides for instruction? a | 17 | 4.5 | 89.5 | 18 | 4.7 | 100 |
| F3-Are they suitable for the content/ objectives? c | 17 | 4.5 | 89.5 | 18 | 4.6 | 100 |
| F4-Are they suitable for the trainees? c | 16 | 4.1 | 84.2 | 18 | 4.5 | 100 |
| F5-Are they innovative? a | 14 | 3.9 | 73.6 | 16 | 4.6 | 88.9 |
| F6-Are they feasible? c | 17 | 4.5 | 89.5 | 18 | 4.7 | 100 |
| F7-Is the program longitudinal? a | 15 | 4 | 78.9 | 15 | 4.4 | 83.4 |
| Domain G | ||||||
| G1-Is there a proper structure to enable follow up of the learning? c | 15 | 3.9 | 78.9 | 16 | 4.4 | 88.9 |
| G2-Is this structure adequate to the objectives? c | 14 | 4.1 | 73.6 | 17 | 4.4 | 94.4 |
| G3-Is this structure known to everyone in the program (management, faculty, learners, administration)? c | 16 | 4.1 | 84.2 | 17 | 4.7 | 94.4 |
| G4-Are there proper follow up tools for the learning? e | 14 | 3.9 | 73.7 | |||
| G5-Have the program ILOs been reached? a | 16 | 4.4 | 84.2 | 18 | 4.7 | 100 |
| G6-Is there a method to assess the ILOs? a | 16 | 4.4 | 84.2 | 18 | 4.7 | 100 |
| G7-Is there a methodology to deal with the non-attaining learners? b | 15 | 3.9 | 78.9 | 15 | 4.3 | 83.3 |
| Domain H | Domains H, I, J & K were not included in Delphi round 2 | |||||
| H1-Is there a platform to allow for building the community? | 17 | 4.3 | 89.5 | |||
| H2-Is there time allocated in the program to allow for building the community? | 16 | 4.1 | 84.2 | |||
| H3-Are there designated activities to allow for building the community? | 16 | 4.3 | 84.2 | |||
| H4-Do trainees have enough knowledge of other trainees? | 16 | 4.1 | 84.2 | |||
| H5-Are there collaborative efforts between trainees? | 17 | 4.4 | 89.5 | |||
| H6-Are there enough collaborative project outcomes with trainees as project members (publications, conferences, workshops…etc.) | 16 | 4 | 84.2 | |||
| Domain I | ||||||
| I1- Has the program achieved growth over the years? (Number of attendees, learner satisfaction, learner attainment, measurable impact on teaching/ learning/ assessment…etc.) | 16 | 4.4 | 84.2 | |||
| I2- Are there established methods to measure the KPIs? | 15 | 4.2 | 79 | |||
| I3- Is there a dedicated team for measuring the KPIs? | 15 | 4.3 | 79 | |||
| I4- Is there enough data collected? | 15 | 4.2 | 78.9 | |||
| I5- Is the data properly analyzed? | 15 | 4.1 | 79 | |||
| I6- Is the information deduced from the data properly reported/ discussed? | 15 | 4.3 | 79 | |||
| I7-Are there corrective actions taken based on the information deduced? | 15 | 4.2 | 79 | |||
| Domain J | ||||||
| J1- Has the feedback improved over the years? (Student satisfaction/ faculty satisfaction/ student attainment) | 15 | 4.2 | 78.9 | |||
| J2- Are there established methods to measure the learner and trainer feedback? | 15 | 4.3 | 79 | |||
| J3- Is there a dedicated team for measuring the learner and trainer feedback? | 16 | 4.4 | 84.2 | |||
| J4- Is there enough data collected? | 15 | 4.1 | 78.9 | |||
| J5- Is the data properly analyzed? | 16 | 4.3 | 84.2 | |||
| J6- Is the information deduced from the data properly reported/ discussed? | 16 | 4.2 | 84.2 | |||
| J7- Are there corrective actions taken based on the information deduced? | 16 | 4.3 | 84.2 | |||
| Domain K | ||||||
| K1- Are there decisions and or practices signifying non-linear training plan methods? E.g. Revising content while directing the learning… etc. | 15 | 4 | 79 | |||
aSame in Round 1 and 2
bReformulated after Delphi round 1
cReformulated/wording
dNewly added in Delphi Round two
eDiscard
Evaluation guide for faculty development program in educational effectiveness
| Evaluation question | Indicators | Data Sources | Data Collection Method |
|---|---|---|---|
| A1- Has the context of the training been well defined? | • Provide a description of the training context in printed and/or online format | • Program specification/ Faculty guide/ Brochures • Surveys/ Website | • Document review • Website Review • Survey Review |
| A2- Is the context described in the faculty development program description? | • Provide an orientation of the program context to the trainees | • Faculty guide/ Brochures/ Program specification/ | |
| A3-Does the context identify the potential target audience? | • The context is specifically designed with the target audience in mind. • There is a description of the intended target audience in the program specifications. • Percentage of trainees that see that the program meets their needs. | • The FDP mission and vision and objective statements • Preamble of the course/ Program specs/ Brochures/ Faculty guide | |
| A4-Does the context identify the specific need or situation necessitating the training? | • There is a description of the specific need or situation in the program specifications | • Survey for a needs assessment. | |
| A5-Does the context identify the place and time? | • Description of the place and the program’s timeframe in the program specification | • Preamble of the course/ Program specs/ Brochures/ Faculty guide | |
| Domain B: Faculty | |||
| B1-Are the faculty selected for the program identified? | • Presence of admission criteria with a clear description of the target audience | • Program specifications | • Document review • Surveys |
| B2-Are the faculty selected for the program stratified according to their knowledge? | • Presence of training program pre-requisite | • Faculty guide/ Brochures | |
| B3-Are the faculty selected for the program stratified according to interest? | • Survey the trainees and trainers’ interests upon admission/registration. | • Compare group allocation form with the registration forms | |
| B4- Is the selection of the trainees for the program homogenous in terms of knowledge and interest? | • Review attendance sheets (Registered Vs attended) | • Compare the attendance list and registration form | |
| B5- Is there a degree of heterogeneity employed in the selection of the trainees? | • Presence of training program pre-requisites indicating a wide range of variables (sex, race, country, specialty.) | • Program Specifications | |
| Domain C: Needs | |||
| C1-Have the trainee needs been studied? | • Trainees’ knowledge gaps and training requirements were identified as per the literature review. • Percentage of trainees expressing willingness to attend FDP in the ‘identified topic.’ • Percentage of trainees mentioning this topic in their Personal development plan | • Relevant literature articles • Documentation of faculty needs assessment. • questionnaire • Faculty members personal development plans | • Document review • Review of media files • survey • FGD / Interview with trainees |
| C2- Have the institutional needs been studied? | • Quality assurance report suggests that this topic needs improvement. • A review of the literature reveals that institutions need to train their faculty in the “identified. • Leadership/administrators/curriculum committee / medical education/quality assurance believe that there is scope for improvement in the ‘identified domain’ and recommend FDP | • Quality / accreditation report • Documentation of relevant literature review/ Soft or hard copy of relevant journals • Documentation of institutional needs assessment questionnaire/ • Expressed oral and written opinions of Leadership/administrators/curriculum committee / medical education/quality assurance/ Documentation of ‘learner’ needs assessment questionnaire. | |
| C3-Have the identified trainee and their institutional needs been prioritized? | • Percentage of dissatisfaction from trainees regarding this identified topic/domain. • Percentage of trainees and administrators who believe that these tasks/contents/training in the ‘identified topic’ should be given high priority | • Documentation of ‘prioritization’ based on the Data sources of C1 and C2/ FDP schedule/ Brochure. • Trainee and administrators’ feedback/satisfaction questionnaire | |
| C4-Have the identified trainee and their institutional needs been reflected on the content and methods of training? | • Percentage of identified trainees and their institutional needs added as contents with appropriate tasks and methods for training in the FDP schedule. • The proportion of experts who agree that trainee and their institutional needs been reflected on the content and methods of training | • Teaching materials/handout/ Recording of FGD with experts/ • External reviewer report | |
| Domain D: Objectives | |||
| D1-Are there defined objectives for the training? | • Expected outcomes/contents of the FDP are mentioned as well-defined objectives. • The proportion of experts who agree that objectives are well defined for the training. | • FDP schedule/ Brochure/ Reading materials / Handouts • Recording of FGD with experts/ External reviewer report. | Document review FGD FGD with experts (Comparison of FDP schedule, FDP with results of faculty needs assessment/literature review / institutional needs assessment) |
| D2-Are the objectives SMART? | • The proportion of experts who agree that objectives are specific, measurable, achievable, (or agreeable), realistic (or relevant) and time-bound, (or timely) • Percentage of program organizers who agree that the objectives were SMART. | • FDP schedule/ Brochure/ Reading materials / Handouts/ External reviewer report/ Recording of FGD with experts • Analysis of Feedback questionnaire. | |
| D3-Are the objectives aligned with any of the identified needs? | • Percentage of trainees/administrators who agree that identified objectives are aligned with either trainee or their institutional needs. • The proportion of experts who agree that trainee and their institutional needs been reflected on the content and methods of training | • Trainee and administrator questionnaire with analysis reports • FDP schedule/ Brochure/ Documentation of faculty needs assessment questionnaire with analysis/ Documentation of institutional needs assessment/ Recording of FGD with experts/ Inter-rater analysis of experts. | |
| D4- Are there objectives that deal with trainee soft skills? | • Percentage of identified objectives that are dedicated to soft skills of the trainees (Under regular circumstances) • Percentage of adapted objectives that deal with trainee soft skills (Under special circumstances) | • Analysis of survey from trainees / resource faculty • /administration/ FDP schedule with contents/ Teaching materials / handout / Analysis of Expert opinion | |
| Domain E: Materials | |||
| E1-Are there materials for the training? | • There is the availability of pre-reading materials, timetables and schedules are provided. | • FDP content/ Lesson outlines/ Brochures/ Timetables | • Interview • Document review • Survey |
| E2-Are the materials authentic? | • Materials are tailored to the institution’s and trainees’ demands. • Materials are suitable to the context of the institute, culture, and country. | • Literature review (on authentic resource material)/ Guidelines of the institute/ Need’s assessment report • FDP program content | |
| E3-Are the materials in proper format? | • The program is well structured with proper learning objectives and timelines. | • Lesson outlines/ Study guides/ Trainee interviews/ Guideline from literature review/accreditation bodies | |
| E4-Are the materials adequate for the training content? | • Materials are found sufficient to cover the domain of FDP e.g., Teaching and learning /Leadership/ Workplace-based assessment etc. | • FGD of facilitators • External reviewer report/ End of program trainee survey/ End of program trainer survey | |
| Domain F: Methods | |||
| F1-Are the instruction methods planned? | • Instruction methods are well described. | • Lesson plans/ Questionnaires to the trainees/ Peer observation/ Trainee interviews • FDP program syllabus | • Documents Review • Survey • Observation • Digital data review • Interviews • Document review |
| F2-Are there proper guides for instruction? | • There is a document guiding students about the outline of the instruction. | • Lesson outlines/ Study guides | |
| F3-Are the instruction methods suitable for the content and objectives? | • There is a variety of instruction material that delivers the content most efficiently in the opinion of experts, trainers and trainees | • Lesson outlines/ Study guides | |
| F4-Are the instruction methods suitable for the trainees? | • Percentage of trainees who pass the attainment level of the program. • More than 70% of the trainees are satisfied with the instruction methods | • Student assessment result/assignment results • Collection of expectations of the trainee at beginning of the session and matching with the objectives detailed throughout the session • (Surveys /Discussions Teaching Learning Conversation). • A study with constructive alignment in planned, delivered and assessed material. • Student end of program reports • Student satisfaction surveys. | |
| F5-Are there innovative instruction methods in the program? | • Innovative methods such as different approaches like gamification, TBL, role play, Case-based learning etc. are present. | • FDP brochure/ promotion from the institute / Software used. • Interview the participants. • Comparison study of innovation and previous program methodology • Brainstorming and group discussion | |
| F6-Are the instruction methods feasible? | • Instruction methods are found feasible by external reviewers. • Percentage of instruction methods reported that are performed | • Report of external reviewers/ • Trainee and trainer feedback | |
| F7-Is the program longitudinal? | • The program runs longitudinally for more than 3 months with an opportunity for self-study and structured assignments. | • Syllabus/ Faculty guides | |
| Domain G: Learning oversight | |||
| G1- Is there a functional process to enable follow up of the learning? | • Percentage of the trainees passing the formative assessment. • Two-three formative assessment exams are conducted each module. • Improvement of the student performance • Trainee reflections are collected at fixed intervals | • Records of the training sessions • Reflection reports • Mentor report and self-assessment report • Pre and post-test results | • Document review • Surveys • Observation • Statistical analysis • Assessor evaluation checklist • Questionnaire • Focus group • website review |
| G2-Is this mechanism adequate to the objectives? | • Trainees’ perception of the concepts indicates that the mechanism is adequate. • Percentage of the non- attaining Trainees diagnosed annually. • Percentage of the procedural defects detected by this mechanism. | • Trainees’ feedback • Audit report | |
| G3-Is this mechanism known to everyone in the program (management, faculty, learners, administration)? | • Percentage of trainees and administrators who received the announcement and program details. • Percentage of the student accessing the website and knowing the mechanism. • Percentage of students’ satisfaction with the mechanism. • Use of all the available communication channels emails, brochures, social media platforms. | • Emails and brochures • Website metrics • Questionnaire results • Emails, brochures, social media groups | |
| G4-Are there functional measurement tools to evaluate the learning and skill acquisition? | • There are differentiating assessment tools to assess learning | • Ensure the validity and reliability (Psychometrics measures) through: • Multiple tools • Multiple occasions • Multiple assessors (external assessors) | |
| G5-Have the program ILOs been reached? | • The student success rate in assessments and post-tests • Student satisfaction feedback questionnaires and percentage of students agreeing that the ILOs have been achieved. | • Post evaluation quiz Statistical analysis report • Questionnaire results | |
| G6-Is there a method to assess the ILOs? | • There is a program post-test or program evaluation that demonstrates learners’ achievement | • Post-test results • Program Evaluation report | |
| G7-Is there a methodology to deal with the non-attaining learners? | • Percentage of the non-attaining learners that have undergone a remedial procedure. • Percentage of the trainee informed and aware of the remedial policy • An Authorized policy is announced to the trainee • Frequency of evaluation measures to detect the non- attaining learners | • Mark list • Learner feedback • PDF brochure/ website • Evaluation reports | |
| Domain H: Community of practice | |||
| H1-Is there a platform to allow for building the community? | • There is a platform that is user friendly, flexible and allows for communication between trainees. | • Platform dashboard • Trainee and trainer feedback | • Observation • Surveys • Document review • Digital review |
| H2-Is there time allocated in the program to allow for building the community? | • Percentage of time allocated for activities established to promote community building | • Program specifications/ schedules | |
| H3-Are there designated activities to allow for building the community? | • Presence of activity moderators/ Facilitator to help them build the community | • Program report | |
| H4-Do trainees have enough knowledge of other trainees? | • Activities allocated for community building are innovative. • Availability of trainee information on platforms and/or in printed format | • Website | |
| H5-Are there collaborative efforts between trainees? | • Percentage of trainees that built a relationship with other trainees (Projects, publications, social media friendship or social activities). | • Survey • Publications • Social media • Project proposals | |
| H6-Are there enough collaborative project outcomes with trainees as project members (publications, conferences, workshops…etc.) | • The number of collaborative projects established between members in each group. • The number of joint activities between trainees yearly (conferences, publications etc.) • Impact evaluation of joint activities | • Surveys • Annual alumni reports • Impact evaluation report | |
| Domain I: KPI | |||
| I1- Has the program achieved growth over the years? (Number of attendees, learner satisfaction, learner attainment, measurable impact on teaching/ learning/ assessment…etc.) | o • An annual increase in the number of trainees attending the program • An annual increase in the number of trainees applying to attend the program • Percentage of the increase in the number of trained trainees compared to non-trained faculty members annually • Trainee satisfaction • Average of trainees’ satisfaction rate with the activities of the training program on a five-point scale in the program evaluation survey • Trainee attainment • Increase in the proportion of trainees who • complete the program in minimum time. • Increase in the proportion of trainees passing the program annually • Improvement in scores of the trainees in the post-program assessment than pre-program assessment • Dropout rate/ total program • Number of complaints/ year • Recommendation of the program • Measurable impact on teaching, learning and assessment • Percentage of trainees who graduated from the program who were appointed in leadership positions • Percentage of graduates promoted • Improvement of skill the of graduates in the workplace | • An official document with the number of trainees entering the program annually • An official document with the number of trainees graduating from the program for one batch • FG recordings • Feedback from colleagues and students | • Observation • Self-assessment questionnaires • Trainee survey • FGD / Interview with trainees • Document review • Statistical data analysis |
| I2- Are there established methods to measure the KPIs? | • Valid and reliable established methods for measuring KPI • Timely and continuous measuring of the KPI. | • Evaluation Reports • Annual reports • Data collection tools | |
| I3- Is there a dedicated team for measuring the KPIs? | • A dedicated and professional team for measuring each of the KPI is appointed | • Appointment decree for the team | |
| I4- Is there enough data collected? | • Adequate data collection for measuring each of the KPIs | • Documents • Records • Statistical data | |
| I5- Is the data properly analyzed? | • Proper analysis of the data using suitable statistical methods for all KPIs | • Documents • Records • Statistical data | |
| I6- Is the information deduced from the data properly reported/ discussed? | • 80% of the information deduced from the data properly reported/ discussed • Increase in the number of Scientific council meetings that discuss the deducted information properly | • Meeting minutes of the scientific councils | |
| I7-Are there corrective actions taken based on the information deduced? | • Presence of proof of corrective action taken in response to assessment results. This can be a change in the scope, structure or content of the program. | • Program report | |
| Domain J: Feedback | |||
| J1- Has the feedback improved over the years? (Student satisfaction/ faculty satisfaction/ student attainment) | • • An annual increase in the satisfaction rate of trainees, faculty and administration of 10% • • Percentage of the trainee who passed the course improved by 10% | • Surveys • FG and interviews • Post-training quizzes | • Focus groups • Interviews • Statistical analysis • Document review • Observation |
| J2- Are there established methods to measure the learner and trainer feedback? | • There are valid and reliable established methods for measuring feedback (end of program surveys, focus groups, reflection meetings) • Timely and continuous measuring of the feedback | • Report from external program reviewers • Data sets available from the feedback | |
| J3- Is there a dedicated team for measuring the learner and trainer feedback? | • A dedicated and professional team for measuring each of the feedback | • Appointment decree for the team | |
| J4- Is there enough data collected? | • There exists at least one type of data set for each KPI | • Data repositories for the program | |
| J5- Is the data properly analyzed? | • Data is analyzed using a well-established data analysis program | • Programs existing on the computers where data repositories are present • Data repository formats | |
| J6- Is the information deducted from the data properly reported/ discussed? | • The information deducted from the data properly reported/ discussed in the relevant scientific committees | • Minutes of meeting of relevant scientific committees | |
| J7- Are there corrective actions taken based on the information deduced? | • At least one annual corrective action can be demonstrated | • Program report • Program specification of the upcoming training round | |