| Literature DB >> 28678678 |
Mari van Wyk1, Linda van Ryneveld1.
Abstract
Academics often develop software for teaching and learning purposes with the best of intentions, only to be disappointed by the low acceptance rate of the software by their students once it is implemented. In this study, the focus is on software that was designed to enable veterinary students to record their clinical skills. A pilot of the software clearly showed that the program had not been received as well as had been anticipated, and therefore the researchers used a group interview and a questionnaire with closed-ended and open-ended questions to obtain the students' feedback. The open-ended questions were analysed with conceptual content analysis, and themes were identified. Students made valuable suggestions about what they regarded as important considerations when a new software program is introduced. The most important lesson learnt was that students cannot always predict their needs accurately if they are asked for input prior to the development of software. For that reason student input should be obtained on a continuous and regular basis throughout the design and development phases.Entities:
Keywords: Student voice; ease of use; educational software; software development; user experience; veterinary education
Mesh:
Year: 2017 PMID: 28678678 PMCID: PMC5508648 DOI: 10.1080/10872981.2017.1345575
Source DB: PubMed Journal: Med Educ Online ISSN: 1087-2981
Figure 1.Level of ease of the login process.
Figure 2.Attempts to log procedures.
Figure 3.Ease of use relating to the logging of procedures.
Figure 4.Progress bar as motivator.
Figure 5.Devices used to log procedures.
Figure 6.Usefulness of the training session and the online manual.