Literature DB >> 24358393

A survey tool for assessing student expectations early in a semester.

Karl R B Schmitt1, Elise A Larsen1, Matthew Miller1, Abdel-Hameed A Badawy2, Mara Dougherty1, Artesha Taylor Sharma1, Katie Hrapczynski1, Andrea Andrew1, Breanne Robertson1, Alexis Williams1, Sabrina Kramer1, Spencer Benson3.   

Abstract

Year:  2013        PMID: 24358393      PMCID: PMC3867767          DOI: 10.1128/jmbe.v14i2.581

Source DB:  PubMed          Journal:  J Microbiol Biol Educ        ISSN: 1935-7877


× No keyword cloud information.

INTRODUCTION

Quality learning is fostered when faculty members are aware of and address student expectations for course learning activities and assessments. However, faculty often have difficulty identifying and addressing student expectations given variations in students’ backgrounds, experiences, and beliefs about education. Prior research has described significant discrepancies between student and faculty expectations that result from cultural backgrounds (1), technological expertise (2), and ‘teaching dimensions’ as described by Trudeau and Barnes (4). Such studies illustrate the need for tools to identify and index student expectations, which can be used to facilitate a dialogue between instructor and students. Here we present the results of our work to develop, refine, and deploy such a tool.

PROCEDURE

Tool development

In developing the student expectations assessment survey tool, we focused on two objectives: 1) to optimize the assessment tool’s length and 2) to make the tool applicable to a variety course types. In optimizing the length our goal was to provide sufficient information to faculty without being burdensome to students or faculty. Respecting this, we developed a pilot survey that collects basic demographic data, e.g., course, college, student year, etc., plus five questions to aid the teacher in making decisions about classroom time, assignments, and student interactions, and three questions asking students to rank various components. Specifically, we identified five pedagogical and learning components that are addressed by the survey: technology use, learning assessments, learning activities, faculty-student interactions, and timeliness of an instructor’s actions (Table 1). These components were assessed by having students select item(s) from a pre-determined set of answers (Table 1). In addition, we asked students to rank the value of the various course components with respect to their learning. The specific elements included in the list were carefully chosen to address our second objective, with the understanding that some aspects of the tool would not be applicable to every class.
TABLE 1.

Primary survey component summary.

ComponentQuestionsSelections
TechnologyWhich of the following do you expect in this course?Rank the three most important components in this course for your learning.ClickersElectronic Learning Management SystemsE-textbooksPower pointSocial media
Learning activitiesWhich of the following do you expect in this course?Rank the three most important components in this course for your learning.Chalkboard/whiteboardDemonstrationsIn-class discussionsNon-textbook readingsSmall discussion groupsTextbooks
Learning assessmentsWhich of the following do you expect in this course?Rank the three most important components in this course for your learning.Class participation pointsEssay-based examsGroup projectsHomeworkIndividual projectsMultiple-choice examsWritten papers
Faculty-student interactionsWhich of the following do you expect from the instructor of this course?Hold office hoursInteract with students in classBe accessible outside office hoursKnow students’ namesOtherNone of the above
Timeliness of actionHow soon do you expect your instructor to: (respond to email, post grades, return assignments, be available to meet with you, respond to phone calls)?ImmediatelyWithin 24 hoursWithin 2 daysWithin a weekNeverNA
Primary survey component summary.

Tool refinement

In the spring 2012 semester, we piloted the survey tool and collected 816 responses from undergraduates in 25 STEM courses at the University of Maryland (UMD). We then refined the survey tool based on the pilot results and faculty feedback. Specifically, we clarified the wording of several questions and made minor changes to the available response options. For example, in the survey question related to the timeliness of an instructor’s actions, we added a new category, “longer than a week” to address the gap between “within a week” and “never” in the options originally provided. The refined survey tool consists of three demographic questions and six teaching-related questions (Appendix 1). It has been distributed for implementation across the UMD campus community in a format that can be easily customized for a given class to better suit individual instructor’s needs.

DISCUSSION

This idea of assessing student expectations is very interesting at a conceptual level but can it be successful in shaping or evaluating different practices in a course? Our pilot survey provided instructors of 25 courses with constructive information on student expectations. As an example, we received 167 responses from a sophomore level General Microbiology course that included which technologies, activities, and assessments students expected in the course. In addition, students were asked to identify the three classroom components they valued most for their learning. The data shown in Figure 1 were collected from microbiology students after students had received the syllabus and the class had met for several weeks. Even after having seen the syllabus and attending class, no survey element was expected or not expected by 100% of the students, indicating that there were a significant number of students who were unclear or unable to recall parts of the course. As a general trend, students placed greater value on learning tools available to them during their independent study time, such as study guides and textbooks, while they discounted the value of in-class activities (like discussion groups and in-class participation) for learning. In particular, the majority of students expected Classroom Response Systems (‘clickers’) to be used in class, but few placed any value on clickers for learning. This is contrary to data showing the effectiveness of clickers for learning (3) and suggests there is an expectation gap between faculty and students. Student learning may benefit from bridging this gap by providing students with information about how learning activities such as the use of clickers can help them reach their learning goals. Instructors may also use this survey to assess the potential impact of any changes they are considering in the course by using the tool in a longitudinal fashion.
FIGURE 1.

Survey results for a 200-level General Microbiology class. The black bars show the percentage of students who expected the pedagogical tool to be used in the class. The gray bars show the percentage of students reporting that the tool was important (valued) for their learning.

Survey results for a 200-level General Microbiology class. The black bars show the percentage of students who expected the pedagogical tool to be used in the class. The gray bars show the percentage of students reporting that the tool was important (valued) for their learning.

CONCLUSION

In our pilot, we found this survey could provide useful information for faculty on what students expect and value in the classroom. The issue of whether and how faculty might use the tool is fodder for future studies, the beginning of which is to make the tool widely available—the purpose of this manuscript. The revised assessment tool is publicly available as a customizable survey for the entire instructional community at UMD through the Qualtrics (http://www.cte.umd.edu/Resource/Surveys/) instance at UMD and downloadable for any instructor (see Appendix 1). We believe that faculty who use this tool in the first week or two of a class will be better able to identify and address misconceptions students might have about what will occur in a course, even after the syllabus has been distributed. As one faculty responded on the feedback form, “I thought this survey was great at getting a cross-section of what my students expected from the class. I was surprised at some of the expectations.” In addition, the tool can be used to help students better appreciate the importance of learning tools and activities. It can also provide data for widespread analysis on what types of resources should be available to faculty (e.g., e-textbooks or demonstration materials). This tool provides instructors an opportunity to improve classroom learning—by engaging with students about what they expect, and starting a dialogue to better address these expectations. Appendix 1: Survey tool
  2 in total

1.  Faculty and student expectations and perceptions of e-mail communication in a campus and distance doctor of pharmacy program.

Authors:  Pamela A Foral; Paul D Turner; Michael S Monaghan; Ryan W Walters; Jennifer J Merkel; Jeremy H Lipschultz; Thomas L Lenz
Journal:  Am J Pharm Educ       Date:  2010-12-15       Impact factor: 2.047

2.  Evaluating the impact of a classroom response system in a microbiology course.

Authors:  Erica Suchman; Kay Uchiyama; Ralph Smith; Kim Bender
Journal:  Microbiol Educ       Date:  2006-05
  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.