Literature DB >> 27158302

Uncovering Barriers to Teaching Assistants (TAs) Implementing Inquiry Teaching: Inconsistent Facilitation Techniques, Student Resistance, and Reluctance to Share Control over Learning with Students.

Cara Gormally1, Carol Subiño Sullivan2, Nadia Szeinbaum3.   

Abstract

Inquiry-based teaching approaches are increasingly being adopted in biology laboratories. Yet teaching assistants (TAs), often novice teachers, teach the majority of laboratory courses in US research universities. This study analyzed the perspectives of TAs and their students and used classroom observations to uncover challenges faced by TAs during their first year of inquiry-based teaching. Our study revealed three insights about barriers to effective inquiry teaching practices: 1) TAs lack sufficient facilitation skills; 2) TAs struggle to share control over learning with students as they reconcile long-standing teaching beliefs with newly learned approaches, consequently undermining their fledgling ability to use inquiry approaches; and 3) student evaluations reinforce teacher-centered behaviors as TAs receive positive feedback conflicting with inquiry approaches. We make recommendations, including changing instructional feedback to focus on learner-centered teaching practices. We urge TA mentors to engage TAs in discussions to uncover teaching beliefs underlying teaching choices and support TAs through targeted feedback and practice.

Entities:  

Year:  2016        PMID: 27158302      PMCID: PMC4858357          DOI: 10.1128/jmbe.v17i2.1038

Source DB:  PubMed          Journal:  J Microbiol Biol Educ        ISSN: 1935-7877


INTRODUCTION

Many college laboratory courses have adopted inquiry-based learning, designed to mimic scientists’ practices; students learn problem-solving by testing hypotheses (1, 20). Benefits include improved science process skills (3), reasoning (2), and engagement (14, 10). However, learning gains depend on skilled execution of inquiry teaching practices. Teaching assistants (TAs), often new to inquiry approaches, teach 90% of biology laboratories (26) and need coaching to develop these skills. Yet, we know little about the challenges TAs face in implementing inquiry teaching practices, so predicting effective coaching approaches is difficult. Inquiry-based teaching challenges TAs as the instructor’s role shifts from information transmitter to facilitator. Effective inquiry facilitation requires TAs to execute numerous skills simultaneously (12, 35). TAs need to use questioning to unlock students’ curiosity, deciding which intervention is most appropriate in that moment (35). However, didactic teaching beliefs may affect TAs’ teaching practices, e.g., TAs may expect inquiry learning to frustrate students (6, 11). Thus, TAs may believe that it is their duty to provide students with correct, easy-to-understand information (11) or tell students exactly what to do (15). As a result, students are prevented from wrestling with discovering what to do, the hallmark of inquiry-based learning. We hypothesized that TAs’ beliefs cause contradictions between what inquiry ideally should be and their implementation of inquiry within the limitations of the real space of the classroom. Literature in the K–12 context has uncovered that teaching beliefs and views about science play a critical role in instructional decision-making (9, 33). Beliefs are deeply rooted, as “preservice teachers are insiders,” acculturated through their own education (22). Consequently, if TAs’ beliefs are not aligned with inquiry teaching, even extensive pedagogical preparation may not translate into effective inquiry teaching (8, 9, 13, 18, 21, 31, 34). TAs and K–12 teachers differ in important ways (e.g., motivation, teacher identity, research experience, pedagogical content knowledge, institutional culture, mandatory curriculum), and investigating how TAs implement inquiry is therefore necessary (33). To uncover TAs’ beliefs, our study was guided by the following research question: What challenges do TAs encounter in the process of learning to teach science as inquiry? We used multiple data sources to analyze TAs’ teaching practices and beliefs qualitatively throughout their first year of teaching.

METHODS

Study context and participants

The study, approved by the Institutional Review Board (Protocol Number H12281), was conducted at a US public research university with high research activity (5). All new biology TAs complete a course (“TA Prep course”) about teaching strategies, including a unit about inquiry teaching (29)1. In fall 2012, TAs in the TA Prep course were invited to participate in this study. Six inquiry TAs consented to participate (identified with pseudonyms) (Table 1).
TABLE 1

Background about the teaching assistant (TA) research participants, including their teaching assignments.

TA PseudonymYear in SchoolNationality and Undergraduate EducationGenderLab Course Taught

Fall 2012Spring 2013
Ana3rd year undergraduateUSFemaleOrganismal biologyIntroductory biology
Bharat1st year graduate studentIndiaMaleOrganismal biologyOrganismal biology
Daniel1st year graduate studentUSMaleIntroductory biologyIntroductory biology
Elisa1st year graduate studentUSFemaleIntroductory biologyOrganismal biology
Gita1st year graduate studentIndiaFemaleIntroductory biologyOrganismal biology
Hai1st year graduate studentChinaMaleIntroductory biologyIntroductory biology
Background about the teaching assistant (TA) research participants, including their teaching assignments. All TAs taught two sections of one introductory biology laboratory (Table 1), in pairs, and no study participants taught together. Students’ grades are based on carrying out their own experiments and reporting findings. Grades do not depend on acquiring particular results. TAs were supported by weekly laboratory preparation meetings (“lab prep”). Lab prep sessions (taught by CG) focused on inquiry-based teaching strategies contextualized for each unit.

Data collection

Our qualitative study engaged in exploratory, descriptive research using multiple sources of evidence. Our data captured three perspectives on TAs’ emerging inquiry teaching practices (Table 2): 1) Direct observation perspective – To evaluate TAs’ inquiry teaching practices, we video-recorded each participant teaching once in each semester (fall 2012, spring 2013). 2) TAs’ perspectives – To capture TAs’ understanding of inquiry and experiences using inquiry teaching practices, we administered surveys about their pedagogical knowledge and beliefs at the beginning and end of the TA Prep course (fall 2012); analyzed their class learning portfolios; and interviewed each TA at the end of their second semester teaching (spring 2013). 3) Student perspectives – To capture student perspectives, we analyzed responses to open-ended questions about teaching effectiveness on mid-semester and end-of-semester student evaluations (fall 2012, spring 2013). These three perspectives provided insights about barriers to inquiry teaching.
TABLE 2

Timeline of data collection over the course of two semesters.

SemesterPre-SemesterDuring SemesterPost-Semester
Fall 2012Knowledge & ATI SurveysClassroom observationMid-semester teaching evaluationsKnowledge & ATI SurveysEnd of semester teaching evaluationsLearning portfolios
Spring 2013Classroom observationsMid-semester teaching evaluationsEnd of semester teaching evaluationsInterviews

ATI = Approaches to Teaching Inventory.

Timeline of data collection over the course of two semesters. ATI = Approaches to Teaching Inventory.

Direct observation of teaching practice

We used the Electronic Quality of Inquiry Protocol (EQUIP) to assess inquiry-based instruction (17). The EQUIP evaluates four categories related to inquiry: Instruction, Discourse, Assessment, and Curriculum, rated on scales from 1 (pre-inquiry) to 4 (exemplary inquiry). During fall 2012, CG video-recorded the classes; during spring 2013, all researchers video-recorded in rotating pairs (Appendix A, EQUIP scoring process).

TA perspectives

TAs completed two pre- and post-semester surveys: the Knowledge Survey and the Approaches to Teaching Inventory (ATI). The Knowledge Survey measured familiarity with teaching skills, including active learning, engaging explanations, and managing group work (Appendix B). Students indicated their confidence in their understanding of each item using a three-point Likert scale adapted from Wirth and Perkins (36). To determine shifts in TAs’ beliefs about teaching, we compared changes in the 20 inquiry-focused items (Appendix B). The ATI evaluates instructors’ approaches to teaching college science, measuring a conceptual change/student-focused approach (CCSF) and an information transmission/teacher-focused approach (ITTF), measured on a five-point Likert scale (where 1 = only rarely/never true and 5 = almost always/always true; 27). Instructors’ approaches are assessed by calculating the sum of the CCSF- and ITTF-item responses to determine the relative weight of each approach (CCSF and ITTF). Each sum is averaged. The ATI was used to determine whether TAs shifted their beliefs about teaching (23, 28). TAs assembled learning portfolios, selecting four learning activities (e.g., case studies, responses to reading questions, active learning exercises) to showcase TA Prep course experiences. TAs also wrote an essay explaining their portfolio selections and reflecting on their learning from the course and their first TA experience. We recorded the artifacts included in portfolios, interpreting an item’s inclusion as an indication of value. We calculated the ratio of each teaching skill represented in portfolios as compared with its representation in the course. We conducted a qualitative analysis of the reflective essays. Since IRB restrictions prevented CSS from knowing the participants’ identities in fall 2012, we could not conduct interviews then. Reflection essays provide information about TAs’ perceptions of their learning about inquiry-based teaching (Appendix C, coding process, code book, and frequency counts). We conducted semi-structured interviews in rotating pairs to understand how TAs perceived changes in their teaching from the first to the second semester. We constructed the interview questionnaire to learn more about how TAs learn to teach inquiry (Appendix D, Interview Questionnaire). During interviews, participants viewed three video-clips: one of a peer teaching and two of themselves, one from each semester. We used EQUIP scores to select representative video-clips. First, we asked TAs to watch a clip of a peer teaching (another study participant) so that we could ascertain their understanding of inquiry from a more objective perspective. Next, we asked TAs to evaluate clips of their own teaching to gauge whether they recognized gaps in their inquiry teaching practices. Interviews were recorded, transcribed, and coded (Appendix E, coding process, code book, and frequency counts) (7).

Student perspectives

We coded students’ responses to two open-ended questions on mid-semester and end-of-semester evaluations: 1) “Please give a couple of examples of specific things your TA did that really helped you learn in biology lab,” and 2) “Please suggest one or two specific changes your TA could make that would help improve your learning in the lab.” We analyzed responses to identify what students valued and the pedagogical strategies they highlighted (Table 3). Mid-semester evaluations were likely to impact TAs’ teaching approaches since they read and addressed them during the semester (Appendix F, coding, code book, and frequency counts).
TABLE 3

Numbers of completed mid-semester and end-of-semester student evaluations of teaching for fall 2012 and spring 2013.

20122013


Teaching AssistantMid-Semester Percent (Count)End-of-Semester Percent (Count)Mid-Semester Percent (Count)End-of-Semester Percent (Count)
Ana50.0% (14)50.0% (14)64.7% (22)61.8% (21)
Bharat63.2% (24)36.8% (14)95.7% (22)60.9% (14)
Daniel41.0% (16)38.5% (15)73.5% (25)55.9% (19)
Elisa26.7% (12)28.9% (13)71.4% (15)57.1% (12)
Gita54.3% (25)50.0% (23)80.0% (24)70.0% (21)
Hai59.6% (28)63.8% (30)70.4% (19)66.7% (18)
Total/Average49.1% (119)44.7% (109)75.9% (127)62.1% (105)
Numbers of completed mid-semester and end-of-semester student evaluations of teaching for fall 2012 and spring 2013.

Ethics and credibility

We made efforts to ensure ethical, credible results. One study author (CSS) taught TA Prep but did not know participants’ names until after the course. A colleague recruited participants and collected informed consent forms, keeping participants’ names confidential until grades were submitted. The other two authors collected data during fall 2012; however, we did not analyze these data until the completion of TA Prep. To ensure study credibility, we triangulated multiple data sources. Prior to submitting the manuscript for publication, we asked two colleagues to critically review our work so that we could address potential concerns and incorporate their feedback. Finally, we shared our manuscript with the research participants to check their understanding of our conclusions and incorporate their feedback.

RESULTS

Cross-case comparisons provided evidence that TAs made some gains in their conceptual understanding of inquiry teaching, with five of the six gaining confidence in their conceptual understanding (Knowledge Survey, Fig. 1). TAs valued learning about inquiry (Fig. 2). All six TAs’ essays included comments that reflected perspective shifts about TAs’ and students’ roles (Table 4). For example, TAs realized how important it is not to give students the answers, seeing this as an opportunity for students to seek answers. However, the reflections also highlight how TAs initially approached inquiry-based teaching from teacher-centered perspectives, causing them to struggle. For example, Daniel feared losing control and credibility when shifting to being a discussion leader rather than a lecturer. TAs needed more practice to master these new roles. In her reflection essay, Gita wrote, “I need to improve a lot in this aspect as I tend to spill the answers, if I find the student is stuck.”
FIGURE 1

Comparison of pre and post scores from the Knowledge Survey (KS): inquiry-related items only. Teaching assistants (TAs) indicated their confidence for each item on a scale of 1 to 3, with 3 indicating the most confidence. The scores shown represent averages for the 20 inquiry-related items. The pre and post Knowledge Survey questions were identical. TAs completed the pre-test prior to beginning the TA Prep course and the post-test on the final day of the course (Appendix B, Knowledge Survey questions and instructions).

FIGURE 2

Learning Portfolio Artifact Analysis. We calculated the representation of assignment artifacts that teaching assistants (TAs) chose to include in their learning portfolios as compared with the actual representation of the items in the class. For example, while Active Learning was approximately 14% of the actual assignments offered during the class, TAs selected Active Learning assignments for 19% of the items in the learning portfolios. TAs chose to include more items about Inquiry, Active Learning, and Grading, indicating that TAs valued these content units. Alternatively, TAs rarely chose to include artifacts related to Policies/Professionalism.

TABLE 4

Percent of inquiry-related statements in the learning portfolio reflection essays.

Teaching Assistant% of Statements Related to Inquiry
Ana33.3%
Bharat20.0%
Daniel25.0%
Elisa16.7%
Gita14.3%
Hai20.0%
Average21.6%
Comparison of pre and post scores from the Knowledge Survey (KS): inquiry-related items only. Teaching assistants (TAs) indicated their confidence for each item on a scale of 1 to 3, with 3 indicating the most confidence. The scores shown represent averages for the 20 inquiry-related items. The pre and post Knowledge Survey questions were identical. TAs completed the pre-test prior to beginning the TA Prep course and the post-test on the final day of the course (Appendix B, Knowledge Survey questions and instructions). Learning Portfolio Artifact Analysis. We calculated the representation of assignment artifacts that teaching assistants (TAs) chose to include in their learning portfolios as compared with the actual representation of the items in the class. For example, while Active Learning was approximately 14% of the actual assignments offered during the class, TAs selected Active Learning assignments for 19% of the items in the learning portfolios. TAs chose to include more items about Inquiry, Active Learning, and Grading, indicating that TAs valued these content units. Alternatively, TAs rarely chose to include artifacts related to Policies/Professionalism. Percent of inquiry-related statements in the learning portfolio reflection essays. Other data confirmed that TAs’ use and understanding of inquiry approaches were still developing. TAs made slight shifts toward more student-centered beliefs and teacher-centered beliefs (Fig. 3, ATI). This is unsurprising as teachers new to inquiry often hold both teacher-centered and student-centered beliefs (16). TAs had a developing level of inquiry in the first observations; however, we did not see improvements in the second observation (Fig. 4). These results support our hypothesis that there is a gap between TAs’ beliefs and their use of inquiry. However, which barriers held TAs back from making progress? Our analysis of the interviews and student evaluations elaborate descriptively on this question. Three themes emerged from analysis, each described below (Table 5).
FIGURE 3

Average pre- and post-semester scores from the Approaches to Teaching Inventory (ATI) (27). The ATI measures items on two key dimensions of teaching on a five-point Likert scale (5 being the most): Information Transmission/Teacher-focused (y-axis) and Conceptual Change/Student-focused (x-axis). Each teaching assistant’s average score is shown as well as the group average. Higher numbers indicate a stronger orientation towards a respective set of beliefs. Teaching assistants (TAs) made slight shifts, becoming slightly more learner-centered and teacher-centered in their beliefs.

FIGURE 4

Overall EQUIP scores from classroom observations made in fall 2012 and spring 2013. No score was available for Hai in fall 2012 due to poor video quality.

TABLE 5

Barriers identified through analysis and integrated subthemes.

BarrierSourceSubthemes
TAs did not fully develop the facilitation skills required for inquiry teaching in their first yearDirect observation (EQUIP)GuidingExperienceTrusting students to think and answer questionsTA/student rolesRole of information and information seekingEnsuring successConfidenceGroup participationPeer learningCommunication: phrasing questions
TAs felt a responsibility to protect and control student learning experiencesInterview codesExperienceTrusting students to think and answer questionsAuthorityConfidenceEnsuring successSignificance/big pictureRole of failurePeer learningTime management impacts authenticityRole of creativityStudent motivationEnjoying biology
Students explicitly or implicitly resist inquiry-based teachingTime management impacts authenticityExperienceConfidence
Student evaluation codesPositive feedback from students: pressure against inquiryNegative feedback: pressure against inquiry
Average pre- and post-semester scores from the Approaches to Teaching Inventory (ATI) (27). The ATI measures items on two key dimensions of teaching on a five-point Likert scale (5 being the most): Information Transmission/Teacher-focused (y-axis) and Conceptual Change/Student-focused (x-axis). Each teaching assistant’s average score is shown as well as the group average. Higher numbers indicate a stronger orientation towards a respective set of beliefs. Teaching assistants (TAs) made slight shifts, becoming slightly more learner-centered and teacher-centered in their beliefs. Overall EQUIP scores from classroom observations made in fall 2012 and spring 2013. No score was available for Hai in fall 2012 due to poor video quality. Barriers identified through analysis and integrated subthemes. TAs did not fully develop the facilitation skills required for inquiry teaching in their first year In interviews, all TAs recognized the importance of all group members participating in designing experiments. Watching video-clips, TAs identified moments when another TA was not effectively promoting participation. In her interview, Elisa explains why group dynamics are important: It is important because it is a proxy to let us know what the students understand … it brings more ideas to the table. Generally, TAs made few changes in their proficiency with inquiry approaches from fall 2012 to spring 2013. Frequently, we saw a mix of questioning strategies: failing to probe for student understanding during follow-up questioning or using numerous step-wise questions; failing to understand the instructor’s role in a student-centered discussion—patterns consistent with an information-transmission teaching orientation (28). For example, during classroom observations Daniel used questioning to encourage genuine group discussion about how to improve their experimental design. However, when asked a question, Daniel often simply answered the question. TAs wrestled with one central question, how to not tell the students the answers, which manifested in weak facilitation skills. For example, Bharat did not trust his students to problem-solve, while Daniel struggled with wanting to help students. In his interview, Bharat expressed needing to change to challenge students to take responsibility for learning: I tend to include the answer in my question. I reframe it so that it’s easier for them to answer… that would be a leading question. I realized during this semester… that’s not a good way because they figure out ‘oh, this seems to be the answer.’ TAs felt responsible for protecting and controlling student learning experiences TAs struggled to allow students to experience the failure inherent in doing science, providing information and directions rather than giving students ownership for seeking information and making experimental decisions. Interviews revealed that TAs had difficulty reconciling doing science in the classroom as a student versus doing science in the laboratory as a researcher (c.f. 30). Gita best articulated this difference between science in the classroom and the laboratory: [In class] I write sources of error [on the lab report] and I’m done with this…it’s not affecting my grades, it’s not affecting me. But when I’m failing my experiment…and I give my lab report and don’t have controls, my boss says ‘how do I trust your experiments without controls?’ So that’s the impact you get there. TAs perceived different consequences of experiencing failure while doing scientific research versus failure as a student. In research, there are opportunities to conduct experiments repeatedly, depending on the cause of failure. However, for students, failure is often a summative measure of performance rather than feedback for further experimental iteration, since students typically have one opportunity to conduct an experiment. As a result, some TAs hesitated to allow students the full experience of learning from failure. For example, Daniel was concerned that failure would dampen students’ enjoyment of biology. He struggled to protect students from failure, though he acknowledged that he himself enjoyed grappling with similar challenges as an undergraduate. Additionally, some TAs recognized the inherent time pressure in the classroom, leading them to devalue learning from mistakes. Instead, TAs pushed students to account for basic information, so that their experiments could be meaningful rather than about not making “dumb mistakes” from procedural complications. TAs provided information in the form of leading questions rather than giving students time and ownership for seeking out relevant information. At times, TAs tried to steer student experiments too strongly through over-involvement in student deliberations. Watching a video clip during the interviews, Daniel identified this problem: I could have asked them a few questions and then stepped back and evaluate … I was too involved in the conversation. Finally, some TAs made creativity a requirement. While promoting creativity sounds positive, it is not helpful if its purpose is not to push students to think deeply about biology or if it does not align with inquiry. For example, Elisa demanded that groups develop unique experiments that were not necessarily student-determined. Elisa elaborated on her motivations during her interview: It makes it more fun for us to read rather than reading the same thing over and over. It is sort of selfish but it benefits them so when they talk to each other they get new ideas. In reality, this creativity is teacher-centered; students are not empowered to make their own decisions. Instead of helping students experience being a scientist, TAs remained as the information authorities. Students explicitly or implicitly resist inquiry-based teaching Student evaluations revealed a range of opinions about pedagogical approaches (Table 6, Table 7). Our analysis focused on comments pertaining to what students found helpful (or not) about how TAs’ explanations of biological concepts guided their inquiry-learning process. Effective explanations helped students base experimental designs on their understanding of biological concepts. Other comments described how TAs used questions to guide students in discovering answers. Sometimes students praised the way TAs allowed them to test out ideas. TAs might be more likely to make teaching decisions supporting inquiry learning when students respond positively to their probing questions about the scientific reasoning supporting their experimental designs (Table 7). However, TAs’ perceptions of student satisfaction may also discourage them from inquiry teaching practices.
TABLE 6

Student evaluation frequency counts.

Fall 2012Spring 2013


ThemeMid-Semester Frequency (# TAs)End-of-Semester Frequency (# TAs)Total Code CountsMid-Semester Frequency (# TAs)End-of-Semester Frequency (# TAs)Total Code Counts


Positive feedback from students: reinforces inquiry18 (5)8 (4)26 (6)26 (6)10 (5)36 (6)
Positive feedback from students: pressure against inquiry33 (6)9 (6)42 (6)25 (6)6 (3)31 (6)
Negative feedback from students: reinforces inquiry1 (1)1 (1)2 (2)2 (2)1 (1)3 (3)
Negative feedback from students: pressure against inquiry14 (5)4 (4)18 (6)3 (3)3 (2)6 (4)

TA = teaching assistant.

TABLE 7

Representative comments from student evaluations.

Reinforces InquiryPressure Against Inquiry
Positive feedback from students“Our TA comes around and will discuss the design of our experiment and ask us why we made the decisions that we made and if we have scientific information that we used to make those decisions. Also, he will ask questions that will encourage us to think ahead to the next lab.” (Bharat, spring 2013 mid-semester)“Able to involve the students effectively in group activities to help each other to learn key concepts.” (Hai, fall 2012 end of semester)“Discusses different portions of the lab (i.e., abstract, figures, etc.) very well – goes step by step and goes into great detail; is good at asking directed questions rather than divulging the answer – makes the student work for the answer.” (Elisa, fall 2012 mid-semester)“She comes over at the start of each lab and asks us about our ideas, and then proceeds to explain any flaws or obstacles that we may encounter.” (Gita, spring 2013 mid-semester)“Made it clear what he wanted us to do.” (Bharat, spring 2013 end of semester)“He is always monitoring us so he helps us catch any mistakes we are doing in our experiments.” (Hai, fall 2012 mid-semester)
Negative feedback from students“Maybe have some thought provoking questions for our lab groups to get started on the right track or to make sure our data is relevant. Or talk with our lab section about how what we are doing in our lab may be relevant in the real world.” (Gita, spring 2013 mid-semester)“He could speak less quickly, and seem like he is in less of a hurry. There are times where he asks a question with a seemingly obvious answer that nobody replies to – he could ask less of those, as it only seems awkward for all parties when there’s silence in reply.” (Hai, spring 2013 mid-semester)“Ana could explain the experiment in more detail before we begin [sic] it, so that we could ask less questions about the experiment while doing it.” (Ana, fall 2012 mid-semester)“Warn us against certain experiment constructions (for example, if it will be very difficult to statistically analyze).” (Bharat, fall 2012 mid-semester)“The only suggestion would be to address in detail what possible errors may occur in an experiment before they actually happen. This way, the experimental design can be based around those factors and in turn, produce more efficient results.” (Daniel, spring 2013 mid-semester)

TA = teaching assistant.

In comments resisting inquiry, students wanted TAs to connect biological concepts to their experiments instead of providing expansive explanations that required students to construct these connections. Positive comments resisting inquiry pressure TAs to provide explicit instructions so students do not have to ask so many questions. Importantly, students did not say that TAs provided them clear instructions about their experimental designs. Nevertheless, students pressured TAs more subtly to use teacher-centered approaches. Thus, often, TAs faced positive pressure to adopt non-inquiry strategies (Table 6, Table 7). Negative comments resisting inquiry revealed students’ desire for authoritative instruction (Table 7). In these comments, students appreciated when TAs: 1) provided clear, detailed steps about finishing the laboratory, 2) helped students avoid mistakes or the lack of meaningful data by monitoring groups closely and asking questions to identify potential experimental problems, sometimes even telling them how to avoid them, or 3) helped the students learn the “best” way to do something. The second case is insidious: TAs appeared to guide students, as opposed to bluntly supplying answers, but their guiding directed students towards a set goal, limiting students’ decision-making. Student evaluation frequency counts. TA = teaching assistant. Representative comments from student evaluations. TA = teaching assistant.

Study limitations

Our study had a small sample size, which may amplify the impact of TAs’ personal backgrounds and teaching beliefs. However, the sample size is typical for qualitative studies (4, 9, 18, 30, 31, 33). We chose analysis techniques, mindful of study limitations, to leverage study strengths and minimize misleading readers or overextending findings. For example, we highlighted how common particular themes were, but we did not claim statistical significance for these findings. Another potential study limitation is that participants taught with co-TAs. However, this concern never arose in data sources, nor did we find evidence that co-TAs’ influences impacted TAs’ use of inquiry teaching strategies. Additionally, we conducted one classroom observation for each TA each semester. Additional observations may have provided a more representative picture of TAs’ use of inquiry.

DISCUSSION

We triangulated data sources to capture three perspectives on TAs’ emerging inquiry teaching practices. We identified gaps between TAs’ teaching beliefs and their implementation of inquiry. Findings reveal three persistent barriers to using inquiry. Here, we make specific recommendations for improving pedagogical development. The first barrier is that TAs did not fully develop the facilitation skills required for inquiry teaching in their first year. TAs were beginning to develop as inquiry facilitators. However, their teaching practices were not always aligned with established definitions of inquiry. This suggests that TAs have misconceptions about inquiry, or have not yet developed a complete understanding of how to use inquiry. Reviewing the manuscript, Bharat noted that there was a “big gap in what I thought I should do and what actually happened in class.” In their learning portfolio essays, TAs emphasized the value of practicing teaching. We recommend that TAs practice more and get direct feedback about facilitating inquiry (e.g., 37). The EQUIP could be a tool to guide novice TAs to reflect on inquiry teaching practices. Additionally, in reviewing the manuscript, Elisa suggested that “seeing how a pro does it might give context on how to teach in an inquiry style.” A second barrier was that TAs felt a responsibility to protect and control student learning experiences. TAs described what they perceived as obstacles to enacting inquiry, including feeling responsible for protecting students from failure, for guaranteeing creativity, and for providing information rather than letting students seek information. Taken together, these obstacles reveal TAs’ reluctance to share control with students. This is evidenced by the ways in which TAs use their power—whether in teacher-centered ways to maintain status as the “information authority” or in learner-centered ways, empowering students to take responsibility for decision-making. In reviewing the manuscript, Elisa noted that this mindset can be tough to change. In his review, Bharat noted TAs’ diverse educational backgrounds, influencing their attitudes toward sharing power in the classroom. Further, there is subtle pressure from students for TAs to “take control,” also noted by others (25). Pedagogical development efforts need to examine TAs’ beliefs about control of learning. We suggest constructing discussion exercises around case studies or video-clips to problematize teacher-centeredness, with the goal of uncovering beliefs that underlie ineffective use of inquiry teaching practices. Furthermore, we recommend improving the curriculum itself. In Gita’s description about the experience of failure in scientific research versus failure in an inquiry classroom, we have a legitimate criticism of the authenticity of inquiry learning experiences. Typically, students do not have opportunities to confront failures. We suggest providing opportunities for students to repeat an experiment to clarify or expand the results obtained. This exercise should include guided reflection about failure in science, much like a researcher repeating an experiment. Positive student evaluation comments that reinforced non-inquiry practices were a third barrier. We argue that if students respond positively to didactic teaching approaches, then TAs might be pressured away from inquiry approaches. We realize positive resistance feeds TAs’ teacher-centered beliefs. Other researchers implicitly acknowledge the power of positive resistance. For example, research about student evaluations in a university biology course includes a cautionary message “to organize classes so as to achieve a judicious balance of student-centered activities and presentation-style instruction” to receive more positive student reviews (32). Instead, we suggest TAs should be prepared to anticipate positive resistance. Training to critically interpret student evaluations can help TAs to balance their reactions and share explanations about inquiry approaches with students. Furthermore, we recommend developing teaching evaluation instruments to measure inquiry-based teaching behaviors. Student evaluations revealed the importance of TA approachability for creating an environment conducive to inquiry. An approachability mindset gives TAs a proactive strategy for their role rather than focusing on what to avoid. TAs could be coached to use approachability to engage students through actions that the latter interpret as helpful, including: encouraging students to brainstorm with them; guiding students with questions that promote reflection and complex understanding; crafting explanations that push students to connect their emerging understanding of biological concepts with their experiments. TAs become approachable in students’ eyes, minimizing student frustration by being open to discussing students’ ideas seriously, versus refusing to share the answers. Leveraging the approachability mindset might help TAs navigate student expectations while persisting in creating inquiry-based learning experiences. Recognizing that TAs struggled to implement inquiry, we were motivated to explore what deeper barriers might prevent the full realization of inquiry. We identified that TAs held unexplored, unchanged beliefs that were barriers to inquiry even after training. Few guidelines are available to prepare TAs in higher education. This study provides insights to inform this gap. From literature about K–12 inquiry classrooms, it is clear that inquiry teaching is challenging even for expert instructors (9). However, TAs differ significantly from K–12 teachers in teacher preparation, belief systems, and institutional context. Most K–12 science teachers have no experience in doing research (33), unlike most TAs. These differences should be considered when adopting professional development models for TAs. There are several avenues for investigating interventions targeting barriers identified in this study. Possible considerations could include, for example, how these interventions expose gaps between TAs’ teaching beliefs and practices; whether TAs are trained to use the EQUIP as a reflective tool to critique video-recorded classes; whether TAs are coached to share control for learning with students; whether TAs are trained to recognize positive pressure against inquiry from student evaluations and in class; and how each intervention improves their teaching decision-making process. Appendix 1: EQUIP scoring process Appendix 2: Knowledge survey Appendix 3: Reflection essay themes Appendix 4: Interview questionnaire Appendix 5: Interview themes Appendix 6: Student evaluation themes
  4 in total

1.  Confirmatory factor analysis of the Approaches to Teaching Inventory.

Authors:  Michael Prosser; Keith Trigwell
Journal:  Br J Educ Psychol       Date:  2006-06

2.  A delicate balance: integrating active learning into a large lecture course.

Authors:  J D Walker; Sehoya H Cotner; Paul M Baepler; Mark D Decker
Journal:  CBE Life Sci Educ       Date:  2008       Impact factor: 3.325

3.  Promoting inquiry-based teaching in laboratory courses: are we meeting the grade?

Authors:  Christopher Beck; Amy Butler; Karen Burke da Silva
Journal:  CBE Life Sci Educ       Date:  2014       Impact factor: 3.325

4.  Teaching Assistant Professional Development in Biology: Designed for and Driven by Multidimensional Data.

Authors:  Sara A Wyse; Tammy M Long; Diane Ebert-May
Journal:  CBE Life Sci Educ       Date:  2014       Impact factor: 3.325

  4 in total
  10 in total

1.  Making the Grade: Using Instructional Feedback and Evaluation to Inspire Evidence-Based Teaching.

Authors:  Peggy Brickman; Cara Gormally; Amedee Marchand Martella
Journal:  CBE Life Sci Educ       Date:  2016       Impact factor: 3.325

2.  Deaf, Hard-of-Hearing, and Hearing Signing Undergraduates' Attitudes toward Science in Inquiry-Based Biology Laboratory Classes.

Authors:  Cara Gormally
Journal:  CBE Life Sci Educ       Date:  2017       Impact factor: 3.325

3.  Enthusiastic but Inconsistent: Graduate Teaching Assistants' Perceptions of Their Role in the CURE Classroom.

Authors:  Emma C Goodwin; Jessica R Cary; Erin E Shortlidge
Journal:  CBE Life Sci Educ       Date:  2021-12       Impact factor: 3.325

4.  Real-time text message surveys reveal student perceptions of personnel resources throughout a course-based research experience.

Authors:  Alyssa N Olson; Sehoya Cotner; Catherine Kirkpatrick; Seth Thompson; Sadie Hebert
Journal:  PLoS One       Date:  2022-02-18       Impact factor: 3.240

5.  Not the same CURE: Student experiences in course-based undergraduate research experiences vary by graduate teaching assistant.

Authors:  Emma C Goodwin; Jessica R Cary; Erin E Shortlidge
Journal:  PLoS One       Date:  2022-09-27       Impact factor: 3.752

6.  Investigating Instructor Talk among Graduate Teaching Assistants in Undergraduate Biology Laboratory Classrooms.

Authors:  Katharine A Gelinas; Dax Ovid; Wilmer Amaya-Mejia; Rafael Ayala; Hanna E Baek; Eric Gasmin; Karina Hissen; Amanda Johnson; Emily Kossa; Lauren Levesque; Kurt R Lutz; Amichai S Lyons; Alan F Mata; Casey G Mitchell; Lisa Paggeot; Maria José Pastor-Infantas; Cheryl Patel; Susan Prestol-Casillas; Kevin Xu Chen; Kimberly D Tanner
Journal:  CBE Life Sci Educ       Date:  2022-06       Impact factor: 3.955

7.  Teaching Assistant Attention and Responsiveness to Student Reasoning in Written Work.

Authors:  Cynthia F C Hill; Julia S Gouvea; David Hammer
Journal:  CBE Life Sci Educ       Date:  2018-06       Impact factor: 3.325

8.  Catching the Wave: Are Biology Graduate Students on Board with Evidence-Based Teaching?

Authors:  Emma C Goodwin; Jane N Cao; Miles Fletcher; Justin L Flaiban; Erin E Shortlidge
Journal:  CBE Life Sci Educ       Date:  2018-09       Impact factor: 3.325

9.  The Impact of a Pedagogy Course on the Teaching Beliefs of Inexperienced Graduate Teaching Assistants.

Authors:  Star W Lee
Journal:  CBE Life Sci Educ       Date:  2019-03       Impact factor: 3.325

10.  Does the Match between Gender and Race of Graduate Teaching Assistants and Undergraduates Improve Student Performance in Introductory Biology?

Authors:  Star W Lee; Marsha Ing
Journal:  CBE Life Sci Educ       Date:  2020-12       Impact factor: 3.325

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.