Literature DB >> 29854053

CREATE Two-Year/Four-Year Faculty Workshops: A Focus on Practice, Reflection, and Novel Curricular Design Leads to Diverse Gains for Faculty at Two-Year and Four-Year Institutions.

Sally G Hoskins1, Alan J Gottesman1, Kristy L Kenyon2.   

Abstract

Improving STEM education through the propagation of highly effective teaching strategies is a major goal of national reform movements. CREATE (Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment) is a transformative teaching and learning strategy grounded in evidence-based science pedagogy. CREATE courses promote both cognitive (e.g., critical thinking) and affective (e.g., attitudinal and epistemological) student gains in diverse settings. In this study, we look more deeply into the faculty development workshop used to disseminate CREATE pedagogy to instructors at two-year and four-year institutions. We hypothesized that an immersive experience would positively shift faculty participants' views on teaching/learning, build their understanding of CREATE pedagogy and develop their confidence for course implementation. Internal and external assessments indicate that faculty participants did achieve gains within the timeframe of the CREATE workshop. We discuss the workshop training outcomes in the context of designing effective dissemination models for innovative practices.

Entities:  

Year:  2017        PMID: 29854053      PMCID: PMC5976048          DOI: 10.1128/jmbe.v18i3.1365

Source DB:  PubMed          Journal:  J Microbiol Biol Educ        ISSN: 1935-7877


INTRODUCTION

Changing how science is taught and learned in college entails overcoming deep and often unhelpful traditions. Historically, graduate curricula in science, technology, engineering, and mathematics (STEM) fields have not included training in pedagogy. Many scientists thus begin teaching with little or no classroom experience, and often are unaware of core teaching and learning research (1–3). The need for systemic change in science teaching has been highlighted by STEM reform efforts (4, 5). Significant effort has focused on enhancing the training of science faculty in evidence-based teaching practices (6–11). Nevertheless, most undergraduates still experience traditional, teacher-focused science instruction (12). The single or multiday workshop has been a popular venue for training science faculty to become more effective teachers (8, 13, 14). Well-known examples include the Center for the Integration of Research, Teaching, and Learning (14), the Summer Institutes (SI) (15, 16), Workshop for New Physics and Astronomy Faculty (17, 18), and the Faculty Institutes for Reforming Science Teaching (FIRST) (10, 11). Ideally, faculty development programs facilitate the learning of new teaching approaches, which participants then apply in their courses in ways that improve student learning. Recent studies have reported some gains made by science faculty participating in professional development workshops (e.g., 8 STAR, First IV; 11). Yet, despite its widespread use, the workshop model has been criticized for the lack of data demonstrating its efficacy in propagating meaningful change across STEM education (19–21). While participants may learn about innovative practices in workshops, their ability to successfully apply training has been questioned (22), and overall, there is a paucity of information on “downstream” outcomes in the classroom (11, 23). To achieve the long-term goals of dissemination requires a deeper understanding of the factors that influence teaching practices. Acquiring knowledge of research-based instructional strategies may be a foundational step for developing teaching skills (22, 23), but personal beliefs and attitudes can dominate the decision-making process of faculty (24, 25). Efforts to train faculty through professional development workshops may be stymied if the training experiences do not foster changes in mindset or sufficiently prepare instructors for the realities of implementing new methods. Our firsthand experiences as participants suggested to us that it is rare for workshop designers to probe pre-workshop beliefs and expectations or examine how post-workshop views align with training. That is, given that participants likely seek such training because of a high interest in teaching and science-education issues, do their viewpoints nonetheless mature or change during the workshop? Do participants feel confident they “got what they came for” in terms of mastering the pedagogical innovations and instructional practices that were the focus of a given workshop? It may be that many workshop leaders investigate such issues through either internal or external assessments, but few studies report outcomes specific to the training period. Such information could be quite useful for those seeking to design effective faculty development experiences. In this study, we address the proximate effects of training faculty in CREATE pedagogy through a multiday, residential workshop. CREATE (Consider, Read, Elucidate hypotheses, Analyze data, Think of the next Experiment), uses intensive analysis of scientific literature as an inroad into building scientific thinking skills, improving attitudes toward science/scientists, and fostering deep understanding of the research process (26). Built upon well-established principles from the learning sciences (e.g., constructivism and active engagement; see 27, 28), CREATE combines novel and adapted pedagogical tools (concept mapping, sketching, figure annotation, experimental design) and active engagement approaches (small group work, debates, and grant panels) that facilitate learning (29–35). CREATE courses have been shown to stimulate cognitive and affective gains in a wide range of students (9, 36–39). The CREATE strategy leverages the research expertise of instructors, allowing them to apply skills and knowledge in ways they may not typically bring to the classroom. For this study, we designed a faculty development model of dissemination with the goal of expanding the use of CREATE into new academic environments. Instructors from two-year and four-year institutions across the US participated in workshops of 4.5 days’ duration. This design allowed us to test the hypothesis that workshop training would positively change two-year and four-year faculty participants’ 1) beliefs about teaching/learning; 2) intended practices; and 3) self-rated understanding, skills, and attitudes specific to CREATE. The principal investigators (SH, KK) gauged the impacts of the workshops through internal assessments complemented by external evaluation by an independent outside evaluator (OE). Outcomes indicate that the workshops significantly shifted aspects of participants’ views of teaching, course design and curricula, and also led to significant shifts in their post-workshop pedagogical plans compared with pre-workshop approaches. Participants also reported gains in their self-assessed confidence, understanding and ability to teach with CREATE methods. Shifting beliefs, attitudes, and intended practices in the short-term may be critical for achieving long-term success with post-workshop CREATE implementation.

METHODS

Recruitment of faculty participants

We held four multi-day workshops at Hobart and William Smith Colleges (Geneva, NY) in June 2012 and June 2013. We worked with 33 two-year faculty from colleges across 12 states, 63 four-year faculty from colleges/universities across 30 states, and a single four-year Canadian participant. Faculty came from a wide range of institutions including large public universities, smaller liberal arts colleges, and two-year colleges (urban and rural). Sixty-four participants self-identified as female and 32 as male. (See Appendix 1 for details regarding recruitment, selection, and participant demographics.)

Workshop design

The workshops were designed to: 1) impart deep understanding of the principles underlying the CREATE strategy and the strategy itself; 2) provide direct experience with the CREATE toolkit through individual and group hands-on activities; 3) provide insight into the challenges of changing teaching styles, both for teachers and students; and 4) guide faculty in designing CREATE modules for their own courses. Table 1 provides the overall workshop schedule. Workshop design was conceived by the principal investigators (PIs). We were influenced by a pilot CREATE faculty development workshop (9), our teaching experiences in CREATE courses, and prior experiences in other professional development programs. Formative feedback from the outside evaluator (see below) allowed the PIs to further refine and adapt activities to address gaps or weak areas between workshops.
TABLE 1

CREATE faculty workshop: schedule and summary of activities.

Day 1Day 2Day 3Day 4Day 5
AMPre-Workshop Assessment Mapping the Learning Environment: (CREATE Toolkit Activity)Participant Presentations“Stoplight” Experiment Proposals; followed by discussion of experiences as “students”; sharing of ideas for course adaptationImplementer Experiences with CREATE: Previous workshop trainees share experiences with designing, implementing, and teaching with the strategy“Teach It” Mock Class Sessions: Individuals teach 45-minute “CREATE class”Discussion: Assessment of courses, ways to evaluate student learning, skill development
Short Article Activity I (Babies Recognize Faces Paper)a Consider/Read/ Cartoon Experimental Design/Analyze DataShort Article Activity II: Testing/ Writing/Rat Empathyb Tools: Cartoon study design/Annotate Figures/ Transform Data“Designing CREATE Curricular Modules and Roadmaps”: Discussion of finding scientific literature, other media sources; planning class sessions and applying CREATE pedagogyOther participants act as “students” for each session (total = five student teaching sessions)Post-Workshop Assessment & Wrap Up

BreakEND

PMShort Article Activity I (cont’d): “Think of the Next Experiment”Design “Teach It” Activity: Small groups select one short article to teach with CREATE toolkit; design a short teaching unit using specific (assigned) toolsCurricular Development Session:

Participants develop CREATE roadmaps and teaching materials for their courses

PIs provide individual feedback and assistance

“Teach It” Mock Class Sessions: Individuals teach 45-minute “CREATE class”
Overview of CREATE: Introduction to evidence-based methods and published studies
Experimental Design Activity: Develop hypotheses and design experiments using “Stoplight” Online Reaction Time Test http://gywh.com/archives/online-reaction-time-testTeach It” Activity I: Participants teach units; debrief discussion follows each unit (reflection by teacher; followed by reflection by workshop participant “students”) “Teach It” Activity II:

Seek volunteers to develop a 45-minute session to teach part of roadmap for a specific course

Participants work with PIs to design mock sessions; develop pre-class tasks for participating “students”; prepare teaching materials

Other participants act as “students” for each session (total = five student teaching sessions)

EveningHomework: Prepare presentation “Stoplight” experiments; identify short articles “Teach It” ActivityDiscussion: Faculty Experiences in Discipline-Based Education ResearchParticipant “Teachers” assign tasks to be completed by Participant “Students” for “Teach It” sessions

Green: Workshop participants were challenged to use the CREATE toolkit and apply the CREATE strategy in literature reading and data analysis, Pink: Workshop participants in role as “teacher” for designing CREATE curricular material; Blue: Workshop participants in role as “students.”

Activity described in Hoskins, 2010 (40).

Teaching notes in Gottesman and Hoskins, 2013 (38).

CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment; DBER = disciplined-based education research; PI = principal investigator.

CREATE faculty workshop: schedule and summary of activities. Participants develop CREATE roadmaps and teaching materials for their courses PIs provide individual feedback and assistance Seek volunteers to develop a 45-minute session to teach part of roadmap for a specific course Participants work with PIs to design mock sessions; develop pre-class tasks for participating “students”; prepare teaching materials Green: Workshop participants were challenged to use the CREATE toolkit and apply the CREATE strategy in literature reading and data analysis, Pink: Workshop participants in role as “teacher” for designing CREATE curricular material; Blue: Workshop participants in role as “students.” Activity described in Hoskins, 2010 (40). Teaching notes in Gottesman and Hoskins, 2013 (38). CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment; DBER = disciplined-based education research; PI = principal investigator. The PIs led activities in the first day and a half; by late in day 2, individual participants led some CREATE activities. On day 3, an experienced CREATE instructor gave a talk on their use of the CREATE strategy. During this early phase, we addressed the rationale for developing the CREATE strategy, alignment of CREATE tools with pedagogy literature, and examples of the strategy’s use in a variety of classroom situations. Activities were designed to model specific CREATE class experiences, allowing participants to act as naïve students by working in small groups or completing homework assignments that mimicked those routinely used in the PIs’ CREATE courses (Table 1). We remixed groups often, as in our own CREATE classes, which allowed participants to discuss teaching challenges common to all, as well as those unique to two-year or four-year campuses. Throughout the workshop, we encouraged metacognitive reflection from participants, both from a student perspective, “How do I feel being a student in a class taught like this?” and from a professor perspective, “How does it feel to teach in this alternative way?” This aspect of training is important because: 1) Student reaction to new methods can discourage faculty from pursuing such methods, even when ample research indicates the methods strongly support learning (41); and 2) Faculty discomfort with new methods can discourage persistence with their implementation (42). The final days of the workshops were designed to provide faculty with the time, support, and feedback for individually developing their own CREATE materials. Each participant developed a “CREATE Roadmap” for a specific course of their choosing. These roadmaps were lesson plans built around a particular set of readings selected by workshop participants, along with instructions for applying the CREATE strategy. Roadmaps designed for introductory level courses often use a popular-press article linked to one or two related primary articles, while roadmaps for intermediate/advanced courses typically comprise three to five linked primary articles (see www.teachcreate.org/ for examples). We also provided workshop participants with an opportunity to practice teaching with their roadmaps (Table 1). Each “teacher” participant created a lesson plan and devised a short pre-class homework assignment for other workshop participants, who subsequently acted as the “students” for the teaching session. Workshop “teachers” taught a 45-minute mock class, which was followed with a debriefing discussion. We first asked the “teacher” to reflect on the experience of leading a CREATE class; then asked their workshop peers to provide feedback about their experiences as first-time CREATE students.

Workshop evaluation by principal investigators

We assessed participants using two surveys in a pre/post format; responses on all surveys were anonymous. Participants created personal code numbers (unknown to the PIs), allowing comparison of pre- and post-workshop responses from individuals. Surveys were given on the morning of the first day (pre) and on the final day (post) (Table 1). The PIs created and administered an assessment called the Survey of Teachers’ Beliefs, Practices, and Intentions (TBPI) as there were no published instruments available that addressed our research questions. The TBPI survey probed participants’ beliefs about multiple aspects of teaching/learning and teaching practices on a five-point Likert-style scale (1 = strongly disagree; 2 = disagree; 3 = I’m not sure; 4 = agree; 5 = strongly agree; Appendix 3). Twelve belief statements centered on CREATE pedagogy, such as whether a focus on critical thinking must be preceded by emphasis on content, and the importance of considering the human side of science. Thirteen statements examined the extent to which participants already employed particular elements of CREATE pedagogy (e.g., using primary literature with first year students; emphasizing experimental design). Pre-workshop responses on the TBPI provided a snapshot of participants’ beliefs about teaching and learning, as well as their pre-workshop practices. Post-workshop, we adapted the survey in order to assess potential changes in beliefs across the 4.5 day period and whether participants’ future teaching plans differed from their pre-workshop practices. The TBPI was designed to be brief and specific to the CREATE strategy. We analyzed numeric Likert scores, reversing scores on negatively phrased statements, as in previous work (38). We used Cohen’s alpha to test relationships among statements. Based on post-workshop responses, we defined four factors in this analysis: Factor 1, beliefs about curriculum (three statements; e.g., “Students need to have completed introductory coursework in science before they can read and understand primary scientific literature”; alpha = 0.67); Factor 2, beliefs about students (three statements; e.g., “Only the most talented students can learn how to think critically about science”; alpha = 0.73); Factor 3, course design (seven statements, e.g. “I have/will have students work in small groups….”; alpha = 0.73); Factor 4, instructional practices (six statements, e.g., “I explicitly teach/will teach students metacognitive strategies”; alpha = 0.90). Several statements did not associate with any factor. One statement was eliminated due to error on the post-workshop survey. The percentage of respondents who agreed (score of 4 or 5) with each statement was determined. To test for pre- vs. post-workshop differences on the percentage in agreement, Chi-squared analysis was used to compare outcomes for three categories (all participants, two-year only, and four-year only) using web-based tools (www.medcalc.org/calc/comparison_of_proportions.php). The PIs also administered surveys derived from the Student Assessment of their Learning Gains (SALG) website, a free online tool (www.salgsite.org; 43). The SALG survey statements are given in Table 2. The pre/post SALG survey probed: 1) whether workshop participants felt they had learned the pedagogical basis for the CREATE toolkit and how/when/why the tools are used (“Understanding”); 2) participants’ sense of whether they had developed the ability to effectively apply CREATE tools (“Skills”); and 3) the extent to which the workshop had sustained participants’ initial interest in CREATE and made them confident in their ability to apply what they had learned (“Attitudes”). Post-workshop SALGs included two open-ended questions: “Did you get what you came for?” and “Would you recommend the workshops to another faculty member? Why or why not?” We tracked the “yes,” “no,” and qualified (“yes, but….”) responses to these prompts.
TABLE 2

SALG survey statements.

Understanding: “Presently, I understand …”
Concept mappingUsing sketching (cartooning) to clarify methods and experimental designsAnnotating figuresHow to build an effective module of articlesHow to survey paper authors by email with student questionsHow to assess multiple aspects of students’ potential gains from my courseHow ideas we will explore in this class relate to science teaching in general
Skills: “Presently, I can…”
Easily find articles that I think will work well with the CREATE approachEffectively guide students in critically reading articlesEffectively guide students to identify patterns in data, without just showing them the patternsGuide students to use evidence to support ideasFacilitate small-group discussion in my classesAnticipate student reaction to new learning strategies
Attitudes: “Presently, I am…”
Interested in discussing the CREATE strategy with colleagues on my home campusInterested in applying CREATE approaches in one or more classesConfident that I understand the research base for the CREATE tools (e.g., why concept maps, cartooning are effective)Confident that I can teach CREATE effectivelyComfortable working in a “student centered” classroom rather than lecturing

Workshop participants’ completed a “self-assessed learning gains survey” (SALG, www.salgsite.org) adapted by PIs to probe aspects of participants’ self-rated skills development and understanding as these were affected by the 4.5-day workshop experience. A fourth category contained a single statement regarding exam design, on which no significant changes were seen (data not shown).

SALG = student assessment of their learning gains; CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment.

SALG survey statements. Workshop participants’ completed a “self-assessed learning gains survey” (SALG, www.salgsite.org) adapted by PIs to probe aspects of participants’ self-rated skills development and understanding as these were affected by the 4.5-day workshop experience. A fourth category contained a single statement regarding exam design, on which no significant changes were seen (data not shown). SALG = student assessment of their learning gains; CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment.

Workshop evaluation by external expert

All four workshops were assessed independently by Outside Evaluator (OE) Marlene Hurley, PhD (Hurley and Associates), who has been associated with the CREATE project since 2007. Dr. Hurley attended every workshop and tracked all aspects of the experience using a modified version of the Weiss Observation Protocol for Science Programs, which had been developed for a previous CREATE study (9). For the first year (2012), several weeks in advance of both workshops, the OE independently contacted accepted applicants. She asked each individual to complete a short pre-workshop survey (four questions) regarding the application process and their expectations for the upcoming workshop (Appendix 2). Post-workshop, the OE administered a different survey (eight questions) focused on reactions to the workshop experience (Appendix 2). At the conclusion of the 2012 workshops, the OE provided her formative assessment information (including the OE survey data) to the PIs, which allowed us to make adjustments in the second year (2013). For the second year of workshops (2013), the OE again observed and tracked all aspects of both workshops. We note that Dr. Hurley also conducted the evaluation of course implementations by a subset of workshop participants in the academic year following each workshop. She compiled a comprehensive final report describing all workshops and faculty implementations (Hurley, 2014; unpublished).

RESULTS

Principal investigator evaluation

As outcomes of individual workshops were very similar (data not shown), we present analyses of pooled assessment data from the four workshops.

TBPI outcomes

Overall, two-year outcomes were quite similar to four-year outcomes on the TBPI (Fig. 1). Both groups shifted significantly on three of the four factors: beliefs about curriculum, instructional practices, and course design. These findings suggest that the CREATE workshops had an impact on faculty views on teaching and learning and inspired faculty to broaden their classroom approaches. On the factor addressing “beliefs about students,” we saw no significant change across the workshop period in either cohort (Fig. 1). Several statements on the survey did not associate with any of the four factors. One addressed whether faculty were comfortable reading published education literature; one addressed potential concerns about the amount of time needed for course redesign. Pre-workshop, the majority response for both two-year and four-year participants was agreement (4 or 5) that education literature was difficult, and uncertainty about time demands. Both groups agreed strongly (pre-workshop) with a statement suggesting that prerequisite courses did not prepare students effectively for subsequent courses, and expressed uncertainty on a statement suggesting that lab courses taught students about research. None of these views changed significantly across the workshop period (data not shown).
FIGURE 1

Shifts in participants’ practices/intentions and beliefs based on categories defined for TBPI survey. A) two-year only, n = 30, B) four-year only, n = 59, C) all participants. Data were combined for all four workshops. Histogram shows the percentage of respondents who a = agreed (score 4 or 5) with category statements at the start or conclusion of the workshop. See Table S1A for full statements and Methods for discussion of how statements were grouped statistically. Significance (Chi-squared) determined via www.medcalc.org/calc/comparison_of_proportions.php; comparison of proportions calculator. * = p < 0.02; ** = p < 0.01; *** = p < 0.001; **** = p < 0.0001. TBPI = teachers’ beliefs, practices, and intentions.

Shifts in participants’ practices/intentions and beliefs based on categories defined for TBPI survey. A) two-year only, n = 30, B) four-year only, n = 59, C) all participants. Data were combined for all four workshops. Histogram shows the percentage of respondents who a = agreed (score 4 or 5) with category statements at the start or conclusion of the workshop. See Table S1A for full statements and Methods for discussion of how statements were grouped statistically. Significance (Chi-squared) determined via www.medcalc.org/calc/comparison_of_proportions.php; comparison of proportions calculator. * = p < 0.02; ** = p < 0.01; *** = p < 0.001; **** = p < 0.0001. TBPI = teachers’ beliefs, practices, and intentions.

SALG outcomes

The surveys allowed participants to self-rate their understanding, skills, and attitudes regarding CREATE pedagogy (Table 3). Both pooled two-year and pooled four-year participants made significant gains, with effect sizes (ES) of 0.9 or above in each of the SALG survey’s overarching “Understanding,” “Skills,” and “Attitudes” categories (Table 3, Fig. 2). These data support the idea that faculty participants achieved their pre-workshop goals (see Fig. S2, Appendix 4). The “attitudes” results suggest that, as they learned the pedagogical underpinnings of CREATE and how to apply it in their classes, participants maintained their enthusiasm for the strategy and increased their sense that: 1) they could apply the strategy successfully; and 2) CREATE tools would be effective in their own courses. When participants were asked in open-ended SALG questions whether workshops met expectations, well over 90% of responses were positive (Fig. S1, Appendix 4). A small subset (8%) qualified their “yes” responses to indicate either a desire for more time for CREATE roadmap development or concern about their ability to implement. Taken together, the significant gains on both the SALG and TBPI surveys argue that the workshops had strong and positive effects on both two-year and four-year faculty.
TABLE 3

Participants’ responses to OE post-workshop survey.

Statement PromptResponse (avg)
My learning about the C.R.E.A.T.E. method4.75
My knowledge gained about teaching4.63
My knowledge gained about how students learn3.90
The instruction in this workshop4.76
The overall workshop rating4.85

Participants were asked to respond to these prompts by ranking on a Likert-style 5-point scale (1 = lowest; 5 = highest; intermediate numbers not defined); n = 45 respondents.

CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment; OE = outside evaluator.

FIGURE 2

SALG outcomes from four workshops. Pooled average data in each broad category from SALG respondents (4-year faculty: 55 pre, 46 post; 2-year faculty: 27 pre, 27 post). Each category represents a set of 5 or 6 individual statements, to which faculty responded on a 1 to 5 scale (1 = not at all; 2 = just a little; 3 = somewhat; 4 = a lot; 5 = a great deal). See Table 3 for SALG statements. *** = p < 0.000; analysis by non-paired t-test (Excel). Effect sizes (Cohen’s d) are 0.9–1.9 (2-year); 1.6–2.6 (4-year). Error bars = standard deviations. We also looked at combined data from the entire cohort; response patterns and effect sizes were very similar (data not shown). As with the TBPI survey (Fig. 1), outcomes for two-year participants and for four-year participants were quite similar. SALG = student assessment of their learning gains; TBPI = teachers’ beliefs, practices, and intentions.

Participants’ responses to OE post-workshop survey. Participants were asked to respond to these prompts by ranking on a Likert-style 5-point scale (1 = lowest; 5 = highest; intermediate numbers not defined); n = 45 respondents. CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment; OE = outside evaluator. SALG outcomes from four workshops. Pooled average data in each broad category from SALG respondents (4-year faculty: 55 pre, 46 post; 2-year faculty: 27 pre, 27 post). Each category represents a set of 5 or 6 individual statements, to which faculty responded on a 1 to 5 scale (1 = not at all; 2 = just a little; 3 = somewhat; 4 = a lot; 5 = a great deal). See Table 3 for SALG statements. *** = p < 0.000; analysis by non-paired t-test (Excel). Effect sizes (Cohen’s d) are 0.9–1.9 (2-year); 1.6–2.6 (4-year). Error bars = standard deviations. We also looked at combined data from the entire cohort; response patterns and effect sizes were very similar (data not shown). As with the TBPI survey (Fig. 1), outcomes for two-year participants and for four-year participants were quite similar. SALG = student assessment of their learning gains; TBPI = teachers’ beliefs, practices, and intentions.

External evaluation of the 2012 workshops

The OE formative assessment of the two workshops provided data derived independently of the PIs’ assessments. Participants in the first two workshops gave high marks overall to the training (OE post-workshop survey; Likert-style scale; Table 3), suggesting that pre-workshop expectations were met (Table 4). The OE determined that “working with peers” and “teaching with CREATE” were prominent among the participants’ “favorite aspects of the workshops” (Table 4). There was little consensus among participants’ responses regarding “least-favorite aspects” of the workshop experience; logistical issues and certain activities (e.g., lectures; ineffective group work) were most often mentioned. When asked to comment on their “favorite aspects of the CREATE strategy,” participants emphasized active learning, use of primary literature and emphasis on higher-level thinking (Table 5). In response to the OE post-workshop question: “Do you still want to teach using the CREATE method?” 100% of respondents from the 2012 workshops said “yes” (n = 23 for session 1, n = 22 for session 2; Appendix 2).
TABLE 4

Participants’ expectations for, and reactions to, the 2012 workshops – OE formative evaluation.

Expected Gains (Pre)Favorite Aspects of Workshops (Post)Least Favorite Aspects of Workshops (Post)
Literature-centered concepts (15)*Interactions with peers (16)No least favorite aspects (11)
Teaching-centered concepts (14)Teaching CREATE (10)Long days (2)
CREATE-centered concepts (11)Applying new methods (7)Various room problems (4)
Curriculum design-centered concepts (7)Learning about tools for teaching (6)Sessions that were lectures (2)
Science content-centered concepts (5)Being a student (6)Small group work not effective (2)
Colleague-centered concepts (4)Interactive learning (3)
Learning-centered concepts (4)Instructors (3)
Assessment-centered concepts (4)Materials (3)
Developing materials (2)
Engaging (2)

Outcomes from the two cohorts of year 1 were combined; number of individuals indicated in parentheses. Responses supported by only one person are not shown.

Left panel: Responses to OE pre-workshop survey (question 4) on participants’ expectations regarding the workshops; (Appendix 1; n = 45).

Center and right panel: Responses to OE post-workshop survey (question 7) on participants’ reactions to the workshops; (Appendix 1; n = 45).

Participants’ reactions aligned well overall with participants’ pre-workshop expectations.

CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment; OE = outside evaluator.

TABLE 5

Participants’ favorite aspects of CREATE strategy OE 2012 formative evaluation.

Categories
Active learning (11)
Understanding primary literature (8)
Higher level thinking tool (7)
Understanding science process (4)
Student-centered (4)
Discussions (4)
Cartooning (5)
All of it! (3)
Practical teaching (2)
Concept mapping (2)
Flexibility (3)
Teaching examples (2)

Responses to question 6 of the OE post-workshop survey; results were combined for the two 2012 workshops (n = 45). Many respondents mentioned more than one aspect (see Appendix 1 for full survey).

CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment;

OE = outside evaluator.

Participants’ expectations for, and reactions to, the 2012 workshops – OE formative evaluation. Outcomes from the two cohorts of year 1 were combined; number of individuals indicated in parentheses. Responses supported by only one person are not shown. Left panel: Responses to OE pre-workshop survey (question 4) on participants’ expectations regarding the workshops; (Appendix 1; n = 45). Center and right panel: Responses to OE post-workshop survey (question 7) on participants’ reactions to the workshops; (Appendix 1; n = 45). Participants’ reactions aligned well overall with participants’ pre-workshop expectations. CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment; OE = outside evaluator. Participants’ favorite aspects of CREATE strategy OE 2012 formative evaluation. Responses to question 6 of the OE post-workshop survey; results were combined for the two 2012 workshops (n = 45). Many respondents mentioned more than one aspect (see Appendix 1 for full survey). CREATE = Consider, Read, Elucidate the hypotheses, Analyze and interpret the data, and Think of the next Experiment; OE = outside evaluator. The OE observed and evaluated the 2013 (year 2) workshops but did not provide survey data to the PIs. In the final summative report on the project, the OE declared “While not totally ‘glitch-free,’ these new workshops [2013] were excellent as supported by workshop participant respondents in their [2013] post-workshop surveys and by PE [Program Evaluator] observations. The instruction was excellent for both summers, but improved with each successive workshop.”

DISCUSSION

We tested the hypothesis that a multi-day, intensive workshop would positively affect two-year and four-year faculty participants’ beliefs about teaching and learning with the CREATE strategy while preparing them to teach CREATE with confidence. Participants were selected in a competitive process in which we considered factors such as self-described desire to learn new ways of teaching, teaching load, and institutional affiliation (see Appendix 1). As they committed to an in-residence workshop, these participants were arguably highly interested in pedagogy and many had completed other faculty development programs. Changing faculty beliefs and/or intentions may have been less likely to occur if participants already held progressive views about teaching and learning pre-workshop, and we anticipated the possibility of a ceiling effect on surveys. Instead, our data suggest that the workshop experience propelled faculty to reformulate some of their ideas in response to CREATE pedagogical approaches.

CREATE workshops shift faculty views on teaching and learning

Our assessments indicate that participants made important gains within the timeframe of the workshop. On the TBPI survey, both two-year and four-year faculty changed significantly in beliefs about science curricula as well as in their intended practices relative to course design and instruction (Fig. 1). Outcomes from the SALG surveys reflected similar shifts: enthusiasm for CREATE approaches increased (Attitudes scores) and participants felt that the workshop activities prepared them to teach CREATE (Understanding and Skills scores). Throughout the workshop, participants experienced both the cognitive dissonance that their students were likely to feel when learning with the CREATE toolkit and the collaborative, discussion-based CREATE learning environment. Repeated use of reflection in workshops promoted metacognition, a key component of deep learning (44–46). In a previous pilot study, we used monthly meetings to train faculty; that format also evoked faculty gains while allowing more time for reflection (9). The present findings argue that a compressed but intensive workshop format is sufficient to engender positive shifts in faculty viewpoints. Participants provided valuable feedback on workshop assessments about issues we had not considered. In open-ended comments on the SALG survey (2012), anonymous respondents noted that not all two-year faculty have PhDs and that their experience using primary literature, especially in content-heavy courses, could be different from that of other participants. A few individuals pointed out that particular constraints (departmental curricula, lack of colleague support) were issues that the PIs did not address adequately. Some felt that the PIs should have presented examples differently given that course foci and/or students’ backgrounds may vary depending on the academic environment. In subsequent workshops, we devoted more time to questions of how to apply CREATE for students at different curricular levels. Remixing groups promoted conversation between two-year and four-year faculty, which benefitted both cohorts. For example, few of the four-year participants knew of challenges experienced by students when transferring from two-year institutions (47); discussions among participants often led to creative ways to better engage all students. With each reiteration of the workshop, we spent more effort guiding participants in developing CREATE materials and teaching plans targeted specifically for their own students. We also made a point of addressing participants’ concerns about student resistance to novel teaching methods. Student pushback in response to new approaches may decrease instructors’ willingness to learn, introduce, and persist with innovative methods (48, 49). Providing opportunities for participants to teach during the workshop allowed the equivalent of a beta-test implementation of CREATE. Those who taught a mock class received formative feedback from the PIs and workshop peers. By acting as “students” for these sessions, other participants experienced using CREATE tools, giving them an authentic perspective of a learner-centered CREATE class. Thus, all participants could be alerted to challenges (e.g., potential resistance to changing expectations, the dynamics of group work) that could affect their future implementation of CREATE. Recent work from Andrews and Lemons (25) suggests that faculty willingness to change pedagogical approaches is driven more by personal beliefs than published evidence. The design of our workshops may have helped evoke change in participants by affecting their personal views about teaching and learning.

Improving the workshop dissemination model

The faculty workshop is a popular medium for disseminating new pedagogical approaches (8, 13–18) whose effectiveness has been criticized as insufficiently supported by data (19–23). As stated by Ebert-May et al. (11, p. 2), while thousands of STEM faculty and/or postdoctoral fellows have participated in workshops, “[e]ven with the continued availability of and interest in teaching development opportunities, there is little evidence of resulting widespread impact on teaching practices and even less about the impact on student learning.” We note that there is wide variation in the design, structure, and participant demographics of faculty development programs (e.g., duration of workshop, nature of activities, inclusion of diverse faculty groups). For example, studies involving faculty from community colleges have been limited in scope and number. The STAR (Scientific Teaching, Assessment and Resources) 2.5-day workshop (8) invited participants from both two-year and four-year institutions to workshops aimed at extending lessons of the Summer Institutes and FIRST programs (8, 14, 50). Participants completed post-workshop assessments addressing their plans to apply lessons learned, and a subset responded to an online follow-up survey post-workshop (0.3–4 years), regarding their use of workshop tools and their sense of their students’ engagement and performance. While participants reported positive impacts, the authors noted that faculty self-assessment must be interpreted with caution due to the tendency for faculty to inaccurately report the extent to which their classrooms are student-centered (51; see also 10). In this regard, a recent report that students and faculty may view the same course quite differently (52) also highlights the importance of moving beyond self-report, either by faculty or students, in tracking post-workshop implementation or learning. Ebert-May and colleagues (11) recently studied the effects of a faculty development program (FIRST IV) on the teaching/learning beliefs and teaching practices of postdoctoral fellows (PDs). The authors trained PDs in two consecutive summer workshops, and PDs participated in an extensive post-workshop program with experienced faculty mentors. The study produced new findings on pedagogical training of early-career faculty indicating faculty gains (11, 53). It is not clear whether faculty cohorts currently underrepresented in science-reform (e.g., two-year faculty with heavy teaching loads who may be lacking departmental or institutional support for professional development) could easily apply the model. In the context of these studies and others, we consider our experiences with CREATE dissemination. Our workshop fits the “interactive dissemination” model described by Khatri and colleagues (54). Based on extensive survey data from National Science Foundation (NSF)-funded science education researchers and NSF STEM program officers, the researchers proposed that sustainable propagation of effective teaching strategies requires dissemination efforts that are “interactive” and “immersive” (54). Dissemination must also address situational factors that may influence whether faculty will initiate and/or continue use of new practices (21, 22, 24, 25). We suggest that assessing attendees’ beliefs and practices pre-workshop and then again post-workshop, rather than only or primarily retrospectively (8, 21), is an essential first step in evaluating workshop impact and determining whether participants have built the skills, beliefs, and confidence that could, in principle, support their eventual application of workshop-acquired concepts. The workshops discussed herein are one phase of the three-part CREATE dissemination model, which encompasses faculty development, course implementation, and student learning, with multiple assessments at each phase. We suggest that this model may offer a viable way to improve propagation of innovative practice. While we cannot know whether all workshop participants later applied CREATE pedagogy in their teaching, we followed a subset of two-year and four-year trainees who implemented CREATE at their home institutions in a research study. The OE (Hurley) who observed the workshops also evaluated each implementation. In addition, we (PIs) independently assessed students in the implementers’ courses using cognitive tests and affective surveys (pre/post). Evaluations by the OE supported the fidelity of implementations with respect to CREATE pedagogy, and teaching practices aligned with student gains in both cognitive and affective domains (2-year, 39: 4-year, submitted). Students were enthusiastic about CREATE courses: more than 75% of student comments, at both two-year and four-year institutions, were positive regarding the CREATE style of teaching (Hurley 2014; 39 and submitted). These findings provide evidence of positive, “downstream” impacts of workshop training.

CONCLUSIONS

Our findings support the efficacy of the workshop phase of the CREATE dissemination model. The workshop training: 1) improved participants’ views about teaching/learning; 2) augmented participants’ intentions to change their instructional practices; and 3) increased participants’ confidence that they understood the pedagogical basis of particular CREATE strategies and had acquired the skills needed to teach with CREATE. Thus, the low-cost CREATE strategy can be mastered in an intensive multiday faculty development workshop. We suggest that dissemination models like this one have the potential to make a deep impact on undergraduate education. Data captured through evaluation and assessment at each phase of dissemination can provide critical knowledge for improving teaching practices and student learning outcomes using innovative practices such as CREATE. Click here for additional data file.
  24 in total

1.  Education. Scientific teaching.

Authors:  Jo Handelsman; Diane Ebert-May; Robert Beichner; Peter Bruns; Amy Chang; Robert DeHaan; Jim Gentile; Sarah Lauffer; James Stewart; Shirley M Tilghman; William B Wood
Journal:  Science       Date:  2004-04-23       Impact factor: 47.728

2.  Selective use of the primary literature transforms the classroom into a virtual laboratory.

Authors:  Sally G Hoskins; Leslie M Stevens; Ross H Nehm
Journal:  Genetics       Date:  2007-05-04       Impact factor: 4.562

Review 3.  Learning our L.I.M.I.T.S.: less is more in teaching science.

Authors:  Sally G Hoskins; Leslie M Stevens
Journal:  Adv Physiol Educ       Date:  2009-03       Impact factor: 2.288

4.  Self-efficacy: toward a unifying theory of behavioral change.

Authors:  A Bandura
Journal:  Psychol Rev       Date:  1977-03       Impact factor: 8.934

5.  Learner-centered inquiry in undergraduate biology: positive relationships with long-term student achievement.

Authors:  Terry L Derting; Diane Ebert-May
Journal:  CBE Life Sci Educ       Date:  2010       Impact factor: 3.325

6.  The CREATE Strategy for Intensive Analysis of Primary Literature Can Be Used Effectively by Newly Trained Faculty to Produce Multiple Gains in Diverse Students.

Authors:  Leslie M Stevens; Sally G Hoskins
Journal:  CBE Life Sci Educ       Date:  2014       Impact factor: 3.325

7.  Alternative Realities: Faculty and Student Perceptions of Instructional Practices in Laboratory Courses.

Authors:  Christopher W Beck; Lawrence S Blumer
Journal:  CBE Life Sci Educ       Date:  2016       Impact factor: 3.325

8.  Fidelity of Implementation: An Overlooked Yet Critical Construct to Establish Effectiveness of Evidence-Based Instructional Practices.

Authors:  Marilyne Stains; Trisha Vickrey
Journal:  CBE Life Sci Educ       Date:  2017       Impact factor: 3.325

9.  CREATE cornerstone: introduction to scientific thinking, a new course for STEM-interested freshmen, demystifies scientific thinking through analysis of scientific literature.

Authors:  Alan J Gottesman; Sally G Hoskins
Journal:  CBE Life Sci Educ       Date:  2013       Impact factor: 3.325

10.  Barriers to faculty pedagogical change: lack of training, time, incentives, and...tensions with professional identity?

Authors:  Sara E Brownell; Kimberly D Tanner
Journal:  CBE Life Sci Educ       Date:  2012       Impact factor: 3.325

View more
  1 in total

1.  A Modified CREATE Intervention Improves Student Cognitive and Affective Outcomes in an Upper-Division Genetics Course.

Authors:  Stanley M Lo; Tiffany B Luu; Justin Tran
Journal:  J Microbiol Biol Educ       Date:  2020-04-30
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.