Literature DB >> 34170940

Predictors of students' participation in a learning environment survey with annual follow-ups.

Elaina DaLomba1, Astrid Gramstad2,3, Susanne G Johnson4, Tove Carstensen5, Linda Stigen6, Gry Mørk7, Trine A Magne5, Tore Bonsaksen7,8.   

Abstract

BACKGROUND: Longitudinal research is one effective way to gauge changes in a student cohort over time, however attrition in these studies is typically high, which can result in study bias. This study explored learning environment factors, approaches to studying, and academic performance as predictors of occupational therapy students' consistent participation in data collection conducted over three years of their professional program.
METHOD: A longitudinal study of Norwegian occupational therapy students (analyzed n = 240) was conducted. Logistic regression analysis was used to explore occupational therapy students' perceptions of the learning environment, their approaches to studying, and exam grades as they related to the likelihood of consistent participation at three annual surveys.
RESULTS: Annual response rates varied between 55.1%, and 65.6%, and consistent participation was observed among 49.2%. The fully adjusted regression models showed that higher strategic approach scores increased the odds of consistent participation (adjusted OR: 1.04, p < 0.01), whereas higher surface approach scores decreased the odds of consistent participation (adjusted OR: 0.95, p < 0.05). Neither sociodemographic factors, learning environment factors nor academic performance predicted participation over time.
CONCLUSIONS: Researchers can anticipate relatively high levels of attrition in longitudinal studies of occupational therapy students, but attrition seems to be largely proportional between groups. However, completers in longitudinal studies may be somewhat more well-organized and academically oriented than drop-outs.

Entities:  

Year:  2021        PMID: 34170940      PMCID: PMC8232400          DOI: 10.1371/journal.pone.0253773

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.240


Introduction

Originating in Marton and Säljö’s [1] work and expanded by Entwistle [2], approaches to studying have been characterized as deep, such as analyzing and relating ideas through critical thinking, and surface, such as rote learning and memorization in which broader connections are not made nor deeper meaning sought [3]. With further exploration, a third strategic approach that addresses student organization and self-regulation around work was added [4]. Recent evidence from occupational therapy student studies showed that students who adopt deep and strategic approaches tend to have higher reported self-efficacy and positive mental health [5] and are more inclined to adopt a positive outlook on the broader learning outcomes of the study program. However, whether these students are more inclined to participate in educational research is not known. Longitudinal research follows a particular group of individuals over time, allowing researchers to capture changes, and amount of change, with exposure to different variables [6]. Longitudinal studies can be effectively conducted in higher education because most students are present over a period of semesters and years. However, maintaining participant engagement in research and reducing participant attrition (dropout) are crucial for the value and generalizability of data obtained [7]. Some attrition is to be expected in longitudinal studies and may be due to loss of interest or lack of appreciation of the relevance of their individual situation to the phenomena being studied [8]. Repeated data collection may be experienced as burdensome to some participants, which can lead to incomplete responses and attrition [8]. Teague and colleagues [9] noted that making participation as easy and convenient as possible, “barrier reduction” such as providing reminders and keeping items to a minimum, was most impactful in increasing retention. Longitudinal research is used frequently to study college student behaviors and characteristics, as well as to evaluate the development of students’ learning. The purpose of such studies can be to identify program needs or to confirm and support use of existing methods. However, as with many studies, results and their meaning are often impacted by attrition. For example, in their study of occupational therapy students in team-based learning, Carson and Mennenga [10] found limited retention of students from the first to the second time of measurement, decreasing the generalizability of the results. Similarly, in their study of emotional intelligence in diverse student groups, Gribble, Ladyshewsky, and Parsons [11] noted that their high attrition rate (50%), particularly in the control group, could have led to erroneous comparisons between the groups. To the best of our knowledge, there is no research that addresses the impact of specific learner, or learning environment characteristics, on the likelihood of student participation and retention in longitudinal studies. However, when considering conducting intervention studies in higher education, it is crucial to have knowledge of these characteristics, since the recruitment and retention of participants can increase the validity and generalizability of them [12]. Moreover, if dropouts from longitudinal studies are systematically different from completers, this will decrease the validity of the results obtained from such studies. It seems possible that student background factors, their academic abilities, and their perceptions of the learning environment and approaches to studying may differ between those who are interested and willing to participate in longitudinal research in higher education and those who are not. Therefore, more knowledge about factors predictive of retention and dropout from longitudinal studies of higher education students can give some indication of the external validity of such studies and the biases they may have. The aim of the longitudinal study was to examine (i) the rates of student participation in the research study across time, and (ii) learning environment factors, approaches to studying, and academic performance as predictors of occupational therapy students’ consistent participation in the three waves of data collection conducted during the course of the study program.

Methods

Design and study context

The study is based on a longitudinal study of Norwegian occupational therapy students’ perceptions of the learning environment and approaches to studying. All occupational therapy education programs in Norway are three-year undergraduate programs. In the current study, the students’ participation in each year of study was examined, and predictors of consistent participation (participation in the three years of study) were explored. The collected data was from a single cohort of students, which implies that year 1 was year 1 students, year 2 was year 2 students, and year 3 was year 3 students.

Procedure and eligible participants

About midway through each of the three study years (i.e., in the December and January months), occupational therapy students at six higher education institutions in Norway were approached for possible participation in the study. The purpose and design of the study was outlined in classroom session at each of the education institutions, followed by an invitation to participate. The self-administered questionnaires were completed within the same session, or (in a few cases) later at a time and place of the students’ convenience. From the six education programs, 305 students were eligible participants.

Measurement

Sociodemographic variables

Age (in years) was registered as a continuous variable. Gender (male = 0, female = 1), having prior experience from higher education (no = 0, yes = 1) and having occupational therapy as the highest prioritized line of education at the time of enrolment (no = 0, yes = 1) were registered as categorical variables. The variables measuring prior experience from higher education and educational priority at enrolment were included due to their associations with academic performance and approaches to studying, as established in previous research [13-15].

The learning environment

The extended Course Experience Questionnaire (CEQ) [16-19] consists of 37 items distributed onto six scales: clear goals and standards, emphasis on independence, good teaching, appropriate workload, appropriate assessment, and generic skills. In its original version, reliability estimates (Cronbach’s α) for the employed scales ranged between 0.71 (appropriate assessment) and 0.87 (good teaching) [24]. In addition, one item assesses the students’ general satisfaction with the course. The validated Norwegian translation of the CEQ [20] was used in the present study. Higher scores on the scales indicate that the respondent perceives the course to have (1) clearly established and disseminated goals; (2) high levels of student autonomy and independence; (3) teaching that engages and involves the students; (4) a workload that is not too high; (5) assessment forms that promote and support learning; and the course is felt to (6) support the transfer of content knowledge and skills to the relevant work context. Among the participants in the first wave (first-year students), internal consistency of the scales was 0.73 (clear goals and standards), 0.63 (emphasis on independence), 0.70 (good teaching), 0.69 (appropriate workload), 0.45 (appropriate assessment), and 0.83 (generic skills) [21]. In view of the internal consistency results, subsequent analyses of the ‘appropriate assessment’ scale were not pursued.

Approaches to studying

Study approaches were measured with the Approaches and Study Skills Inventory for Students [22] and the students used a previously validated Norwegian translation of the instrument [23]. The ASSIST consists of 52 statements to which the respondent is asked to rate his or her level of agreement (1 = disagree, 2 = disagree somewhat, 3 = unsure, 4 = agree somewhat, 5 = agree). The instrument has a three-factor structure, a structure recently replicated in a cross-cultural study of undergraduate occupational therapy students [24]. The items are organized accordingly into three main scales (the deep, strategic, and surface approaches to studying). Scale scores are calculated by adding the scores on the relevant items. Entwistle and colleagues reported very good reliability measures for the deep (0.82) and strategic (0.83) approach scales, while reliability measures have been in the lower range (0.65) for the surface approach scale [22]. In this study, among the participants in the first wave (first-year students), internal consistency estimates (Cronbach’s α) for the study approach scales were 0.71 (deep approach), 0.84 (strategic approach), and 0.76 (surface approach) [25].

Academic performance

Exam grades were collected from registries kept at each of the education institutions. As the education institutions used a different number of exams in each study year (ranging between two and four exams in the first study year; between two and three exams in the second year; and between one and three in the third year), the students’ average exam performance (based on the completed number of exams) was first calculated within each study year, and then across the whole study period. The students’ average exam grade scores were based on the qualitative descriptors related to the students’ exam grades: fail = 1, sufficient = 2, satisfactory = 3, good = 4, very good = 5, and excellent = 6 [26].

Consistent participation

The outcome measure in this study was a binary categorical variable indicating whether or not the student had participated in the study at each of the three time points. Students who had not participated in the data collection at any of the time points were not participants in the study. Students who participated one or two times are conveniently referred to as ‘dropouts’, irrespective of which of the data collections they had not participated. Students who had participated in the data collection at all three time points are referred to as ‘consistent participants’.

Data analysis

The sample was described with descriptive statistics; i.e., means and standard deviations for continuous variables and frequencies and percentages for categorical variables. Scores on continuous measures, such as grades and the learning environment and study approach scales, were averaged across the number of measurement occasions on which the students had participated. Consistent participation in the study was defined as having completed and returned the questionnaires on all three measurement occasions. Single and multiple binary logistic regression analyses were performed using consistent participation as the outcome variable. The independent variables included in the single logistic regression analyses were age, gender, educational priority, and prior higher education (representing the background variables); scores on clear goals and standards, student autonomy, good teaching, appropriate workload, and generic skills (representing the learning environment variables); scores on deep approach, strategic approach and surface approach scales (representing the approaches to studying); and average exam grade across all study years. In the adjusted (multiple) logistic regression analysis, background variables were included only if they had a statistically significant bivariate association with the outcome. Grades, all learning environment scales, and study approach scales were included in the adjusted analysis, regardless of their unadjusted association with the outcome. Effect sizes were reported as odds ratio (OR) with corresponding 95% confidence interval (95% CI).

Research ethics

Approval for collecting, storing and utilizing the de-identified data was granted on October 12, 2017 by the Norwegian Center for Research Data (project no. 55875).

Results

Response rates

Of the 305 eligible students, 187 students participated in the first study year (response rate 61.3%). In the second year of study, 168 students participated, representing a response rate of 55.1%. In the third year of study, 200 students participated, representing a response rate of 65.6% and an increase in response rate compared to both previous years. A total of 263 students (86.2%) participated at least once during the three-year study period. We were unable to track the exam grades of 23 students, and these were removed from the analyses. Thus, 240 students constituted the final study sample. Of these, 118 students (49.2%) participated consistently at all three measurement occasions.

Participant characteristics

The sample mean age was 22.6 years, and 190 of the 240 participants (79.2%) were women. At enrollment, occupational therapy was the highest priority line of education for 63% of the students, while 41% had previous experience from university level studying. Table 1 displays the students’ background characteristics, averaged grade and averaged perceptions of the learning environment and approaches to studying.
Table 1

Sample characteristics (n = 240).

VariablesScale rangeValues
Sociodemographic variablesM (SD)
    Age (years)22.6 (4.4)
n (%)
    Female gender190 (79.2)
    Occupational therapy was priority line of study152 (63.3)
    With prior higher education experience98 (40.8)
Learning environmentM (SD)
    Clear goals and standards5–2516.8 (3.2)
    Student autonomy6–3018.3 (3.5)
    Good teaching8–4026.3 (5.1)
    Appropriate workload5–2515.2 (3.3)
    Generic skills6–3023.8 (3.6)
    Satisfaction with the study program11–53.8 (0.8)
Approaches to studyingM (SD)
    Deep approach16–8057.1 (8.0)
    Strategic approach20–10071.8 (8.8)
    Surface approach16–8046.3 (8.5)
Academic performance
    Average exam grade1–64.0 (0.8)

Note. Scale range is the possible scale range.

1Satisfaction with the study program is one item, and one participant had missing score on this variable.

Note. Scale range is the possible scale range. 1Satisfaction with the study program is one item, and one participant had missing score on this variable.

Predictors of consistent participation in the research study

Table 2 displays the results from the unadjusted regression analysis (left side) and the adjusted regression analysis (right side), using consistent participation as the dependent variable. None of the sociodemographic variables or the learning environment variables significantly predicted change in the odds for consistent participation. However, each unit increase in strategic approach score increased the odds of consistent participation, even in the fully adjusted model (adjusted OR: 1.04, p < 0.01). Conversely, each unit increase in surface approach score decreased the odds of consistent participation, even in the fully adjusted model (adjusted OR: 0.95, p < 0.05). Higher grades were associated with higher odds of consistent study participation in the unadjusted model, but the association was no longer statistically significant in the adjusted model.
Table 2

Single and multiple binary logistic regression analyses showing associations with consistent participation in the research study.

Independent variablesUnadjusted associationsAdjusted associations
Sociodemographic variablesOR95% CIpOR95% CIp
    Age1.000.94–1.060.94
    Gender0.870.47–1.620.87
    Priority line of study1.150.68–1.950.60
    Prior higher education0.740.44–1.240.25
Learning environment
    Clear goals and standards1.050.97–1.140.241.010.91–1.130.81
    Student autonomy1.020.95–1.100.611.030.93–1.130.60
    Good teaching1.000.95–1.050.910.990.92–1.070.85
    Appropriate workload1.030.95–1.110.500.970.88–1.070.56
    Generic skills0.990.92–1.060.750.930.83–1.030.15
    Satisfaction with study program1.110.82–1.520.500.990.63–1.550.96
Approaches to studying
    Deep approach1.000.97–1.030.971.000.96–1.030.86
    Strategic approach1.041.01–1.070.011.041.01–1.080.02
    Surface approach0.960.93–0.990.0040.950.92–0.990.02
Academic performance
    Mean exam grade1.481.05–2.09< 0.051.250.86–1.820.24

Note. In the adjusted model, all variables are entered into the equation in one block. Parameters for the adjusted model: Model χ2 = 19.6, p < 0.05. Nagelkerke R2 = 0.11, Cox-Snell R2 = 0.08. In the analyses using ‘satisfaction with study program’ as predictor, 239 participants with valid scores were included.

Note. In the adjusted model, all variables are entered into the equation in one block. Parameters for the adjusted model: Model χ2 = 19.6, p < 0.05. Nagelkerke R2 = 0.11, Cox-Snell R2 = 0.08. In the analyses using ‘satisfaction with study program’ as predictor, 239 participants with valid scores were included.

Discussion

The aim of the study was to examine learning environment factors and approaches to studying as predictors of occupational therapy students’ consistent participation in three waves of data collection in a learning environment survey. The students’ scores on the strategic and surface approach scales were significantly associated with higher and lower odds for consistent participation in the research study, respectively. None of the background variables, learning environment variables, or academic performance significantly predicted consistent participation in the study.

Response rates across time

In this study, student response rates varied at each point of measurement, but were modest overall. This is consistent with the findings of others who studied allied health students longitudinally [10,11,27,28]. Comparisons beyond this are difficult, as many studies do not report the point in time at which participants dropped out if capturing data at multiple points in time, or reasons for non-participation/drop-out, offering only overall outcomes. Of course, participants have a right to discontinue their involvement in research studies, thus trying to ascertain reasons for dropping out may not be appropriate. Even the term”response rate” is defined and derived in a variety of ways, often not clearly described in studies [29], further decreasing the effectiveness and potential value of comparisons between studies. Our study found the lowest level of participation in the second study year, which may represent some unknown factor about the intensity of the workload at that time in the program. Conversely, the highest participation rate was shown in the third study year possibly indicating stronger inclinations towards contributing to the knowledge base in a profession they are soon to enter. Moreover, at the time of the data collection in the third year, the students were preparing for their bachelor’s thesis, which involves planning and conducting a small-scale research project. Thus, the students’ involvement in thesis work at the time may also have served to increase their interest in and willingness to participate in the study in the third year. All of these connections, however, would require further investigation. Nonetheless, this study provides novel information of occupational therapy students’ participation rates over three years of data collection, and these appear to be consistent with longitudinal study norms [9]. There is some evidence that intervention, versus observational studies, may show higher retention rates in allied health students. For example, DaLomba et al. [30] studied the impacts of an embedded librarian on information literacy skills (ILS) of occupational therapy students and reported a 92% retention rate. It is plausible that the novel addition of the librarian’s presence in the course, providing direct teaching, mentoring, and consistent communication of her purpose (to enhance ILS), acted as a prompt to students to complete study protocols at both collection times (beginning and end of semester). The high participation rate in the study may also be attributed to study goals being highly related to course goals, thus increasing relevance and meaning of study participation for the students. To understand the purpose and relevance of course work correlates with increased engagement academically [31,32] therefore it seems reasonable that student participation in study procedures could have been enhanced by these factors. This study found that only two features of student approaches to studying and learning impacted persistent participation. Students who had higher scores on strategic learning were more likely, whereas students who had higher scores on surface learning were less likely to participate in all of the three data collection procedures. Since strategic learners are those that actively observe the dynamics of their environment and seek ways to meet the standards of those who assess their learning [2] it may be that these students responded to requests to participate more frequently due to a perceived benefit to doing so, such as enhancing chances of getting better grades in the course. Alternatively, or in addition, strategically oriented students are often able to manage a higher workload [2]. If setting aside time to complete the survey was perceived as increasing the workload, students with higher scores on the strategic approach may have been better prepared, and thus more inclined, to respond to it consistently over the three time points. Conversely, surface learners tend to operate from a fear of failure, often evidencing overwhelm with work and expectations, and are often inclined to spend little effort beyond what they believe is necessary to pass exams [33]. Therefore, higher dropout rates among students with higher scores on surface learning might have been expected if the addition of three surveys compounded these feelings. Strategic approach behaviors have been associated with desirable states and outcomes, such as higher self-efficacy [5] and better academic results [34], while surface approach behaviors have been linked with poorer self-efficacy and poorer mental health [5], as well as poorer academic outcomes [35]. Thus, the results indicate that consistent study participation may be somewhat more commonplace among higher functioning, academically oriented students with productive study approaches. However, this needs to be further researched. So far, in view of the very weak associations found, this study indicates that for the possible impact of dropout on the validity of findings in longitudinal studies with this population, variations in students’ study approaches are negligible. In this study, variations in the students’ scores on the learning environment variables did not significantly co-vary with consistent participation in the research study. While aspects of the learning environment may lead to increased engagement [31] and satisfaction with the education program [36], they appear to be irrelevant for the retention or drop-out from a research study conducted during the course of study. While there is some evidence that females are more likely to remain engaged in longitudinal studies [29], none of the sociodemographic variables was associated with consistent participation across the three time points in this study. Notably, participation in studies is on the decline over all, possibly due to the volume of research being done, and the amount and frequency with which people are being asked to contribute [29]. This could be a factor impacting the response rates of this study. Moreover, there is evidence that the higher education may be becoming a more commercialized process, where students are viewed as consumers, with their feedback and experiences being routinely collected and used for course improvement, as well as for recruitment purposes [37]. This too, may have impacted students’ willingness to participate in repeated data collection.

Study strengths and limitations

This research was carried out at all (six) academic institutions offering occupational therapy education in Norway, thus providing a fairly comprehensive perspective of Norwegian occupational therapy students. Nonetheless, the results require careful interpretation, as the study sample is rather small, restricted to undergraduate students in one country, and the characteristics of students opting out at all time points remain unknown. We have no information about the eligible students who did not take part in the study on any of the measurement occasions. The results of the study may therefore not be fully applicable to the general occupational therapy student population. The study did not capture reasons for dropping out, nor was information obtained about the workload or other environmental factors that might explain the change in participation between the three time points. For students participating at two or three measurement occasions, their scores on the study approach and learning environment scales were averaged across the relevant number of measurement occasions. While this was necessary in order to use these scales as independent variables in the analysis (given that participants had participated in the data collection on a different number of occasions), it also means that the study has not taken possible variations in these measures over time into consideration. One of the learning environment scales with particularly low reliability was not used in this study. However, several of the other scales that were used also had lower than desired reliabilities.

Conclusion

The study found that rates of student participation in the study fluctuated across time, however 86% of the eligible students participated in at least one of the three measurement times. Forty-nine percent completed the questionnaires at all time points, demonstrating that a relatively high dropout rate may be expected in a longitudinal observational study such as this. Students with higher scores on strategic approach, and lower scores on surface approach, were more likely to complete the questionnaires at each time point. There was no evidence that sociodemographic variables, features of the learning environment or student academic performance were related to consistent participation. These findings imply that while substantial dropout is expected, dropout was relatively evenly distributed across sample subgroups. Thus, dropout appears not to introduce substantial sample bias in longitudinal analyses, as completers were similar to dropouts in most respects. However, students participating at all time points seem to be somewhat more well-organized and academically oriented, compared to students who did not participate at all time points. These findings may be of value for researchers and educators who are planning longitudinal studies, be they observational or experimental, among occupational therapy students. 26 May 2021 PONE-D-21-11288 Predictors of students’ participation in a learning environment survey with annual follow-ups PLOS ONE Dear Dr. Bonsaksen, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The reviewers have highlighted a number of areas where you work can be strengthened and provide clarification for readers. Please submit your revised manuscript by Jul 02 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript: A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Jenny Wilkinson, PhD Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2. We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. In your revised cover letter, please address the following prompts: a) If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent. b) If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. We will update your Data Availability statement on your behalf to reflect the information you provide. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Partly ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: How the data were collected? Was it a self administered or interview? Please specify. Was the tool a standardized? If yes, have you obtained permission to translate it to Norwegian language? Also mention its reliability. (original tool) The explanation on learning environment (CEQ) questionnaire is little confusing for the reader. Somewhere it is mentioned as 30 items and later 37 items. Re-write it. Reviewer #2: This was a study that looked at factors that may predict whether occupational students would consistently participate in a three year longitudinal study. Overall, the writing and presentation of this study was clear. Comments I have regarding each of the sections are listed below: Introduction: I feel that the opening to your introduction is a little confusing. You introduce your study with a discussion of engagement, which seems out of place to me. When I read this the first time, I though you were referring to engagement as participation in research, which I didn’t agree with (but you then did make this reference in the discussion, so maybe you were?). I believe you are using engagement to represent learning environment, but then wonder why you don’t refer to it as the learning environment? Rows 88-90 – I find this sentence confusing …team-based learning, [18] found… limited to a degree that is decreased the generalizability…” You talk about this study as looking at factors that predict drop out, but does is also not refer to students who may not compete the first year (or second year), but opt in at a later year? I think this needs to be clarified Methods: Participants – just to clarify, the data you collected was from a single cohort of students – so year 1 was year 1 students, year 2 was year 2 students, year 3 was year 3 students? Or was it all year 1, year 2, and year 3 students over a 3 year period? I think this could be made clearer. Variables – why were prior experience in higher education and prioritized occupational therapy selected as sociodemographic variables? I would like a little more information regarding academic performance; for example, the number or range of exams each year and why you used the qualitative descriptions as opposed to keeping this as a continuous variable Data Analysis - In the analysis section, you state that if participants completed the surveys multiple times, you averaged their scores for the variables used. I think you need to clarify your rationale for doing this and the possible limitations of doing this. For example, I would be concerned that perceptions of the learning environment may be different at different years of the program. Results: You have used the summed scores for your learning environment and approaches to studying variables. Did you have any missing data for these scales and how did you deal with it? Table 2 – for your statistically significant results, I would suggest that you report the actual p-value instead of the <.05/<.01 Discussion: At the end of the section on response rates across time, you refer to research participation as engagement, and would suggest that you use a different term. You found statistically significant results for strategic and surface learning. In looking at the results and the confidence intervals, both results are very close to 1. I think it would be relevant to include a bit of a discussion regarding the practical significance of these findings. Limitations: I agree that it was a good call to not analyze the appropriate assessment variable. I think it should also be noted as a limitation that many of the other scales also had lower than desired reliabilities (<.80). I also think it should be noted as a limitation that you do not know about the characteristics of those who chose not to take part at all in your study. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Nirmala Pradhan Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. Submitted filename: PONE-D-21-11288_reviewer.pdf Click here for additional data file. 7 Jun 2021 Dear Editor and Reviewers, Thank you for your comments to the manuscript. All comments have been addressed in this response letter, and all changes have been performed using track changes for Word. We look forward to hearing from you. Best wishes, The Authors ************************************************************************* Editor (E): When submitting your revision, we need you to address these additional requirements. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf Authors: The additional requirements have been addressed. E: We note that you have indicated that data from this study are available upon request. PLOS only allows data to be available upon request if there are legal or ethical restrictions on sharing data publicly. For more information on unacceptable data access restrictions, please see http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Authors: An updated Data Availability Statement has been provided. Please note that the preliminary URL to the dataset will be replaced with a permanent URL by the time of acceptance of the manuscript. E: In your revised cover letter, please address the following prompts: If there are ethical or legal restrictions on sharing a de-identified data set, please explain them in detail (e.g., data contain potentially sensitive information, data are owned by a third-party organization, etc.) and who has imposed them (e.g., an ethics committee). Please also provide contact information for a data access committee, ethics committee, or other institutional body to which data requests may be sent. Authors: No ethical or legal restrictions apply; see revised cover letter. E: If there are no restrictions, please upload the minimal anonymized data set necessary to replicate your study findings as either Supporting Information files or to a stable, public repository and provide us with the relevant URLs, DOIs, or accession numbers. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Authors: The data have been stored at INN Open Research Data; URL: https://dataverse.no/privateurl.xhtml?token=91353718-e3c2-4530-9b77-4b03f45e1ab3. Reviewer 1 (R1): How the data were collected? Was it a self administered or interview? Please specify. Authors: All questionnaires were self-administered (see Procedure section), whereas exam grades were collected from registries at each of the education institutions (see measures/Academic Performance section). R1: Was the tool a standardized? If yes, have you obtained permission to translate it to Norwegian language? Also mention its reliability. (original tool) Authors: The study used two tools; the ASSIST and the CEQ. Both are standardized (although several shorter and modified versions exist) and have been previously translated to Norwegian (Diseth, 2001; Pettersen, 2007). Thus, translation was not performed by the authors. Currently, both measures are in the public domain and there is general permission to use them. Reliability estimates for both measures are included, see Measures section. R1: The explanation on learning environment (CEQ) questionnaire is little confusing for the reader. Somewhere it is mentioned as 30 items and later 37 items. Re-write it. Authors: We agree, and the section has been re-written; see Measures section. Reviewer 2: This was a study that looked at factors that may predict whether occupational students would consistently participate in a three year longitudinal study. Overall, the writing and presentation of this study was clear. Comments I have regarding each of the sections are listed below: Introduction: I feel that the opening to your introduction is a little confusing. You introduce your study with a discussion of engagement, which seems out of place to me. When I read this the first time, I though you were referring to engagement as participation in research, which I didn’t agree with (but you then did make this reference in the discussion, so maybe you were?). I believe you are using engagement to represent learning environment, but then wonder why you don’t refer to it as the learning environment? Authors: The initial paragraph was meant to address broader areas which ultimately we were not able to address in this manuscript, therefore the first paragraph has been eliminated to allow specific focus on student approaches to learning and study participation over time. R2: Rows 88-90 – I find this sentence confusing …team-based learning, [18] found… limited to a degree that is decreased the generalizability…” Authors: We agree, and we have modified the sentence. R2: You talk about this study as looking at factors that predict drop out, but does is also not refer to students who may not compete the first year (or second year), but opt in at a later year? I think this needs to be clarified Authors: We agree, and we have clarified the point, see new section ‘Consistent participation’ in the Methods chapter. R2: Methods: Participants – just to clarify, the data you collected was from a single cohort of students – so year 1 was year 1 students, year 2 was year 2 students, year 3 was year 3 students? Or was it all year 1, year 2, and year 3 students over a 3 year period? I think this could be made clearer. Authors: We have clarified the issue, see revised Design section. R2: Variables – why were prior experience in higher education and prioritized occupational therapy selected as sociodemographic variables? Authors: Prior experience from higher education have predicted academic performance (Bonsaksen, 2016) as well as scores on study approach measures (Bonsaksen, Sadeghi, & Thørrisen, 2017). Similarly, having occupational therapy as the top priority line of education at the time of enrolment has been associated with scores on study approach scales (Thørrisen et al., 2020). Given the possibility of covariance between these independent variables, all were included in the multivariate analysis. See revised ‘Sociodemographic variables’ section. R2: I would like a little more information regarding academic performance; for example, the number or range of exams each year and why you used the qualitative descriptions as opposed to keeping this as a continuous variable. Authors: Information about the range of the number of exams in the education institutions is provided; see Academic Performance section. The grade measure was indeed a continuous measure; the text in this section merely refers to the document within which the grades (i.e., A-F) are explained in brief, qualitative statements. See explanation here: Microsoft Word - Karaktersystemet (uhr.no). R2: Data Analysis - In the analysis section, you state that if participants completed the surveys multiple times, you averaged their scores for the variables used. I think you need to clarify your rationale for doing this and the possible limitations of doing this. For example, I would be concerned that perceptions of the learning environment may be different at different years of the program. Authors: These issues have been addressed in the revised Study Limitations section. R2: Results: You have used the summed scores for your learning environment and approaches to studying variables. Did you have any missing data for these scales and how did you deal with it? Authors: One of the 240 participants did not complete information about satisfaction with the study programme. This person is therefore not included in any analyses where this variable is used. The issue is commented in the notes beneath Tables 1 and 2. R2: Table 2 – for your statistically significant results, I would suggest that you report the actual p-value instead of the <.05/<.01 Authors: Performed as requested; see revised Table 2. R2: Discussion: At the end of the section on response rates across time, you refer to research participation as engagement, and would suggest that you use a different term. Authors: Agreed. The wording has been changed to reflect participation in the study. R2: You found statistically significant results for strategic and surface learning. In looking at the results and the confidence intervals, both results are very close to 1. I think it would be relevant to include a bit of a discussion regarding the practical significance of these findings. Authors: In the revised Discussion, we have included a brief statement about the significance of the findings. The Conclusion section is aligned with this interpretation. R2: Limitations: I agree that it was a good call to not analyze the appropriate assessment variable. I think it should also be noted as a limitation that many of the other scales also had lower than desired reliabilities (<.80). I also think it should be noted as a limitation that you do not know about the characteristics of those who chose not to take part at all in your study. Authors: We agree, and we have included this in the Study Limitations section. References used in the response to reviewers: Bonsaksen, T. (2016). Predictors of academic performance and education programme satisfaction in occupational therapy students. British Journal of Occupational Therapy, 79(6), 361-367. doi:10.1177/0308022615627174 Bonsaksen, T., Sadeghi, T., & Thørrisen, M. M. (2017). Associations between self-esteem, general self-efficacy, and approaches to studying in occupational therapy students: A cross-sectional study. Occupational Therapy and Mental Health, 33(4), 326-341. doi:10.1080/0164212X.2017.1295006 Diseth, Å. (2001). Validation of Norwegian version of the Approaches and Study Skills Inventory for Students (ASSIST): Application of structural equation modelling. Scandinavian Journal of Educational Research, 45(4), 381-394. doi:10.1080/0031380120096789 Pettersen, R. C. (2007). Students' experience with and evaluation of teaching and the learning environmenet: Presentation of the Course Experience Questionnaire (CEQ) and validation of three Norwegian versions [in Norwegian: Studenters opplevelse og evaluering av undervisning og læringsmiljø: Presentasjon av Course Experience Questionnaire (CEQ) og validering av tre norske versjoner, Erfaringer med studiet (EMS)]. Halden, Norway: Østfold University College. Report no. 4. Thørrisen, M. M., Mørk, G., Åsli, L. A., Gramstad, A., Stigen, L., Magne, T. A., . . . Bonsaksen, T. (2020). Student characteristics associated with dominant approaches to studying – comparing a national and an international sample (early online). Scandinavian Journal of Occupational Therapy. doi:10.1080/11038128.2020.1831056 Submitted filename: Response to reviewers.docx Click here for additional data file. 14 Jun 2021 Predictors of students’ participation in a learning environment survey with annual follow-ups PONE-D-21-11288R1 Dear Dr. Bonsaksen, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Jenny Wilkinson, PhD Academic Editor PLOS ONE Additional Editor Comments (optional): Thank you for your responses to reviewer comments and manuscript revisions. These have satisfactorily addressed the review comments. Reviewers' comments: 17 Jun 2021 PONE-D-21-11288R1 Predictors of students’ participation in a learning environment survey with annual follow-ups Dear Dr. Bonsaksen: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr Jenny Wilkinson Academic Editor PLOS ONE
  13 in total

Review 1.  Participation rates in epidemiologic studies.

Authors:  Sandro Galea; Melissa Tracy
Journal:  Ann Epidemiol       Date:  2007-06-06       Impact factor: 3.797

2.  Factor analysis of the Approaches and Study Skills Inventory for Students in a cross-cultural occupational therapy undergraduate student sample.

Authors:  Tore Bonsaksen; Milada C Småstuen; Mikkel M Thørrisen; Kenneth Fong; Hua Beng Lim; Ted Brown
Journal:  Aust Occup Ther J       Date:  2018-07-31       Impact factor: 1.856

3.  A longitudinal study of health professional students' attitudes towards interprofessional education at an American university.

Authors:  Risa Liang Wong; Deborah Bain Fahs; Jaideep S Talwalkar; Eve R Colson; Mayur M Desai; Gerald Kayingo; Matthew Balanda; Anthony G Luczak; Marjorie S Rosenthal
Journal:  J Interprof Care       Date:  2016       Impact factor: 2.338

4.  Team-Based Learning and the Team-Based Learning Student Assessment Instrument (TBL-SAI): A Longitudinal Study of Master of Occupational Therapy Students' Changing Perceptions.

Authors:  Ron Carson; Heidi Mennenga
Journal:  Am J Occup Ther       Date:  2019 Jul/Aug

5.  Student perceptions of the learning environment in Norwegian occupational therapy education programs.

Authors:  Björg Thordardottir; Linda Stigen; Trine A Magne; Susanne G Johnson; Astrid Gramstad; Adrian W Gran; Lene A Åsli; Gry Mørk; Tore Bonsaksen
Journal:  Scand J Occup Ther       Date:  2020-10-12       Impact factor: 2.611

6.  Student characteristics associated with dominant approaches to studying: Comparing a national and an international sample.

Authors:  Mikkel M Thørrisen; Gry Mørk; Lene A Åsli; Astrid Gramstad; Linda Stigen; Trine A Magne; Tove Carstensen; Susanne G Johnson; Ted Brown; Hua B Lim; Kenneth N K Fong; Tore Bonsaksen
Journal:  Scand J Occup Ther       Date:  2020-10-11       Impact factor: 2.611

7.  Using multiple imputation to deal with missing data and attrition in longitudinal studies with repeated measures of patient-reported outcomes.

Authors:  Karin Biering; Niels Henrik Hjollund; Morten Frydenberg
Journal:  Clin Epidemiol       Date:  2015-01-16       Impact factor: 4.790

8.  Retention strategies in longitudinal cohort studies: a systematic review and meta-analysis.

Authors:  Samantha Teague; George J Youssef; Jacqui A Macdonald; Emma Sciberras; Adrian Shatte; Matthew Fuller-Tyszkiewicz; Chris Greenwood; Jennifer McIntosh; Craig A Olsson; Delyse Hutchinson
Journal:  BMC Med Res Methodol       Date:  2018-11-26       Impact factor: 4.615

9.  The impact of clinical placements on the emotional intelligence of occupational therapy, physiotherapy, speech pathology, and business students: a longitudinal study.

Authors:  Nigel Gribble; Richard K Ladyshewsky; Richard Parsons
Journal:  BMC Med Educ       Date:  2019-03-27       Impact factor: 2.463

10.  Attrition and generalizability in longitudinal studies: findings from a 15-year population-based study and a Monte Carlo simulation study.

Authors:  Kristin Gustavson; Tilmann von Soest; Evalill Karevold; Espen Røysamb
Journal:  BMC Public Health       Date:  2012-10-29       Impact factor: 3.295

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.