Page 97 - Teaching Innovation for the 21st Century
P. 97

                                                                          where only a pre- or post-test may have been administered (Alinea and Naylor 2017; Alinea and Naylor 2015; Alinea 2020). A ‘polarisation effect’ had been observed for questions 5, 11, 13, 18, 29 and 30, where student responses were divided between the correct answer and a ‘mostly correct’ answer that carries a misleading statement. These misleading statements intentionally target a misconception (i.e., that an object in motion is driven by an active force). In a study focused on the pre-test data from first-year physics students, we observed this same polarisation effect (pronounced in algebra-
based introductory physics classes) (Chrysostomou et al. forthcoming). We observed that the proportion
of students polarised towards the wrong choice remained fairly high and even overtook the correct answer. Students seemed to struggle most in understanding the interplay of forces for objects in motion.
In this present study, our research aim was to investigate whether
this polarisation effect could be observed after a semester’s worth
of introductory mechanics taught remotely at a South African university. To do so, we performed a question- by-question analysis on post-test responses to classical mechanics questions, with a total of N = 139 students. These are also compared (question-by-question) to the pre-test cohort.2 We begin with a description of the technology used to collate
the responses in Section 2 and then present our results in Section 3. Concluding remarks and future avenues of research are presented in Section 4.
2 This is compared to the pre-test cohort numbers: N = 353.
Through this platform, we were able to track student activity (to determine whether test-takers left the browser page during the course of the test), time their test attempt and force submission precisely 30 minutes after the test was initiated.
Data collection and processing in large cohorts
Due to the ongoing nature of the Covid-19 pandemic, South African universities were expected to function mostly − if not entirely − online. The delayed release of the 2020 Grade
12 results further derailed the 2021 academic programme, with first-year academic activities only beginning
on 8 March 2021 at the University of Johannesburg (UJ), leading to a first semester shortened by several weeks.3
The FCI pre-test was administered in the second and third week of April 2021. The classes involved in this testing included students majoring
in physics, life science, earth science and physics (high school) education. Second semester activities began
on 15 July, with the post-test administered in the first and second week of August 2021. Engineering students and earth science majors were included in this cohort. Since only 7 of the 139 students involved
in the post-test phase wrote both the pre- and post-test, it was not possible to perform a detailed analysis capable of extracting statistically significant conclusions on the effectiveness
of instruction from the data. For
this reason, a commentary on the observed gains (and the extent to which the polarisation effect was influenced by a semester’s worth
of mechanics classes) is beyond the scope of this study. The pre- and post- tests were set up via the Blackboard interface and made available to students for a total of five days, using the module page corresponding to each course. Through this platform, we were able to track student activity (to determine whether test-takers left the browser page during the course
3 For results at UJ during the 2020 year see, Carleschi et al. 2021.
Teaching Innovation for the 21st Century | Showcasing UJ Teaching and Learning 2021
95
                      











































































   95   96   97   98   99