Date of Award


Degree Name

Doctor of Philosophy



First Advisor

Dr. Stephanie Peterson

Second Advisor

Dr. Jonathan Baker

Third Advisor

Dr. Wayne Fuqua

Fourth Advisor

Dr. Nancy Neef


Online instruction, active student responding, response cards, fill-in-the-blank, synchronous


As of 2016, approximately 28% of college students in the United States were taking at least one online course (U.S. Department of Education, 2016), and it was projected that the percentage of students enrolled in online courses would continue to increase 33% each year (Pethokoukis, 2002). The COVID-19 pandemic hastened further shifts from in-person to virtual learning for many institutions of higher education. Given this rapid shift to online instruction, it is critical to evaluate the effectiveness of online instructional procedures. Providing multiple opportunities for students to respond to instruction has proven to be an effective procedure across most educational settings (Archer & Hughes et al., 2011; Moore Partin et al., 2010) using various active student response systems including response boards and personal response systems (i.e., clickers). While there is a robust body of literature to support the effectiveness of embedding opportunities to respond during in-person instruction; to date, there is limited data on the effects of embedding opportunities to respond through synchronous online formats in post-secondary settings. Using an alternating treatments design, this study evaluated the effects of two active student response modalities (i.e., response cards and written responses in the chat forum) on response accuracy during a synchronous online graduate course. The results suggest that students performed more accurately on post-lecture queries following conditions that required written responses in the chat forum. Moreover, the accuracy of correct responding maintained across the exams and the cumulative final exam. Limitations and future implications are discussed.

Access Setting

Dissertation-Open Access