- Our Mission
- Our Leadership
- Diversity, Equity, Inclusion
- Learning Science
- Webinars on Demand
- Digital Community
- English Community
- Psychology Community
- History Community
- Communication Community
- College Success Community
- Economics Community
- Institutional Solutions Community
- Nutrition Community
- Lab Solutions Community
- STEM Community
Last month, I considered the strategy of including Essentially, while I hated pop quizzes as a student, I thought I might be shortchanging students who do well as test takers. I decided to try quizzes in the online technical writing course during Virginia Tech’s Winter Session.
Now that the course is over, I have to admit that the quizzes seemed useful and effective. Logistically, the system was simple to set up.The companion website for the textbook included quizzes that were ready to import to Scholar (our campus installation of Sakai). I had to edit the quizzes in order to randomize answers where possible and remove the requirements for written rationales for some questions. Otherwise, they were ready to go. I just used what was available.
In course materials, I described the assessments as “reading quizzes.” I explained that my goal was to help students find the key information in each chapter. The quizzes were open book, and students had as much time as they needed to complete them. The only restrictions were that they had to complete the quiz in one sitting and that they could not retake the quizzes.
Scholar automatically graded students’ work, giving them immediate feedback. Their averages on these quizzes were high. Nearly everyone earned 100% on every quiz. Occasionally, someone missed one question. The only truly low grades were people who received zeros for not taking a quiz at all before the deadline. Because of the built-in analysis tools in Scholar, I was able to look at statistics and item analysis for each question; so it was easy to notice any questions that students had trouble with.
Anecdotally, students seemed to include more key terms from the text in discussions than in my previous classes. I haven’t analyzed the data to see if the numbers bear out that observation, but I think it makes sense. Previously, I relied on students reading the text and learning the material. I typically pointed out the key terms and guidelines in class in an attempt to help them focus on the significant details.
With these quizzes, I asked students to take an active role in finding, and often applying, the information from the text. They weren’t in the passive role of absorbing a reading or listening to me point out the details. They had to find the key information themselves and use it to pass the quiz. That difference between active and passive learning may be significant in this case.
So back to the title question: “Quizzes Work: True or False?” I’m going with True. In my circumstance, they seemed to help students identify key concepts, but I was able to set things up in a way that didn’t punish students. I am convinced enough that I am using reading quizzes spring semester again in my technical writing course and adding them to my writing and digital media course (a multimodal composing class). I use Practical Strategies for Technical Communication in the former course and Writer/Designer in the latter, for which there are no ready-made quizzes on the companion site. I will have to write my own, but my experience during the winter session was good enough though that I think the effort will pay off. We’ll see as the term unfolds, and I’ll share the details.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.