Quizzes are opportunities to reinforce key concepts, check comprehension, and promote deeper thinking. Poorly designed questions can be immensely frustrating for learners, or even worse: a source of entertainment (see my last post for an example). What’s the most frustrating (or entertaining) quiz question you’ve ever seen? In this post, I’ll explore how to create meaningful quiz questions that are fair, defensible, and supportive of learning, including how to start, how to phrase questions, write distractors, and create feedback that reinforces learning.
How do I start creating quiz questions? (start with the learning outcomes)
Begin by considering the learning outcomes. The learning outcomes should drive all the learning in the training or module, so this is what you want to test your learners on. Unpack the concept into specific and discrete areas and levels of knowledge. This should present itself naturally within your content. What are the critical messages in the module that you want the participant to take away?
Create a scenario in which a participant will need to apply their learning. The more real-world the scenario, the more your participants will be able to relate to it and the more authentic the question will be for your participants (which leads to more buy-in).
How do I write the questions? (be clear and concise)
When writing questions:
- Be clear and concise. Questions should test understanding, not the ability to read subtext or decipher meaning from superfluous detail (unless reading subtext is your learning outcome, and in which case, can we chat? That sounds super cool).
- Use simplified language (but keep discipline-necessary vocabulary). Simplified language reduces barriers for English language learners in quizzes that are not intended to assess reading ability (Haladyna et al., 2002).
- Avoid unnecessary context that could distract from the main purpose of the question.
- Test one concept per question. “Measures of test reliability and validity assume that the items are independent from one another.” (Towns, 2014). In other words, if the questions are connected and build on each other in a way that penalises question 3 because question 1 was not correct, then it is not validly testing these ideas independently.
- Align the question to the vern of the learning outcome. If the verb asks learners to apply, then create a scenario that allows them to apply. Don’t design a factual recall question.
How do I write defensible distractors? (make them plausible and nuanced)
Distractors (the incorrect answer options) are often the weak point of multiple-choice questions. To make them defensible:
- Make them plausible. Each distractor should sound like a realistic possibility to someone who hasn’t mastered the material.
- Avoid obvious throwaways. Nonsense options make the correct answer too easy.
- Use common misconceptions. If learners often confuse two concepts, include that misconception as a distractor.
- Use learned misinformation. It’s a great opportunity to clarify the truth and show the evidence.
- Create equal distractors. Create distractors that are all of a similar length and level of nuance. A defensibly correct option will require more words to justify its ‘correctness’ and show nuance.
What does nuance look like in a distractor?
Here’s an example aligned with this learning outcome: “apply key principles of data security and compliance”.
You have a subscription to Articulate Rise to develop online modules as part of your role, and another member of your team is going to help you develop them. Do you:
- Change the password and give them temporary access, changing the password after they have helped out?
- Ask your manager for a new Articulate license for the staff member.
- Give them the password and tell them not to share it with anyone.
- Share the login details only after getting written approval from your manager.
This is a very common scenario when small teams need annual licenses for tools. The nuance in this scenario highlights common practice, not overtly negligent practice. It’s the authenticity in these questions that makes the participant think twice about the correct response: if it’s common practice, is it correct? (Hint: this is also the super important teachable moment about misinformation).
- Feels plausible, and there is an element of security given that you will be changing the password multiple times. Also, breaches the organisation’s IT policies.
- Feels plausible, doubtful that the manager would want to approve a couple of thousand for a license that isn’t going to get used all that much.
- Feels less plausible, but it is fairly common practice in my experience. Also, breaches the organisation’s IT policies.
- Feels plausible with an air of safety, as you have written approval. Also, breaches the organisation’s IT policies, and you won’t get off lightly just because you have your manager’s written approval.
So, from the perspective of the learning outcome: “apply key principles of data security and compliance”, the correct response is: #2. This is the only response that does not breach the organisation’s IT policy, but it is fairly common practice for staff in small teams to share passwords for paid subscriptions.
How do I write feedback? (explain and inform)
A really important (and often under-utilised) aspect of training quizzes is feedback. Feedback transforms a quiz from an assessment tool into a learning tool. Quizzes are useful for consolidating learning and checking participant comprehension, but the emphasis should be more on the feedback and less on whether the participant is correct or incorrect. Ultimately, mandatory compliance training is not about whether the participants have met a specific level of competency, and this is made clear by the ability for participants to repeat the quizzes until they get the correct responses. So, to make this more meaningful, we should focus on the feedback.
Effective feedback should be more than correct/incorrect. Feedback should be:
➼ Explanatory, not just confirmatory. There are usually icons to indicate ‘correctness’.
➼ Specific. Vary the feedback depending on response (not global whole-quiz feedback). This creates a feedback dialogue.
➼ Informative. Clarify misconceptions and misinformation. Describe why the answer is correct to reinforce understanding and to clarify for those who may have guessed correctly.
➼ Supportive. Encourage learners by framing mistakes as opportunities to learn.
Creating meaningful quiz questions in online modules is about aligning with learning outcomes, crafting clear and authentic questions, writing defensible distractors, and giving learners the kind of feedback that deepens understanding. If you’ve designed a quiz recently, try rewriting just one question using this framework. See if your learners engage differently. Interested in live quizzes? Check out my blog post on Checking comprehension with quizzes and in-class polling
References
Haladyna, T. M., Downing, S. M., & Rodriguez, M. C. (2002). A Review of Multiple-Choice Item-Writing Guidelines for Classroom Assessment. Applied Measurement in Education, 15(3), 309–333. https://doi.org/10.1207/S15324818AME1503_5
Towns, M. H. (2014). Guide To Developing High-Quality, Reliable, and Valid Multiple-Choice Assessments. Journal of Chemical Education, 91(9), 1426–1431. https://doi.org/10.1021/ed500076x
