BLOG: Top 10 Question Writing Crimes #TBT
A couple of years ago we posted a blog - based on 'The Great Duck Race' we created as game sponsors of the eLearning Awards - detailing the things people get so wrong when setting quiz and assessment questions.
Heaps has changed on the digital learning landscape since, not least the continued evolution of games and technology. But the basics stay the same. Here's why that blog is as relevant today as it was then...
Although we live in a brave new world of super-fast broadband, 4K TV, Oculus Rift and learning ecosystems, the humble multiple-choice question still has an important role to play in the world of learning.
There is considerable skill involved in writing good MCQs and lots of ways of writing bad ones too. No matter how well the graphic designers wrap them up in creative and elegant clothing, those old familiar flaws still emerge blinking into the daylight with remarkable frequency.
We’ve been writing assessment questions for over 25 years and have learnt a lot of simple but important lessons along the way.
So here’s a “top ten” list of common question-writing ‘crimes’, using the Unicorn Great Duck Race as a light hearted template.
1. Obvious questions. You may not know much about duck racing, but you can probably pick the right answer to this question:
You work in the anti-doping team of the Duck Racing Federation. The latest test results have just come in and a famous East European racing duck is showing a positive test. What should you do? A. Destroy the results as they are clearly incorrect B. Pass the results on to a junior member of the team to deal with in case they later prove to be incorrect C. Speak to the duck concerned to tip him off D. Follow the DRF standard procedures. Report the positive results to the anti-doping board, complete from DRB6b in triplicate, keep the pink copy for your file and return the other two by recorded delivery to DRF HQ
2. Labelling questions - we still see too many of them and they usually indicate that the writer doesn’t really understand the subject or is being paid by the question! Take this example. Is this really testing a key learning point that’s going to help me do my job better or is it just lazy question writing?
Which clause in the Duck Racing Code of Conduct is concerned with the allowed length of tail feathers? A. Clause 3 B. Clause 4 C. Clause 7 D. Clause 9
Other examples include: ‘In what year…?’ ‘What do the initials FCA stand for…?’, ’Which Act introduced…?’, ‘How many Data Protection principles are there …?’ and so on.
3. Arithmetical questions - where the distractors are just ‘placeholders’ and not the result of logical misunderstanding / miscalculation.
If six ducks set out through the rapids and three are eaten by crocodiles, how many are left? A. 3 B. 23 C. 43 D. 14.75
This ‘crime’ is often instigated during the SME updating process when, say, a tax rate changes. The writer updates the question stem and correct answer but ignores the distractors, which then become arithmetically improbable.
On a related point, remember wherever possible to keep the arithmetic simple - unless you are testing mathematical skills. Focus on the principle not the maths.
4. Negative stems - often compounded with negative answer options, these questions are unnecessarily confusing for the student.
Which of the following is NOT a drawback of having extra large flippers? A. They are not effective on courses with rapids B. They get tangled in the weeds C. They don’t attract crocodiles D. Not getting caught by overhanging branches
5. “All of (None of) the above” as an answer option - normally the correct answer anyway, it becomes even more inappropriate when the order of the answers is randomised!
Who is responsible for reporting breaches of the duck federation rules? A. Competitors B. The referee C. The federation inspectors D. All of the above
6. Lack of (or inappropriate) alignment between the stem and answer options - if only one answer (or in extreme cases no answer) aligns with the stem, the question is flawed.
Fast ducks will usually defeat strong ducks because they …. A. have bigger flippers B. strong ducks slow down in rapids C. the size of the wings D. fast ducks can outswim crocodiles
On a related point, a well written assessment question normally has a ‘focused’ stem - it should be possible to formulate a response before looking at the answer options.
7. Answer options aren’t discrete - in the example below if option D is correct then options A and B which are components of this answer, are also valid responses.
To enter the Great Duck Race, what, if anything, do you need to do? A. Complete the online entry form B. Pay the race entry fee C. Join the DRF D. Complete the online entry form and pay the race entry fee
8. Subjective question stem - ‘What would you do?’ ('What do you think?' How will you respond?) This may be suitable in exercises within an eLearning module but is not appropriate for objective testing where you are seeking a correct response rather than an opinion.
While competing in the Great Duck Race, you spot a duck in difficulties in the rapids. What action will you take? A. Swim on, as you are there to win B. Laugh and point him out to other competitors C. Stop and try to help D. Look for the nearest lifebuoy
Depending on the character of the respondent, any of the answers is possible and all of them might be valid. “What should you do?” is marginally better but still too subjective.
9. Answer options that contain ‘never’ or ‘always’ – giving a clue that these are distractors, especially when included with a ‘usually’ option.
Which is true of the Drake’s Gulch racecourse? A. Strong ducks always win B. Fast ducks usually win C. Intelligent ducks never win D. Stable ducks always get eaten by crocodiles
10. If you can’t think of a tenth question, what should you do?
A. Don’t write one B. Desperately try to write just one more even though you’ve already run out of learning points C. Rename your article as the ‘Top Nine’ question writing crimes....