Solving Not Answering. Validation of Guidance for Writing Higher-Order Multiple-Choice Questions in Medical Science Education
2024

Validating Guidance for Writing Higher-Order Multiple-Choice Questions

Sample size: 80 publication 10 minutes Evidence: high

Author Information

Author(s): Maria Xiromeriti, Philip M. Newton

Primary Institution: Swansea University Medical School

Hypothesis

Can guidance for creating multiple-choice questions effectively assess higher-order learning in medical education?

Conclusion

The study found that questions written using the guidance were significantly harder for novices to answer, indicating they effectively assess higher-order learning.

Supporting Evidence

  • Novices could easily answer lower-order questions but struggled with higher-order questions.
  • Experts maintained their ability to answer higher-order questions even under closed-book conditions.
  • Statistical analysis showed significant differences in performance between question formats.

Takeaway

This study shows that special rules for writing test questions can help teachers create harder questions that make students think more deeply.

Methodology

The study involved two experiments comparing novice and expert students' performance on higher-order and lower-order questions.

Potential Biases

Potential bias from using online labor markets for participant recruitment.

Limitations

The study was limited to two subject areas and did not determine which specific elements of the guidance were most effective.

Participant Demographics

Participants included novices and experts in medical science, with a total of 80 students involved.

Statistical Information

P-Value

0.0002

Statistical Significance

p<0.05

Digital Object Identifier (DOI)

10.1007/s40670-024-02140-7

Want to read the original?

Access the complete publication on the publisher's website

View Original Publication