Certiverse Blog

How many reviews does it take to develop a high-quality exam question?

Written by Alan Mead, PhD | Sep 1, 2021 4:34:42 PM

Test developers understand the importance of reviewing test questions (i.e., items) to ensure quality. In the traditional test development process, much time is spent training Subject Matter Experts (SMEs) to write valid questions. The Certiverse platform reduces this training time by using AI and machine learning to guide the item writer along in the process, providing prompts when necessary to ensure high-quality items.  

Despite the advances in machine learning over the last several years, however, there is still a need for human subject matter experts (SME) to review potential test questions.  

The Certiverse peer-review system is designed to require a minimum of two independent reviews, but some items need more review after revision. For example, if a reviewer were to find an issue, they would flag the item and provide feedback to the author; the item would then need to be reviewed at least once more until it was marked “approved.” 

To better understand how many reviews are required, we examined review data for one exam. In addition to being interesting in its own right, this analysis serves as a baseline to evaluate the effect of future improvements in our AI-assisted item-writing process. 

As shown in the table below, 58.6% of the items were finalized within two review cycles, but some items needed more reviews. More than 95% of test items were approved within five review cycles. However, in a few rare instances some items required as many as ten reviews. 

Because most of the remaining items were resolved by a fifth review, these data suggest that the "typical” number of reviews should be two to four. Fewer than 5% of items required more reviews. To support this level of review and approval, most programs will want to have at least six active SMEs to author and peer-review items. Employing a dozen or more authors not only makes the work go more quickly, but also allows for a faster resolution in the rare instance when an item needs additional cycles of review and revision. When there are fewer SMEs, some items may not have enough available reviewers and will need to be manually approved by a “super SME” administrator. 

Reviews Percent Percentile
2 58.5 29
3 19.7 68
4 11.5 84
5 6.0 93
6 2.1 97
7 0.9 98
8 0.4 99
9 0.1 99
10 0.9 99.6

 

While this data is from just one exam—and each program and exam are unique—we think it offers a useful point of comparison for those who are considering a move to the Certiverse platform. The real beauty of the system is the speed with which these editing and approval rounds can happen. By automating repetitive training and moderating tasks, Certiverse allows your SMEs to focus on providing more meaningful input to drive high-quality exam content more quickly.