The mission to deliver an insightful, helpful score report
While change is inevitable, progress should never be optional.
It is in that spirit the National Registry of Emergency Medical Technicians (National Registry) is constantly looking for ways to make our examinations the best they can be for the EMS community. Currently, as part of our comprehensive Advanced Life Support redesign, the Expert Panel is reviewing the usefulness and impact of domain level feedback provided to candidates, Program Directors and educators, and state EMS agencies.
The National Registry has, historically, provided domain level scoring feedback which provides candidates with information about how well they did on each content domain assessed during their examination. That is, a score is estimated for each domain (e.g., cardio, ARV) included on the test, which tells candidates how they performed on that specific domain (or how many items/questions they correctly answered on that content domain).
While some may believe the domain level feedback is helpful, it is important to know that the National Registry uses a compensatory scoring model for all cognitive examinations. A compensatory scoring model uses a total score for the examination rather than a score for each specific domain. A lower performance in one content domain could be balanced by a higher performance in another content domain on the same test.
The National Registry’s compensatory scoring model is not designed to provide domain level scoring. A candidate’s performance on an examination is determined based on their overall performance on the test and not on their performance in an individual area. While the overall examination is designed to meet high standards of reliability and validity, the information gathered at individual domain level is not sufficient to provide conclusive and useful results.
In other words, using domain level feedback as a study guide may not help a candidate to pass the examination. Likewise, an EMS program’s success should be viewed based on the overall success of the candidates’ examination outcomes, not their individual domain scores.
The question before the ALS Redesign Expert Panel is what will be the impact on the EMS community if the National Registry stops reporting domain level feedback?
There are alternatives, including Scaled Scoring, which are being discussed. That option would provide better feedback for candidates and programs. Additionally, more dynamic messages could be provided to candidates who fail the examination. Potentially, this could include explanations of the score and suggestions on next steps tailored to each candidate.
The consensus from the panel, regardless of the direction taken, is to focus on delivering a score report that provides the most accurate, helpful and consistent feedback for candidates and educators alike.