Abstract:
INTRODUCTION : Chart review is central to understanding adverse events (AEs) in medicine. In this article, we describe the process and results of educating chart reviewers assigned to evaluate dental AEs.
METHODS : We developed a Web-based training program, “Dental Patient Safety Training,” which uses both independent and consensus-based curricula, for identifying AEs recorded in electronic health records in the dental setting. Training included (1) didactic education, (2) skills training using videos and guided walkthroughs, (3) quizzes with feedback, and (4) hands-on learning exercises. In addition, novice reviewers were coached weekly during consensus review discussions. TeamExpert was composed of 2 experienced reviewers, and TeamNovice included 2 chart reviewers in training. McNemar test, interrater reliability, sensitivity, specificity, positive predictive value, and negative predictive value were calculated to compare accuracy rates on the identification of charts containing AEs at the start of training and 7 months after consensus building discussions between the 2 teams.
RESULTS : TeamNovice completed independent and consensus development training. Initial chart reviews were conducted on a shared set of charts (n = 51) followed by additional training including consensus building discussions. There was a marked improvement in overall percent agreement, prevalence and bias-adjusted κ correlation, and diagnostic measures (sensitivity, specificity, positive predictive value, and negative predictive value) of reviewed charts between both teams from the phase I training program to phase II consensus building.
CONCLUSIONS : This study detailed the process of training new chart reviewers and evaluating their performance. Our results suggest that standardized training and continuous coaching improves calibration between experts and trained chart reviewers.