E-learning Provides Consensus Among Pediatric Urologists In Assessing Residents Doing Surgery
Max Maizels, MD1, Walid A. Farhat, MD2, Edwin Smith, MD3, Dennis Liu, MD1, Linda C. Lee, MD2, Nicolas Fernandez, MD, PhD2, Yasin Bhanji, BA1.
1Lurie Children's Hospital, Chicago, IL, USA, 2Hospital for Sick Kids, Toronto, ON, Canada, 3Children's Healthcare of Atlanta, Atlanta, GA, USA.
Residency Training Programs expect Pediatric Urology attendings to train residents to do surgery. Best training practices include establishing that attendings do agree in their assessments of residents at surgery. Yet, attendings do not have a tool which informs them how to make such assessments. Therefore, the authors collaborated to bridge this gap using e-learning. First, we developed an online tool to teach “how to assess residents doing surgery.” Then, we developed an online tool to test attendings who have e-learned for agreement (i.e., consensus) in their assessments of residents at surgery.
Hypothesis. Attendings who e-learn “how to assess residents doing surgery” will show strong consensus between each other.
Materials. First, the assessment method, developed over three iterations, incorporates the psychomotor (i.e., surgical skill) and cognitive (i.e., understanding of surgery) learning domains. Specifically, attendings make Likert ratings (1-4) of resident skill by noting the extent to which the resident needed help at surgery (i.e., Zwisch scale: show & tell, active help, passive help, or supervise only) and of resident understanding by noting the ability to name key surgical features (e.g., surgery plan, tissues handled). Next, the index cases were chosen as inguinal approach to repair hernia, hydrocele, and undescended testis. Last, tools for e-learning learning and testing were built as mixed media.
Study Design. The study was a non-randomized prospective test as follows. Pediatric Urology attendings were enrolled as study subjects. They did not cross communicate. After e-learning the assessment method, they were tested on their assessments of residents doing the index surgeries as shown in 20 video clips. Data was analyzed for consensus in assessments between attendings (i.e., inter-rater reliability).
There were 11 Pediatric Urologists enrolled. All completed the e-learning phase in less than 1 hour and agreed or strongly agreed that the e-learning was effective. The testing phase was completed over the next 8 weeks and provided the following data. Of 89 ratings of skill, 56 (65%) matched exactly and 89 (100%) matched within one level above or below. Similarly, of 264 ratings of understanding, 169 (64%) matched exactly and 253 (96%) matched within one level above or below. In aggregate, these ratings showed exact matches in 64% and matches within one level in 97%. Analysis of inter-rater reliability showed strong consensus between attendings in ratings of both skill (ICC= 0.71, CI 0.46-0.95, p = 0.03) and understanding (ICC = 0.86, CI 0.67-0.96, p < 0.001).
We have demonstrated that attendings who e-learn "how to assess residents doing surgery" show strong consensus between each other. We expect e-learning will provide the basis to generalize teaching how to assess residents doing surgery. We plan to expand use of these tools to the real time operating room so as to enable attendings to provide feedback/remediation and residents to track their progress during training.
Back to 2016 Fall Congress