Direct Observation vs. Video-Based Assessment in Flexible Cystoscopy

Research output: Contribution to journalJournal articleResearchpeer-review

Objective: Direct observation in assessment of clinical skills is prone to bias, demands the observer to be present at a certain location at a specific time, and is time-consuming. Video-based assessment could remove the risk of bias, increase flexibility, and reduce the time spent on assessment. This study investigated if video-based assessment was a reliable tool for cystoscopy and if direct observers were prone to bias compared with video-raters. Design: This study was a blinded observational trial. Twenty medical students and 9 urologists were recorded during 2 cystoscopies and rated by a direct observer and subsequently by 2 blinded video-raters on a global rating scale (GRS) for cystoscopy. Both intrarater and interrater reliability were explored. Furthermore, direct observer bias was explored by a paired samples t-test. Results: Intrarater reliability calculated by Pearson's r was 0.86. Interrater reliability was 0.74 for single measure and 0.85 for average measures. A hawk-dove effect was seen between the 2 raters. Direct observer bias was detected when comparing direct observer scores to the assessment by an independent video-rater (p < 0.001). Conclusion: This study found that video-based assessment was a reliable tool for cystoscopy with 2 video-raters. There was a significant bias when comparing direct observation with blinded video-based assessment.

Original languageEnglish
JournalJournal of Surgical Education
Volume75
Issue number3
Pages (from-to)671-677
Number of pages7
ISSN1931-7204
DOIs
Publication statusPublished - 2018

    Research areas

  • cystoscopy, interrater variability, rater-based assessment, surgical education, video recording

ID: 196915553