Setting standard and defining quality of performance in the validation of a standardized-patient examination

Curtis J. Rosebraugh, Alice J. Speer, David J. Solomon, Karen E. Szauter, Michael A. Ainsworth, Mark D. Holden, Steven A. Lieberman, Ernest B. Clyburn

Research output: Contribution to journalArticlepeer-review

15 Scopus citations


Purpose. To evaluate whether written standards increase the reproducibility of a physician-facilitated station in an objective structured clinical examination (OSCE) designed to assess history, physical-examination, and communication skills. Method. The OSCE examination at the University of Texas Medical Branch-Galveston consists of ten eight-minute stations. Six of these stations consist of three History, Physical-examination, Problem- solving, and Plan (HPPP) station pairs. Each existing clinical-problem HPPP station was given to two content experts to develop standards for faculty rating scales appropriate for the evaluation of third-year medical students. Three pairs of faculty members were used to determine interrater reliability by scoring videotapes of three HPPP stations' presentation and problem- solving components. Faculty pairs scored tapes of 15 students without using standards and tapes of 15 students using the standards developed. Differences between the reliabilities without and with the standards were tested for significance using Fisher's R to Z transformation. The reproducibility and standard error of measurement (SEM) were extrapolated for increasing amounts of testing time. The HPPP component scores were also correlated with the written examination scores and preceptors' ratings. Data were obtained from the three HPPP stations used in the 1995-96 internal medicine clerkship SP examination. Results. In all, 196 students completed the OSCE examination. The standards developed improved interrater reliability and reached statistical significance (p<.01) for one HPPP station. Reproducibility for the presentation and problem-solving components of HPPP stations were > .80 after five hours of testing. The problem-solving component correlated at .37 and .19 with written examinations and with ward grades, respectively. Conclusion. The data from this study suggest that standards increase the reproducibility of presentation and problem-solving components of an OSCE to a level as high as, or higher than, the associated with the history, physical-examination, and communication components of traditional standardized-patient examinations.

Original languageEnglish (US)
Pages (from-to)1012-1014
Number of pages3
JournalAcademic Medicine
Issue number11
StatePublished - Nov 1997

ASJC Scopus subject areas

  • Education


Dive into the research topics of 'Setting standard and defining quality of performance in the validation of a standardized-patient examination'. Together they form a unique fingerprint.

Cite this