

Objective Structured Clinical Examinations (OSCE) are resource intensive, not practical as teaching tools, and their reliability depends on evaluators. Computer-based case simulations (“virtual patients”, VP) have been advocated as useful and reliable tools for teaching clinical skills and evaluating competence. We have developed an internet-based VP system designed both for practice and assessment of medical students. The system uses interactive dialogue with natural language processing, and is designed for history taking, evaluation of physical examination, including recognition of visual findings and heart and lung sounds, and ordering lab-and imaging tests. The system includes a practice modality that provides feedback, and a computerized OSCE. The reliability of our system was assessed over the last three years by comparing the clinical competence of medical students in similar VP and human OSCE. A total of 262 students were evaluated with both exam modalities. The correlation between the two exams scores was highly significant (p<0.001). Alpha Cronbach for the computerized exam was 0.82-0.89 in the 3 years, and was substantially higher than that of the conventional OSCE each year. We conclude that a computerized VP OSCE is a reliable examination tool, with the advantage of providing also a training modality.