Journal of Family and Community Medicine

MEDICAL EDUCATION
Year
: 2005  |  Volume : 12  |  Issue : 2  |  Page : 101--105

An audit of assessment tools in a medical school in eastern Saudi Arabia


Abdullah M Al-Rubaish1, Khalid U Al-Umran2, Lade Wosornu3 
1 Department of Internal Medicine, College of Medicine, King Faisal University, Dammam, Saudi Arabia
2 Department of Pediatrics, College of Medicine, King Faisal University, Dammam, Saudi Arabia
3 Department of Surgery, College of Medicine, King Faisal University, Dammam, Saudi Arabia

Correspondence Address:
Khalid U Al-Umran
P.O. Box 40140, Al-Khobar 31952
Saudi Arabia

Background : Assessment has a powerful influence on curriculum delivery. Medical instructors must use tools which conform to educational principles, and audit them as part of curriculum review. Aim : To generate information to support recommendations for improving curriculum delivery. Setting : Pre-clinical and clinical departments in a College of Medicine, Saudi Arabia. Method : A self-administered questionnaire was used in a cross-sectional survey to see if assessment tools being used met basic standards of validity, reliability and currency, and if feedback to students was adequate. Excluded were cost, feasibility and tool combinations. Results : Thirty-one (out of 34) courses were evaluated. All 31 respondents used MCQs, especially one-best (28/31) and true/false (13/31). Groups of teachers selected test questions mostly. Pre-clinical departments sourced equally from «DQ»new«DQ» (10/14) and «DQ»used«DQ» (10/14) MCQs; clinical departments relied on «SQ»banked«SQ» MCQs (16/17). Departments decided pass marks (28/31) and chose the College-set 60%; the timing was pre-examination in 13/17 clinical but post-examination in 5/14 pre-clinical departments. Of six essay users, five used model answers but only one did double marking. OSCE was used by 7/17 clinical departments; five provided checklist. Only 3/31 used optical reader. Post-marking review was done by 13/14 pre-clinical but 10/17 clinical departments. Difficulty and discriminating indices were determined by only 4/31 departments. Feedback was provided by 12/14 pre-clinical and 7/17 clinical departments. Only 10/31 course coordinators had copies of examination regulations. Recommendations: MCQ with single-best answer, if properly constructed and adequately critiqued, is the preferred tool for assessing theory domain. However, there should be fresh questions, item analyses, comparisons with pervious results, optical reader systems and double marking. Departments should use OSCE or OSPE more often. Long essays, true/false, fill-in-the-blank-spaces and more-than-one-correct-answer can be safely abolished. Departments or teams should set test papers and collectively take decisions. Feedback rates should be improved. A Center of Medical Education, including an Examination Center is required. Fruitful future studies can be repeat audit, use of «DQ»negative questions«DQ» and the number of MCQs per test paper. Comparative audit involving other regional medical schools may be of general interest.


How to cite this article:
Al-Rubaish AM, Al-Umran KU, Wosornu L. An audit of assessment tools in a medical school in eastern Saudi Arabia.J Fam Community Med 2005;12:101-105


How to cite this URL:
Al-Rubaish AM, Al-Umran KU, Wosornu L. An audit of assessment tools in a medical school in eastern Saudi Arabia. J Fam Community Med [serial online] 2005 [cited 2021 May 16 ];12:101-105
Available from: https://www.jfcmonline.com/article.asp?issn=2230-8229;year=2005;volume=12;issue=2;spage=101;epage=105;aulast=Al-Rubaish;type=0