Journal of Family & Community Medicine
Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contact us Login 
 

Users Online: 101 Home Print this page Email this page Small font sizeDefault font sizeIncrease font size

 
MEDICAL EDUCATION
Year : 2005  |  Volume : 12  |  Issue : 2  |  Page : 101-105

An audit of assessment tools in a medical school in eastern Saudi Arabia


1 Department of Internal Medicine, College of Medicine, King Faisal University, Dammam, Saudi Arabia
2 Department of Pediatrics, College of Medicine, King Faisal University, Dammam, Saudi Arabia
3 Department of Surgery, College of Medicine, King Faisal University, Dammam, Saudi Arabia

Correspondence Address:
Khalid U Al-Umran
P.O. Box 40140, Al-Khobar 31952
Saudi Arabia
Login to access the Email id

Source of Support: None, Conflict of Interest: None


PMID: 23012084

Rights and PermissionsRights and Permissions

Background : Assessment has a powerful influence on curriculum delivery. Medical instructors must use tools which conform to educational principles, and audit them as part of curriculum review. Aim : To generate information to support recommendations for improving curriculum delivery. Setting : Pre-clinical and clinical departments in a College of Medicine, Saudi Arabia. Method : A self-administered questionnaire was used in a cross-sectional survey to see if assessment tools being used met basic standards of validity, reliability and currency, and if feedback to students was adequate. Excluded were cost, feasibility and tool combinations. Results : Thirty-one (out of 34) courses were evaluated. All 31 respondents used MCQs, especially one-best (28/31) and true/false (13/31). Groups of teachers selected test questions mostly. Pre-clinical departments sourced equally from "new" (10/14) and "used" (10/14) MCQs; clinical departments relied on 'banked' MCQs (16/17). Departments decided pass marks (28/31) and chose the College-set 60%; the timing was pre-examination in 13/17 clinical but post-examination in 5/14 pre-clinical departments. Of six essay users, five used model answers but only one did double marking. OSCE was used by 7/17 clinical departments; five provided checklist. Only 3/31 used optical reader. Post-marking review was done by 13/14 pre-clinical but 10/17 clinical departments. Difficulty and discriminating indices were determined by only 4/31 departments. Feedback was provided by 12/14 pre-clinical and 7/17 clinical departments. Only 10/31 course coordinators had copies of examination regulations. Recommendations: MCQ with single-best answer, if properly constructed and adequately critiqued, is the preferred tool for assessing theory domain. However, there should be fresh questions, item analyses, comparisons with pervious results, optical reader systems and double marking. Departments should use OSCE or OSPE more often. Long essays, true/false, fill-in-the-blank-spaces and more-than-one-correct-answer can be safely abolished. Departments or teams should set test papers and collectively take decisions. Feedback rates should be improved. A Center of Medical Education, including an Examination Center is required. Fruitful future studies can be repeat audit, use of "negative questions" and the number of MCQs per test paper. Comparative audit involving other regional medical schools may be of general interest.


[FULL TEXT] [PDF]*
Print this article     Email this article
 Next article
 Previous article
 Table of Contents

 Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
 Citation Manager
 Access Statistics
 Reader Comments
 Email Alert *
 Add to My List *
 * Requires registration (Free)
 

 Article Access Statistics
    Viewed1370    
    Printed85    
    Emailed0    
    PDF Downloaded185    
    Comments [Add]    

Recommend this journal

 

Advertise | Sitemap | What's New | Feedback | Disclaimer
Journal of Family and Community Medicine | Published by Wolters Kluwer - Medknow
Online since 05th September, 2010