OBJECTIVE: To examine the relationship between medical school applicants' performances in the Graduate Australian Medical School Admissions Test (GAMSAT) and structured interviews and their subsequent performance in medical school. DESIGN: Students in Years 2-4 of two graduate-entry medical programs were invited to complete two previously validated tests of clinical reasoning. These results and their Year 2 examination results were compared with their previous performance in GAMSAT and at interview. SETTING: The graduate-entry programs at the Universities of Queensland and Sydney. PARTICIPANTS: 189 student volunteers (13.6% response rate). MAIN OUTCOME MEASURES: Students' test results on a set of Clinical Reasoning Problems (CRPs) and a Diagnostic Thinking Inventory (DTI) and their Year 2 examination results. RESULTS: There was no association between performance in GAMSAT and performance in the CRPs; there was a weak negative correlation between performance in GAMSAT and the DTI (- 0.05 > r > - 0.31, P = 0.03). The correlation between GAMSAT and examination results was weak (r < 0.24, P = 0.02). The correlation between GAMSAT and interview scores for each school was weakly negative for University of Queensland (r = - 0.34, P < 0.01) and weakly positive for University of Sydney (r = 0.11), with a combined significance level P < 0.01. CONCLUSIONS: We did not find evidence that GAMSAT and structured interviews are good predictors of performance in medical school. Our study highlights a need for more rigorous evaluation of Australian medical school admissions tests.
OBJECTIVE: To examine the relationship between medical school applicants' performances in the Graduate Australian Medical School Admissions Test (GAMSAT) and structured interviews and their subsequent performance in medical school. DESIGN: Students in Years 2-4 of two graduate-entry medical programs were invited to complete two previously validated tests of clinical reasoning. These results and their Year 2 examination results were compared with their previous performance in GAMSAT and at interview. SETTING: The graduate-entry programs at the Universities of Queensland and Sydney. PARTICIPANTS: 189 student volunteers (13.6% response rate). MAIN OUTCOME MEASURES: Students' test results on a set of Clinical Reasoning Problems (CRPs) and a Diagnostic Thinking Inventory (DTI) and their Year 2 examination results. RESULTS: There was no association between performance in GAMSAT and performance in the CRPs; there was a weak negative correlation between performance in GAMSAT and the DTI (- 0.05 > r > - 0.31, P = 0.03). The correlation between GAMSAT and examination results was weak (r < 0.24, P = 0.02). The correlation between GAMSAT and interview scores for each school was weakly negative for University of Queensland (r = - 0.34, P < 0.01) and weakly positive for University of Sydney (r = 0.11), with a combined significance level P < 0.01. CONCLUSIONS: We did not find evidence that GAMSAT and structured interviews are good predictors of performance in medical school. Our study highlights a need for more rigorous evaluation of Australian medical school admissions tests.
Authors: Jacqueline E McLaughlin; Julia Khanova; Kelly Scolaro; Philip T Rodgers; Wendy C Cox Journal: Am J Pharm Educ Date: 2015-08-25 Impact factor: 2.047