Improving assessment for healthcare training

Discarded face mask on tarmac

A research collaboration between the School of Education and the Leeds Institute for Medical Education (Dr Matt Homer, Prof Richard Fuller, Godfrey Pell) has enhanced the assessment practice, design of assessments and quality-assurance approaches in medical schools and national healthcare examinations nationally and internationally.

Their work has delivered a new model of assessment for the clinical, practical and communication skills of trainee healthcare practitioners, replacing current methods that are resource intensive, costly and prone to error. This approach, called sequential testing, has been found to be more effective than a single exam in assessing clinical, practical and communication skills; better able to differentiate between trainees who are ready to progress to the next stage of their programme and those who are underperforming, and is more cost effective. A significant number of medical schools in the UK have changed their model of assessment to sequential testing.

The Dean of Brighton and Sussex Medical School describes the benefits of sequential testing:

“to reduce unnecessary testing burden on students …to support students who do not reach the passing standard by providing them with a substantial period of remediation rather than the few weeks presently available …[and]…to improve the reliability of the OSCE examination.”

Work at Leeds is also informing the development of the new Medical Licensing Assessment regulated by the General Medical Council, and the assessment of overseas doctors wanting to practice in the NHS. Overall, Leeds research has improved the quality of a range of healthcare practitioners entering and progressing in the workforce, saves on resources, and in the long-term enhances patient safety.

Some of the underlying research is listed:

  • Pell G, Fuller R, Homer M, Roberts T. 2012. Is short-term remediation after OSCE failure sustained? A retrospective analysis of the longitudinal attainment of underperforming students in OSCE assessments. Medical Teacher 34 (2):146–50. DOI.
  • Pell G, Fuller R, Homer M, Roberts T. 2010. How to measure the quality of the OSCE: A review of metrics - AMEE guide no. 49. Medical Teacher 32 (10):802–11. DOI.
  • Pell G, Fuller R, Homer M, Roberts T. 2013. Advancing the objective structured clinical examination: sequential testing in theory and practice. Medical Education 47 (6):569–77. DOI.
  • Fuller, R., Homer, M.S., Pell, G. and Hallam, J. 2017. Managing extremes of assessor judgement within the OSCE. Medical Teacher. 39(1), pp.58–66. DOI.
  • Homer M, Fuller R, Pell G. 2018. The benefits of sequential testing: Improved diagnostic accuracy and better outcomes for failing students. Medical Teacher 40 (3):275–84. DOI.
  • Homer M, Fuller R, Hallam J, Pell G. 2019. Setting defensible standards in small cohort OSCEs: Understanding better when borderline regression can ‘work.’ Medical Teacher 26;0 (0):1–10. DOI.

Details of the project team: