Assessment Research Group

What we do

Our mission is to facilitate delivery of high quality, meaningful assessments, which balance statistical measures of quality alongside broader evidence of validity. Our approach has been recognised through an ASPIRE award for assessment excellence at Leeds, the first such award within Europe.

Underpinning our success is an active programme of scholarship and research within the Leeds Institute of Medical Education. The ARG has a national and international reputation, publishing widely, and leading workshops and research presentations at medical educational conferences.

Areas of expertise relate to performance testing (OSCE and WBA), knowledge testing and the assessment of professionalism as well as broader issues relating to assuring assessment quality (psychometric analysis, remediation and re-design of assessment). The group supervises a number of student research projects relating to assessment and quality improvement, and welcomes PhD applicants in these areas. We also have expertise in the design, development and evaluation of behavioural interventions (such as nudges) and the application of psychological principles to assessment and feedback design. We work with international partners in this field and have received several funding grants associated with this work

Our members hold a range of advisory assessment roles, and we provide a range of assessment consultancy across the world. Our scholarship work has had a significant impact on student education at Leeds and beyond through the introduction of novel assessment and feedback designs. These include sequential testing and workplace assessment apps, in order to improve assessment decision-making, and deliver more personalised assessment feedback to students. 

Key publications

Key publications our team has contributed to

Homer, M., Fuller, R., Hallam, J. and Pell, G. 2020. Shining a spotlight on scoring in the OSCE: checklists and item weighting. Medical Teacher.

Homer, M., Fuller, R., Hallam, J. and Pell, G. 2020. Setting defensible standards in small cohort OSCEs: Understanding better when borderline regression can ‘work’. Medical Teacher. 42(3), pp.306–315.

Homer, M., Fuller, R. and Pell, G. 2018. The benefits of sequential testing: Improved diagnostic accuracy and better outcomes for failing students. Medical Teacher. 40(3), pp.275–284.

Homer, M., Pell, G. and Fuller, R. 2017. Problematizing the concept of the “borderline” group in performance assessments. Medical Teacher. 39(5), pp.469–475.

Williams, D.V.H., Reid, A.M. and Homer, M. 2017. Boosting clinical performance: The impact of enhanced final year placements. Medical Teacher. 0(0), pp.1–6.

Fuller, R., Homer, M.S., Pell, G. and Hallam, J. 2017. Managing extremes of assessor judgement within the OSCE. Medical Teacher. 39(1), pp.58–66.

Homer, M. and Darling, J.C. 2016. Setting standards in knowledge assessments: Comparing Ebel and Cohen via Rasch. Medical Teacher. 38(12), pp.1267– 1277.

Homer, M., Pell, G., Fuller, R. and Patterson, J. 2016. Quantifying error in OSCE standard setting for varying cohort sizes: A resampling approach to measuring assessment quality. Medical Teacher. 38(2), pp.181–188.

Pell, G., Fuller, R., Homer, M. and Roberts, T. 2013. Advancing the objective structured clinical examination: sequential testing in theory and practice. Medical Education. 47(6), pp.569–577.



Dr Matt Homer

Contact us

Dr Matt Homer, Leeds Institute of Medical Education

10.30 Worsley Building

Clarendon Way Leeds, LS2 9NL

Telephone: +44 (0)113 343 4654


Twitter: @LeedsARG