RadBench: Benchmarking image interpretation skills

Journal article


Wright, C and Reeves, P (2016). RadBench: Benchmarking image interpretation skills. Radiography. 22 (2), pp. 131 - 136 (6).
AuthorsWright, C and Reeves, P
Abstract

Purpose: The key aim of this research was to develop an objective, accurate assessment tool with which to provide regular measurement and monitoring of image interpretation performance. The tool was a specially developed software program (RadBench) by which to objectively measure image interpretation performance en masse and identify development needs. Method: Two test banks were generated (Test 1 & Test 2), each containing twenty appendicular musculoskeletal images, half were normal, half contained fractures. All images were double reported by radiologists and anonymised. A study (n ¼ 42) was carried out within one calendar month to test the method and analysis approach. The participants included general radiographers (34), reporting radiographers (3), radiologists (2) (all from one UK NHS Trust) and medical imaging academics (3). Results: The RadBench software generated calculations of sensitivity, specificity, and accuracy in addition to a decision making map for each respondent. Early findings highlighted a 5% mean difference between image banks, confirming that benchmarking must be related to a specific test. The benchmarking option within the software enabled the user to compare their score with the highest, lowest and mean score of others who had taken the same test. Reporting radiographers and radiologists all scored 95% or above accuracy in both tests. The general radiographer population scored between 60 and 95%. Conclusions: The evidence from this research indicates that the Radbench tool is capable of providing benchmark measures of image interpretation accuracy, with the potential for comparison across populations.

KeywordsAbnormality detection; Accreditation; Accuracy; Audit; CPD; Development needs
Year2016
JournalRadiography
Journal citation22 (2), pp. 131 - 136 (6)
PublisherW. B. Saunders Co., Ltd.
ISSN1078-8174
Digital Object Identifier (DOI)doi:10.1016/j.radi.2015.12.010
Publication dates
Print20 May 2016
Publication process dates
Deposited14 Feb 2017
Accepted30 Dec 2015
Accepted author manuscript
License
CC BY 4.0
Permalink -

https://openresearch.lsbu.ac.uk/item/87413

  • 2
    total views
  • 21
    total downloads
  • 0
    views this month
  • 1
    downloads this month

Related outputs

Degree Classification: Does the Calculation Model Affect the Award?
Sinclair, N, Wright, C, Edwards, G and Keane, P (2017). Degree Classification: Does the Calculation Model Affect the Award? UK Radiological Congress and Radiation Oncology Congress. Manchester 12 - 14 Jun 2017 London South Bank University.
Image Interpretation Performance of Diagnostic Radiographers in Singapore
Wright, C and Xiang, Y (2016). Image Interpretation Performance of Diagnostic Radiographers in Singapore. UK Radiological Congress. Liverpool 06 - 08 Jun 2016 London South Bank University.
Preliminary Clinical Evaluation: The What, Where, How Approach to Scoring
Wright, C and Akimoto, T (2016). Preliminary Clinical Evaluation: The What, Where, How Approach to Scoring. The UK Radiological Congress. Liverpool 06 - 08 Jun 2016 London South Bank University.
Traffic Light: An Alternative Approach to Abnormality Signalling
Wright, C and Higgins, S (2016). Traffic Light: An Alternative Approach to Abnormality Signalling. UK Radiological Congress. Liverpool 06 - 08 Jun 2016 London South Bank University.
Image interpretation performance: A longitudinal study from novice to professional
Wright, C and Reeves, P (2016). Image interpretation performance: A longitudinal study from novice to professional. Radiography. 23 (1), pp. e1-e7.