AI Assisted Detection of Fractures on X-Rays (FRACT-AI)
NCT ID: NCT06130397
Last Updated: 2025-11-24
Study Results
The study team has not published outcome measurements, participant flow, or safety data for this trial yet. Check back later for updates.
Basic Information
Get a concise snapshot of the trial, including recruitment status, study phase, enrollment targets, and key timeline milestones.
COMPLETED
21 participants
OBSERVATIONAL
2024-02-08
2025-06-01
Brief Summary
Review the sponsor-provided synopsis that highlights what the study is about and why it is being conducted.
Related Clinical Trials
Explore similar clinical trials based on study characteristics and research focus.
Boneview-ED - Impact of Artificial Intelligence Detecting Fractures in the Emergence Department : a Pragmatic Prospective Study
NCT06013852
AI Performance for the Detection of Bone Fractures in Children
NCT05538403
Enhancing Diagnostic Accuracy in Fracture Identification on Musculoskeletal Radiographs Using Deep Learning
NCT06644391
Optimization of the Diagnosis of Bone Fractures in Patients Treated in the Emergency Department by Using Artificial Intelligence for Reading Radiological Images in Comparison With Traditional Reading by the Emergency Doctor.
NCT06051682
Assessing AI-Supported Fracture Detection in Emergency Care Units
NCT06754137
Detailed Description
Dive into the extended narrative that explains the scientific background, objectives, and procedures in greater depth.
Conditions
See the medical conditions and disease areas that this research is targeting or investigating.
Study Design
Understand how the trial is structured, including allocation methods, masking strategies, primary purpose, and other design elements.
COHORT
RETROSPECTIVE
Study Groups
Review each arm or cohort in the study, along with the interventions and objectives associated with them.
Readers/participants
Reader Selection: 18 readers will be selected from the following five clinical specialty groups (3 readers each):
* Emergency Medicine
* Trauma and Orthopaedic Surgery
* Emergency Nurse Practitioners
* Physiotherapy
* General Radiology
* Radiographers
And from the following level of seniority/experience:
* Consultant/Senior/Equivalent - \>10yrs experience
* Middle Grade/Registrar/Equivalent - 5-10yrs experience
* Junior Grade/Senior House Officer/Equivalent - \<5yrs experience
Each specialty reader group will include 1 reader at each level of experience.
Readers will be recruited from across 5 NHS organisations which comprise the Thames Valley Emergency Medicine Research Network (www.TaVERNresearch.org):
* Oxford University Hospitals NHS Foundation Trust
* Royal Berkshire NHS Foundation Trust
* Buckinghamshire Healthcare NHS Trust
* Frimley Health NHS Foundation Trust
* Milton Keynes University Hospital NHS Foundation Trust
Cases reading
The reading will be done remotely via the Report and Image Quality Control site (www.RAIQC.com), an online platform allowing medical imaging viewing and reporting. Participants can work from any location, but the work must be done from a computer with internet access. For avoidance of doubt, the work cannot be performed from a phone or tablet.
The project is divided into two phases and participants are required to complete both phases. The estimated total involvement in the project is up to 20-24 hours.
Phase 1: Time allowed: 2 weeks
\- Participants must review 500 X-rays and express a clinical opinion through a structured reporting template (multiple choice, no open text required).
Rest/washout period - Time allowed: 4 weeks, to mitigate the effects of recall bias.
Phase 2 - Time allowed: 2 weeks
\- Review 500 X-rays together with an AI report for each case and express their clinical opinion through the same structured reporting template used in Phase 1.
Ground truthers
Two consultant musculoskeletal radiologists. A third senior musculoskeletal radiologist's opinion (\>20 years experience) will undertake arbitration.
Ground truthing
Two consultant musculoskeletal radiologists will independently review the images to establish the 'ground truth' findings on the XRs, where a consensus is reached this will then be used as the reference standard. In the case of disagreement, a third senior musculoskeletal radiologist's opinion (\>20 years experience) will undertake arbitration. A difficulty score will be assigned to each abnormality by the ground truthers using a 4-point Likert scale (1 being easy/obvious to 4 being hard/poorly visualised).
Interventions
Learn about the drugs, procedures, or behavioral strategies being tested and how they are applied within this trial.
Cases reading
The reading will be done remotely via the Report and Image Quality Control site (www.RAIQC.com), an online platform allowing medical imaging viewing and reporting. Participants can work from any location, but the work must be done from a computer with internet access. For avoidance of doubt, the work cannot be performed from a phone or tablet.
The project is divided into two phases and participants are required to complete both phases. The estimated total involvement in the project is up to 20-24 hours.
Phase 1: Time allowed: 2 weeks
\- Participants must review 500 X-rays and express a clinical opinion through a structured reporting template (multiple choice, no open text required).
Rest/washout period - Time allowed: 4 weeks, to mitigate the effects of recall bias.
Phase 2 - Time allowed: 2 weeks
\- Review 500 X-rays together with an AI report for each case and express their clinical opinion through the same structured reporting template used in Phase 1.
Ground truthing
Two consultant musculoskeletal radiologists will independently review the images to establish the 'ground truth' findings on the XRs, where a consensus is reached this will then be used as the reference standard. In the case of disagreement, a third senior musculoskeletal radiologist's opinion (\>20 years experience) will undertake arbitration. A difficulty score will be assigned to each abnormality by the ground truthers using a 4-point Likert scale (1 being easy/obvious to 4 being hard/poorly visualised).
Eligibility Criteria
Check the participation requirements, including inclusion and exclusion rules, age limits, and whether healthy volunteers are accepted.
Inclusion Criteria
* Currently working in the National Health Service (NHS).
Exclusion Criteria
* Non-radiology physicians with previous career in radiology
ALL
Yes
Sponsors
Meet the organizations funding or collaborating on the study and learn about their roles.
Gleamer
INDUSTRY
Oxford University Hospitals NHS Trust
OTHER
Responsible Party
Identify the individual or organization who holds primary responsibility for the study information submitted to regulators.
Alex Novak
Primary Investigator
Locations
Explore where the study is taking place and check the recruitment status at each participating site.
Oxford University Hospitals NHS Foundation Trust
Oxford, Oxfordshire, United Kingdom
Countries
Review the countries where the study has at least one active or historical site.
References
Explore related publications, articles, or registry entries linked to this study.
Hussain F, Cooper A, Carson-Stevens A, Donaldson L, Hibbert P, Hughes T, Edwards A. Diagnostic error in the emergency department: learning from national patient safety incident report analysis. BMC Emerg Med. 2019 Dec 4;19(1):77. doi: 10.1186/s12873-019-0289-3.
Donaldson LJ, Reckless IP, Scholes S, Mindell JS, Shelton NJ. The epidemiology of fractures in England. J Epidemiol Community Health. 2008 Feb;62(2):174-80. doi: 10.1136/jech.2006.056622.
National Clinical Guideline Centre (UK). Fractures (Non-Complex): Assessment and Management. London: National Institute for Health and Care Excellence (NICE); 2016 Feb. Available from http://www.ncbi.nlm.nih.gov/books/NBK344251/
Blazar E, Mitchell D, Townzen JD. Radiology Training in Emergency Medicine Residency as a Predictor of Confidence in an Attending. Cureus. 2020 Jan 9;12(1):e6615. doi: 10.7759/cureus.6615.
Snaith B, Hardy M. Emergency department image interpretation accuracy: The influence of immediate reporting by radiology. Int Emerg Nurs. 2014 Apr;22(2):63-8. doi: 10.1016/j.ienj.2013.04.004. Epub 2013 May 30.
York TJ, Jenkins PJ, Ireland AJ. Reporting Discrepancy Resolved by Findings and Time in 2947 Emergency Department Ankle X-rays. Skeletal Radiol. 2020 Apr;49(4):601-611. doi: 10.1007/s00256-019-03317-7. Epub 2019 Nov 21.
van Leeuwen KG, Schalekamp S, Rutten MJCM, van Ginneken B, de Rooij M. Artificial intelligence in radiology: 100 commercially available products and their scientific evidence. Eur Radiol. 2021 Jun;31(6):3797-3804. doi: 10.1007/s00330-021-07892-z. Epub 2021 Apr 15.
Duron L, Ducarouge A, Gillibert A, Laine J, Allouche C, Cherel N, Zhang Z, Nitche N, Lacave E, Pourchot A, Felter A, Lassalle L, Regnard NE, Feydy A. Assessment of an AI Aid in Detection of Adult Appendicular Skeletal Fractures by Emergency Physicians and Radiologists: A Multicenter Cross-sectional Diagnostic Study. Radiology. 2021 Jul;300(1):120-129. doi: 10.1148/radiol.2021203886. Epub 2021 May 4.
Fenton JJ, Taplin SH, Carney PA, Abraham L, Sickles EA, D'Orsi C, Berns EA, Cutter G, Hendrick RE, Barlow WE, Elmore JG. Influence of computer-aided detection on performance of screening mammography. N Engl J Med. 2007 Apr 5;356(14):1399-409. doi: 10.1056/NEJMoa066099.
Chilamkurthy S, Ghosh R, Tanamala S, Biviji M, Campeau NG, Venugopal VK, Mahajan V, Rao P, Warier P. Deep learning algorithms for detection of critical findings in head CT scans: a retrospective study. Lancet. 2018 Dec 1;392(10162):2388-2396. doi: 10.1016/S0140-6736(18)31645-3. Epub 2018 Oct 11.
Patel MR, Norgaard BL, Fairbairn TA, Nieman K, Akasaka T, Berman DS, Raff GL, Hurwitz Koweek LM, Pontone G, Kawasaki T, Sand NPR, Jensen JM, Amano T, Poon M, Ovrehus KA, Sonck J, Rabbat MG, Mullen S, De Bruyne B, Rogers C, Matsuo H, Bax JJ, Leipsic J. 1-Year Impact on Medical Practice and Clinical Outcomes of FFRCT: The ADVANCE Registry. JACC Cardiovasc Imaging. 2020 Jan;13(1 Pt 1):97-105. doi: 10.1016/j.jcmg.2019.03.003. Epub 2019 Mar 17.
Obuchowski NA, Bullen J. Multireader Diagnostic Accuracy Imaging Studies: Fundamentals of Design and Analysis. Radiology. 2022 Apr;303(1):26-34. doi: 10.1148/radiol.211593. Epub 2022 Feb 15.
Smith BJ, Hillis SL. Multi-reader multi-case analysis of variance software for diagnostic performance comparison of imaging modalities. Proc SPIE Int Soc Opt Eng. 2020 Feb;11316:113160K. doi: 10.1117/12.2549075. Epub 2020 Mar 16.
Novak A, Hollowday M, Espinosa Morgado AT, Oke J, Shelmerdine S, Woznitza N, Metcalfe D, Costa ML, Wilson S, Kiam JS, Vaz J, Limphaibool N, Ventre J, Jones D, Greenhalgh L, Gleeson F, Welch N, Mistry A, Devic N, Teh J, Ather S. Evaluating the impact of artificial intelligence-assisted image analysis on the diagnostic accuracy of front-line clinicians in detecting fractures on plain X-rays (FRACT-AI): protocol for a prospective observational study. BMJ Open. 2024 Sep 5;14(9):e086061. doi: 10.1136/bmjopen-2024-086061.
Related Links
Access external resources that provide additional context or updates about the study.
3\. Clinical negligence claims in Emergency Departments in England. Report 2 of 3: Missed fractures. NHS Resolution. March 2022
11\. The NICE Evidence Standards Framework for digital health and care technologies. (ECD7) Last Updated: 9 August
Other Identifiers
Review additional registry numbers or institutional identifiers associated with this trial.
310995-C
Identifier Type: -
Identifier Source: org_study_id
More Related Trials
Additional clinical trials that may be relevant based on similarity analysis.