Regulation of Artificial Intelligence in Healthcare
Expert Clinicians need to be at the forefront of AI Healthcare Revolution, say leading Scientific Bodies in Joint statement from the Institute of Physics and Engineering in Medicine (IPEM), the Royal College of Radiologists (RCR) and the Society of Radiographers (SCoR)
On the Regulation of Artificial Intelligence in Healthcare
The Institute of Physics and Engineering in Medicine (IPEM), the Royal College of Radiologists (RCR) and the Society of Radiographers (SCoR) have called for a properly trained and funded workforce and clear, consistent regulation across AI developers, healthcare providers and professionals, as they issue their responses to the MHRA consultation on the regulation of artificial intelligence (AI) in healthcare.
Across our professions, there is clear consensus that AI must enhance, not replace, clinical expertise. Our members deploy and assure AI systems daily and see both their benefits and the risks when evidence, governance or workforce capacity are insufficient.
Our shared recommendations - IPEM, RCR and SCoR are aligned on three regulatory priorities:
End-to-end assurance across the AI lifecycle: Regulation must require proportionate pre-market evidence, transparent communication of limitations and mandatory post-market surveillance to detect performance drift and bias, with clinicians retaining oversight throughout.
Workforce capacity as a patient safety requirement: Safe AI deployment depends on a trained, resourced workforce. National workforce planning, funded training pathways, recognised roles and protected time must be integral to regulation.
Clear system-wide accountability: Regulation should be clear on where responsibility lies between manufacturers, healthcare organisations and professionals, including expectations for transparency, training, post-market monitoring and liability.
Together, these organisations represent the scientific, clinical and technical professionals responsible for the safe deployment of AI across diagnostic imaging and radiotherapy services. We stand together as a unified frontline voice on the regulation of AI in healthcare.
AI is already embedded across imaging and radiotherapy services, with use expanding rapidly. Regulation must therefore be grounded in real clinical practice, reflecting patient safety, workforce capacity and NHS delivery realities.
The recently announced Cancer Plan reinforces this urgency, with ambitions such as achieving 75% of cancer patients surviving five years depending on earlier diagnosis, timely treatment and high-quality imaging and radiotherapy services, supported by the safe, evidence-based and regulated deployment of AI. We welcome the MHRA’s engagement and believe the regulatory framework must be strengthened to enable safe, sustainable adoption and maintain public trust.
Statements from the Presidents
Mark Knight, President of IPEM, said:
‘AI must be regulated as a safety-critical technology. That requires clear standards across the AI lifecycle and a workforce with the capability and authority to assure these systems in clinical practice.’
Dr Stephen Harden, President of the RCR, said:
‘Clinical radiologists and clinical oncologists see both the promise and risks of AI every day. Regulation must support professional judgement, be underpinned by robust evidence and provide clear accountability.’
Katie Thompson, President of SCoR, said:
‘Radiographers are central to the safe use of AI in imaging and radiotherapy. Regulation must recognise frontline practice and invest in workforce capacity to ensure patient safety.’
Standing together
IPEM, RCR and SCoR stand together in calling for AI regulation that aligns innovation with patient safety, workforce realities and clinical accountability.