This article was originally published in Legal Futures.
In July 2025, the government published its 10-year health plan for England, Fit for the Future, which relies on the successful integration of technology into a new healthcare system to transform the patient care model.
The plan finds that the current NHS model is not fit for purpose and requiring a total overhaul. The aim is to ensure a healthier population through the means of preventative, precision medicine – a developing area that featured in the Kennedys report Healthtech in the future – The legal ramifications.
The government lays out a plan to harness technological advances including artificial intelligence (AI), wearables and robotics to streamline NHS services, improve access to care and improve patient outcomes.
The vision is that all NHS hospital trusts will be fully AI-enabled within 10 years to support clinical decision making, wearables will become the accepted standard for continuous patient monitoring, and the adoption of surgical robotics will be greatly expanded.
Changes to staff training will be needed within the next three years to support this tech revolution, with training in the use of AI and digital tools becoming essential. In terms of patients’ experience, the NHS App is envisaged as the driving force for change, giving patients the ability to manage their own healthcare and upload their own health data.
While these innovations provide opportunities for saving lives, saving money and improving access to leading healthcare for all, they also present significant legal risks for healthcare providers including NHS trusts and those working in primary care.
Innovation in this space, particularly of AI tools and data-sharing, creates legal considerations of ownership of tech and data between the healthcare provider, the tech supplier and the patient, as well as questions of liability and accountability if the technology goes wrong, alongside ethics and regulatory compliance issues.
Legal landscape
While the legal landscape in this area is complex, the legal conversations around liability, regulation and accountability for healthtech should be prioritised to ensure that risks are anticipated and mitigated as far as possible.
We are already seeing claims against NHS trusts from patients alleging they have suffered harm as a result of AI technical failures, and the number of claims will only increase. The following scenarios may become commonplace:
- A healthcare provider deploys an AI tool to assist in diagnosing skin conditions. A missed melanoma leads to delayed treatment and a claim. Who is responsible?
- A wearable device is distributed to patients managing heart conditions. The data the device collects was being stored, interpreted and used, but who owns the data? Are UK GDPR standards being met?
- A genomic testing programme flags as high-risk a patient who had not consented for their data to be used in the research trial. Consider the legal, ethical and accountability considerations for the healthcare provider involved.
Mitigating legal risk
The good news is that legal risk in the healthtech space for NHS trusts and other healthcare providers can be mitigated, beginning with contractual negotiation with healthtech suppliers.
Contracts should clearly state the obligations of NHS trusts and healthcare providers regarding healthtech and provide for indemnities in respect of patient harm or breach of applicable laws (such as UK GDPR), if caused by the technology.
Contractual uncertainty around legal obligations and indemnities for NHS trusts and healthcare providers will create issues over liability when patient claims arise, exposing the NHS, healthcare providers and insurers to the risk of compensation payouts with limited prospect of recovery.
In the absence of a clear contractual remedy, trusts and healthcare providers may have to prove their losses were caused by the suppliers’ breach of contract or negligence. Such litigation is likely to involve a complex hybrid of tortious and contractual law.
When establishing who is liable for a tech failure and liability and indemnity, contracts will be the first point of call.
Where responsibility for a technical error is unclear, a hybrid split settlement between a number of tortfeasors could occur. To avoid such risk exposure, contracts should clarify from the outset who is responsible for which operating feature and component of the technology.
This requires advanced consideration of the legal scenarios which could unfold through the use of tech in healthcare settings to ensure that NHS Trusts, their clinicians, other healthcare providers and their employees are protected from claims being brought against them.
We expect that future AI guidance in the healthcare sector will need to emphasise the importance of a human expert checking AI output as a risk mitigation strategy.
However, liability will attach if such oversight is not in place or should that oversight be found to be negligent, probably applying the standards of Bolam. Teams within the healthcare sector using AI will therefore require specific training to mitigate risk further.
The 10-year plan advances the need for legal foresight and advice for NHS trusts, healthcare providers and insurers in the healthtech space. Conversations as to liability for healthtech are needed now more than ever.