Artificial Intelligence (AI)

January 2025

AI technologies are increasingly being adopted in primary care, with common applications including medical scribes, AI-powered triage systems and document management tools. While these technologies have the potential to reduce administrative burden, improve patient access and enhance efficiency, they also raise important concerns regarding liability, regulation, reliability, accuracy and environmental impact.

RCGP recommendations

In September 2025, RCGP Council discussed the use of AI in general practice, considering the growing use of AI tools in primary care and the potential benefits these offer, as well as the significant concerns surrounding their adoption, particularly in relation to liability, regulation, safety and sustainability.

Members can read the full Council paper on AI in the members’ area of the website

Following this discussion, it is the RCGP's position that AI has the potential to support general practice, but its safe and effective roll-out requires coordinated action across the UK, at national and regional levels. Our key recommendations in relation to the use of AI in general practice are set out below and are strongly supported by the findings of the subsequent research we carried out with the Nuffield Trust:

  • The regulatory framework for AI devices, due for publication in 2026 by the UK Government, needs to be developed urgently, in consultation with health professionals across the UK.
  • National guidance is needed on the ethical, clinical and environmental dimensions of AI to support responsible adoption in general practice and other NHS settings, and to ensure clarity on key issues such as liability. This guidance should ensure a clear, consistent approach across the UK.
  • Commissioners should play a key role, offering access to clinical safety experts and central procurement, to reduce the burden on practices.
  • Education and training on AI should be embedded across medical education and CPD, with real-world application and regular updates.
  • Clinician involvement in design must be standard practice, with AI developers working alongside healthcare providers and professional bodies.
  • Suppliers of AI tools must provide adequate onboarding and support to clinicians for confident and safe use.
  • Clinicians must be given sufficient time and space to thoughtfully implement, evaluate and adopt AI tools within their practices to ensure safe, effective and context-sensitive integration.
  • Evaluation of AI tools must meet recognised ethical standards, with transparent reporting of both benefits and risks. This includes publishing negative as well as positive findings, and clearly disclosing limitations, unintended consequences and potential harms. Selective reporting undermines trust and risks patient safety; balanced evaluation is essential to informed decision-making and public confidence.

RCGP and Nuffield Trust research report

The report ‘How are GPs using AI? Insights from the front line’ co-authored by Nuffield Trust and RCGP (December 2025) provides important evidence to underpin the RCGP's recommendations. Using a mixed-methods approach, the research combined findings from our annual GP Voice Survey with a series of online focus groups to explore how GPs across the UK are using AI in clinical practice.

How are GPs using AI? Insights from the front line (external PDF)

The report found that more than one in four GPs (28%) reported using AI tools in their work. Among those who specified their use, the most common applications were clinical documentation and note-taking (57%), followed by professional development (45%) and administrative tasks (44%). Fewer of the GPs who said they use AI reported using it to support clinical decision-making (28%).

GPs raised specific concerns relating to professional liability and medico-legal risk. These issues emerged as the most significant barriers to AI adoption, cited by 89% of non-users and 80% of both practice-selected and self-obtained AI tool users. A lack of regulatory oversight was the second most commonly cited concern, followed by risks of clinical error and patient privacy and data security issues.

While AI is already being used by many GPs, progress is constrained by unresolved questions around liability, regulation, safety and governance. Addressing these issues is essential if AI is to be adopted in a way that supports clinicians, protects patients and delivers genuine benefit to general practice.

The RCGP will continue to advocate for the delivery of the recommendations set out above on the use of AI in general practice, and to examine emerging evidence on the benefits and risks of AI technologies.

List of relevant guidance on adoption/use of AI in healthcare settings

There is currently no centralised or national guidance specifically tailored to AI use in general practice, although a new regulatory framework for AI in healthcare is due to be published this year (2026). However, a range of documents, some published, others in development, have been produced by national and regional bodies, and these are listed in the table below.

Organisation Coverage Title and remit
GMC UK Guidance on how good medical practice applies to the use of AI
BMA UK Principles for AI and its application in healthcare covering the potential benefits and drawbacks of current and emerging AI technologies
CQC England GP mythbuster 109: Use of AI in GP Services covering good regulatory, governance and compliance processes
NHS Confederation England, Wales and Northern Ireland Jargon buster to demystify the language of AI and help improve AI literacy
NHS England (NHSE) England AI knowledge repository to support the responsible adoption of AI within the wider NHS
NHSE England Guidance on the use of AI-enabled ambient scribing products in health and care settings
NHSE England Ensuring Safe and Assured Adoption of AI Scribe Technology (Priority Notification)
NHSE England The Digital Primary Care: The Good Practice Guidelines for GP electronic patient records – (GPGv5) - Artificial intelligence (AI) and machine learning Aimed at those working with digital services in general practice, covering tips and key considerations when planning the use of AI
NHSE England DCB0160: Clinical Risk Management: its Application in the Deployment and Use of Health IT Systems Designed to help health and care organisations assure the clinical safety of their health IT software
NHSE England Ambient Voice Technology Self-Certified Supplier Registry
NHS Scotland Scotland Interim position on the use of Ambient Scribes
Health Education and Improvement Wales (HEIW) Wales AI Education and Skills Landscape Review which provides recommendations to address gaps in AI skills and literacy across the healthcare workforce

A safe place for NHS Wales healthcare professionals to learn AI links to a ‘foundations in AI’ free course