Transforming Healthcare – AI scribes: medicolegal issues and potential impact on claims

November 2025

It is well understood by doctors that maintaining clear and accurate medical records is essential for the continuing good care of patients. Good medical practice involves keeping accurate, up to date and legible records and ensuring that medical records are held securely and are protected against unauthorised access.[1]

There is little doubt that with increasing busy practices and patient workloads, the obligation relating to clinical documentation and administrative tasks can lead to increased stress and burnout for doctors. The emergence of Artificial Intelligence (AI) scribes[2] will go a long way to reduce this burden. The software is designed to transcribe audio-recordings of consultations (converting speech to text) into a written transcript of that consultation. 

With the proliferation of AI scribes on the market and increasing data breaches over many sectors, doctors must ensure that the software has been tested and is suitable for use. Importantly, doctors should ensure that the software provider has robust systems in place to protect and store a patient’s sensitive health information.

AI scribes make a recording of the consultation. State and Territory surveillance laws[3] prohibit the recording of a conversation without the consent of all parties to the conversation. That consent should include consent to the making and communication of the recording, and the use of clinical notes generated from the recording by the AI scribe.

In addition, Australian privacy laws regulate the collection, use, disclosure and storage of health information.[4] A clinic’s or doctor’s privacy policy should mention that they may disclose patient personal information to the software provider and which countries the service provider will store the patients’ information in. Clinics and doctors should be satisfied that the software provider will only use patient information for the purposes of generating clinical notes and will not disclose the personal information to any subcontractors or other third parties. 

In terms of patient consent and the accuracy of records:

  1. it is recommended to obtain written consent (for example, on new patient forms) and then verbal consent prior to the recording;
  2. consent must be obtained from everyone in the consultation, not just the patient;
  3. it is the responsibility of each individual doctor to check AI-generated medical records are accurate and reflect what happened during a consultation. This check should be done while the consultation is still fresh in your mind. This is to protect against the risk of AI errors and ‘hallucinations’[5]

Inadequate clinical notes are often an issue when defending legal claims against doctors, or assisting doctors where Medicare has commenced an investigation into their billing practices. AI scribes will likely reduce the requirement for doctors to rely on their ‘usual practice’ and narrow factual disputes, where previously their notes may not have reflected the entire discussion with a patient. 

AI scribes can reduce the administrative burden on doctors, ensure accuracy of notes and provide more time for doctors to spend time with their patients. However, the potential medico-legal issues which may arise include a failure to keep software updated leading to a malfunctioning of the system or potential data breaches, clinics who employ doctors without providing them with adequate training about the software, and privacy complaints from patients if there is a failure to obtain consent and be transparent that AI scribes are being used.


[1]   Paragraph 10.5 Good medical practice: a code of conduct for doctors in Australia.

[2]   Also referred to as AI transcription software or ambient voice technology.

[3]   Surveillance Devices Act 1999 (Vic).

[4]   Privacy Act 1988 (Cth).

[5]   Where the AI model produces false, misleading or non-existent information.

Locations