Using AI to write clinical notes and reports

Ethics Review Committee (ERC) > Guidance for practitioners > Using AI to write clinical notes and reports

The ethics of using artificial or augmented intelligence (AI) to write clinical notes and reports has been raised in conversations with the Ethics Officer and members of the Ethics Review Committees.  Although no complaints have arisen from the used of AI to date, this appears to be an emerging topic of interest to members.

You are responsible for the actions of any AI you use

The ethical perspective on the use of AI to write clinical notes and reports is clear:

As the healthcare professional, you remain responsible for clinical notes and reports developed using AI, which means that you will also be held responsible for the actions of any AI you use.

The American Medical Association (AMA) has published a range of resources on AI, including the publication ‘ChatGPT and generative AI: What physicians should consider’ (1).  This includes a summary of the known current limitations of Large Language Model (LLM) natural language processing tools like ChatGPT, namely:

  • Risk of incorrect or falsified responses.
  • Training dataset limitations.
  • Lack of knowledge-based reasoning.
  • LLMs are not currently regulated.
  • Patient privacy and cybersecurity concerns.
  • Risk of bias, discrimination, and promoting stereotypes.
  • Liability may arise from use.

The Australian Alliance for Artificial Intelligence in Healthcare 2023 National Policy Roadmap for Artificial Intelligence in Healthcare (2) notes particular issues with AI used in clinical note-taking and report writing:

“While clinical AI is subject to TGA software as a medical device (SaMD) safety regulation, non-medical generative AI like ChatGPT falls into a grey zone, where it is being used for clinical purposes but evades scrutiny because they are general purpose technologies not explicitly intended for healthcare. Uploading sensitive patient data into a non-medical AI like ChatGPT hosted on United States servers is also problematic from a privacy and consent perspective.” (2)

Any one of these limitations could lead to multiple potential breaches of the Code of Conduct for audiologists and audiometrists, including, but not limited to those relating to:

Standard 1 – Members must provide hearing services in a safe and ethical manner
Standard 2 – Members must provide hearing services in a respectful manner and not discriminate against anyone they interact with in a professional capacity
Standard 4 – Members must promote the client’s right to participate in decisions that affect their hearing health
Standard 16 – Members must comply with all relevant laws and regulations
Standard 17 – Members must adhere to appropriate documentation standards
Standard 18 – Members must be covered by appropriate indemnity insurance 

Before you use AI, you need to understand AI and be able to understand how it works and how it will impact your clinical note taking and/or report writing.  This means that before you use AI in your clinical practice you have to:

  • acknowledge and accept the limited evidence on AI, and
  • put in place processes and systems to ensure that any potential risks are addressed.

Furthermore, you need to be able to explain to your clients how AI is used in your clinical practice and how this may affect your clinical decision making processes (e.g. how it may effect the information of your clinical notes and/or reports and how this, in turn, may effect advice given).  This relates to a client’s right to participate in decisions that affect their hearing health as required under Standard 4 above.

There is no evidence on the accuracy or potential risks of AI

A quick internet search results in numerous software products that claim to provide allied health professionals with clinical note taking and report writing tools supported by AI.

The New South Wales Government Agency for Clinical Innovation has a living table available on its website titled ‘AI: automating indirect clinical tasks and administration: living evidence’ (3).  This living table is updated with relevant results from weekly PubMed searches.  Nonetheless, it includes only 29 publications across all indirect clinical tasks and administration, with only three of these relating to clinical note taking (4-6).

Two of these three publications are by health software providers and report on anecdotal support of AI by healthcare clinicians (4 and 6) with no references to patient perspectives, clinical trials or other reports on accuracy of transcription and AI summary functions (5).  None of these publications are in peer-reviewed academic journals.

If you do not fully understand how AI works, how it stores and uses your data, and how the use of AI may impact your adherence with the Code of Conduct for audiologists and audiometrists, you should not use it in your clinical practice.

At this stage, it is likely that few, if any, audiologists or audiometrists practicing in Australia currently have the skills and expertise, or resources, to address the significant risks explored above given the lack of evidence on AI and its impacts.

“To prepare the sector for the increased use of AI, we will need to support the creation of national consensus on foundational clinical competencies, scopes of professional practice, and codes of professional conduct to use AI, and provide a basis for patient safety, service quality and practitioner credentialling.” (2)

References

(1) American Medical Association, 2023. ChatGPT and generative AI: What physicians should consider. Available from: https://www.ama-assn.org/system/files/chatgpt-what-physicians-should-consider.pdf

(2) Australian Alliance for Artificial Intelligence in Healthcare, 2023. A National Policy Roadmap for Artificial Intelligence in Healthcare. Available from: https://aihealthalliance.org/wp-content/uploads/2023/11/AAAiH_NationalPolicyRoadmap_FINAL.pdf

(3) Critical Intelligence Unit, Agency for Clinical Innovation, 2024. AI: automating indirect clinical tasks and administration: living evidence. Available from: https://aci.health.nsw.gov.au/statewide-programs/critical-intelligence-unit/artificial/automating-indirect-clinical-tasks

(4) Amazon.com Inc, 2023. AWS announces AWS HealthScribe, a new generative AI-powered service that automatically creates clinical documentation. New York: Amazon.com Inc. Available from: https://press.aboutamazon.com/2023/7/aws-announces-aws-healthscribe-a-new-generative-ai-powered-service-that-automatically-creates-clinical-documentation#:~:text=With%20generative%20AI%20capabilities%20powered,to%20enter%20into%20the%20EHR

(5) Pifer R, 2023. Amazon launches generative AI-based clinical documentation service. Washington, DC: Healthcare Dive. Available from: https://www.healthcaredive.com/news/amazon-generative-ai-clinical-documentation-healthscribe/688996/

(6) MIT Technology Review Insights, 2019. The AI Effect: How artificial intelligence is making health care more human. Cambridge, MA: MIT. Available from: https://www.gehealthcare.co.uk/-/jssmedia/61b7b6b1adc740e58d4b86eef1bb6604.pdf

Audiology Australia, the Australian College of Audiology and the Hearing Aid Audiology Society of Australia 2023