Why you can expect to see more Aussie doctors using AI

Next time you visit a doctor, a psychiatrist or even a vet, artificial intelligence could be listening to your concerns and taking notes.

AI scribes, as the tools are known, are becoming more popular in Australian medical clinics as they promise to save doctors time by writing patient notes, referral letters and even hospital discharge statements.

Two major medical authorities have issued guidance for their use in Australia in recent weeks, boosting confidence in the technology and noting that AI has the potential to “improve health outcomes” for patients.

But the guidelines also warn that AI tools should not be allowed to make recommendations about patients’ medical treatment or to summarise consultations without oversight.

Even medical AI proponents say the technology may never be able to work without close scrutiny and is not designed to replace doctors.

Concerns about the use of AI scribes were raised at a recent Senate inquiry into adopting artificial intelligence, where experts revealed the tools were not subject to Therapeutic Goods Administration approval as they were not medical devices.

Despite the absence of rules, Australian Alliance for Artificial Intelligence in Healthcare director Enrico Coiera said the technology was in “routine use” within the healthcare system.

“Digital scribes are used daily in general practice to listen in on patient conversations and summarise records for them automatically,” he said.

The Australian Health Practitioner Regulation Agency issued guidelines for the use of AI scribes last week, focusing on accountability and transparency for doctors and informed consent for patients.

The rules join guidance from the Royal Australian College of General Practitioners issued in July, and a position statement from the Australian Medical Association.

Darren Ross, chief executive of the software firm PatientNotes, says the rules will not only help practitioners to use the technology but signify industry acceptance.

“We’ve been pushing for this for quite some time,” he said.

“We’re glad some of the governing bodies are coming out to say it is safe to use but you need to still be conscious that as a practitioner or as a doctor, you are still responsible for your notes.”

Mr Ross, who also works as a physiotherapist, says the AI-powered transcription technology is used by a range of medical professionals, including GPs, surgeons, psychologists, psychiatrists and chiropractors, as well as vets and animal therapists.

It works by recording and transcribing patient consultations, and preparing summaries and notes according to customised templates.

From its notes, practitioners can ask the technology to create referral letters or patient summaries, potentially saving minutes or hours of their time.

“It might be a 10-minute doctor appointment or it might be a three-hour session where a therapist is working with a child with different levels of neural processing and they’re doing play or animal therapy,” he said.

“It takes away the need to do minute-by-minute documentation.”

But the notes are only intended to be a draft, Mr Ross says, and all documentation requires trained medical professionals to review and edit it as necessary.

“It’s a tool and a resource – it’s not a clone of you that means you can be out on the golf course and not do your notes,” he told AAP.

The AHPRA guidelines mirror this warning, telling medical professionals who use AI scribes they must understand the software’s data policies, tell patients about its use, seek consent and take responsibility for its outcomes.

“Practitioners must apply human judgement to any output of AI,” the rules state.

“If using an AI scribing tool, the practitioner is responsible for checking the accuracy and relevance of records created using generative AI.”

Australian Medical Association NSW president Michael Bonning, who has worked on AI technology in the US, says AI scribe guidelines stress that medical professionals must bear responsibility for the software they deploy.

While transcription tools present a low risk in terms of generative AI use, doctors need to be aware of how patient data is stored, used and shared by software, in addition to checking its accuracy.

“We need to be thoughtful and careful about what we record and how we maintain privacy and confidentiality for patients,” Dr Bonning said.

“The responsibility always lives with the practitioner to be mindful and thoughtful in what they do.”

Data privacy is one of the reasons Austin Health chose Microsoft’s closed system to test generative AI, chief technology officer Alan Pritchard says.

The Victorian provider is also testing whether AI can be used to identify and categorise medical documents, such as notes and referrals, and has previously used the technology to identify returning hospital patients and refer them to specialists.

Mr Pritchard says AI tools have the potential to make treating patients easier for clinicians, but questions remain over whether their output can be relied upon and how well it will be scrutinised.

“The issue we need to be aware of … is to what extent people just rely on the output of the artificial intelligence gives them and to what extent do they engage with it and critically review the content,” he said.

“Studies already have shown that if an artificially intelligent tool is good, people start to not really question too much the content coming out.”

Leave a Comment

url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url url