College Blog 

In the spirit of transparency, we encourage open debate and constructive criticism. For this to be effective, comments need to remain professional and respectful. Comments will be reviewed and posts that include personal attacks, unfounded allegations, unverified facts, product pitches, or profanity will not be published.

The AI Will See You Now: Thinking about the AI Implications on Practice

Apr 22, 2024
By: Craig Roxborough, Registrar and CEO

Recently I had the pleasure of attending the Ontario Physiotherapy Association (OPA) InterACTION conference. The keynote was focused on Artificial Intelligence (AI), providing some background on the development of this technological innovation and discussing its role and future in clinical practice.

It was clear in the presentation and discussion that there is potential here. AI will inevitably evolve to a point where it helps augment or complement clinical practice to be more efficient and potentially more effective. It was also clear that there are a lot of unknowns and risks, particularly given the ‘black box’ nature of these tools and our inability to ‘see’ how they work.

So, I wanted to take a moment to explore AI a bit further and outline some key considerations for physiotherapists looking to integrate these tools in their practice. This is not the first time an innovation has landed on our doorstep, so reviewing existing guidance is a good start as we navigate this evolving world together.

Supporting Practice

While there is no consensus yet on how effective or helpful AI tools will be, there is an indication that there are benefits to using AI in practice.

We know that many healthcare professionals are facing burnout; particularly with the more administrative tasks of managing a practice. Tools are already being developed to help, either by supporting record-keeping practices or producing treatment recommendations after a clinical assessment. At InterACTION, we heard about using AI to build out routine exercise protocols.

Off-loading these tasks to AI is attractive as it reduces administrative burden and enables physiotherapists to spend more time on patient care. But while these benefits may be real, there are also risks that need to be addressed and human intervention and oversight remain essential.

Mitigating Risks and Addressing Challenges

Tools that facilitate charting, making a diagnosis, or recommending a treatment protocol are not new – dictation and transcription services already exist. Clinical practice guidelines provide an algorithmic approach to assessing and treating a patient. What these tools have in common is the need for a clinician to ensure the output is sound and their legal and professional obligations are met. The same is true for AI.

Privacy

One of the biggest questions around the use of AI in healthcare relates to privacy. Particularly how patient health information is being managed, stored, and used within these systems.

In Ontario, patient health information is protected by the Personal Health Information Protection Act (PHIPA). Any software tool with access to personal health information needs to be used in compliance with PHIPA. This is true for electronic medical records and virtual care tools, as well as any AI tool being integrated into practice.

When adopting AI tools, it may be very difficult to ensure you abide by your legislative requirements. Concerns include where the data is stored (i.e., outside of Canada), whether the data is being scrubbed and used for other purposes, and the security protocols that exist to protect any information in the system.

PHIPA is currently being updated to set requirements for all consumer electronic service providers managing patient health information. This may be helpful going forward, but ultimately, it’s your responsibility to ensure you protect patients’ health information in compliance with PHIPA.

Consent

Given the uncertainty and unique nature of AI, the default recommendation emerging at this point is for clinicians to obtain consent from patients before using AI. This includes explaining to patients what the tool is, how it will be used, what the risks are, and ensuring they are comfortable with its use in their care.

Bias

Current AI tools develop their abilities based on reviewing extensive data sets. However, these data sets are likely to reflect or contain bias, leading some critics to suggest that AI tools often show significant bias that risks marginalizing or discriminating against equity-seeking groups. Clinicians must use their judgement to evaluate the outputs and ensure that the specific circumstances of the patient are always being considered and reflected in the outcome.

Accuracy

The accuracy of AI tools in charting or making diagnoses and treatment recommendations has not yet been fully validated. Ultimately, physiotherapists are responsible for their charting and for the care they provide, regardless of the process. You need to evaluate any outputs from AI tools to ensure they accurately and comprehensively capture the care that has been provided, support a treatment plan that is clinically sound and reflect the patient’s unique needs.

Looking Forward

We’re only just scratching the surface. Regulators are actively looking at the implications of AI and a regulatory response that ensures we move forward ethically and responsibly. In fact, this blog has been informed by existing guidance from the College of Physicians and Surgeons of Alberta.

For now, it’s helpful to recount our core obligations to ensure these tools are being integrated responsibly. Keep these points in mind as you consider integrating AI tools into your practice:

  • Protect patient health information at all times.
  • Engage your patients in the process and obtain their consent.
  • Critically evaluate any output generated by these tools to ensure they reflect the unique circumstances of patients.
  • Only make diagnoses and treatment recommendations that are clinically sound and align with your professional judgment.
  • Proceed with optimism and caution.
Leave a comment
  1. Navya | Aug 26, 2024
    Awesome post.
  2. Navya | Aug 13, 2024
    Awesome details. Thanks!
  3. Ben | May 16, 2024

    I agree with Dev...But take it further.  It isn't only business owners that are capitalizing on us.  Hospital management, insurers, other regulated healthcare professionals and even our own patients do that every day.  Hospitals require a plethora of statistics and for PT's to justify every minute of their day, all of which decreases patient care.  Insurers (including WSIB) expect premium services at minimal price, causing us to either have to choose not to take specific 3rd party payor patients (harder for patients to get quality care) or to make up the financial losses elsewhere (it costs OTHER patients more to get care or 3rd party payor patients get less time and attention than other patient types.  Other healthcare professionals occasionally pressure us for extra work or time (I just have this one patient I need you to see), or pass patients off that are complex, despite not being within our scope of practice (ie. depression, anxiety, etc). Finally, our own patients sometimes do it...trying to persuade us to take their husband, wife, kid, etc, over the people on the waiting list first, asking us to fill out forms so they don't have to pay their physicians' much higher fees (but we're bound by the college to a specific recommended price that hasn't been updated for inflation), no showing for appointments and then not paying cancellation fees, or just never attending, loses time another patient could be treated.

    So there are a lot of people capitalizing on PT's.  The College itself even does.

  4. Darcy | Apr 23, 2024

    I work in the hospital setting IP and OP. One of the pediatricians that I work with closely has been using AI software for about 3 months now for his consults in office. The AI software is about 95% accurate and reports the whole visit in SOAP format while filtering out non-essential conversation. He raves about it and says that it has saved him 1-1.5 hours a day, as he doesn't need to dictate now. He simply reviews the note and saves. 

     

    I think this would be a great software for OP PT's to use in the private practice setting where SOAP notes are used and access to a computer is right there.

     

    Of course this is only one example of how AI can help with productivity but I believe it is quite useful in certain situations. 

  5. Kishore ML | Apr 23, 2024
    It is better to have AI as guidance for practicing as a supportive guidelines where the care involved is complex in terms of ethical dilemma, unrealistic expectations & also during the conflict of interest - as seen in various situations as the physiotherapy practice is still evolving. Any thoughts?
  6. Anonymous | Apr 23, 2024

    So basically, no difference from what we have to normally do.  Just because AI is utilized, that doesn't make any change to our responsibilities.  I love how people who have no idea how AI works want to lecture on being careful with it.  I've worked with AI models for a decade, assisting in neural network development and large language model development and its integration into healthcare.  This isn't new.  It's only new to the general public and non-tech folk.

    Nothing changes on our end...we are ultimately responsible for what goes into a chart, on a patient care sheet, a prescription, etc.  So any output from an AI should be considered a draft, to be reviewed and approved or corrected.

    If you use AI to do your work for you, you will ultimately fail and be held to task for that.  Don't be lazy.  Do your own work.  If you use it to bounce ideas off of, summarize research, summarize a patient encounter or write a letter or note, then READ it over with a critical eye. 

    Ultimately, if we realized that most of the paperwork we do for insurers, WSIB and others goes unread or ignored, and just thrust into a file somewhere, we could eliminate the need for AI by simply having charting requirements that make sense and are practical.  Instead, we have exhaustive requirements that decrease patient care and focus on paper trails, and make most of us roll our eyes at having to do these tasks.  Therefore, costs to patients and systems will go up, patient care will go down, and eventually, we'll realize that seeing 7 or 8 patients per day and spending 3-4 hours a day charting on them and writing reports is a ridiculous way of thinking we are "healthcare professionals."  Maybe then we'll change.  Until then, use AI all day long and use it a lot. Just fact check it and proof read it.

  7. Dev | Apr 23, 2024
    i am open with new ideas. From experience, business owners take this to benefit the revenue or profit more. What always starts with the patient care,  gets lost along the way. Therapist end up seeing more patients end of the day and being more responsible for these process. Its always feel there is not much to protect the therapists, business owners keep capitalising on us. 

    Leave a comment

    Comment Form

    Contact the Practice Advisor

    Free and anonymous counsel for PTs, patients, & the public. Learn More 

    practiceadvice@collegept.org
    416-591-3828 ext. 241
    1-800-583-5885 ext. 241