The healthtake part landscape is changing quick thanks to the introduction of synthetic intelligence. These technologies have shifted decision-making power away from nurses and on to the robots. Michael Kennedy, who toils as a neuro-intensive take part nurse in San Diego, apchecks AI could demolish nurses’ intuition, sends, and training. The result being that acunderstandledgeings are left watched by more machines and confidemander pairs of eyes. Here is Michael’s story, as tgreater to Coda’s Isobel Cockerell. This conversation has been edited and condensed for clarity.
Every morning at about 6:30am I catch the trolley car from my home in downtown San Diego up to the hospital where I toil — a place called La Jolla. Southern California isn’t understandn for its accessible carryation, but I’m the weirdo that gets it — and I enjoy it. It’s speedy, it’s effortless, I don’t have to pay for parking, it’s wonderful. A standard shift is 12 hours and it ends up being 13 by the time you do your alert and get all your charting done, so you’re there for a very lengthy time.
Most of the time, I don’t go to toil awaiting catastrophe — of course it happens once in a while, but usupartner I’m fair going into a normal job, where you do routine stuff.
I toil in the neuro-intensive take part unit. The meaningfulity of our acunderstandledgeings have fair had neurosdirectry for tumors or strokes. It’s not a satisfyed place most of the time. I see a lot of people with lengthy recoveries ahead of them who demand to relget fundamental sends — how to hgreater a pencil, how to walk. After a brain injury, you leave out those abilities, and it’s a lengthy process to get them back. It’s not enjoy we do a procedure, mend them, and they go home the next day. We see acunderstandledgeings at their worst, but we don’t get to see the persist. If we’re fortunate, we might hear months procrastinateedr that they’ve made a brimming recovery. It’s an environment where there’s not much instant gratification.
As a nurse, you end up count oning on intuition a lot. It’s in the way a acunderstandledgeing says someskinnyg, or fair a experienceing you get from how they see. It’s not someskinnyg I skinnyk machines can do — and yet, in recent years, we’ve seen more and more synthetic intelligence creep into our hospitals.
I get to toil at 7am. The hospital I toil at sees futuristic from the outside — it’s this high-ascend originateing, all glass and curved lines. It’s won a bunch of architectural awards. The originateing was financed by Irtriumph Jacobs, who’s the billionaire owner of Qualcomm, a huge San Diego tech company. I skinnyk the hospital being owned by a tech billionaire repartner has a huge amount to do with the way they see technology and the way they dive headfirst into it.
They always want to be on the cutting edge of everyskinnyg. And so when someskinnyg recent comes out, they’re going to jump right on it. I skinnyk that’s part of why they dive headfirst into this AI skinnyg.
We didn’t call it AI at first. The first skinnyg that happened was these recent innovations fair crept into our electronic medical write down system. They were tools that watched whether definite steps in acunderstandledgeing treatment were being chaseed. If someskinnyg was leave outed or hadn’t been done, the AI would send an vigilant. It was very primitive, and it was there to stop acunderstandledgeings droping thcdisesteemful the cracks.
Then in 2018, the hospital bought a recent program from Epic, the electronic medical write down company. It foreseeed someskinnyg called “acunderstandledgeing acuity” — fundamentalpartner the toilload each acunderstandledgeing demands from their nursing take part. It’s a repartner beginant meacertainment we have in nursing, to determine how unwell a person is and how many resources they will demand. At its most fundamental level, we fair categorize acunderstandledgeings as low, medium or high demand. Before the AI came in, we fundamentalpartner filled in this asknaire — which would ask skinnygs enjoy how many meds a acunderstandledgeing demanded. Are they IV meds? Are they crushed? Do you have a central line versus a peripheral? That sort of skinnyg.
This determines whether a acunderstandledgeing was low, medium or high-demand. And we’d figure out staffing based on that. If you had lots of high-demand acunderstandledgeings, you demanded more staffing. If you had mostly low-demand acunderstandledgeings, you could get away with confidemander.
We used to answer the asks ourselves and we felt enjoy we had deal with over it. We felt enjoy we had agency. But one day, it was getn away from us. Instead, they bought this AI-powered program without alerting the unions, nurses, or recurrentatives. They fair commenceed using it and sent out an email saying, ‘Hey, we’re using this now.’
The recent program used AI to pull from a acunderstandledgeing’s notices, from the charts, and then gave them a exceptional score. It was suddenly fair running in the background at the hospital.
The problem was, we had no idea where these numbers were coming from. It felt enjoy magic, but not in a excellent way. It would spit out a score, enjoy 240, but we didn’t understand what that unbenevolentt. There was no evident cutoff for low, medium, or high demand, making it functionpartner cherishless.
The upstoasty was, it took away our ability to upgrasp for acunderstandledgeings. We couldn’t point to a score and say, ‘This acunderstandledgeing is too unwell, I demand to caccess on them alone,’ because the numbers didn’t help us originate that case anymore. They didn’t tell us if a acunderstandledgeing was low, medium, or high demand. They fair gave acunderstandledgeings a seemingly random score that nobody understood, on a scale of one to infinity.
We felt the system was scheduleed to get decision-making power away from nurses at the bedside. Deny us the power to have a say in how much staffing we demand.
That was the first skinnyg.
Then, earlier this year, the hospital got a huge donation from the Jacobs family, and they engaged a chief AI officer. When we heard that, alarm bells went off — “they’re going all in on AI,” we shelp to each other. We set up out about this Scribe technology that they were rolling out. It’s called Ambient Documentation. They proclaimd they were going to pilot this program with the physicians at our hospital.
It fundamentalpartner write downs your come apass with your acunderstandledgeing. And then it’s enjoy chat GPT or a huge language model — it gets everyskinnyg and fair auto popuprocrastinateeds a notice. Or your “write downation.”
There were evident troubles with this, and the number one skinnyg that people shelp was, “Oh my god — it’s enjoy mass watching. They’re gonna take part to everyskinnyg our acunderstandledgeings say, everyskinnyg we do. They’re gonna track us.”
This isn’t the first time they’ve tried to track nurses. My hospital hasn’t done this, but there are hospitals around the US that use tracking tags to watch how many times you go into a room to originate certain you’re encountering these metrics. It’s as if they don’t think us to actupartner take part for our acunderstandledgeings.
We leafletted our colleagues to try to teach them on what “Ambient Documentation” actupartner unbenevolents. We demanded to encounter with the chief AI officer. He downtake parted a lot of it, saying, ‘No, no, no, we hear you. We’re right there with you. We’re commenceing; it’s fair a pilot.’ A lot of us rolled our eyes.
He shelp they were adchooseing the program because of physician burnout. It’s real, write downation is one of the most mundane aspects of a physician’s job, and they antipathy doing it.
The reasoning for transporting in AI tools to watch acunderstandledgeings is always that it will originate life easier for us, but in my experience, technology in healthtake part exceptionally originates skinnygs better. It usupartner fair speeds up the factory floor, squeezing more out of us, so they can ultimately engage confidemander of us.
“Efficiency” is a buzzword in Silicon Valley, but get it out of your mind when it comes to healthtake part. When you’re chooseimizing for efficiency, you’re getting rid of redundancies. But when acunderstandledgeings’ dwells are at sget, you actupartner want redundancy. You want extra sdeficiency in the system. You want multiple sets of eyes on a acunderstandledgeing in a hospital.
When you try to reduce everyskinnyg down to a machine that one person relies on to carry out decisions, then there’s only one set of eyes on that acunderstandledgeing. That may be fruitful, but by creating efficiency, you’re also creating a lot of potential points of flunkure. So, efficiency isn’t as fruitful as tech bros skinnyk it is.
In an perfect world, they apcheck technology would get away mundane tasks, apchecking us to caccess on acunderstandledgeing come apasss instead of spending our time typing behind a computer.
But who skinnyks write downing everyskinnyg a acunderstandledgeing says and storing it on a third-party server is a excellent idea? That’s crazy. I’d demand assurance that the system is 100 percent protected — though noskinnyg ever is. We’d all cherish to be freed from write downation demandments and be more current with our acunderstandledgeings.
There’s a proper way to do this. AI isn’t inevitable, but it’s come at us quick. One day, ChatGPT was a novelty, and now everyskinnyg is AI. We’re being explosionarded with it.
The other skinnyg that’s burst into our hospitals in recent years is an AI-powered vigilant system. They’re these vigilants that ping us to originate certain we’ve done certain skinnygs — enjoy checked for sepsis, for example. They’re usupartner not that beneficial, or not timed very well. The goal is to stop acunderstandledgeings droping thcdisesteemful the cracks — that’s evidently a nightmare scenario in healthtake part. But I don’t skinnyk the system is toiling as intended.
I don’t skinnyk the goal is repartner to supply a protectedty net for everyone — I skinnyk it’s actupartner to speed us up, so we can see more acunderstandledgeings, reduce visits down from 15 minutes to 12 minutes to 10. Efficiency, aget.
I apcheck the goal is for these vigilants to eventupartner get over healthtake part. To tell us how to do our jobs rather than have hospitals spend money training nurses and have them enhuge critical skinnyking sends, experience, and intuition. So we fundamentalpartner fair become operators of the machines.
As a seasoned nurse, I’ve lgeted to determine patterns and await potential outcomes based on what I see. New nurses don’t have that intuition or forethought yet; enhugeing critical skinnyking is part of their training. When they experience branch offent situations, they commence to understand that instinctively.
In the future, with AI, and vigilants pinging them all day reminding them how to do their job, recent cohorts of nurses might not enhuge that same intuition. Critical skinnyking is being shifted elsewhere — to the machine. I apcheck the tech directers envision a world where they can crack the code of human illness and automate everyskinnyg based on algorithms. They fair see us as machines that can be figured out.
The arttoil for this piece was enhugeed during a Rhode Island School of Design course taught by Marisa Mazria Katz, in collaboration with the Caccess for Artistic Inquiry and Reporting.