Medical schools are missing the mark on artificial intelligence

RReady or not, health care is undergoing a massive transformation driven by artificial intelligence. But medical schools have barely started to teach about AI and machine learning — creating knowledge gaps that could compound the damage caused by flawed algorithms and biased decision-support systems.

“We’re going to be at a point where we’re not going to be able to catch up and be able to call out the technology defects or flaws,” said Erkin Ötleş, a machine learning researcher working toward his medical degree and Ph.D. .D. at the University of Michigan. “Without being armed with that set of foundational knowledge into how these things work, we’re going to be at a disadvantage.”

In a recent commentary published in Cell Reports Medicine, Ötleş and a group of physicians and educators from the University of Michigan called for medical educators to make AI less of an afterthought and more of a core concept in undergraduate medical training. They emphasize the idea of ​​a spiral curriculum, in which students learn key points about AI in medicine at the start, then turn back to it again and again as they learn more specialized skills.


But that won’t be easy to execute, said co-author and former Michigan medical school dean Jim Woolliscroft. Bureaucratic inertia keeps medical school curriculums from evolving quickly, and faculty themselves may not yet have the expertise to teach a new generation of doctors. In an interview with STAT, the student and the educator elaborated on how medical educators can kick-start the process of revamping AI training.

What’s the current state of medical education in artificial intelligence?


Jim Woolliscroft: Medicine’s educational programs are not essentially evolving at all. There’s little tweaks, but there are not the sort of seismic changes that need to occur.

Erkin Ötleş: It’s not really specialized at all in terms of artificial intelligence and machine learning. What you will often see, for those interested, is they will either take the time out to do either a master’s, or what I’m doing, a combined MD/Ph.D. Otherwise, people might be exposed through the research that they’re able to do as electives. As a student, you still need to go and direct your own studies and you need to go inform yourself.

Is that enough? Or should AI be incorporated into the general medical curriculum?

Erkin Ötleş Courtesy Stephanie Otlesh

Otlesh: Artificial intelligence and machine learning is going to be so pervasive in our everyday practice that everyone will need to have some base level of understanding in order to at least evaluate the tools that they’re using. They don’t need to be experts and they don’t need to develop this stuff, but they need to be able to say, “I don’t think this works well,” and then be able to call up the developer and say , “I think we have a problem.” We need to start teaching people quickly, because we’re going to be behind the eight ball.

Woolliscroft: Medical students don’t know about this stuff, and they need to see it as basic as pharmacology and physiology. Already, machine learning algorithms and AI more generally are essentially ubiquitous.

One of the real problems is our faculty aren’t even aware of it. Pre-Covid, I gave a lecture on machine learning, and people were going, “Why is this important?” They didn’t even know that at the university hospital, there were eight programs at that time running continuously in the background, monitoring physiologic variables on their patients. Faculty don’t have the expertise to teach it.

When AI is being taught at medical schools, what kind of strategies are being used?

Otlesh: There’s a lot of focus on specific technologies or tools, and it shouldn’t be that. There’s been discussion of: All these techniques use a lot of Python programming, so we should teach Python programming to medical students. And, you know, I love Python programming, but I don’t think all my colleagues coming out of medical school should know how to program in Python.

Woolliscroft: Several of the examples that seem to have worked, like in radiology, are really contextually grounded. That’s good. But what we need is to have students at a much earlier level understand some basic questions to ask. What was the database, what attention was paid to ensure that the data that was used to build the algorithm was clean? What were the gold standards that were utilized? All of these things are questions that are widely applicable.

So what would a more successful framework for AI medical education look like?

Otlesh: We need to prioritize teaching basic concepts of AI and machine learning because we have a limited time. We need to figure out what the most important things are and use that as a foundation. And so once you have that foundation, you can then continually over time refer back to it and then grow it if you need to or connect it to other concepts.

Jim Woolliscroft Courtesy University of Michigan

Woolliscroft: One of the things that hasn’t been done, to my knowledge, is this concept of the spiral curriculum: You come back to it again and again as students move into clinical areas. So when they’re on radiology, they can ask: So this mammogram interpretation, what was it based on? Did it include women from, say, Egypt that have a lot more inflammatory breast cancer? It didn’t. Oh, ok. Well, here in Michigan, we have a lot of people from the Middle East. So is this going to be applicable to this population or not? As they get into all of these different things, they’ll have a foundation that they can plug into these specific examples to fill out the flesh of those bones that have been laid.

what are the biggest hurdles to implementing that kind of change in a curriculum?

Woolliscroft: Essentially, most medical schools haven’t changed their department structure to reflect the changes in the sciences underlying the practice of medicine. We have this structure, this legacy that leads to huge inertia because there’s all sorts of things that are tied into that, primarily budgetary and personnel.

The other real problem is making decisions about curriculum, because you can’t have this as an appendage. The students will not see it as being valuable. It has to be integrated and that requires faculty to really change a lot of fundamental things that they’ve been doing.

what are some of the first steps to take on those barriers, and who needs to be leading the charge?

Otlesh: We need to have it be led by physicians. It’s probably going to be academic medical centers, places that have colleagues in colleges of engineering or departments of computer science or of schools of information or learning health systems departments. You’re going to have resources that can coalesce together quickly — and that’s what we need, we need that speed right now.

And what about medical schools that don’t have those kinds of resources?

Otlesh: We’re trying to push this as a conversation. We think that having a scaffolding — focusing on this foundational knowledge and then coming back to it continually — is an important way to lay these things out. People may think that we’re totally off base, but hopefully they agree that it’s important that we move and that we move quickly in this area. As a part of that, hopefully we all think about how we can share resources. When we build a curriculum, we share it, when we build a tool, we share it, so that we don’t waste time recreating and we can just get to teaching and learning.

Woolliscroft: It’ll happen. I’m just anxious that it happens more quickly. This is no different than other technological innovations through the decades, even centuries: As new technology is introduced, often from outside of biology, it is applied to biologic problems, a discipline emanates, departments are created. I just think it’s important enough that this be put on a fast track rather than allowed to just grow organically. We can’t have that happen, because patients will die.

This story is part of a series examining the use of artificial intelligence in health care and practices for exchanging and analyzing patient data. It is supported with funding from the Gordon and Betty Moore Foundation.

Leave a Comment