The "fail fast and iterate" doctrine falters in an environment where patient lives are at stake. Moreover, this approach is predicated on two assumptions. First, you need a space where failure is not catastrophic. Second, you need an audience that is willing to tolerate imperfect prototypes en route to a finished product. Both are currently lacking in medicine. Despite this, the authors conclude that generative artificial intelligence (genAI) will likely be adopted faster than prior technologies in healthcare.1 We support this conclusion and offer two additional points. First, medical education is a low-risk environment in which to build genAI tools. Second, teaching physicians how to use genAI can promote a future generation of physician builders.
We are already seeing attempts to automate physician tasks with genAI in areas such as informed consent, operative note dictation, and patient education.2-4 However, the road to deploying these innovations goes through significant red tape and a paucity of electronic health record (EHR) vendors, as the authors reference.1 On the other hand, medical education is a core component of academic medicine but one step removed from patient care. The regulatory barriers to deployment are fewer and the available platforms are plentiful. For example, most trainees use one EHR but multiple educational technologies.
Before genAI can be deployed within healthcare, physicians who are already burnt out and risk averse will need to iteratively evaluate these imperfect tools without losing enthusiasm. Perhaps this necessary culture change can be accomplished by transitioning physicians from passive consumers of genAI to builders. Previously, EHRs were largely built by engineers who don't practice medicine. Even after go-live, authority over institution-specific modifications is limited to small physician committees. In current AI research, access to cutting-edge technology is restricted to technical experts and physicians with large data sets. In contrast, state-of-the-art genAI tools are available to the public. As the authors allude, everyone is a potential builder. To capitalize on this, there is a pressing need to equip physicians with genAI competency, especially in areas such as prompt engineering.
In a previous JAMA article, Berwick proposed rules for disseminating innovation. Among them were: "find and support innovators", "invest in early adopters", and "create slack for change".5 Education in genAI can expand the current pool of physician innovators and early adopters. Low-risk environments can provide the necessary space for change. Together, these developments can catalyze the adoption of genAI in healthcare.
2. Decker H, Trang K, Ramirez J, et al. Large Language Model-Based Chatbot vs Surgeon-Generated Informed Consent Documentation for Common Procedures. JAMA Netw Open. 2023;6(10):e2336997. doi:10.1001/jamanetworkopen.2023.36997
4. Emile SH, Horesh N, Freund M, et al. How appropriate are answers of online chat-based artificial intelligence (ChatGPT) to common questions on colon cancer? Surgery. 2023;174(5):1273-1275. doi:10.1016/j.surg.2023.06.005