At its core, this is not simply a technology debate. It is a patient experience conversation. Patients do not measure care in algorithms, efficiency dashboards, or predictive scores. They measure it in eye contact. In pauses. Whether they feel heard. Whether the nurse sits instead of standing. Whether the clinician turns toward them or toward a screen.
Nursing has always been the discipline that integrates data with human presence. We assess, we interpret, we document — but we also notice the tremor in a voice, the hesitation before answering, the spouse who has not spoken. If artificial intelligence disrupts that balance, nursing leadership must be the discipline that restores it. Patient experience is not preserved by efficiency alone. It is preserved by intentional presence.
Recently, I listened to a podcast discussing artificial intelligence and its growing impact on healthcare. The discussion centered largely on medical practitioners. Yet as I listened, I found myself thinking more broadly. What does AI mean not just for medicine, but for healthcare as a whole — and for those of us responsible for shaping patient experience across the continuum of care?
As a registered nurse who began practice in the 1970s, I have witnessed healthcare evolve from relationship-centered bedside care to technology-mediated delivery models. Artificial intelligence represents the newest chapter in that evolution. While AI promises efficiency and clinical support, it also raises a critical question: are we strengthening professional judgment and human connection or quietly eroding the very skills that define healthcare as a relational profession?
Artificial intelligence now assists with diagnosis, imaging interpretation, documentation, risk prediction, and real-time decision support. Its promise is improved efficiency, accuracy, and reduced administrative burden. But alongside that promise is a quieter concern — the potential for deskilling.
Deskilling refers to the gradual erosion of professional judgment and perceptual expertise when clinicians rely heavily on automated systems rather than actively exercising independent reasoning. It is rarely dramatic. It is incremental. And over time, it can subtly reshape professional identity (Natali et al., 2025).
When I was a young nurse, we had no electronic health records, no predictive analytics, and certainly no AI tools. What we had were hands, presence, and time. Care was mediated through relationships, not a screen. Over the decades, documentation demands increased. Productivity metrics intensified. Screens entered the room. Now AI is entering alongside them.
Research examining AI exposure in procedural medicine suggests that prolonged reliance on AI support may influence unassisted performance when that support is removed (Budzyń et al., 2025). The concern is not that clinicians become incapable, but that opportunities to continuously exercise independent judgment may diminish over time.
Documentation burden has also reshaped clinical presence. Studies of electronic health record use demonstrate that clinicians spend substantial time interacting with digital systems, often dividing attention between the patient and the screen (Arndt et al., 2017; Tai-Seale et al., 2019). The unintended consequence can be fragmented communication and reduced relational presence.
I remember carrying folded scraps of paper in my pocket as a young nurse, jotting notes so I could chart later. It was inefficient — but it never required me to turn my back on a patient. Years later, as a hospital CEO walking into a newly built facility, I saw patient rooms designed for documentation efficiency — computers positioned in ways that required clinicians to face the wall rather than the patient. The architecture reflected the system’s priorities.
The question is not whether technology belongs in healthcare. It does. The question is whether we design and deploy it in ways that protect clinical reasoning and preserve human connection.
If AI replaces thinking, we risk cognitive deskilling. If AI replaces presence, we risk relational erosion. But if AI is implemented thoughtfully — reducing unnecessary clerical burden while preserving accountability and independent judgment — it can enhance rather than diminish the patient experience.
For nursing leaders and patient experience executives, AI adoption must be treated as a cultural decision, not merely a technological one. The responsibility is not to resist innovation, but to shape it intentionally — ensuring that human dignity, professional judgment, and relational presence remain at the center of care.
The bigger risk is not that clinicians will forget how to diagnose. It is that we may forget how to connect. That would not simply be the deskilling of healthcare — it would be the quiet erosion of humanity within it.
References
Arndt, B. G., Beasley, J. W., Watkinson, M. D., et al. (2017). Tethered to the EHR: Primary care physician workload assessment using EHR event log data and time-motion observations. Annals of Family Medicine, 15(5), 419–426. https://doi.org/10.1370/afm.2121
Budzyń, K., Romańczyk, M., & Mori, Y. (2025). Endoscopist deskilling risk after exposure to artificial intelligence in colonoscopy: A multicentre, observational study. The Lancet Gastroenterology & Hepatology. https://doi.org/10.1016/S2468-1253(25)00133-5
Natali, C., et al. (2025). AI-induced deskilling in medicine: A mixed-method review. Artificial Intelligence Review. https://doi.org/10.1007/s10462-025-11352-1
Tai-Seale, M., Olson, C. W., Li, J., et al. (2019). Electronic health record logs indicate that physicians split time evenly between seeing patients and desktop medicine. Health Affairs, 38(4), 655–662. https://doi.org/10.1377/hlthaff.2018.05125