Dr. Mark F. Sullivan

The technological divide: Navigating the depersonalizing impact of technology on the patient-physician connection

May 16, 2025
By Dr. Mark F. Sullivan

The digital revolution has radically transformed healthcare, bringing sophisticated tools like electronic health records (EHRs), artificial intelligence (AI), and machine learning into the hands of physicians. These innovations allow doctors to access information with unprecedented speed and accuracy, improving diagnoses, streamlining workflows, and enhancing treatment outcomes. However, as these technologies become increasingly woven into the fabric of clinical care, they also risk distancing physicians from the very human element of medicine—the personal connection between doctor and patient.

This “iPatient,” a term popularized by Dr. Abraham Verghese, represents the virtual version of the patient—charts, labs, scans, and templated notes that live inside the EHR. In modern clinical practice, it is often this version that receives most of the physician's attention. While these tools are meant to support care, they have introduced an overwhelming volume of digital tasks that compete with face-to-face interaction.

Instead of prioritizing personal interaction, many physicians find themselves tethered to screens, inputting data, and navigating complex systems that demand more and more attention. Auto-populated clinical notes, task inboxes, multiple email accounts, and prior authorization portals can contribute to an overwhelming flood of digital distractions. Patient portals have blurred the boundaries of care, creating a 24/7 stream of messages that physicians are expected to respond to, often outside clinic hours. Task inboxes flood with alerts and results while navigating fragmented billing and prior authorization software can take longer than the clinical decisions themselves. The human experience, the story of the patient, risks being buried under layers of bureaucracy.

The effect of this shift is profound. Non-verbal cues—important indicators of a patient’s emotional or physical state—are missed. The opportunity for meaningful dialogue is lost as physicians multitask between data entry and diagnosis. What was once a dialogue between two people can become an exchange between one person and a machine, a situation that increasingly threatens to depersonalize the healing process.

A prominent cardiologist recently described to me the future of medical decision-making as something akin to watching a football game through an ESPN app. In this vision, as machine learning and big data continue to evolve, clinical care will be guided much like an algorithm tracking a game’s progress. “You won’t need to think about the decision anymore,” the cardiologist explained. “Just like the app tells you a team’s chance of winning based on their position on the field and time on the clock, your clinical dashboard will tell you what treatment the patient should receive—no questions asked.”

In some ways, this future is already taking shape. Risk calculators, AI-based decision aids, and predictive algorithms are increasingly embedded in clinical workflows. While these tools can improve consistency, reduce bias, and support evidence-based care, they also raise a critical question: where does human judgment fit into this equation? And more importantly, where does the patient’s voice fit in?

If care becomes overly algorithmic—driven by probabilities, patterns, and clinical pathways—there’s a danger that individuality will be lost. A patient becomes not a person with a unique history and set of values, but a data point on a probability curve. The risk is not just clinical oversimplification, but emotional detachment.

To combat this, healthcare systems and medical educators must emphasize the importance of presence, listening, and empathy—even in the most tech-saturated environments. Digital tools must be designed with human connection in mind: streamlined interfaces, integrated systems, and shared screen models that keep patients involved. Medical training must evolve to teach not just how to use technology, but how to maintain compassion despite it. A major emphasis of this medical education should be reminding the student that healthcare professionals are serving a human being (not a computer) who is coming to us with personal needs and, many times, in a situation of distress. This personal and professional education involves building the character traits, dispositions, and ethical framework through which to enter a relationship with the human patient, something that a machine cannot possess by virtue of its non-living existence.

Interestingly, a good friend of mine, an air traffic controller at Miami International Airport, shared an analogy that sheds light on the broader implications of technology in decision-making. He works in an environment where technology-assisted systems direct much of the flow of air traffic, especially under standard conditions. These systems use algorithms that calculate optimal routes, track planes, and minimize delays. For much of the time, technology has been highly efficient, managing thousands of planes in a manner that would be overwhelming for any human to do alone.

However, these systems struggle when complex, unpredictable events occur—like a sudden, severe thunderstorm. When weather patterns change unexpectedly or a plane experiences an in-flight emergency, the situation requires more than just predictive algorithms. It demands human judgment, experience, and a nuanced understanding of both the systems in play and the lives at stake.

“AI is only as good as its programmer,” he exclaims. “It can handle routine situations well, but it’s not prepared for the gray areas. When things get complex, the human touch is irreplaceable.”

His argument holds deep relevance for medicine, especially as AI and machine learning begin to take on larger roles in clinical decision-making. Like the air traffic control system, many medical algorithms excel at processing routine cases based on existing data—suggesting treatment plans, identifying risks, and predicting outcomes. However, when dealing with unique, complex patients or unexpected developments, human judgment becomes critical. And just as importantly, so does human empathy.

What makes my friend invaluable in those high-stakes moments at the control tower is not just his ability to read data, but his connection to the over 200 passengers onboard a plane in distress. He feels a deep sense of responsibility, not just for the logistics of landing a plane, but for the lives of those on board. This personal responsibility, this human element of care, is what makes his judgment essential. In the same way, a physician’s role is not simply to make a diagnosis based on algorithms but to understand the patient as a whole person—their joys, hopes, fears, anxieties, and unique circumstances. No technological system, no matter how advanced, can replace the value of that human connection.

In healthcare, as in air traffic control, technology has a place in improving efficiency and reducing human error. However, when the stakes are high and the complexities of a patient’s case exceed the predictable or algorithmic, the experience, empathy, and judgment of a human being remain irreplaceable. A physician’s ability to connect with their patient—to truly see and understand them—cannot be replicated by a machine. It is this personal responsibility, this intimate knowledge of the patient’s story and context, that empowers doctors to make decisions that go beyond the cold logic of data and into the realm of compassionate, patient-centered care.

Ultimately, as technology continues to evolve, the challenge will be to ensure that it complements, rather than replaces, the human connection that lies at the heart of healing. The future of medicine may include predictive algorithms and AI-driven decision-making, but it should always retain space for the physician’s humanity—an essential ingredient that no machine can replicate. Technology should serve the doctor, not overshadow them; most importantly, it should never replace the personal relationship between a physician and their patient. In the end, that human touch remains what makes medicine truly transformative, and human lives depend on it.

About the author: Mark F. Sullivan, MD is an Internal Medicine physician at Northern Virginia Family Practice.