We live in a time of very rapidly developing artificial intelligence (AI) — when technology is increasingly becoming the first point of contact between patients and healthcare. Chatbots, applications and clinical decision support systems offer speed, accessibility and efficiency. However, in this technological bustle, the question arises: are we not losing something fundamental? Is it not empathy, intimacy and human presence that give real value to health care?
From this article you will learn:
The healthcare system faces a key challenge — the integration of technology with the human dimension of treatment. While AI offers opportunities to increase medical efficiency and precision, patient-physician relationships based on empathy and intimacy remain the foundation of quality healthcare. Intimacy, both physical and emotional, is inherent in these relationships, especially in crisis situations, such as, for example, a diagnosis of terminal illness [1].
Such a deep presence, mindfulness, trust and shared health experience regardless of channel—offline or online—is called Human Intimacy [3] [5]. Research shows that patients feel safer and more willing to cooperate [2] and also have reduced levels of anxiety and are more likely to follow therapeutic recommendations [4] when a doctor or health care provider shows such empathy for them. It builds trust and allows care to be better tailored to the individual needs of the patient [1].
In addition, relationships based on trust and mutual listening improve the quality of diagnostics. In such an environment, patients are more likely to share symptoms that they might have missed in a purely “technical” conversation [6]. In practice, this simply means that a doctor who is present — in an emotional and communicative sense — more correctly selects treatment, less professional burnout and builds long-term therapeutic relationshipsthat have real clinical effects.
What is interesting, intimacy in a medical relationship has its biological justification. Trusting contact triggers the release of oxytocin, a bond and security hormone that lowers cortisol levels, reduces stress, and strengthens the immune system [7]. It's not just biology — it's proof that relation heals just like drug therapy.
It should also be borne in mind that the relationship itself does not end its role at the moment when the office door is closed. For many patients, this is after the diagnosis, the most difficult questions, emotions and doubts begin. And here technology can prove to be an ally again — not as a replacement, but prolonging the presence of a doctor For example, through a medical communicator.
AI-powered technologies, such as Google DeepMind in radiological image analysis and IBM Watson Health in oncology, are already widely used in healthcare systems. They process data faster than humans, offering benefits such as scalability, 24/7 availability, cost reduction and support for overburdened health systems.
Despite these indisputable advantages, a key ethical question arises:is it responsible for making clinical decisions that are not only about educating the patient or referring him to the right specialist, but also guiding him through the intricate path of treatment?
This issue concerns not only technical aspects, but human presence in health care. While AI is a powerful tool, it still requires humans to responsibly assume the role of guide and decision-maker on difficult health issues.
This article is not about competition, but about partnership. AI and doctor can coexist in a model where technology frees up the doctor's time eliminates repetitive administrative tasks and supports them in faster and more precise analysis of patient data, so that the patient can be more present Here and now for the patient..
EXAMPLE
In Japan, care robots are being developed to help seniors remind them of medication or contact family — but no one is trying to replace the emotional caregiver role with them. The machine takes care of the rhythm of the day, the human — of dignity and emotional comfort [11].
Poland is not lagging behind. Both the public and private sectors are experimenting with the use of AI in triage, diagnostics and patient communication—and they are doing so with a growing emphasis on understandability and empathy.
Interestingly, there are tools in the world to support the building of a relationship with the patient based on trust, empathy and care. Tools that allow the doctor to continue the relationship with the patient even outside the office — in Poland it is the medical communicator ScholarlyR.one — allowing asynchronous contact. They are not based on artificial intelligence (yet), but they are a technology through which the patient can ask a question after the appointment, dispel doubts, and the doctor - answer when he has the space. Such solutions support not only the organization of work, but also Patient's sense of security: someone here really is He listens and remembers.
This is the model — man in the foreground, technology in the background Seems the most promising. AI can support, but it can't lead therapeutic relationship.
According to the Capgemini report, Empathy and interpersonal skills will be key competencies for healthcare leaders in the coming decades [12]. AI can therefore take over repetitive administrative tasks — such as searching medical records, coding ICD procedures or analyzing statistical data. Solutions such as Sully AI (automatic creation of doctor's appointment notes), Nabla Co-pilot (support in documenting conversations with patients) or Abridge (creation of summaries of clinical conversations) are already relieving doctors of the burden, allowing them to focus on what really matters: contact with the patient. However, it is the doctor — with his gaze, touch, attentive listening and ability to empathically remain silent at the right moment — that remains the pillar of the therapeutic relationship.
Imagine two almost identical situations — the same patient, the same diagnosis, but a completely different course of visit.
The patient, Mr. Andrzej, 58, had been complaining of abdominal pain and weight loss for several weeks. He took advantage of an online consultation in a modern clinic, where the first line of contact was automatic AI system.
After filling out the symptom form in the app, within a few minutes he received a message:
Based on the information provided, we suggest doing a colonoscopy to rule out colon cancer. Please schedule an examination through our system. Recommendations have been sent to your email.
No contact with the doctor. No explanation. No question about his fears, emotional state or family situation.
Mr. Andrzej, reading a short message on the phone screen, felt that the ground was being removed from under his feet. Alone, without support, without the opportunity to immediately ask questions, was left with information about a potentially fatal disease.
He didn't call right away to schedule a colonoscopy. Instead — he stopped sleeping, avoided the topic, was afraid, not knowing where to start. No one asked him, does he understand what is happening and Does he need support.
In this scenario, it is AI that has taken over the main role of communication — and that is why the patient was left without help and without a sense of security.
Not because the doctor spoke wrong—but because the doctor... wasn't there at all.
Same situation. Same patient, same abdominal pain, same AI system.
This time, however, the doctor uses technology recommendations, but before he says anything, turns away from the screen, looks the patient in the eye and says:
Mr. Andrew, the symptoms you have described are of concern to me. We need a colonoscopy to see exactly what's going on. I'm here to go through this with you.
Briefly, clearly, with care. Mr. Andrzej leaves the office with anxiety — but also with the feeling that is not alone.
And then the real care begins... On the way home, he gets the first message from the coordinator:
Mr. Andrzej, this is Katarzyna — I am available to you at every stage. Here are the instructions before the colonoscopy. If anything is unclear, please write.
In the evening, another:
Please note that we will contact you promptly after the examination. We are with you on this path.
After completing the examination, even before Mr. Andrzej has time to ask, he receives information on what to do next — always in the language of care and understanding. Technology constantly watches in the background — analyzing data, reminding about deadlines, keeping an eye on the process. But it is the people — the doctor and the coordinator — who build the bridge of trust and presence. This is what it looks like Intimacy in action - something that we should not replace with any - even the most advanced interface.
Intimacy In medicine, it's not just a sentimental value — it specific clinical and communicative value. The modern doctor is not faced with a choice: AI or HI? — his role is Connecting the two worlds. AI can search documentation, analyze data, recognize patterns faster than a human.
And what next?
Over the next decade, artificial intelligence will be able to diagnose many diseases on its own — faster, more accurately and often more effectively than humans. but it is not the AI that will decide how and when the patient will know the diagnosis. It is the doctor — equipped with both data and empathy — who will remain the one will take responsibility for the way of communication, for the care, for the relationship.
What should an ideal model look like in which technology supports, but does not overtake man?
AI runs in the background — analyzing, prompting, speeding up. And in the first line stands a man - he looks, listens, understands. This is a doctor who gets more time and space to really be with the patientbecause technology has taken over bureaucracy and repetitive tasks.
It's not a choice between AI and humans. This is an opportunity: AI for man.