For missing details of the patient's condition and giving wrong diagnoses and conflicting prescriptions

Patients using GPT chat for diagnosis and treatment. Doctors warn

  • Artificial intelligence is unable to know the psychological dimension of the patient, which is one of the basics of treatment. à Archival

image

The great progress in artificial intelligence techniques prompted its users to use it to provide radical solutions to many life matters, including diseases and epidemics, as some resorted to the use of "Chat GPT" technology in diagnosing and treating diseases, instead of doctors, while specialists warned of this step, because artificial intelligence lacks many details about the patient, and then may give him wrong diagnoses, and conflicting prescriptions that may threaten his life.

Users of the technology «Chat GPT» told «Emirates Today» that they use it to diagnose some of the diseases they develop by asking about the symptoms they feel, and then asking about the proposed drugs, stressing that they are in a state of fear and confusion from taking medicines that may be wrong and random by the program.

Doctors stressed the need to rely on a specialist doctor in diagnosing and treating diseases, especially emergency, dangerous and chronic diseases, where the patient is subject to clinical examination, and the diagnosis is linked to accurate medical examinations and analyzes, and then prescribing medicines that suit the patient's condition, without conflicting with other medicines.

In detail, users of the technology «chat GPT», Bassem Abdulaziz, Ihab Mohamed, Ahmed Abdel Moaty, Mohamed Gaber, that they resorted to the use of innovative technology in the process of treatment by asking about the symptoms they had, and then the best drugs for treatment, and were surprised by the provision of technology a huge amount of information, but the reliance on it is still a matter of concern and distrust, demanding doctors and specialists to clarify the possibility of adopting it in the diagnosis and treatment of diseases.

Professor Humaid bin Harmal Al Shamsi, Consultant Oncologist at the University of Sharjah, said that despite the usefulness of GPT chat and artificial intelligence techniques in general in providing information and data very quickly, it cannot be relied upon in diagnosing and treating diseases, as it cannot link the clinical information discovered by the doctor himself, and the diagnosis and treatment of the disease, as well as it is unable to know the psychological dimension of the patient, which is one of the basics of treatment.

Al Shamsi stressed that many diseases are intertwined, and their detection and diagnosis require a lot of accurate scientific examinations, while these results cannot be obtained by simply asking an artificial intelligence question.

He continued: «One of the most prominent risks of relying on artificial intelligence in diagnosis and treatment is that it may give a wrong diagnosis, and then a treatment that may threaten the patient's health and deterioration instead of helping to treat it».

He pointed out that at the same time, the benefits of artificial intelligence in developing the diagnosis and treatment process cannot be denied, as it has greatly helped doctors in developing treatment programs, as well as enhancing the role of medical devices and means in giving much more accurate and faster results.

In turn, the consultant of family medicine and occupational health, Dr. Mansour Anwar Habib, said that the technology «chat GPT» may help provide a huge amount of information about various diseases, and may also offer treated drugs easily, but the patient must go to the specialist doctor to feel reassured of the treatment process, pointing out that artificial intelligence lacks the patient's medical history, and therefore his diagnosis and treatment lack accuracy.

He added: «The suffering of the patient using the technology (chat GPT) from other diseases, makes him need special treatment prescriptions that suit his condition, which cannot be relied on this technology, as solutions are offered without knowledge of their suitability for the patient's condition».

He stated that the technology «chat GPT» can be relied on in the treatment of simple diseases, such as colds, high temperature and other non-dangerous diseases, without relying on it in the treatment of chronic and dangerous diseases.

For his part, Consultant Interventional Cardiologist in Dubai, Dr. Ahmed Al-Masaeed, warned against the adoption of «Chat GPT» in the treatment process in general, pointing out that these technologies are still in the stage of development, and their information and capabilities are still very limited, as they require asking the question accurately, and correctly, so that the patient gets the appropriate answer, and artificial intelligence techniques lack this feature.

He added: «Medicine depends in a large proportion on the patient's version of the details of his patient's condition, his medical history, and clinical examination, to give the appropriate medicine, after confirming the disease and diagnosing it accurately».

He pointed out that the technology can be adopted as an information reference for doctors and specialists, while the treatment process is limited to the doctor only.

He stressed the need not to take any medications without the prescription of the treating doctor, as artificial intelligence may suggest drugs that conflict with the patient's condition, then threaten his life, and lead to the deterioration of his health instead of treating him.

Cybersecurity expert, Abdel Nour Sami, stated that before using Chat GPT technology in diagnosing and treating diseases, it is necessary to understand how the technology works first, as it is based on machine learning and deep learning, and its advantage is the closest communication to thinking and addressing humanity as a language processing tool, and all its responses are based on two things, first, the amount of data it learned, and the effective model, each model is likely to be used differently, some are better for human interaction, others are better for proofreading, and its use cannot be limited to one category.

He continued: «Suppose that (Chat GPT) was trained to take the position of a doctor or diagnostics and interact on this basis by training on a huge amount of medical stored data, and here comes the importance of the second thing, which is the language of discourse, the way to communicate with the tool, the amount of information we give, and the method of explanation, as the programming discourse language is devoid of understanding human emotions, the responses are based on logic of what the tool knows, and what the person reveals when talking to it.

He stated that the problems around the use of «chat GPT» medically so far are led by privacy violations, because all the data received by him is collected, and the confidentiality of this data may be revealed and linked to the person who asked the question, while the medical profession requires maintaining privacy and confidentiality according to the highest standards of information security and professional integrity, and that medical transactions go through administrative procedures, through which the patient understands what he has and what he has to do, while communicating with «Chat GPT» is just Of all these things, and it is no different from searching in «Google» in general on the surface, the difference is that «chat GPT» gives specific information based on the course of the conversation, while when searching in «Google» the person searches, investigates, and arbitrates his mind, and sometimes his heart.

Abdel Nour pointed out that «Chat GPT» happens from time to time, so he will not be able to help diagnose things that happen in real time, seasonal matters, and new developments, and that valuable medical sources may be limited to major parties that are not reached by «Chat GPT», and that the doctor performs many tests that help to understand the accurate situation, whether theoretical or physical examination, radiological analyzes, blood tests, etc., all of which have a role in accurate diagnosis, and underestimate Some do these tests, because the consequences of a misdiagnosis intimidate any doctor, but Chat GPT does not have any responsibility about it, it transmits answers based on the application of the search algorithm, and it may not be correct, so individuals should not use the technology for personal use.

He stressed that «Chat GPT» succeeded in the medical exam at a high rate, but this is limited to the clarity of the questions and their narrative theoretically, but in fact, the responsibility for diagnosis, analysis and discovery lies with the doctor and his auxiliary tools, which enable him to reach the diagnosis, and the data does not come ready, as was the experiment on «Chat GPT», but in the future, «Chat GPT» can be legalized to benefit the medical staff, after it is integrated with diagnostic devices, I think for its success it must be linked from the patient's history, family record, measuring the atmosphere around him, the emission factor, and other useful information that will approach the result to some extent, but relying on it without linking to effective data will not bring a very accurate result, which may be right at times and wrong at other times.

• Doctors: Artificial intelligence cannot link clinical information with disease diagnosis and treatment.

• Users asked specialists to clarify the possibility of adopting the diagnosis and treatment of «chat GPT».

Accurate reading of radiology examinations

Cybersecurity expert, Abdel Nour Sami, pointed out that the technology is useful for the use of specialists, as «Chat GPT» is useful in accurate reading of X-rays, CT scans, and magnetic resonance imaging, and it draws attention to accurate observations that may be unable to the naked eye, whether for poor vision, or the severity of detail accuracy, or falling inadvertently, and here «Chat GPT» is useful in processing and reading them.