top of page

New technologies and pain management

We sat with Dr Amaury Salavert, an authority in pediatric pain management, for a fascinating talk on pain ranging from assessment to treatments and the contributions Alternative and Complementary medicines can make.

This is part 6 of 6, where we discuss new technologies to support pain management. See part 1 for an overview of the various type of pains and how to address them. See part 2 for an overview of the various treatments available. See part 3 for an overview of benefits and risks. See part 4 for on overview of integrative treatments and new trends. For an overview


What specific decisions support tools do you currently use in your pediatric practice?

In my pediatric practice, I use two main support tools to inform my decisions: my gut and Uptodate. My gut is informed by the experience I have gained from seeing over 800 children in pain consultation. I also use Uptodate, which is a tool that compiles the latest recommendations from medical and scientific societies. This helps me to stay up-to-date on the best practices for patient care and treatment.

Have you ever tried to Google symptoms?

Sometimes patients tell me about a disease I'm not familiar with, and when I look it up, I find that it's completely made up. Recently, there has been a trend of people claiming about a new disease that makes babies cry and sleepless. In my experience every baby cries and most are sleepless. This is not a disease.

In which area would you like to see these tools evolve?

In my daily practice, I would benefit from tools that help me to monitor patient adherence to treatments, as well as to conduct larger scale studies on non-pharmaceutical treatments. This could involve follow-up questionnaires to assess whether patients are taking their treatments as prescribed. There is a lack of information and support when it comes to non-pharmaceutical treatments.

Currently, there is a lot of discussion in the world of startups about the use of artificial intelligence (AI). Do you see AI playing an important role in your practice?

In terms of its use in medical practice, AI is being used in some specific areas such as radiography to help detect burns and other injuries. However, it is not yet reliable enough to be used on pediatric populations. AI is also being used to suggest pharmaceutical treatments but completely absent for other treatments. I believe that in the future, AI will likely be used more and more but I really think of it as autopilot with providers still in charge of defining course.

Let's say you have a tool, AI powered that suggests you prescriptions. Will you follow them?

When considering the use of artificial intelligence to suggest prescriptions, I'm asked to consider the extent to which I trust the technology. In the next few years, it is likely that the accuracy of AI will improve, and at that point, I may be more inclined to trust it. However, in my current experience and in the case of radiology, I remain aware of the particularities of pediatric populations. Half of the time the AI is still wrong in its predictions.

Do you see any ethical issues in using the technologies we've been mentioning?

Of course I see quite a few ethical issues when using technologies such as decision support or artificial intelligence. For example, when it comes to prescribing medication, is the tool being influenced by pharmaceutical companies to promote their own products? Alternatively, when a treatment is effective but costly, should the treatment be prioritized over other needs, such as providing food and shelter for the family? In both cases, it is necessary to have a human make the final decision. Privacy is an important ethical issue to consider too.

How about responsibility?

Yes that's another interesting question to consider when it comes to artificial intelligence. As a medical professional, I am responsible for prescribing treatments to patients, even if I warn them of potential risks. This is a responsibility that I must take on as a doctor, a human, and part of the human process. Artificial intelligence can't take on these kind of responsibilities. As the number of physicians and nurses decreases and the rate of burnout increases, artificial intelligence may be seen as a quick fix. Who is to be held accountable if something goes wrong? It will be interesting to see how this issue is addressed in the future.

What are the steps that you take to ensure safety whenever there's artificial intelligence in the equation?

I always double check the information provided by the AI, no matter I'm very critical of it. I find that when you have access to an abundance of information it's always good to come with an abundance of skepticism.

A fitting ending to our discussion. Thank you Dr Amaury.

Thank you for having me.

bottom of page