Kamis, 05 Oktober 2017

Why Smart Oncologists Do Dumb Things

Why Smart Oncologists Do Dumb Things


SAN DIEGO — There’s no shortage of ways in which oncologists can unwittingly make bad decisions with patients, but there are recognized strategies that may help minimize risks, according to presenters here at the American Society for Radiation Oncology (ASTRO) 2017 Annual Meeting.

All physicians, including radiation oncologists, are prone to some degree of medical error, suggested Ajay Kapur, PhD, a physicist at Northwell Health System, Long Island, New York, who introduced a meeting session on decision-making, bias, and medical error.

For example, the prevalence of diagnostic error in medicine overall is estimated to be 15%, a number that comes out of autopsy studies, patient and provider surveys, and second reviews, he said.

Given the potency of radiation, medical errors made in the planning and treatment of cancers can be very costly.

Dr Kapur reviewed the well-documented story of Lisa Norris, a 15-year-old British girl given an overdose of radiation for her medulloblastoma, which required treatment to both the brain and spine, at a Glasgow, Scotland, cancer center. She died in 2016 soon after completing treatment.

The spinal radiation was felt to be the most complex part of her treatment. Once the team found an error in the risk-laden spinal part of the treatment plan, they were not expecting another error — but there was one.

The treatment planner, when calculating the radiation dose to her brain, used the wrong factor in the calculation for the appropriate machine output. This error went undiscovered and resulted in the delivery of 58% more radiation to her brain than intended.

In the end, the tragic incident was carefully reviewed and the practitioners were found to have made some cognitive errors. These include confirmation bias, in which you find what you expect to find, and search satisfying, in which you (subconsciously) stop looking once you find one thing, pointed out Suzanne Evans, MD, a radiation oncologist at Yale Cancer Center in New Haven, Connecticut, in comments to Medscape Medical News. She also spoke during the meeting session.

There are dozens of technical phrases created by academics for various problematic cognitive biases. However, Daylian Cain, PhD, from the Yale School of Management, used the word “overconfidence” as an umbrella term during his talk during the ASTRO session.

The main point from Dr Cain, who has studied decision-making for the past 10 years, was simple: Don’t be overconfident.

But walking the path of this practice can be tricky, he added.

“We have a lay view of overconfidence that’s just wrong,” he told an audience of about 300 people at the meeting session. It generally is considered to be akin to bluster and the result of active effort, he said.

“Overconfidence is not like that. It’s not deliberate, it’s effortless. It doesn’t take a lot of time, it’s instant. It’s not even motivated most of the time,” said Dr Cain.

He made his point by asking the audience to guess at the number of rooms in the world’s biggest house. However, don’t pick a single number, he said. He wanted the audience members to pick a range of numbers — that is, two numbers in between which the actual number exists — so as to increase the chance of being correct. (Medscape Medical News‘s reporter guessed 200 to 300.)

The correct answer is 1789. But less than 10% of the ASTRO audience had an answer with a range containing that number. Why? Because “overprecision” is highly common among formally educated people and exists at significant levels (up to 25% of the time) even when an issue is considered by experts in a field, said Dr Cain.

This applies to making prognoses for patients, he added. “Why are confidence intervals always too small?” asked Dr Cain. In other words, why do we tend to be overconfident? (Answer: It is difficult to pinpoint why.)

The Yale professor went on to explain that there are various strategies to avoid overconfidence —  ranging from folk wisdom (eg, “look before you leap”) to more involved strategies, such as getting educated on a topic, having incentives to be accurate, and relying on experience.

While there are multiple ways to avoid overconfidence and make a good decision, “only one of them works and the rest basically fail,” he proclaimed.

The best strategy: Consider the opposite. “Ask yourself: How could I be wrong?” he continued.

Ask yourself: How could I be wrong?
Dr Daylian Cain

 

That is, “Be the devil’s advocate,” said Dr Cain. This is often misinterpreted as being a “naysayer” but, in fact, is a careful reviewer. Dr Cain explained that the term comes from the Catholic Church and its review process before canonizing a saint. One Catholic reviewer would be designated to bring up every negative or doubtful thing about a nominee — so as to cross those items off the list of disqualifiers.

“Look for imperfections in your own work, that’s how you get closer to perfect,” summarized Dr Cain.

How to Reduce Bias, Errors

Medical error and poor decision-making are not only the result of problematic overconfidence (ie, biases), suggested Dr Evans.

For example, a study of radiation oncologists found that the higher a physician’s workload, the higher the likelihood of errors in radiation therapy planning, she said.

Workload is one of many “system factors” in medicine that can invite cognitive error; these also include workplace rudeness, task complexity, time pressure, and transfers from another facility, she said. There are also certain “person factors” that invite cognitive error, such as cognitive overload, fatigue, and affective bias (having feelings toward patients).

But sometimes overconfidence of some sort does seem at play. For instance, Dr Evans reviewed a range of biases that lead physicians to make ill-advised decisions.

Some of the biases are well known, such as the intervention/commission bias, which inclines doctors to intervene with something (eg, drugs, diagnostic tests, procedures, and surgeries) when doing nothing/not intervening was a “reasonable alternative.”

Being right feels the same as being wrong.
Dr Suzanne Evans

 

But there is a tendency to be blind to mistakes. “It’s hard to recognize when we are making a bad decision,” Dr Evans added. “Being right feels the same as being wrong.”

Cognitive “debiasing strategies” have been developed to help avoid blind spots, she said.

However, there is little proof in medicine that techniques to limit bias in decision-making actually work, admitted Dr Evans. “The data are just not there,” she said, including about Dr Cain’s “consider the opposite” strategy.

But physicians can fortify themselves against bias with sensible tips, such as eating well, sleeping enough, and avoiding cognitive overload.

Dr Evans said that using external resources is another bias-busting strategy. Review practice guidelines from ASTRO and the National Comprehensive Cancer Network and access decision-support materials from Up-to-Date, Medscape, and others, she suggested.

To avoid being hasty/biased, use these strategies, Dr Evans said: Always fully consider at least three diagnosis or treatment options, rule out the worst-case scenario, consider the opposite (Dr Cain’s method), and limit exposure to others’ opinions before you initially review the facts.

Dr Evans also believes in teamwork and the wisdom of group decision-making to avoid bias and errors: “Talk to other people about the case and really go over it.”

Dr Evans, Dr Cain, and Dr Kapur have disclosed no relevant financial relationships.

American Society for Radiation Oncology (ASTRO) 2017 Annual Meeting. Panel 2. Presented September 24, 2017.

Follow Medscape senior journalist Nick Mulcahy on Twitter: @MulcahyNick

For more from Medscape Oncology, follow us on Twitter: @MedscapeOnc



Source link

Tidak ada komentar:

Posting Komentar