# Radicalization, expertise, and skepticism among doctors & engineers: the value of philosophy in education

This past Friday was a busy day for a lot of the folks in Integrated Mathematical Oncology here at the Moffitt Cancer Center. Everybody was rushing around to put the final touches on a multi-million dollar research center grant application to submit to the National Cancer Institute. Although the time was not busy for me, I still stopped by Jacob Scott’s office towards the end of the day to celebrate. Let me set the scene for you: it is a corner office down the hall from me; its many windows are scribbled over with graphs, equations, and biological interaction networks; two giant screens crowd a standing desk, and another screen is hidden in the corner; the only non-glass wall has scribbles in pencil for the carpenters: paint blackboard here. There are too many chairs — Jake is a connector, so his office is always open to guests.

A different celerbation in Jake’s office. The view is from his desk towards the wall that needs to be replaced by a blackboard.

In addition to the scientific and administrative stress of grant-writing, Jake was also covering for his friend as the doc-of-the-day for radiation oncology. So as I rambled on: “If we consider nodes of degree three or higher in this model, we would break up contingent blocks of mutants and result in the domain of our probability distribution going from $n^2$ to $2^n$“, scribbling more math on his wall, we would get interrupted by phone calls. His resident calling to tell him that the neurosurgeons have scheduled a consultation for an acute myeloid leukemia patient who is recovering from surgery earlier that day.

“Only on a Friday afternoon do you get this kind of consult!” Jake fires off, “He’s still in surgery! We can’t do anything for at least a few days – schedule him for Monday.”

The call was on speakerphone, but I could not keep up with the conversation. After years of training and experience, this was an effortless context-shift for Jake. He went from the heavy skepticism of a scientist staring at a blackboard to the certainty of a doctor that needed to get shit done, and back, in moments. I couldn’t imagine having this sort of confidence in my judgements, mostly because I have no training in medicine, but also because I am not expected to be certain. That is why I lean towards using abductive models versus insilications for clinial research; I have more confidence in machine learning than in my own physical and biological intuitions about cancer. Even if that approach might produce less understanding.

In recent weeks, I’ve noticed a theme in some of the (news and blog) articles I’ve been reading. In this post, I wanted to provide an annotated collection of some of these links, along with my reflections on what they say about the tension between expertise and skepticism and how that can radicalize us, both in mundane ways and in drastic ones. And what role philosophy can play in helping us cope. I will end up touching on recent events and politics as a source context, but hopefully we can keep the overall conversation more or less detached from current events.

Ben Carson: A case study on why intelligent people are often not skeptics
by David Gorski at Respectful Insolence

David Gorski is a surgeon specializing in breast cancer, but on the internet he is more famous for his skepticism of complementary and alternative medicine, and advocacy for relying only on evidence-based medicine. Although he has great respect for doctors, he is concerned about how their skepticism and expertise often doesn’t extend outside of medicine. In fact, doctors might be more susceptible to bunk outside their field of expertise that others. Joerg Fliege summarized the main point of this article well:

Dunning-Kruger might be particularly prevalent in people that are told, over and over again, that they are the best of the best and the smartest of the smartest. Which is precisely what some universities do to their students.

I agree that being told that you are the best and the smartest can reinforce the Dunning-Kruger effect, but I think there is a secondary effect in that you are told that you should value your own judgement highly. Of course, these two basically go hand in hand. However, I think that this latter perspective is aggravated further in professions where we expect the practitioner to be confident in their judgements than in fields where we don’t need the practitioners to put on these airs. To go back to this post’s opening, Jake can be skeptical at no cost during discussions with me, but when he is talking to a patient or advising a surgeon, he need to be confident in his decisions.

In other words, I would expect DK to be stronger in business folks and doctors than in engineers, and I would expect it to be higher in engineers than in physical scientists, and higher in physical scientists than the social sciences or humanities. Even if we control for the levels of exclusiveness and ‘cream-of-the-crop’ness in their education, and acknowledge that there is a difference between say an expert and novice reader of literature.

I had a second concern, or pet-peeve, about the passing suggestion in that article that it is always better to actively seek disconfirming evidence for our beliefs. I don’t agree with this universal claim. We can have good reasons for not actively seeking out disconfirming evidence for some of our beliefs in some contexts. An example of such a good reason can be when a belief’s primary ‘goal’ is not veracity but other features like security or consistency. An example where such a reason can apply to convince us to not actively seek out disconfirming evidence is in the interpersonal domain of love for/from one’s children or partner. In such a context, actively seek disconfirming evidence undermines the purpose of the belief. But that conversation is better saved for reddit.

The main point from Gorski’s article is how a technical education in general is no insulation against ignorance and uncritical acceptance of nonsense, and in some cases might even promote it.

This is the group that’s surprisingly prone to violent extremism
by Henry Farrell at Monkey Cage

A lack of skepticism is not limited to doctors. The surprising group in the title is engineers, and the post is based on a sociology study and book by Diego Gambetta and Steffen Hertog. The unexpected affinity between engineers and woo is well noted on the internet. As Jordan Peacock commented on my G+ share, engineers are also “prone to complex conspiracy theories….﻿ and creationism … and climate change denialism. … wait I see a pattern.﻿” The comments are, of course, tongue-in-cheek, but the data point provided by Gambetta & Hertog is serious:

among violent Islamists … individuals with an engineering education are three to four times more frequent than we would expect given the share of engineers among university students in Islamic countries.

and this is not explained by founder effects (or network effects as the post calls them) or the need for engineering skill in terrorism. The authors attribute it to the certainty of the engineer’s mindset and the lack of skilled employment for them in much of the Middle East. They include a bespoke quote from von Hayek:

[Engineers] react violently against the deficiencies of their education and develop a passion for imposing on society the order which they are unable to detect by the means with which they are familiar.

Of course, this doesn’t mean that your engineering friend is a violent extremist. And that is where the real meat of the article is. If we don’t generalize from a few radical engineers to all engineers being bad, or engineering as an idea being bad then why do we do it when engineers are replaced by a more marginalized social group?

Teaching philosophy to children? It’s a great idea
by Michelle Sowey at The Guardian: Philosophy Opinion

So if both doctors and engineers are susceptible, or maybe even more susceptible, to radicalization on minor bunk and major issues then does that mean there is no place for education in making us better people? Well, education in those specialties is aimed at making us a professional doctor or a professional engineer, not — in the words of Alex Pozdnyakov — “become a professional human being.” For the latter, we have to turn to philosophy. And it is best to start early. The benefits can be immense:

Studying philosophy cultivates doubt without helplessness, and confidence without hubris. … By setting children on a path of philosophical enquiry early in life, we could offer them irreplaceable gifts: an awareness of life’s moral, aesthetic and political dimensions; the capacity to articulate thoughts clearly and evaluate them honestly; and the confidence to exercise independent judgement and self-correction.”

Philosophy saved me from poverty and drugs: that’s why I teach it to kids
by Andy West at The Guardian: Philosophy Opinion

It is easy to scoff at such proposals for teaching philosophy as bourgeois and only relevant in the private schools for the children of the the well-to-do. But Andy West doesn’t think so. He doesn’t teach philosophy to kids born with a silver spoon but to “children receiving free school meals”. In this article, he shares his own story of overcoming a rough an non-academic upbringing and having his life turned around by philosophic inquiry. He closes with:

[Philosophy] allows young people to challenge authority and express themselves in a way that creates rather than destroys their life opportunities. Philosophical questions such as “Who should have power?” and “Can you be a good person if you do bad things?” are universally evocative; if we have the means to make them universally accessible, then we must do so.

If dogma and radicalism can destroy lives, can skepticism and inquiry save them?

Pigeons can identify cancerous tissue on x-rays, study finds
by Ellen Brait at The Guardian: Cancer

These anecdotes do not mean that we have to all be philosophy majors, there is space in the medical and engineering (and science) curricula for philosophy. These disciplines are not just about technical aspects of interaction with machinery and technology, science and prediction. If it was really that simple then we might be able to replace radiation oncologists like Jake by pigeons:

It looks like the common pigeon, after a period of conditioning, can classify new x-rays and histologies with an accuracy of around 85%. Which on some of the test sets, is apparently comparable to the performance of trained radiologists and pathologists. If you arrange the fowl in groups and exploit wisdom of crowds then the birds can achieve a 99% success rate.

And the pigeon is probably less likely to publicly deny evolution, provide testimonial for questionable pharmaceuticals, or deliver a commencement address at Andrews University telling the graduates that the Egyptian pyramids were build by biblical Joseph to store grain. Simply because very few would listen to a pigeon.

Gambetta, D., & Hertog, S. (2009). Why are there so many Engineers among Islamic Radicals? European Journal of Sociology, 50 (02) DOI: 10.1017/S0003975609990129

Levenson RM, Krupinski EA, Navarro VM, Wasserman EA (2015). Pigeons (Columba livia) as Trainable Observers of Pathology and Radiology Breast Cancer Images. PLoS ONE 10(11): e0141357.