Exensor - Infrared Scout Camera - UGS 2 - Army Technology
Source: https://www.army-technology.com/contractors/surveillance/exensor-technology/attachment/exensor-infrared-scout-camera-ugs-2/

Imagine you are reading a theoretical social science article that is dedicated to making an argument (let’s call it Argument X). You get to a section of the article called “Alternative Explanations,” which discusses Challenges A, B, and C to Argument X. At the end of this section, the author writes: “Challenge B is the superior explanation, and Argument X is therefore disconfirmed.”

Or imagine you are reading the “Robustness Checks” section of a quantitative piece and the author concludes by writing, “The majority of these robustness checks failed and therefore the main hypothesis of this article is wrong.”

These are the kinds of things you might expect to read if academia had a “Scout Mindset,” a term taken from Julia Galef’s new book [Full disclosure: I haven’t finished the book yet, so these are initial observations]. But you don’t read these kinds of passages very often. And that’s a problem.

Scouts and Soldiers

In a recent interview, Galef describes a ‘scout mindset’ as:

“…my term for the motivation to see things as they are and not as you wish they were, being or trying to be intellectually honest, objective, or fair minded, and curious about what’s actually true.”

This can be contrasted with ‘soldier mindset,’ which is what most people have:

“…a lot of the time we humans are in what I call ‘soldier mindset,’ in which our motivation is to defend our beliefs against any evidence or arguments that might threaten them. Rationalization, motivated reasoning, wishful thinking: these are all facets of what I’m calling a soldier mindset.

I adopted this term because the way that we talk about reasoning in the English language is through militaristic metaphor. We try to ‘shore up’ our beliefs, ‘support them’ and ‘buttress them’ as if they’re fortresses. We try to ‘shoot down’ opposing arguments and we try to ‘poke holes’ in the other side.”

Sound familiar? Like many things it purports to be (e.g., a meritocracy), is it really the case that academia uncovers the best explanations or simply defends privileged explanations? The prevalence of soldier mindset seems especially relevant for academia because a career can be made on the basis of having the “right” argument. 

Why and What to Do

Why does academia not have a scout mindset? I’d suggest a few reasons:

  1. We train academics to have soldier mindset — How many of us really got trained in graduate school to think like a scout? Don’t we tell graduate students to develop strong theories, hypotheses, and then defend them against any and all alternatives? Don’t we talk about how we got “destroyed” at a conference or (god forbid) a job talk? 
  2. It’s hard to publish messy findings — Simply put, the reason scholars don’t undermine their own arguments or flunk their own robustness checks is because you won’t get published if you do. How many articles even discuss the limitations of the argument being pursued? It’s just better to seem confident. 
  3. We really think we’re right — And we could be! But you can’t be sure unless you seriously entertain the possibility that you aren’t. And it’s not clear how often we’re doing that.  

Assuming that you do want better reasoning, what can be done? Again, a few suggestions: 

  1. We could incentivize people to publish messy findings — Let’s all read the International Journal of Negative & Null Effects. Really this is the most honest journal out there. Let’s realize that messy findings or null effects are probably more realistic than “parsimonious” explanations. 
  2. We could be less critical of each other — This doesn’t mean we need to just let anything fly. Argument, discussion, and debate are all good, and I think most of us aim to get closer to the “truth.” But we could reduce the stakes of being wrong. We could try to stop destroying each other at talks. That only makes a scholar hunker down and add 50 more robustness checks to their next article. And this is especially true for graduate students, junior scholars, and those for whom being wrong is a bigger problem. 
  3. We could valorize people who admit their mistakes — The real people to emulate in this business are those that can admit they were wrong, not the ones who can’t. I once joked that the “the way to succeed in academia is to construct a shockingly incorrect theory and defend it to the death for three decades” — I meant this as a joke, but now I realize it might not be a joke. 

Having a scout mindset is beneficial because it could promote more reasonable dialogue between academics. Imagine meeting someone working on the same area as you and — instead of being instantly threatened — you were able to discuss your research and its limitations with open minds. This might make us all happier — and better scholars. 

Lastly, I’d be remiss if I didn’t consider the possibility that my own argument here (that academia doesn’t have a scout mindset) could be wrong. What do you think?  

Share