Judgment and Decision Making, Vol. 15, No. 3, May 2020, pp. 401-412

Individual differences in receptivity to scientific bullshit

Anthony Evans*   Willem Sleegers#   Žan Mlakar$

Pseudo-profound bullshit receptivity is the tendency to perceive meaning in important-sounding, nonsense statements. To understand how bullshit receptivity differs across domains, we develop a scale to measure scientific bullshit receptivity — the tendency to perceive truthfulness in nonsensical scientific statements. Across three studies (total N = 1,948), scientific bullshit receptivity was positively correlated with pseudo-profound bullshit receptivity. Both types of bullshit receptivity were positively correlated with belief in science, conservative political beliefs, and faith in intuition. However, compared to pseudo-profound bullshit receptivity, scientific bullshit receptivity was more strongly correlated with belief in science, and less strongly correlated with conservative political beliefs and faith in intuition. Finally, scientific literacy moderated the relationship the two types of bullshit receptivity; the correlation between the two types of receptivity was weaker for individuals scoring high in scientific literacy.


Keywords: bullshit receptivity, belief in science, motivated reasoning, false beliefs

1  Introduction

Disinformation can negatively affect individual health and well-being (Gigerenzer, Gaissmaier, Kurz-Milcke, Schwartz & Woloshin, 2007), as well as the functioning of society at large (Sunstein, 2018). To understand how and why people are influenced by false information, psychologists have begun to investigate the processes underlying bullshit receptivity, the tendency to believe in important-sounding nonsense statements that are ultimately indifferent to the truth (Frankfurt, 2005). In particular, recent studies have focused on identifying correlates of the willingness to accept pseudo-profound bullshit (Pennycook, Cheyne, Barr, Koehler & Fugelsang, 2015).

We extend prior research by introducing a measure of receptivity to scientific bullshit. People generally trust in scientists and the institution of science (Pew, 2019). Given the strong public trust in science, scientific language can be used as a tool to promote consumer products (Fowler, Carlson & Chaudhuri, 2019), health behaviors (Larsen et al., 2019), and public policies (Sánchez & Parrott, 2017). Although well-intentioned agents may use scientific language to provide citizens with valid information and advice, scientific jargon can also be used as a tool to obfuscate or mislead the public (Goldacre, 2010). In the present research, we are interested in individual differences in susceptibility to nonsensical scientific claims.

We developed a scale to measure individual differences in the tendency to perceive truthfulness in nonsensical scientific statements (Example: “There are no transverse waves when the total magnetic sublimation through a stiff photon is equal to its scattered matrix”). First, we investigate whether receptivity to scientific bullshit is correlated with receptivity to pseudo-profound bullshit. Then, we ask whether the correlates of bullshit receptivity are consistent (or different) across content domains. More specifically, we compare how the two types of bullshit are correlated with individual differences in belief in science, political ideology, and cognitive style. We also ask to what extent these correlations are influenced by acquiescence bias (the tendency to agree with statements, regardless of content), and compare the correlates of receptivity versus sensitivity to scientific bullshit (i.e., the ability to distinguish between factual and bullshit scientific statements). Finally, we examine how relevant domain-expertise (i.e., scientific literacy) moderates the correlation between the two types of bullshit receptivity.

1.1  Pseudo-profound bullshit receptivity

Bullshit is communication that is meant to seem meaningful and impressive (Frankfurt, 2005), but, at the same time, is “unclarifiably unclarifiable” (Cohen, 2002). In other words, a bullshit statement is inherently unclear, and bullshit statements cannot be made clear to those who do not find initial meaning in them (Cohen, 2002). Following this definition, Pennycook et al. (2015) introduced a scale to measure individual differences in receptivity to pseudo-profound bullshit. In this scale, participants rate the profundity of syntactically coherent statements that were constructed without concern for the truth. The statements were constructed using phrase generators that randomly combined sets of new age, profound-sounding terms (Example: “We are in the midst of a self-aware blossoming of being that will align us with the nexus itself”).

What type of person is receptive to pseudo-profound bullshit? Pennycook et al. (2015) found that individuals receptive to bullshit are also more likely to rely on intuitive (vs. reflective) cognitive processes; have weaker cognitive abilities (i.e., lower scores in verbal intelligence, fluid intelligence, and numeracy), and are more likely to believe in religion and the supernatural. Generally, those who believe in pseudo-profound bullshit are either relatively unwilling or unable to engage in rational thought. Furthermore, pseudo-profound bullshit receptivity may have important behavioral consequences, as those who believe in pseudo-profound bullshit are more susceptible to fake news (Pennycook & Rand, 2017). Pseudo-profound bullshit can also be strategically used to increase the attractiveness of consumer goods, such as works of modern art. Paintings with pseudo-profound titles are seen as higher-quality than paintings with mundane titles (Turpin et al., 2019).

Existing research has focused primarily on pseudo-profound bullshit, statements that are superficially related to the fundamental nature of the universe or existence. However, bullshit can also be employed in mundane contexts. For example, bullshit arguments can appear in modern workplaces (Beckwith, 2006), political discussions (Hopkin & Rosamond, 2018), and even in (seemingly) evidence-based scientific reports (Bauer, 2008). It is important to consider how bullshit receptivity functions across domains: Are some people generally accepting of bullshit statements, regardless of content? Or are the correlates of bullshit sensitivity different across domains? For example, individuals with spiritual proclivities may be more likely to accept pseudo-profound (but not political) bullshit.

A recent study by Čavojová, Brezina and Jurkovič (2020) points to the idea that receptivity to bullshit may be consistent across content domains. Čavojová et al. (2020) developed a scale to measure general receptivity to bullshit consisting of statements related to general concepts in areas such as health, relationships, economics, and politics. There was a strong positive correlation (r ∼ .60) between pseudo-profound bullshit receptivity and general bullshit receptivity, and both types of bullshit receptivity were positively correlated with conspiracy beliefs, pseudoscientific beliefs, paranormal beliefs, and individual differences in cognitive style (e.g., greater reliance on intuitive thinking and greater ontological confusion). In the present research, we add to the literature by examining bullshit receptivity in the context of science.

1.2  Scientific bullshit receptivity

We define scientific bullshit as a form of communication that relies on obtuse scientific jargon to convey a false sense of importance or significance. As with the case of pseudo-profound bullshit, scientific bullshit is syntactically coherent, but impossible to verify as either true or false. Scientific bullshit, however, is constructed using scientific (rather than New Age) terminology; and the aim of scientific bullshit is to sound true, not profound.

There is good reason to anticipate that people are susceptible to scientific bullshit: Although people are reluctant to trust in scientific findings that directly conflict with their personal interests and beliefs (Kahan et al., 2012), the public maintains a generally positive view of scientists and the broader institution of science (Pew, 2019). For example, U.S. Americans consider scientists to be more trustworthy than business leaders, religious leaders, the news media, and elected officials (Funk, Hefferon, Kennedy & Johnson, 2019). The public’s general trust in science (and scientists) may also mean that people are vulnerable to malevolent agents who make use of the superficial trappings of science (e.g., those who use irrelevant or made-up scientific jargon) to manipulate the public.

Recent studies support the idea that people trust scientific terminology even when it is irrelevant or unnecessary: Laypeople are more likely to accept scientific explanations that are accompanied with irrelevant neuroscientific jargon (Weisberg, Keil, Goodstein, Rawson & Gray, 2008; Weisberg, Taylor & Hopkins, 2015); in particular, scientific jargon makes non-experts more willing to accept low-quality scientific arguments. Similarly, experienced readers will judge scientific abstracts as higher quality when they include irrelevant mathematical equations (Eriksson, 2012). These studies illustrate that scientific language can be used to bullshit readers. Here, we introduce a measure of the individual tendency to accept nonsense scientific statements (scientific bullshit), and ask how receptivity to scientific bullshit is related to pseudo-profound bullshit receptivity.

1.2.1  Scientific bullshit and pseudo-profound bullshit

Our first question is whether receptivity to scientific bullshit is correlated (either positively or negatively) with receptivity to pseudo-profound bullshit. The answer to this question depends on the extent to which bullshit receptivity is a domain-general (vs. domain-specific) individual difference. Previous work has found that individual differences in judgment and decision making vary substantially across content domains (Blais & Weber, 2006; Duckworth & Kern, 2011). For example, financial risk-taking is only weakly correlated with risk-taking decisions related to health and social relationships. If this is the case for bullshit receptivity, then there should be a weak (or perhaps even negative) correlation between receptivity to scientific bullshit and receptivity to scientific bullshit. On the other hand, if bullshit receptivity is consistent across domains (Čavojová et al., 2020) and related to general dispositions, such as the general preference to rely on intuitive processes, then there may be strong positive correlations between different forms of bullshit receptivity.1

1.2.2  The correlates of scientific bullshit receptivity

Our second goal is to examine correlates of scientific bullshit receptivity, and to ask whether these variables are differently correlated with scientific (vs. pseudo-profound) bullshit receptivity. We focus on three areas relevant to scientific bullshit: belief in science (Farias, Newheiser, Kahane & de Toledo, 2013), political ideology (Sterling, Jost & Pennycook, 2016), and cognitive style (Pennycook et al., 2015).

Belief in science.

Individuals differ in the extent to which they believe in the value of science and its superiority as a source of knowledge (Farias et al., 2013). Those who believe in the institution of science may also be more likely to believe in scientific bullshit. In other words, those who have extremely high levels of trust in the discipline of science may be more likely to take any scientific statement at face value, without reading it carefully to assess its veracity, as individuals tend to process information less carefully when they are in trusting (vs. skeptical) mindsets (Mayo, 2015). At the same time, those with strong beliefs in science may also be less likely to trust in pseudo-profound bullshit. Belief in science is negatively correlated with religiosity and spiritual beliefs (Farias et al., 2013), and people may implicitly believe that science and religion exist in opposition to one another (Preston & Epley, 2009).

Political ideology.

Recently, Sterling et al. (2016) found that conservative political beliefs are positively correlated with receptivity to pseudo-profound bullshit. Similarly, Pfattheicher and Schindler (2016) found that candidate preferences in the 2016 U.S. American presidential election were correlated with bullshit receptivity: supporters of Donald Trump, Ted Cruz, and Marco Rubio (the top three Republican candidates when the study was conducted) were more receptive to pseudo-profound bullshit than supporters of Hillary Clinton and Martin O’Malley (the top two Democratic candidates at the time). Replicating these results in a Swedish sample, Nilsson, Erlandsson and Västfjäll (2019) found that bullshit receptivity was consistently correlated with social (vs. economic) conservatism, even when statistically controlling for demographics and individual differences in cognitive style (e.g., numeracy and cognitive reflection).

We ask whether conservative political beliefs are also correlated with receptivity to scientific bullshit. If the tendency to accept bullshit statements is independent of context, then it is likely that conservatives will also endorse bullshit scientific statements. At the same time, other work has found that conservatives are generally less trusting of science than liberals (Gauchat, 2012), though both groups are generally distrustful of dissonant or personally threatening scientific findings (Kahan et al., 2012). If conservatives are indeed less trusting of science, then they may also be more likely to apply a skeptical eye towards scientific bullshit (Mayo, 2019). If this is the case, then conservatism should be negatively correlated with scientific bullshit receptivity, or at least less strongly correlated with scientific bullshit receptivity (compared to the correlation between conservatism and pseudo-profound bullshit receptivity).

Cognitive style.

Third, we examine the relationship between scientific bullshit receptivity and individual differences in cognitive style, the extent to which individuals prefer to rely on intuitive versus reflective thinking (Cacioppo & Petty, 1982; Epstein, Pacini, Denes-Raj & Heier, 1996). Individuals who prefer an intuitive cognitive style are also more likely to accept pseudo-profound bullshit (Pennycook et al., 2015; Pennycook & Rand, 2017), yet it remains unclear if intuitive thinking will also be associated with receptivity to scientific bullshit. Arguably, science is associated with rational, deliberate reasoning (Pinker, 2018). Hence, those who prefer to rely on intuition may be less inclined to accept scientific bullshit statements.

1.2.3  Scientific literacy and scientific bullshit

Our third area of investigation deals with how knowledge of science influences the relationship between scientific bullshit receptivity and pseudo-profound bullshit receptivity. Prior work has highlighted that laypersons and those who lack knowledge of science may be particularly susceptible to the influence of irrelevant scientific jargon (Weisberg et al., 2008). Non-experts are particularly susceptible to bad scientific arguments. Hence, individuals who have explicit knowledge of science may be better able to recognize scientific bullshit when they see it, whereas those with little scientific background may realize they lack the expertise to judge the legitimacy of scientific statements. In other words, scientific literacy may moderate the relationship between the two types of bullshit receptivity, with highly literate individuals being better able to differentiate between scientific bullshit and pseudo-profound bullshit.

1.3  Overview of studies

We investigate three overarching questions related to individual differences in scientific bullshit receptivity: First, we examine the correlation between receptivity to pseudo-profound bullshit and receptivity to scientific bullshit. Second, we investigate whether the two types of bullshit receptivity are differently correlated with belief in science, political ideology, and cognitive style. We will ask whether scientific bullshit receptivity is correlated with these variables, and whether the two types of bullshit receptivity are differently correlated with them. For example, we will ask if the correlation between belief in science and scientific bullshit receptivity is stronger than the correlation between belief in science and pseudo-profound bullshit receptivity.

While investigating these first two questions, we will examine how controlling for acquiescence bias – the tendency to agree with scale items, regardless of the specific item content – influences the above correlations (Rammstedt, Kemper & Borg, 2013). Additionally, we will conduct exploratory analyses to compare the correlates of receptivity to scientific bullshit and sensitivity to scientific bullshit: the ability to differentiate between factual versus bullshit scientific statements. Previous studies on pseudo-profound bullshit receptivity found that receptivity to pseudo-profound bullshit is correlated with intuitive thinking and reliance on heuristics and biases, whereas sensitivity to pseudo-profound bullshit is correlated with analytical thinking and rational decision making (Pennycook et al., 2015).

Finally, we examine whether scientific literacy (e.g., general scientific knowledge) moderates the relationship between the two types of bullshit receptivity. We ask whether scientific literacy weakens the correlation between pseudo-profound bullshit receptivity and scientific bullshit receptivity. To answer these questions, we conduct analyses using aggregated data from three studies.


Table 1: Participant demographics.
 
Study 1
Study 2
Study 3
Total
Sample size
365
543
1,040
1,948
Gender    
Men
229
284
559
1,072
Women
133
255
472
860
NA or other
3
4
9
16
Age, M (SD)
35.3 (11.1)
35.0 (9.75)
34.3 (11.2)
34.7 (10.8)
Education    
High-school diploma or less
53
83
118
253
Part of college or a full college degree
270
40
699
1,372
Some or all of a graduate degree
40
57
223
320
N/A
2
1
0
3

2  Methods

Each of our three studies was pre-registered at AsPredicted: Study 1: http://aspredicted.org/blind.php?x=44si22; Study 2: http://aspredicted.org/blind.php?x=73ev8s; Study 3: https://aspredicted.org/blind.php?x=dd4up6.

2.1  Participants

Across three studies, we recruited 2,039 U.S. American participants from Amazon Mechanical Turk (Studies 1 and 2) and Prolific Academic (Study 3).2 Participants were paid at a rate of approximately $0.15 per minute ($1.20 in Study 1, $2.00 in Study 2, and £1.60 in Study 3). Study 1 was conducted in May 2017; Study 2 was conducted in July 2017; and Study 3 was conducted in November 2018. We excluded 91 participants (4.46%) from the analysis because they failed to complete one or more of the measures, which left us with a final N = 1,948. Demographics of the included participants are reported in Table 1.

The sample sizes for each study were based on power analyses conducted using GPower (Faul, Erdfelder, Buchner & Lang, 2009). In Study 1, the planned sample size was based on a power analysis with α = .05, β = .20 (80% statistical power), and r = .15: minimum N = 343. In Study 2, the planned sample size was based on α = .05, β = .20 and r = .12: minimum N = 540. In Study 3, the sample size was based on α = .05, β = .1 and r = .10, minimum N = 1,043.

2.2  Materials and procedure

2.2.1  Measures

Participants in all three studies completed the Pseudo-Profound Bullshit Receptivity Scale (Pennycook et al., 2015); the Scientific Bullshit Receptivity Scale (which we describe in the following section); the Belief in Science Scale (Farias et al., 2013) and the Social and Economic Conservatism Scale (Everett, 2013).

There were also unique measures included in each study: In Study 1, participants completed an additional measure of ideology, the Free Market Belief Scale (Heath & Gifford, 2006). In study 2, participants completed two measures of cognitive style, the Faith in Intuition Scale (Epstein et al., 1996) and the Need for Cognition Scale (Cacioppo, Petty & Feng Kao, 1984), as well as a measure of acquiescence bias (Rammstedt et al., 2013). In Study 3, participants completed the Science Literacy Scale (Kahan et al., 2012).

Pseudo-profound bullshit receptivity.

Receptivity to pseudo-profound bullshit was measured using the scale introduced in Pennycook et al. (2015). This scale consists of 20 statements constructed with generators (http://wisdomofchopra.com and http://sebpearce.com/bullshit) that randomly combine pre-entered words into syntactically correct sentences that tend to sound profound, but inherently carry no meaning (i.e., “Hidden meaning transforms unparalleled abstract beauty”). Participants rated these statements on a 5-point profundity scale (1 = not at all profound; 5 = very profound), Cronbach’s α = .94.

Scientific bullshit receptivity.

We developed a new scale to measure receptivity to bullshit scientific statements. Ten-items (bullshit statements) were created by taking existing physical laws and changing their central words with randomly selected words from a physics glossary. These items sounded elaborate and complex, but had no actual scientific truth (Example: “There are no transverse waves when the total magnetic sublimation through a stiff photon is equal to its scattered matrix”). The ten remaining items (real scientific statements) were based on actual physical laws (Example: “A cyclic transformation whose only final result is to transform heat extracted from a source which is at the same temperature throughout into work is impossible”). Note that these ten factual statements were excluded from the third study. The complete set of items can be found in the Appendix.

Participants indicated the extent to which they believed each item to reflect the truth. They indicated their answers on a 5-point Likert scale (1 = not at all truthful; 5 = very truthful). Our measure of bullshit receptivity was the average response to the ten bullshit statements, α = .83.

Content validation.

We collected data to validate the content of our scientific bullshit statements. Participants (N = 200 U.S. Americans recruited from Prolific Academic) were presented with ten pseudo-profound bullshit statements and ten scientific bullshit statements. The pseudo-profound statements consisted of the first ten items of the bullshit receptivity measure introduced by Pennycook et al. (2015). Participants were asked to rate how scientific-sounding (sounding as if it is related to science) and how profound-sounding (sounding as if it is related to related to the meaning or nature of existence) each statement was.3 The twenty statements were rated on a scale from 1 (Not at all scientific- / profound-sounding) to 5 (Very scientific- / profound-sounding), and were presented in a randomized order. We compared ratings using multilevel models with random-intercepts estimated for each participant and each statement. Scientific statements were seen as more scientific-sounding (M = 4.59, SD = 0.73) than pseudo-profound statements (M = 2.15, SD = 1.18), b = 2.43, SE = 0.17, p < .001; and scientific statements were seen as less profound-sounding (M = 1.93, SD = 1.09) than the pseudo-profound statements (M = 3.68, SD = 1.22), b = -1.74, SE = 0.09, p < .001.

Confirmatory factor analyses.

Using data from our three studies, we also conducted confirmatory factor analyses using the lavaan package (Oberski, 2014) to assess whether the two bullshit receptivity scales should be treated as two separate measures. For these analyses, we included the ten scientific bullshit and the first ten pseudo-profound bullshit statements. We compared two models: In the first model, all items loaded onto one latent variable; in the second model, items loaded onto two correlated latent variables corresponding to pseudo-profound bullshit receptivity and scientific bullshit receptivity. The fit of the two-variable model (RMSEA = .043, CFI = .95) was significantly better than the fit of the one-variable model (RMSEA = .085; CFI = .81): χ2(1) = 830.88, p < .001. The present analyses suggest that the two measures of bullshit receptivity can indeed be treated as separate measures.

Scientific bullshit sensitivity.

We also tested whether participants would be able to differentiate between the real and bullshit scientific statements, with the expectation that participants would see real statements as more truthful. Indeed, participants accurately differentiated between the two types of statements, believing that the real scientific statements were more truthful (M = 3.01, SD = 0.68) than the bullshit statements (M = 2.75, SD = 0.78), t(904) = 13.61, p < .001. We also estimated a bullshit sensitivity score for participants in Studies 1 and 2 (we did not include real scientific statements in Study 3), where we estimated the difference in truthfulness ratings between real scientific statements and bullshit scientific statements. Here, higher scores indicate that participants assigned greater truth to real (vs. bullshit) scientific statements: M = 0.28, SD = 0.62.

Belief in science.

Participants completed the 10-item Belief in Science Scale (Farias et al., 2013). This scale measures the extent to which people believe in the superiority of science and the extent to which people believe science is an important and valuable social institution (Example: “Science provides us with a better understanding of the universe than does religion”). Participants rated the extent to which they agreed or disagreed with statements on a 6-point Likert scale (1 = strongly disagree; 6 = strongly agree), α = .93.

Social and economic conservatism.

We used the Social and Economic Conservatism Scale (Everett, 2013) to measure social (7-items; α = .88) and fiscal (5-items; α = .75) conservatism. The items reflect a variety of political issues (e.g., abortion, limited government, and welfare benefits). For each item, participants were asked to indicate the extent to which they supported the issue on a scale from 0 (Strongly against) to 100 (Strongly in favor).

Free market beliefs (Study 1).

The Free Market Beliefs Scale (Heath & Gifford, 2006) was used to measure support for the unregulated free-market system (Example: “The preservation of the free market system is more important than localized environmental concerns”). The scale consists of six statements, which are rated by participants on a 5-point Likert scale, based on how much they agree with them (1 = strongly disagree; 5 = strongly agree), α = .83.

Cognitive style (Study 2).

Participants completed two scales to measure cognitive style, the Need for Cognition Scale (Cacioppo et al., 1984) and the Faith in Intuition Scale (Epstein et al., 1996): The Need for Cognition Scale measures the extent to which people enjoy engaging in conscious thinking (Example: “I prefer complex to simple problems”). The scale consists of 19 statements that are rated on a 5-point scale based on whether the statement is characteristic of the participant (1 = Extremely uncharacteristic of me; 5 = Extremely characteristic of me), α = .96. The Faith in Intuition Scale measures confidence in intuition and the tendency to use it (Example: “I believe in trusting my hunches”). The scale consists of twelve statements that are rated on a 5-point scale (1 = completely false; 5 = completely true), α = .88.

Acquiescence bias (Study 2).

We followed the method of Rammstedt et al. (2013) to measure individual differences in acquiescence: Participants completed the Big Five Inventory-10 (Rammstedt & John, 2007), which consists of two-item measures for each of the Big Five traits. Participants were asked to rate to what extent they agreed that each statement accurately described them (Example: “I see myself as someone who is reserved”). Ratings were made on a scale from 1 (Strongly disagree) to 5 (Strongly agree).

Note that for each two-item trait measure, there was one item where agreement corresponded with a high trait score and one item where agreement indicated a low score. To calculate the degree of acquiescence, we summed the ten items without reverse scoring the negative items. A high score on this acquiescence measure indicates that a participant tends to agree with items, regardless of the specific item content.

Science literacy (Study 3).

To measure science literacy, participants responded to seven true-false statements, such as “The center of the Earth is very hot” (Kahan et al., 2012). The final score for this measure is the number of correct answers (out of seven).

2.2.2  Procedure

The survey was administered using Qualtrics. The procedure for randomizing the different measures varied across studies: In Studies 1 and 2, all participants were presented with the two bullshit receptivity scales at the beginning of the surveys, with these two scales presented in a randomized order. In Study 1, the remaining scales were presented in a fixed order (belief in science followed by political orientation and free market beliefs). In Study 2, the remaining scales were presented in a randomized order. Finally, in Study 3, we manipulated whether the two bullshit receptivity scales were presented at the beginning (or the end) of the survey; order had no effect on our results. The remaining scales in Study 3 were also presented in a randomized order (either before or after the two bullshit scales). At the end of each study, participants provided basic demographic information.

2.3  Analysis plan

We follow the analysis plan outlined in the pre-registration for our third study: we test the correlates of the two types of bullshit receptivity, and test whether these correlations differ in strength for the two types of receptivity.4 For example, we ask if the correlation between scientific bullshit receptivity and belief in science is stronger than the correlation between pseudo-profound bullshit receptivity and belief in science. In addition to our pre-registered analyses, we also added exploratory analyses looking at the correlates of scientific bullshit sensitivity. Note that we use all available data for each set of analyses, and that we indicate when our analyses include variables that were measured in some, but not all, of our studies.


Figure 1: The correlates of receptivity to pseudo-profound versus scientific bullshit. Error bars denote 95% confidence intervals.


Table 2: Correlations between bullshit receptivity, belief in science, political ideology, and cognitive style.
Variable
MSD
1
2
3
4
5
7
1. Scientific bullshit
2.630.89      
2. Pseudo-profound bullshit
2.750.79
.60*** [.57,.63]
     
3. Belief in science
4.161.60
.12*** [.07,.16]
.07** [.03,.12]
    
4. Social conservatism
52.8624.81
.18*** [.14,.23]
.29*** [.25,.33]
−.44*** [−.47,−.40]
   
5. Economic conservatism
55.2919.20
.10** [.05,.14]
.14*** [.10,.18]
−.23*** [−.26,−.18]
.64*** [.61,.66]
  
6. Free market beliefs (Study 1)
2.700.89
.04 [−.06,.14]
.20*** [.09,.30]
−.30*** [−.39,−.20]
.56*** [.49,.63]
.61** [.54,.67]
 
7. Faith in intuition (Study 2)
3.960.68
.25*** [.17,.33]
.39*** [.31,.45]
−.01 [−.10,.07]
.24*** [.16,.32]
.14** [.05,.22]
 
8. Need for cognition (Study 2)
3.410.98
−.05 [−.13,.03]
−.04 [−.12,.04]
.15*** [.07,.23]
−.14*** [−.22,−.06]
−.10* [−.18,−.01]
.03 [−.05,.11]
Note: * indicates p < .05; ** indicates p < .01; *** indicates p < .001. 95% confidence intervals are reported in brackets. Column 6 is excluded because free market beliefs and the two cognitive styles were measured in different studies.

3  Results

3.1  Correlates of scientific bullshit receptivity

To begin, we examined the correlations between the two types of bullshit receptivity (pseudo-profound and scientific), belief in science, the three measures of political ideology (social conservatism, fiscal conservatism, and free market beliefs), and the two measures of cognitive style (faith in intuition and need for cognition). Then, we asked whether these measures were differently correlated with the two measures of bullshit receptivity. For example, we asked whether the correlation between belief in science and receptivity to scientific bullshit was different from the correlation between belief in science and receptivity to pseudo-profound bullshit. These correlation comparisons were conducted using the cocor package (Diedenhofen & Musch, 2015) using the Hittner, May and Silver (2003) method for comparing two dependent correlations.

Analyses including the free market beliefs scale included data from only Study 1 (N = 365), analyses including cognitive style measures included data from only Study 2 (N = 543), and the remaining analyses include data from all three studies (N = 1,948). Table 2 includes a full correlation table with descriptive statistics, and Figure 1 illustrates the correlates of the two types of bullshit receptivity.

3.1.1  Pseudo-profound and scientific bullshit receptivity

To begin, we estimated the correlation between our two measures of bullshit receptivity: there was a significant positive correlation, r(1,948) = .60, p < .001.

3.1.2  Belief in science and bullshit receptivity

Next, we looked at the relationships between belief in science and the two types of bullshit receptivity: belief in science was positively correlated with both pseudo-profound bullshit receptivity (r = .07, p = .001) and scientific bullshit receptivity (r = .12, p < .001). We then used a Z-test to ask if these two correlations were significantly different in magnitudes. Indeed, the relationship between belief in science and scientific bullshit receptivity was significantly stronger than the relationship between belief in science and pseudo-profound bullshit receptivity, Z = −2.24, p = .025.

3.1.3  Political ideology and bullshit receptivity

Next, we examined the relationship between conservative political beliefs and bullshit receptivity. Replicating the results of Sterling et al. (2016), we found that pseudo-profound bullshit receptivity was again positively associated with social conservatism (r = .29, p < .001), economic conservatism (r = .14, p < .001), and beliefs in the free-market (r = .20, p < .001). At the same time, we found that scientific bullshit receptivity was also positively correlated with social conservatism (r = .18, p < .001) and economic conservatism (r = .10, p < .001), though it was not significantly correlated with free-market-beliefs (r = .04, p = .47).

Comparing these sets of correlations, we found that conservatism was more strongly associated with profound (vs. scientific) bullshit receptivity: Social conservatism, Z = 5.43, p < .001; economic conservatism: Z = 2.09, p = .036; Free-market-beliefs Z = 3.51, p < .001.

3.1.4  Bullshit receptivity and cognitive style

Following the approach we used in the preceding analyses, we examined whether the two types of bullshit receptivity correlated (differently) with individual differences in cognitive style.

Faith in intuition was positively correlated with receptivity to both types of bullshit. pseudo-profound bullshit: r = .38, p < .001; scientific bullshit: r = .25, p < .001. Moreover, the correlation between pseudo-profound bullshit and faith in intuition was significantly stronger than the correlation between scientific bullshit and faith in intuition: Z = 3.78, p < .001. In contrast, individual differences in need for cognition were not correlated with either type of bullshit receptivity: pseudo-profound bullshit: r = -.04, p = .34; scientific bullshit: r = -.05, p = .20; finally, there was no significant difference between these two correlations, Z = 0.36, p < .71.


Figure 2: Controlling for acquiescence did not substantially change the correlates of bullshit receptivity.

3.2  Bullshit receptivity and acquiescence bias (Study 2)

Next, we asked to what extent the above results were related to individual differences in acquiescence bias – the tendency to agree with survey items regardless of content. These analyses included participants from our second study, df = 541. Acquiescence was positively correlated with both receptivity to pseudo-profound bullshit, r = .18, p < .001, and receptivity to scientific bullshit, r = .16, p < .001. We further examined whether acquiescence was correlated with belief in science and political ideology. There was a positive correlation between acquiescence and social conservatism, r = .17, p < .001. However, acquiescence was not correlated with either belief in science (r = .03, p = .42) or economic conservatism (r = .07, p = .07). Regarding the correlations between acquiescence and individual differences in cognitive style, we found that acquiescence was positively correlated with both faith in intuition (r = .26, p < .001) and need for cognition (r = .30, p < .001).

Then, we used semi-partial correlations to test whether our key results (i.e., the correlations between bullshit receptivity, belief in science, and political ideology) were robust when controlling for acquiescence bias (see Figure 2 and our Appendix). Controlling for acquiescence did not change the pattern of results.

3.3  Sensitivity to scientific bullshit (Studies 1 and 2)

We conducted a series of exploratory analyses comparing the correlates of receptivity to scientific bullshit, receptivity to scientific facts, and sensitivity to scientific bullshit (i.e., the ability to differentiate between scientific facts and scientific bullshit). Receptivity to scientific bullshit was positively correlated with receptivity to scientific facts, r = .62, p < .001, and negatively correlated with sensitivity to scientific bullshit, r = −.49, p < .001. The correlates of the three different measures are reported in Table 3.


Table 3: Correlates of receptivity to scientific bullshit, receptivity to scientific facts, and scientific bullshit sensitivity.
Variable
Scientific bullshit
Scientific facts
Bullshit sensitivity (facts−bullshit)
Pseudo-profound bullshit
.59*** [.57, .63]
.34*** [.28, .40]
−.29*** [−.35, −.23]
Belief in science
.12*** [.07, .16]
.15*** [.08, 20]
.07* [.004, .13]
Social conservatism
.18*** [.14, .23]
.04 [−.02, .04]
−.13** [−.19, −.13]
Economic conservatism
.10*** [.05, .14]
.01 [−.05, .01]
−.05 [−.12, .009]
Free market beliefs (Study 1)
.04 [−.06, .14]
−.11* [−.21, −.01]
−.17*** [−.27, −.07]
Faith in intuition (Study 2)
.25*** [.17, .33]
.16*** [.08, .24]
−.12** [−.20, −.04]
Need for cognition (Study 2)
−.05 [−.13, .03]
.05 [−.03, .14]
.12** [.04, .21]
Note: * indicates p < .05; ** indicates p < .01; *** indicates p < .001. 95% confidence intervals are reported in brackets.


Figure 3: The correlation between pseudo-profound bullshit receptivity and scientific bullshit receptivity is moderated by scientific literacy. All variables were standardized.

There were notable differences between the correlates of receptivity to scientific bullshit and sensitivity to scientific bullshit. Scientific bullshit sensitivity was negatively correlated with receptivity to pseudo-profound bullshit, conservative political beliefs (social conservatism and free market beliefs), and reliance on intuitive cognitive processes. At the same time, sensitivity to bullshit was positively correlated with belief in science and reliance on analytical thinking (i.e., need for cognition).

3.4  Bullshit receptivity and science literacy (Study 3)

To conclude, we examined whether scientific literacy moderated the relationship between the two types of bullshit receptivity. We hypothesized that the correlation between the two types of bullshit would be weaker for scientifically literate individuals (i.e., those with better abilities to discern scientific facts from scientific bullshit). To test this hypothesis, we estimated a multiple regression with scientific bullshit receptivity as the dependent variable. The following variables were entered as (standardized) predictors: pseudo-profound bullshit receptivity, scientific literacy, and a pseudo-profound bullshit-by-scientific literacy interaction term. Scientific bullshit was positively associated with pseudo-profound bullshit receptivity (β = .60, p < .001) and there was no relationship between literacy and scientific bullshit (β = −.04, p = .12); critically, there was also a significant interaction (β = −.06, p = .008). Figure 3 illustrates the pattern of the interaction. The relationship between the two types of bullshit receptivity was stronger (weaker) for individuals low (high) in scientific literacy.

4  Discussion

We introduced a scale to measure individual differences in receptivity to scientific bullshit (nonsensical, scientific-sounding statements). First, we examined the relationship between receptivity to scientific bullshit and receptivity to pseudo-profound bullshit. Going into this project, we originally expected that these two measures would be uncorrelated, or negatively correlated. Contrary to our expectations, we observed a strong positive correlation between the two types of bullshit receptivity. This finding is important, as it supports the idea that individual differences in bullshit receptivity may generalize across content domains (Čavojová et al., 2020).

Second, we examined the correlates of receptivity to scientific bullshit. We found that receptivity to scientific bullshit was positively correlated with belief in science and negatively correlated with conservative political beliefs (especially social conservatism) and reliance on intuitive cognitive processes. The correlates of pseudo-profound bullshit receptivity followed a similar pattern (in terms of direction), but they also differed in terms of strength. Receptivity to scientific (vs. pseudo-profound) bullshit was more strongly correlated with belief in science, and less strongly correlated with conservatism political beliefs and faith in intuition. These results suggest that although there is a general proclivity to accept bullshit statements (e.g., individuals with conservative beliefs are more likely to accept bullshit statements, regardless of context), the correlations of bullshit receptivity differ significantly across domains. For example, political conservatives may be particularly receptive to pseudo-profound bullshit, whereas those who believe in science may be particularly receptive to scientific bullshit.

To test the robustness of the above results, we also conducted analyses controlling for acquiescence bias, the overall tendency to agree with survey items, regardless of content. Acquiescence was positively correlated with both types of bullshit receptivity; however, controlling for acquiescence did not substantially change the extent to which either of our bullshit measures were correlated with belief in science, ideology, or cognitive style measures. We also compared receptivity and sensitivity to scientific bullshit statements. We observed that sensitivity to scientific bullshit, the ability to differentiate between factual and bullshit scientific statements, was positively correlated with belief in science, negatively correlated with conservative political beliefs, negatively correlated with intuitive thinking, and positively correlated with analytical thinking. Note that these results are in line with previous studies that analyzed sensitivity to profound bullshit, which found that sensitivity to profound bullshit was negatively correlated with reliance on heuristics and biases, and positively correlated with analytical thinking (Pennycook et al., 2015).

Finally, we examined how content knowledge (science literacy) influences sensitivity to different types of bullshit. We found that the correlation between the two types of bullshit receptivity was moderated by scientific literacy: the correlation between pseudo-profound bullshit receptivity and scientific bullshit receptivity was stronger (weaker) for individuals scoring low (high) in science literacy.

4.1  Political ideology and bullshit receptivity

Previous studies found that individuals with conservative political beliefs are more receptive to pseudo-profound bullshit (Pfattheicher & Schindler, 2016; Sterling et al., 2016). Our studies successfully replicated and extended this finding. Political conservatives (especially social conservatives) were consistently more receptive to pseudo-profound bullshit. Surprisingly, conservatives were also more receptive to scientific bullshit. This positive correlation is particularly striking, given the consistent negative correlations between conservative ideology and belief in science (r’s < −.23). In other words: political conservatives do not trust science, but they are still more receptive to scientific bullshit. Critically though, the correlation between ideology and scientific bullshit receptivity was significantly smaller than the correlation between ideology and pseudo-profound bullshit receptivity. This difference in correlations suggests that conservatives’ general mistrust of science only partially inoculates them from the allure of scientific bullshit.

4.2  Limitations

It is important to note that scientific bullshit receptivity is weakly correlated with some of our measures of interest, with many correlations falling between .10 and .20. These effect sizes are generally consistent with the results of other studies examining the correlates of bullshit receptivity (Čavojová et al., 2020; Nilsson et al., 2019). It remains to be seen whether scientific bullshit receptivity meaningfully predicts behavioral outcomes, such as individual differences in the effectiveness of science-based advertisements, or the willingness to follow the advice of scientists.

There are also potential issues with the items we used in our studies: In Studies 1 and 2, participants were presented with real and fake scientific statements; however, participants may have lacked the expertise to clearly differentiate between the two types of statements. In other words, the real scientific statements may have been too difficult for participants to properly evaluate. However, note that the average truthfulness ratings of our real scientific statements (M = 3.01, SD = 0.68) was quite similar to the profoundness ratings of the motivational (the non-bullshit) statements used in Pennycook et al. (2015), Study 3: M = 3.05, SD = 0.69; Study 4 M = 3.13, SD = 0.67. Moreover, the analyses we conducted using scientific bullshit sensitivity also produced results similar to prior analyses of the correlates of pseudo-profound bullshit sensitivity. Specifically, we found that sensitivity was positively correlated with analytical thinking (need for cognition) and negatively correlated with faith in intuition. Taken together, these pieces of evidence suggest that our measure of sensitivity worked as intended, despite the difficulty of our true scientific statements.

Finally, our measure of scientific bullshit receptivity focused on physical science, rather than specific scientific issues. Consider the correlation between conservative political ideology and acceptance of scientific bullshit. This correlation is likely to change depending on the specific context in which scientific bullshit appears. While there are some general effects of ideology on trust in science (Gauchat, 2012), individual attitudes towards science and scientists are also heavily shaped by the extent to which science hinders or supports personal goals (Farias et al., 2013; Kahan et al., 2012). In other words, general attitudes towards science do not necessarily correspond to attitudes towards specific scientific issues (e.g., global warming, or genetically modified organisms). Both liberals and conservatives are reluctant to accept dissonant science. Similarly, we expect that liberals and conservatives would be less willing to accept dissonant scientific bullshit.

4.3  Conclusion

Scientific knowledge can help individuals and organizations make better, evidence-based decisions. At the same time, irrelevant scientific jargon can be used to mislead or harm the public. We introduced a measure of scientific bullshit receptivity, the tendency to believe in nonsensical scientific statements. Scientific bullshit receptivity was strongly correlated with pseudo-profound bullshit receptivity, suggesting the existence of a domain-general tendency to accept bullshit. At the same time, the two types of bullshit differed in terms of how strongly they were associated with other individual difference measures, and we found that scientific literacy moderated the relationship between the two types of bullshit receptivity. Our results are an important step towards broadening the concept of bullshit receptivity.

References

Bauer, M. W. (2008). Paradigm change for science communication: commercial science needs a critical public. In D. Cheng et al. (Eds.), Communicating science in social contexts (pp. 7–25). London: Springer.

Beckwith, L. (2006). The dictionary of corporate bullshit: An A to Z lexicon of empty, enraging, and just plain stupid office talk. New York: Crown Archetype.

Cacioppo, J. T., & Petty, R. E. (1982). The need for cognition. Journal of Personality and Social Psychology, 42(1), 116–131.

Cacioppo, J. T., Petty, R. E., & Feng Kao, C. (1984). The efficient assessment of need for cognition. Journal of Personality Assessment, 48(3), 306–307.

Čavojová, V., Brezina, I., & Jurkovič, M. (2020). Expanding the bullshit research out of pseudo-transcendental domain. Current Psychology. http://dx.doi.org/https://doi.org/10.1007/s12144-020-00617-3

Cohen, G. A. (2002). Deeper into bullshit. In S. Buss and L. Overton (Eds.), Contours of agency: Essays on themes from Harry Frankfurt (pp. 321–339). Cambridge, MA: MIT Press.

Diedenhofen, B., & Musch, J. (2015). cocor: A comprehensive solution for the statistical comparison of correlations. PloS ONE, 10(3), e0121945-e0121945.

Epstein, S., Pacini, R., Denes-Raj, V., & Heier, H. (1996). Individual differences in intuitive–experiential and analytical–rational thinking styles. Journal of Personality and Social Psychology, 71(2), 390–405.

Eriksson, K. (2012). The nonsense math effect. Judgment and Decision Making, 7(6), 746–749.

Everett, J. A. (2013). The 12 item social and economic conservatism scale (SECS). PloS ONE, 8(12), e82131.

Farias, M., Newheiser, A.-K., Kahane, G., & de Toledo, Z. (2013). Scientific faith: Belief in science increases in the face of stress and existential anxiety. Journal of Experimental Social Psychology, 49(6), 1210–1213.

Faul, F., Erdfelder, E., Buchner, A., & Lang, A.-G. (2009). Statistical power analyses using G* Power 3.1: Tests for correlation and regression analyses. Behavior Research Methods, 41(4), 1149–1160.

Fowler, J. G., Carlson, L., & Chaudhuri, H. R. (2019). Assessing scientific claims In print ads that promote cosmetics: How consumers perceive cosmeceutical claims. Journal of Advertising Research, 59(4), 466–482.

Frankfurt, H. G. (2005). On bullshit. Princeton: Princeton University Press.

Gauchat, G. (2012). Politicization of science in the public sphere: A study of public trust in the United States, 1974 to 2010. American Sociological Review, 77(2), 167–187.

Gigerenzer, G., Gaissmaier, W., Kurz-Milcke, E., Schwartz, L. M., & Woloshin, S. (2007). Helping doctors and patients make sense of health statistics. Psychological Science in the Public Interest, 8(2), 53–96.

Goldacre, B. (2010). Bad science: Quacks, hacks, and big pharma flacks. Toronto: McClelland & Stewart.

Heath, Y., & Gifford, R. (2006). Free-market ideology and environmental degradation: The case of belief in global climate change. Environment and Behavior, 38(1), 48–71.

Hittner, J. B., May, K., & Silver, N. C. (2003). A Monte Carlo evaluation of tests for comparing dependent correlations. The Journal of General Psychology, 130(2), 149–168.

Hopkin, J., & Rosamond, B. (2018). Post-truth politics, bullshit and bad ideas: ‘Deficit fetishism’ in the UK. New Political Economy, 23(6), 641–655.

Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732–735.

Larsen, M. E., Huckvale, K., Nicholas, J., Torous, J., Birrell, L., Li, E., & Reda, B. (2019). Using science to sell apps: evaluation of mental health app store quality claims. NPJ Digital Medicine, 2(1), 1–6.

Mayo, R. (2015). Cognition is a matter of trust: Distrust tunes cognitive processes. European Review of Social Psychology, 26(1), 283–327.

Nilsson, A., Erlandsson, A., & Västfjäll, D. (2019). The complex relation between receptivity to pseudo-profound bullshit and political ideology. Personality and Social Psychology Bulletin, 45(10), 1440–1454.

Oberski, D. (2014). lavaan. survey: An R package for complex survey analysis of structural equation models. Journal of Statistical Software, 57(1), 1–27.

Pennycook, G., Cheyne, J. A., Barr, N., Koehler, D. J., & Fugelsang, J. A. (2015). On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making, 10(6), 549–563.

Pennycook, G., & Rand, D. G. (2017). Who falls for fake news? The roles of analytic thinking, motivated reasoning, political ideology, and bullshit receptivity. SSRN Electronic Journal, 1–63.

Pfattheicher, S., & Schindler, S. (2016). Misperceiving bullshit as profound is associated with favorable views of Cruz, Rubio, Trump and conservatism. PloS ONE, 11(4), e0153419.

Pinker, S. (2018). Enlightenment now: The case for reason, science, humanism, and progress. New York: Penguin.


Table A1: Comparing Zero-order and Semi-partial Correlates of Bullshit Receptivity (Study 2).
 Pseudo profound bullshitScientific bullshit
 Zero-orderSemi-partialZPZero-orderSemi-partialZP
Belief in science−.06−.060.92.36.03.020.74.45
Social conservatism.30.283.40< .001.21.193.56< .001
Economic conservatism.09.081.6.10.12.101.6.11
Faith in intuition.39.355.63< .001.25.215.72< .001
Need for cognition−.04−.097.1< .001−.06−.107.10< .001

Preston, J., & Epley, N. (2009). Science and God: An automatic opposition between ultimate explanations. Journal of Experimental Social Psychology, 45(1), 238–241.

Rammstedt, B., & John, O. P. (2007). Measuring personality in one minute or less: A 10-item short version of the Big Five Inventory in English and German. Journal of Research in Personality, 41(1), 203–212.

Rammstedt, B., Kemper, C. J., & Borg, I. (2013). Correcting Big Five personality measurements for acquiescence: An 18-country cross-cultural study. European Journal of Personality, 27(1), 71–81.

Sánchez, M. A., & Parrott, W. A. (2017). Characterization of scientific studies usually cited as evidence of adverse effects of GM food/feed. Plant Biotechnology Journal, 15(10), 1227–1234.

Sterling, J., Jost, J. T., & Pennycook, G. (2016). Are neoliberals more susceptible to bullshit? Judgment and Decision Making, 11(4), 352–360.

Sunstein, C. R. (2018). #Republic: Divided democracy in the age of social media. Princeton: Princeton University Press.

Turpin, M. H., Walker, A. C., Kara-Yakoubian, M., Gabert, N. N., Fugelsang, J. A., & Stolz, J. A. (2019). Bullshit makes the art grow profounder. Judgment and Decision Making, 14(6), 658–670.

Weisberg, D. S., Keil, F. C., Goodstein, J., Rawson, E., & Gray, J. R. (2008). The seductive allure of neuroscience explanations. Journal of Cognitive Neuroscience, 20(3), 470–477.

Weisberg, D. S., Taylor, J. C., & Hopkins, E. J. (2015). Deconstructing the seductive allure of neuroscience explanations. Judgment and Decision Making, 10(5), 429–441.

Appendix

Bullshit receptivity and acquiescence bias (Study 2)

In our second study, We examined how controlling for acquiescence bias changes the correlations between the two types of bullshit receptivity and the following variables: belief in science, political ideology (social conservatism and economic conservatism), and cognitive style (faith in intuition and need for cognition).

We used the following procedure to estimate semi-partial correlations controlling for acquiescence: First, we regressed the two types of bullshit receptivity on acquiescence and saved the model residuals. Next, we tested the correlations between these model residuals and the variables of interest (e.g., belief in science, etc.). Finally, we used the Hittner et al. (2003) method to compare the zero-order and semi-partial versions of each correlation. The results of this procedure are reported in Table A1. Controlling for acquiescence significantly reduced the extent to which the two types of bullshit receptivity were correlated with social conservatism and faith in intuition, and significantly increased the extent to which both types of bullshit receptivity were correlated with need for cognition (i.e., the correlations became more negative). Importantly, although controlling for acquiescence altered the strengths of these correlations, this did not substantially change our pattern of results.


*
Department of Social Psychology, Tilburg University, P.O. Box 90153, 5000 LE Tilburg, The Netherlands. Email: tony.m.evans@gmail.com. ORCID 0000–0003-3345-5282.
#
Tilburg University, 0000–0001-9058-3817.
$
University of Groningen, 0000–0003-3399-9220.

Copyright: © 2020. The authors license this article under the terms of the Creative Commons Attribution 3.0 License.

1
Before running our first study, we hypothesized a priori that there would be a null or negative correlation between the two types of bullshit receptivity. To preview our results: there was instead a strong positive relationship.
2
We restricted each study to only U.S. American participants and prohibited repeat participants in the two MTurk studies.
3
Note that we asked participants to rate how the statements sounded, not whether the statements were actually scientific or profound.
4
Our original analysis plan, as shown in our pre-registrations for the first two studies, was to measure the correlations between the two types of bullshit receptivity and potential correlates.

This document was translated from LATEX by HEVEA.