Most of y'all have figured out that I work in specific areas within Pharma, since I get really detailed about some subjects and am a lot more careful about pontificating on others. One of my areas of specialization is neuroscience, specifically cognition, movement disorders and pain perception. As far as I'm concerned, there's far too much woo and irreproducible results (or results that no one has bothered to replicate, which amounts to the same thing) in mainstream neuroscience and psychiatry. HOWEVER. Those disciplines also contain the tools to clean up their own mess, and the fields are beginning to do so. Rejecting those fields because they have sometimes fallen prey to sloppy technique and "eminence-based medicine" is a mistake - especially if the alternative turned to is not based on the scientific method. This is to say that, as a strict materialist, I think we should be looking to cognitive science if we want to reach and expand human potential, not un- and anti-scientific speculation. One of the recurring themes in my own personal quest to make sense of the world around me is: how can I tell shit from Shinola? (For those of you not from the US, that saying came about because Shinola was a popular brand of black shoe polish. The saying has outlived the company.). I've fallen for my own share of horse shit from sources I trusted, and while you can't protect yourself from *every* con and still function in the world, it behooves every thinking person to get better at this. It has long been known that the human brain will make patterns where there are none, because of our innate desire to make sense of the world. Actually, even birds engage in this behavior: Even pigeons seem subject to illusory pattern perception. In a classic study by Skinner (1948), hungry pigeons received food at regular time intervals, and as a result, the pigeons increasingly started doing whatever they were doing the last time that they received food. As noted by Skinner,"The experiment might be said to demonstrate a sort of superstition. The bird behaves as if there were a causal relation between its behavior and the presentation of food, although such a relation is lacking." Certain types of humans tend to deal with randomness and uncertainty less well than others, but all humans tend to engage in false attributions more when under stress: The desire to make sense of the world is of particular importance to people when they lack control (Park,2010) or when they are uncertain (Van den Bos,2009). Consistently, empirical findings reveal that people are particularly likely to believe conspiracy theories when they lack control or are uncertain (Newheiser, Farias, & Tausch, 2011; Marchlewska, Cichocka, & Kossowska, in press; Sullivan, Landau, & Rothschild,2010; Van Prooijen, 2016; Van Prooijen & Acker,2015; Van Prooijen & Jostmann, 2013; for a review,see Kossowska & Bukowski, 2015). Likewise, lacking control or experiencing feelings of uncertainty have been found to increase supernatural beliefs, in the form of superstition (Whitson & Galinsky, 2008), belief in horoscopes (Wang, Whitson, & Menon, 2012), and increased religiosity (Hogg, Adelman, & Blagg, 2010; Kay, Gaucher, McGregor, & Nash, 2010). These findings are consistent with the idea that irrational beliefs are rooted in pattern perception, as establishing relevant patterns makes an unpredictable, uncertain, and potentially threatening environment more predictable. Indeed, control threats have been found to increase the extent to which people misperceive patterns in randomness, and these findings closely mirrored the effects of control threats on irrational beliefs in consecutive experiments (Van Harreveld, Rutjens, Schneider, Nohlen,& Keskinis, 2014; Whitson & Galinsky, 2008). The scientific article I linked to is pretty dense, but there is also a pretty decent lay overview, here. The question that jumps out in my mind, though, is what can I do about this tendency in my own thinking? There is one finding from this paper that stands out in that regard: Following a manipulation of belief in one conspiracy theory, people saw events in the world as more strongly causally connected, which in turn predicted unrelated irrational beliefs. In other words, when the researchers asked the participants to read about a conspiracy theory, they started seeing patterns in coin toss and other experiments that they had previously (correctly) attributed to random chance. The practical upshot of this is that if a few hours of reading had this kind of impact, imagine the impact of marinating yourself in this kind of stuff for years. Hell, old-timers on the board don't need to imagine, they can think back to a now-banned, formerly prolific poster who, while he had done some significant things to help defeat the cult, not only had a tinfoil hat, he had the tinfoil body armor, leggings and codpiece to go with it. From what I've gathered, being in the Co$, especially the higher levels, is a lot like marinating your brain in conspiracy theories. Once out, getting out of the echo chamber, spending some time in non-related activities and / or in the company (virtual or otherwise) of people with conflicting beliefs to yours may help restore normal and healthy cognitive function.