Can an image, sound, video or string of words influence the human mind so strongly the mind is actually harmed or controlled? Cosmoso takes a look at technology and the theoretical future of psychological warfare with Part Three of an ongoing series.
A lot of the responses I got to the first two installments talked about religion being weaponized memes. People do fight and kill on behalf of their religions and memes play a large part in disseminating the message and information religions have to offer.
Curved bullet meme is a great one. Most of the comments I see associated with this image have to do with how dumb someone would have to be to believe it would work. Some people have an intuitive understanding of spacial relations. Some might have a level of education in physics or basic gun safety and feel alarm bells going off way before they’d try something this dumb. It’s a pretty dangerous idea to put out there, though, because a percentage of people the image reaches could try something stupid. Is it a viable memetic weapon? Possibly~! I present to you, the curved bullet meme.
The dangers here should be obvious. The move starts with “begin trigger-pull with pistol pointed at chest (near heart)” and anyone who is taking it seriously beyond is Darwin Award material.
Whoever created this image has no intention of someone actually trying it. So, in order for someone to fall for this pretty obvious trick, they’d have to be pretty dumb. There is another way people fall for tricks, though.
There is more than one way to end up being a victim of a mindfuck and being ignorant is part of a lot of them but ignorance can actually be induced. In the case of religion, there are several giant pieces of information or ways of thinking that must be gotten all wrong before someone would have to believe that the earth is coming to an end in 2012, or the creator of the universe wants you to burn in hell for eternity for not following the rules. By trash talking religion in general, I’ve made a percentage of readers right now angry, and that’s the point. Even if you take all the other criticisms about religion out of the mix, we can all agree that religion puts its believers in the position of becoming upset or outraged by very simple graphics or text. As a non-believer, a lot of the things religious people say sound as silly to me as the curved bullet graphic seems to a well-trained marksman.
To oversimplify it further: religions are elaborate, bad advice. You can inoculate yourself against that kind of meme but the vast majority of people out there cling desperately, violently to some kind of doctrine that claims to answer one or more of the most unanswerable parts of life. When people feel relief wash over them, they are more easily duped into doing what it takes to keep their access to that feeling.
There are tons of non-religious little memes out there that simply mess with anyone who follows bad advice. It can be a prank but the pranks can get pretty destructive. Check out this image from the movie Fight Club:
Thinking no one fell for this one? For one thing, it’s from a movie, and in the movie it was supposed to be a mean-spirited prank that maybe some people fell for. Go ahead and google “fertilize used motor oil”, though, and see how many people are out there asking questions about it. It may blow your mind…
the ability to understand something immediately, without the need for conscious reasoning.
People tend to trust their own intuition. Has there been much formal study about the veracity of intuition?
Brain science itself is a young field, and the terminology has yet to mature into a solid academic lexicon. To further increase your chances of being confused, modern life is rife with distractions, misinformation, and addictive escapisms, leaving the vast majority of society having no real idea what the hell is happening.
To illustrate my point, I’m going to do something kind of recursive. I am going to document my mind being changed about a deeply held belief as I explore my own cognitive bias. I am not here to tell you what’s REALLY going on or change your mind about your deeply held beliefs. This is just about methods of problem solving and how cognitive bias can become a positive aspect of critical thought.
Image: “Soft Bike” sculptiure by Mashanda Lazarus http://www.ilovemashanda.com/
I’m advocating what I think is the best set of decision making skills, Critical Thought. The National Council for Excellence in Critical Thinking defines critical thinking as the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. (I’m torn between the terms Critical Thinking and Critical Thought, although my complaint is purely aesthetic.)
Ever since taking an introduction to Logic course at Fitchburg State college I have been convinced that Logic is a much more reliable, proven way to make decisions. Putting logic to practice when decision-making is difficult, though. Just like a math problem can be done incorrectly, Some logic can even counter-intuitive. My favorite example of intuition failing over logic is always chess. Even as I write this I can’t convince myself otherwise: I have regretted every intuitive chess move. It’s statistically impossible that all my intuitive moves have been bad moves yet logic works in the game so much better that my mind has overcompensated in favor of logic. In the microcosm of chess rules, logic really is the better decision-making tool. Often the kernel of a good move jumps out at me as intuition but then must still be thoroughly vetted with logic before I can confidently say it’s a good move.
In high school, I was an underachiever. I could pass computer science and physics classes without cracking a book. My same attempt to coast through math classes left me struggling because I could not intuitively grasp the increasingly abstract concepts. The part of my mind that controls logic was very healthy and functioning but my distrust for my own intuition was a handicap. I would be taking make up mathematics courses in the summer but getting debate team trophies during the school year.
I’m not just reminiscing; everyone’s decision making process is an constantly-updating algorithm of intuitive and logical reasoning. No one’s process is exactly the same but we all want to make the best decisions possible. For me it’s easy to rely on logic and ignore even a nagging sense of intuition. Some people trust intuition strongly yet struggle to find the most logical decision; everyone is most comfortable using a specially-tailored degree of intuition and logic. People argue on behalf of their particular decisions and the methodology behind them because a different method is useful in for each paradigm.
In chess, intuition is necessary but should be used sparingly and tempered with logic. It’s my favorite example because the game can be played without any intuition. Non-AI computers are able to beat the average human at chess. Some AI can beat chess masters. So, I’m biased towards logic. Chess is just a game, though. People are always telling me I should have more faith in intuitive thinking.
“But,” you should be asking, “Isn’t there an example of reliance on intuition as the best way to decide how to proceed?”
At least that’s what I have to ask myself. The best example I found of valuable intuition is the ability to ride a bike. It is almost impossible to learn to ride a bike in one session; it takes several tries over a week or longer to create the neural pathways needed to operate this bio-mechanical device. Samurais trained to feel that their weapon was part of themselves, or an extension of their very arm. The mechanical motion of the human body as it drives a bicycle becomes ingrained, literally, in the physical brain. The casual, ubiquitous expression, “It’s like riding a bike”, is used to idiomatically describe anything that can be easily mastered at an intermediate level, forgotten for years, but recalled at near perfect fidelity when encountered once again.
The Backwards Brain Bicycle – Smarter Every Day episode 133
Destin at Smarter Everyday put together a video that shows the duality of intuitive thinking. It is completely possible to train the human mind with complicated algorithms of decision making that can be embrace diversification and even contradictory modes of thinking.
After watching this video, I embraced a moment of doubt and realized that there are very positive and useful aspects to intuition that I often don’t acknowledge. In this case of reversed bicycle steering, a skill that seems to only work after it has been made intuitive can be “lost” and only regained with a somewhat cumbersome level of concentration.
The video demonstrates the undeniable usefulness of what essentially amounts to anecdotal proof that neural pathways can be hacked, that contradictory new skills can be learned. It also shows that a paradigm of behavior can gain a tenacious hold on the mind via intuitive skill. It casts doubt on intuition in one respect but without at least some reliance on this intuitive paradigm of behavior it seems we wouldn’t be able to ride a bike at all.
This video forced me to both acknowledge the usefulness of ingrained, intuitive behaviors while also reminding me of how strong a hold intuition can have over the mind. Paradigms can be temporarily or perhaps permanently lost. In the video, Destin has trouble switching back and forth between the 2 seemingly over-engaging thought systems but the transition itself can be a part of a more complicated thought algorithm, allowing the mind to master and embrace contradictory paradigms by trusting the integrity of the overall algorithm.
Including Confirmation Bias in a greater algorithm.
These paradigms can be turned on and off and just as a worker might be able to get used to driving an automatic transmission car to work and operating a stick shift truck at the job site and drive home in the automatic again after the shift.
This ability to turn on and off intuitive paradigms as a controlled feature of a greater logical algorithm requires the mind to acknowledge confirmation bias. I get a feeling of smug satisfaction that logic comprises the greater framework of a possible decision making process anytime I see evidence supporting that belief. There are just as many people out there who would view intuition as the the framework of a complex decision making process, with the ability to use or not use logical thought as merely a contributing part of a superior thought process. If my personal bias of logic over intuition is erroneous in some situations, can I trust the mode of thinking I am in? Using myself as an example, my relief at realizing data confirms what I have already accepted as true is powerful.
That feeling of relief must always be noted and kept in check before it can overshadow the ability to acknowledge data that opposes the belief. Understanding confirmation bias is the key to adding that next level to the algorithm, in the video example from Smarter Everyday, steering a normal bike is so ingrained in the neural pathway that the backwards steering’s inability to confirm actually fill in the blank and the mind sends an incorrect set of instruction of the mechanical behavior to the body. Understanding the dynamics of confirmation bias would enable the mind to embrace the greater thought system that would enable the mind to go back and forth between those conflicting behavioral paradigms. I’m positing that it should be possible to master a regular bike and the “backwards bike” and be able to switch back and forth between both bikes in quick succession. The neural pathways between both behavior paradigms can be trained and made stronger than the video shows.
I believe that with practice, someone could alternate steering mechanism quickly and without as much awkwardness as we are seeing in the video just as my initial confirmation bias, now identified, doesn’t have to dictate my decision and I might be more open minded to an intuitive interpretation leading to the best decision in certain situations.
An inability to acknowledge that one’s own mind might be susceptible to confirmation bias paradoxically makes one more susceptible. Critical thinking is a method of building immunity to this common trap of confidence. Identifying the experience of one’s own confirmation bias is a great way to try and understand and control this intuitive tendency. No matter what your thoughts are regarding logic and intuition, examining one’s confirmation biases and better embracing them should lead to better decision making skills.