Information Theory as a Shield Against Psychological Warfare
We live in an age of psychological manipulation. Information theory gives us a powerful perspective for protecting ourselves from the propaganda, gaslighting, and disinformation that surround us.
In the last chapter of his letter to the Ephesians, the Apostle Paul counsels readers to put on the full armor of God and thereby protect their hearts and minds against the wiles of the enemy. Nowadays we face many wily enemies with political clout who think they know what’s best for us, but who really don’t have our best interests at heart and, if we resist their influence, mean us harm.
One of those pieces of armor that Paul describes in his letter to the Ephesians is a shield (“the shield of faith”). I want in this essay to describe another shield. It is a shield inspired by information theory. It is a shield that can do us much good in the current political situation, fraught as it is with psychological manipulation. Note to non-technical readers: Calm your hearts. What use this essay makes of information theory is minimal and non-technical.
To understand information theory’s use as a shield against psychological operations, let’s begin with a simple example that contains all the crucial elements. Back in the middle ages, the kings of Sweden and Norway disputed who owned the island of Hising. Instead of going to war over the island, they decided to throw a pair of dice to determine the island’s ownership, the winner being the one who threw the highest number, ties being broken by throwing again.
First the king of Sweden threw the dice. They came up double sixes, giving a total of twelve. He therefore thought he had the island in the bag. But when the king of Norway threw the dice, one landed six and the other broke in two with a six on one side and a one on the other. This gave a total of thirteen. The island thus went to the king of Norway (who happened to be Saint Olaf, famous for miracles).
All the key elements from information theory that we need to protect ourselves against psychological warfare are in this story. First off, information here, as everywhere, takes the form of narrowing down possibilities. The more possibilities are narrowed down, the more information is conveyed. You learn progressively more information about me when you learn I was born in the US, in Illinois, in Chicago, in Bucktown, in a few-block radius near Armitage and Western. In thus narrowing down where I was born, I give you more and more information.
In the Saint Olaf example, the possibilities to be narrowed down were the roll of two dice. Any roll realized a possibility, and thus produced information. Information theory quantifies the amount of information in these possibilities in terms of probabilities (rolling double sixes, for instances, has a probability of 1/36), but we don’t need that aspect of information theory for our purposes. What’s crucial here is to be clear about what the possibilities are and which are realized.
But this is where things can get tricky, namely, how do we know what all the possibilities are? To the king of Sweden, who thought that his double sixes virtually assured him of taking possession of the island of Hising, any roll of the dice would show exactly two faces and the sum of those faces could not exceed twelve. Yet when the king of Norway threw the dice and one broke in two, their faces summing to seven, and the other landed six, the grand total coming to thirteen, his roll beat the king of Sweden’s roll.
The point to note here is that in situations where possibilities are realized and information is thus produced, we may be wrong about the range of live possibilities that may actually be in play. With information, it’s not just about possibilities happening but also the frame within which they happen. Spoiler alert: Many psyops depend on misrepresenting the frame within which the possibilities happen.
To take a simple and widely used example of a psyop, consider the common device whereby parents get their children do something that they know the kids don’t want to do, such as eat vegetables. The parents therefore put a carefully crafted choice to the kids: do you want to eat green beans or brussel sprouts? Capitalizing on the inherent human drive for autonomy, the parents seem to be empowering their children to choose what they like. But of course, they’ve set up the frame to artificially limit the choices.
A precocious child might answer, “It’s clever the way your arranged this choice architecture for me, but the fact is that I prefer neither green beans nor brussel sprouts. I see that there’s chocolate chip ice cream in the freezer, and if it’s just the same to you, I’ll go with that. The information-theoretic frame you gave me is too limited. In the interest of my palate, I’m therefore expanding the frame to include options that I actually enjoy eating.”
Well, that would be quite the precocious child. When my kids were young, a couple that I knew whose kids were grown counseled me to use this false-choice device to get my kids to do what I wanted them to do (rationalized in terms of advancing their best interests—where have we heard that?). I remember something in me rebelling against this advice. It struck me as dishonest and manipulative. So, when I had the chance, I explained to my kids that if people give them a set of choices, they should always ask themselves what choices might have been omitted.
For the record, if a precocious child had tried to convince me that chocolate chip ice cream should be on the table for discussion along with green beans and brussel sprouts, I would lovingly have pointed out that for their good they were limited to the choices I was giving them. By contrast, the problem is that when politicians present us with false choices, it’s typically for their benefit, not ours. Moreover, they often obfuscate the true choices, thereby insulting our intelligence, treating us as children who need to be manipulated.
People who study logic and rhetoric have a name for what I’m describing here: the fallacy of the false choice, also called the false dilemma or false dichotomy. In so describing the fallacy, the focus is on forcing a choice between two options while ignoring a viable third option. Yet from the vantage of information theory, there’s nothing essential in this fallacy about having two options and ignoring a third. There could be more. There could be less. It’s a question of what’s the frame and what possibilities are inside the frame. Only if these are accurately identified can we avoid psychological manipulation.
It could be, for instance, that the powers that be present only one option to us in the frame. Historically, this has been called Hobson’s choice, after Thomas Hobson, a stable owner in Cambridge, England, who lived a few hundred years ago. He offered his customers a choice of taking the horse nearest the stable door or taking none at all. Hobson’s choice describes situations where only one real option is available.
Hobson’s choice afflicts much of our current political discussion. On hot-button topics like climate change or vaccine safety, the political and media mainstream sees the science as “settled” and brooks no dissent. The (information-theoretic) frame in these cases allows exactly one possibility, and to reject that possibility is to go outside the pale—it is to be blacklisted, cancelled, demonized, to be a horrible person, to be a rube, to be harming others through dis-mis-mal-information.
Consider, for instance, vaccine safety. Vaccines, we are told to no end, are safe and effective. To question vaccine safety and effectiveness is to receive a swift smackdown from big pharma and mainstream medicine. To insist that vaccines pose no dangers or such small dangers that their good vastly outweighs any negatives is to impose a Hobson’s choice. It misframes the debate about vaccines as though there is no debate. Emmy Award winning investigative journalist Sharyl Attkisson makes clear how this misframing serves not the truth but to support special interests. As she writes in her recently released (September 2024) and ironically titled Follow the $cience:
Has your doctor warned you about potential side effects each time you or a loved one got a vaccine? Under established ethics guidelines, you should be informed about everything from the risk of paralysis to brain damage and death, depending on the vaccine. If you weren’t told of these risks, then did you truly give your informed consent to be vaccinated?
It was a long time ago, but I recall being surprised when I first discovered public health officials using official channels to perpetrate vaccine misinformation. I was startled by their manipulation of news and information. A large part of their propaganda campaign revolves around mislabeling accurate facts as “disinformation,” smearing certain scientists, and falsely attacking people as “anti-vaccine.”
Let me pause to state something that should be obvious. It’s not “anti-vaccine” to ask questions, research, or report about vaccine safety. In all my years of investigating topics, this one stands alone in terms of the magnitude of orchestrated pushback it draws. When I broke international news about deadly rollovers of Ford Explorers outfitted with Firestone tires, nobody suggested I was “anti-car” or “anti-tire.” That would be absurd. When I uncovered fraud at the Red Cross involving 9/11 donations, nobody suggested I was “anti–Red Cross,” “anti-charity,” or “anti–9/11 donations.”
When I investigated other drug safety issues, nobody considered me to be “anti-medicine.” If I were to talk about studies showing that some people are allergic to penicillin, it wouldn’t make me “anti-penicillin.” Ask yourself why the game changes when it comes to vaccines. The fact that people feel compelled to say, “I’m not anti-vaccine . . .” before making perfectly grounded statements or asking rational questions speaks to the success of one of the most influential propaganda movements of our time.
One of the cruelest things our government does is smear the poor parents of vaccine-injured children. For daring to speak publicly about what happened to their loved ones, these parents are attacked by public health officials and held up to ridicule by the media. Parents have told me the government has proven vindictive and they fear if they don’t keep their mouths shut, the government will take back payments awarded by courts to care for their vaccine-injured children. They’re literally bullied into silence. I think part of the reason why vaccine interests are so intent on neutralizing parents is that parents are the most important and credible spokesmen when it comes to vaccine safety. By definition, they weren’t “anti-vaxxers.” They vaccinated their children.
All medicine has side effects. But in today’s manipulated information landscape, efforts to learn the most about side effects of vaccines, products given multiple times to virtually every American, are actively discouraged. We’re made to think that questions are not even to be raised. This is the antithesis of good science and public health. Without risks being addressed, our national vaccine program is neither as safe nor as effective as it could be. The fact that even medical professionals who should know better treat vaccine safety as a third rail not to be touched makes no logical sense and serves as an important giveaway that a commanding narrative is in play.
By way of truth in advertising, this question of vaccine safety and effectiveness hits close to home. I have a severely autistic son who is now twenty-three, who was developmentally on track until he received a round of vaccines when he was two, after which he regressed. To this day he is non-verbal and requires a lot of care. His is hardly the only case I know. A friend of mine has a cousin who about sixty years ago got a jab from a bad batch of vaccines, which gave him hypoxia, leading to brain damage and causing him to operate at the level of a four-month-old. Last I heard his parents were still taking care of him in their home. And unless you’re an ostrich with your head in the ground, you’ll know people who suffered real physical harm from the Covid vaccines.
True enough, this is anecdotal evidence, and anecdotal evidence is not smoking-gun evidence. But it’s a start. The trouble is that those who present us with a Hobson’s choice on this topic are not willing to follow any evidence that might question the hallowed status of vaccines. Instead, they suggest that only lunatic conspiracy theorists would question the safety and effectiveness of vaccines and that “vaccine hesitancy,” as they call it, must be extirpated in the strongest way possible. If you want better than anecdotal evidence on this topic, look at Neil Z. Miller’s Critical Vaccine Studies: 400 Important Scientific Papers Summarized for Parents and Researchers (2016) and Robert F. Kennedy Jr.’s Vax-Unvax: Let the Science Speak (2023). The latter contrasts the health of populations that did and did not receive vaccinations, consistently finding better health with the unvaccinated.
The classic fallacy of the false choice gives just two options and fails to mention a viable third option. The Hobson’s choice fallacy gives just one option, dismissing all other options as so terrible and unthinkable that they must not be considered. A good rule of thumb when people regard something as unthinkable is to ask whether the underlying problem is that they’ve lost the capacity think—whether their biases and preferences have led them astray by dulling their ability to think critically.
From an information-theoretic perspective, a false dichotomy (two possibilities) and a Hobson’s choice (only one possibility) are not the only ways to be misled by information. In sales, it’s common to foist three options on a potential client by using one as a decoy to drive sales to one of two other options. This use of decoys happens everywhere, from the shelves of a supermarket to the homes that a real estate agent is trying to sell.
Suppose a real estate agent really wants to sell a particular house, call it house A. And let’s say that a client has indicated interest in looking at house B. To help in selling house A, the real estate agent can then have the client also look at house C, the decoy. The point about the decoy is that it has to be similar to house A, but house A must be clearly better (newer, bigger, cheaper). The decoy then helps drive the client to house A because it is such a better deal than house C, making it seem that the real choice is between those two. The decoy, house C, at once drives interest in house A and drives interest away from the competing house B, which is significantly different from both A and C (perhaps A and C are ranch houses, B is a two-story house).
Arguably Joe Biden, after his miserable June 2024 debate with Donald Trump, served as a decoy for Kamala Harris, driving enthusiasm in her candidacy for president…
Decoys are psychologically very effective and unless we are self-aware, we often will not know that we are being manipulated in this way. Perhaps the best way to deal with decoys, even if we are not conscious that we are dealing with them, is to ask ourselves, whenever we are confronted with three options, whether one might be a decoy, and also to insist on seeing additional options if possible. Arguably Joe Biden, after his miserable June 2024 debate with Donald Trump, served as a decoy for Kamala Harris, driving enthusiasm in her candidacy for president not because of her inherent merit, but because Biden, with his cognitive decline, seemed so much worse than she and because the two of them together were so diametrically opposite to Trump.
Information theory helps us to defend against psychological manipulation by questioning the frame in which possibilities are presented to us and by getting clear what those possibilities actually are. It doesn’t matter the size of the frame or the number of possibilities in it. What matters is getting clear on the frame, whether it needs to be expanded or can be contracted, and whether we’ve clearly identified and enumerated all the possibilities that are actually in play.
In using information theory as a shield against psychological warfare, it is always helpful to enlist a devil’s advocate. The idea of a devil’s advocate originated in the Roman Catholic Church during the 16th century to keep the canonization process for potential saints honest. The devil’s advocate was tasked with critically examining the life, virtues, and alleged miracles of candidates for sainthood, presenting arguments against their canonization. The devil's advocate would rigorously challenge the evidence and raise doubts to ensure only the most worthy individuals were elevated to sainthood. Although Pope John Paul II in 1983 abolished role of devil’s advocate, the term has since entered common usage to describe someone who argues against a position or idea, not necessarily out of personal conviction, but to test its strength and probe its weaknesses.
The role of the devil’s advocate in Roman Catholicism parallels the role of dissent in the philosophy of John Stuart Mill. Mill argued in his classic work On Liberty that dissent is crucial for healthy and productive discourse because it helps to challenge prevailing opinions, test their validity, and stimulate critical thinking. Mill believed that even when a prevailing view is correct, it risks becoming dogmatic if not regularly confronted by opposing perspectives. He held that a society where opposing views are freely expressed is more likely to uncover truth and understand it more deeply, as it is constantly being refined through debate. From an information-theoretic perspective, dissent is all about adjusting frames and making sure we’re dealing with a fully articulated set of possibilities, none of which are excluded out of hand but only in the furnace of debate and controversy.
Mill justified the need for dissent on the basis of human fallibility. Those who inflict psychological warfare on us never admit that they might be wrong. In fact, they project an air of infallibility, putting themselves in a master class and the rest of us in a subject class that must, because of their infallibility, bow to their psychological manipulation. But Mill was right about human fallibility being universal. No individual or group can be certain that their beliefs are entirely true, and so suppressing dissenting voices assumes an infallibility that no one possesses. For Mill, allowing open debate and the expression of opposing views is the only way to correct errors in thinking and to ensure that truths remain alive and dynamic, rather than becoming stagnant dogmas.
By disallowing dissent, society risks missing out on valuable corrections to false beliefs or misunderstandings. Mill contended that even widely accepted truths benefit from being challenged because such challenges force people to reaffirm and understand the reasons behind those truths, rather than accepting them passively. Therefore, he saw dissent not just as a right but as a necessity for progress, safeguarding against the human tendency to hold errors with undue confidence. This view of dissent emphasizes the importance of continuously questioning prevailing ideas to maintain the vitality and accuracy of knowledge.
In saying that information theory, in its distinction between a frame and the possibilities within the frame, acts as a shield to protect us from psychological warfare, I’m not advocating a wooden-headed one-size-fits-all approach to dealing with psychological manipulation. Rather, I’m counseling an attitude of mind in which we can look ourselves squarely in the mirror and say, “I may be wrong, I may be very wrong, I may be hopelessly and irretrievably wrong,” and mean it. But then we don’t stop there. That attitude must extend to others as well, even the expert class, even those claiming to know so much more than we do and who thus feel themselves justified in curtailing our freedoms and enforcing our compliance: “But you too may be wrong, you may be very wrong, you may be hopelessly and irretrievably wrong.” The mishandling of the housing crash in 2008 and the SARS-CoV-2 pandemic by the expert class should by themselves be enough to justify this attitude.
The use of information theory as a shield against psychological warfare stems from a healthy human impulse, namely the desire to overcome artificial, tyrannical, or self-imposed limitations and thereby to open oneself and others to new possibilities — in a word, freedom. This desire was beautifully expressed in Bernard Malamud’s novel The Fixer (1966). Yakov Bok, a handyman in pre-revolutionary Russia, leaves his small town and heads off to the big city (Kiev). Misfortunes await him there. Why does he go? He senses the risks. But he asks himself, “What choice has a man who doesn’t know what his choices are?” The desire to open himself to new possibilities (to expand the “frame”) impels him to go to the big city. Later in the novel, when he has been imprisoned and humiliated, so that choice after choice has been removed and his one remaining choice is to maintain his integrity, he is reminded that “the purpose of freedom is to create it for others.”
Those who truly value freedom want to free society from arbitrary constraints that stifle inquiry, undermine education, turn experts into a secular priesthood, and in the end prevent truth from receiving a fair hearing. Whenever there is evidence for a view, there could be evidence for the opposite view. That’s not to say that there is, but that there could be, and that very possibility needs to be taken seriously. Evidence is always a two-edged sword: Claims capable of being refuted by evidence are also capable of being supported by evidence. Whether a given view is ultimately rejected or accepted must be the conclusion of sound argument and open debate in a culture of rational discourse where no one is shunned for not lining up with whatever the received wisdom of the day happens to be.
What choice does a society have if it doesn’t know what its choices are? It can choose to stop arbitrarily limiting its choices. It can choose to invite a true diversity of voices, including dissenting voices. It can encourage freedom of thought and expression. It can expand the range of acceptable discourse. It can get clear on how its choices are misrepresented. In doing all these things, it can effectively shield itself from psychological warfare.
FOR MORE ON PSYWAR/PSYOPS:
**Robert and Jill Malone, PsyWar: Enforcing the New World Order (new book released October 8, 2024).
**Mike Smith’s documentary Into the Light, release fall of 2023, with some insightful interviews, including one with Lara Logan.
**Mike Benz on Tucker:
Here's some quotes from major medical journal editors about the corruption in medical science. It's pretty bad . .
"Politicisation of science was enthusiastically deployed by some of history’s worst autocrats and dictators, and it is now regrettably commonplace in democracies.20 The medical-political complex tends towards suppression of science to aggrandise and enrich those in power. And, AS THE POWERFUL BECOME MORE SUCCESSFUL, RICHER AND FURTHER INTOXICATED WITH POWER, the inconvenient truths of science are suppressed. When good science is suppressed, people die."
KAMRAN Abbasi, executive editor BMJ, 2020.
https://www.bmj.com/content/371/bmj.m4425
J
“It is simply no longer possible to believe much of the clinical research that is published, or to rely on the judgment of trusted physicians or AUTHORITATIVE MEDICAL GUIDELINES. I take no pleasure in this conclusion, which I reached slowly and reluctantly over my two decades as editor of The New England Journal of Medicine” (my emphasasis)
Marcia Angell 2004
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4572812/#!po=87.5000
HJ
2016
"It's unusual to watch one of the world's most powerful editors in scientific publishing play with a marionette puppet.
But Dr. Fiona Godlee, editor of the BMJ, specializes in the unexpected.
The puppet she's holding is dressed as a doctor, complete with a stethoscope around its neck. Its strings represent the hidden hand of the pharmaceutical industry.
'I think we have to call it what it is. It is a corruption of the scientific process.' -Dr. Fiona Godlee, editor, BMJ "
Another quote from the article
"It's led me and others to increasingly question the idea that the manufacturer of the drug could ever be considered the right people to evaluate its effectiveness and safety," Godlee says.
"That seems to me to be very mad idea which has grown up historically, and we have to start questioning it and we have to come up with alternatives, which would mean independent studies done by independent bodies."
And it matters, Godlee says, because bad science can be dangerous.
"Patients do get hurt."
https://www.cbc.ca/news/health/bmj-fiona-godlee-science-1.3541769 6th
"Time to assume that health research is fraudulent until proven otherwise?" ( Richard Smith) Former editor of the BMJ).
https://blogs.bmj.com/bmj/2021/07/05/time-to-assume-that-health-research-is-fraudulent-until-proved-otherwise/
I have a few comments:
Few ask where the frame comes from? Who is pulling the strings? It does not come from a specific group of so called experts. What the puppeteers have learned in recent years especially since COVID is how to pull the strings that move the experts. But who are these puppeteers? Are the so called experts really just marionettes themselves who are also being manipulated? Are we just railing against the wrong people. What are the puppeteers objectives? Are the experts the ultimate decoy?
Second, information theory should begin with A and ~A which by definition is exhaustive. Then Hobson choices would disappear. What hit me in the green vegetable example was that the choice was not between healthy foods and non healthy foods. If that had been the choice, the chocolate chip ice cream would have been eliminated immediately. The whole discussion would be different. It would be which healthy foods and how much?
Third, everyone should be aware of the availability cascade. This phenomenon more than anything determines beliefs in the world. Always has.
https://effectiviology.com/availability-cascade/