In many other blogs, I've talked about the intellectual war going on in this country between right wing and moneyed interests (corporations, Chambers of Commerce dedicated to advancing corporate goals, "deregulators" who think nothing of accumulating money at the cost of equality and resulting in poverty, failed social policy, etc.) on the one hand, and "everyone else" who isn't part of that effort, on the other.
Marks and Engels, more than a hundred years ago, thought it'd be easy enough to get people to change any system of government and economy by simply pointing out to those people, logically and reasonably, that it was "against class interest" (theirs) to allow certain policies and ways of business and politics to continue.
They thought that once they made their arguments clear, that large masses of people would accept the truth and we'd be able to achieve a state of enlightened socialism where everybody was relatively equal.
We all know that idea failed, but very few people know, unless they've studied it, why it failed. And by the way, I'm not suggesting that America ought to become a socialist paradise, because over the years, I've become convinced that mankind just isn't that noble. It's not possible. The only way human beings can live is if they push and pull against one other, with safeguards and controls imposed by the social compact (represented by government) in which they live; it's the best we imperfect and self-interested animals can manage, but it's a damn sight better than the alternative - chaos.
I do think it's interesting, then, to talk about why they failed to convert the masses who, then as now, just don't listen to what's in their best interest. They listen to the opposite far more often than not. I've talked about this often. I call it the "God and flag" argument. Since college and law school, and over my years, I've come to hold certain beliefs as undeniably true when it comes to the old P.T. Barnum observation (or was it W.C. Fields?) that there is a "sucker born every minute".
In forming those ideas, I've been rather unkind. I think that it's easier to fool someone who doesn't have a formal education than it is to fool someone who does, though not always. I think it's easier to fool someone who's poor, and doesn't have access to information and resources, than it is to fool someone who has those benefits. I think it's easier to fool somebody with deep religious or political conviction (although I think they're also one in the same more often than most people are willing to admit, even to themselves) than it is to fool somebody who prefers to reason their way through situations before "feeling" their way through those same situations.
I've read neurology about it, I've read psychology about it, and if we're all honest with ourselves, we know these things are true in a "macro" sense. Are there exceptionally wise and difficult to fool people who never graduated high school? You bet. My father was one of them. Are there millions of people who have attended college and graduate school who view their lives through certain lenses and templates that make it effectively impossible for them to be reasonable or dispassionate even though they claim to be? In other words, easily fooled? Also, you bet.
At the end of the day, however, it's always nice to have some validation, especially at the end of a particularly contentious political season, for these beliefs. They are true. It is easier to get people to buy misinformation, even when it's against their own interests, than it is to get them to buy fact, because accepting "fact" isn't the way our brains are built.
A recent study published in Psychological Science In the Public Interest demonstrates why human beings, generally, find it much easier to believe "false" information, even when they know it's false or suspect that it's false, than it is to believe true information.
By extension, it's easier to lie to people and get them to buy lies then it is to get them to buy the truth (when the truth doesn't jive with what they want to believe). This has sometimes been called the "bias of belief;" there are other terms for it. The greatest and canniest politicians in history, whether we consider them subjectively "good" or "evil" also knew this to be true, and manipulated passions (which is why I use the phrase "God and flag") before attempting to manipulate facts. Facts come later. Passion comes first. "Tell a big enough lie often enough and it becomes the truth" was something that was not lost on the Nazis, and it's not been lost on any propagandist in the modern era. Just listen to pretty much anything the Federal Chamber of Commerce says or the GOP says anymore; it's about feelings and stirring the pot before it's about fact. To wedge it in, so the facts don't end up mattering, because the feeling is already in place.
Accordingly to the team of Psychological Scientists working on the study, the main reason people are more likely to believe false information is because it takes less brain power to believe a statement is false than to accept it as truth. Finding truth, on the other hand, takes time and effort that people don't often care to spend on particular issues that aren't of immediate concern to them.
This is why it's easier for people who want to believe that climate change is hoax to accept that argument as fact even when it's not true, because it's a lot easier to do that than it is to investigate the science that might prove their belief wrong. That takes time, and in addition, it won't "feel" good, because it works against what they've been led, emotionally, to want to believe.
The main reason that misinformation is "sticky" according to the study, is that rejecting information actually requires cognitive effort. Weighing the plausibility and the source of a message is cognitively more difficult than simply accepting that the message is true-it takes additional motivational and cognitive resources.
If the topic isn't very important to you - and let's face it, for most people in America and in the world today, the "big" issues are rather too big and abstract - or if you have other things on your mind, misinformation is far more likely to take hold.
Even when we do take the time to thoroughly evaluate incoming information, we don't tend to pay attention to any but a few features. For example, the first thing we ask ourselves, whether we realize that we're doing it consciously or not, is whether the information fits with other things in which we already believe. We ask ourselves whether it makes a coherent story as compared with what we already know. We ask whether or not it comes from a "credible" source, and of course, how we decide whether a source is "credible" is itself subject to the same biases and prejudices. We ask whether or not others believe it, and again, how do we decide what weight to give what "others." And again, as importantly: who are those "others?" Do we usually agree with them, or usually not?
Misinformation is especially "sticky" when it conforms to preexisting political, religious or social points of view. Well, duh. Because of this, ideology and personal world views can be especially difficult obstacles to overcome when in-taking new factual information which might run counter to those ideologies.
This, by the way, is why it's so much easier to get a super religious conservative to accept anything that is preceded by a discussion of God in which "enemies" of biblical doctrine oppose what the super religious person is about to be told. Once you've got that commonality established, you can sell them almost any social, economic, political or other policy because they become afraid that any opposing view is opposed by God himself; no other truth is possible for them, no matter how factual.
The same is true of people who are "super patriots". It's much easier to accept a lie than to cognitively process opposing facts when you know that the "lie" fits with a "truth" that you hold dear, and is "opposite" to what your political enemies (at least as you perceive them or you are told to perceive them) hold dear.
Even worse than this, efforts to retract misinformation often backfire, paradoxically amplifying the effect of the erroneous information and the beliefs which arise from it.
Obviously, we've seen major misinformation campaigns over the years being perpetrated by media and politicians and moneyed interest behind them. Who hasn't heard that climate change is a hoax when we know it isn't? Who hasn't heard that Saddam Hussein was somehow involved in the attacks of 9/11 when we know he wasn't? Who hasn't heard that President Obama wasn't born in America when we know he was? Refuting any of these claims by "leaders" takes time and research by individuals, which is often neglected. As the report explains, this is how these misinformation campaigns become successful.
Misinformation campaigns are unfortunately very effective. The acceptance of climate change has fluctuated wildly over the last few years, reaching 71% of acceptance by the population in November 2008, falling to 52% in 2010, and then reaching back up to 66% in 2012. As for the "Saddam Hussein" issue and 9/11, polls showed 70% of people believe that statement in 2003 and even last year, 38% of Americans still held that view. As for President Obama's nationality, 30% of registered republicans still believe the President was not born in America, and 20% of the general population holds that belief as well.
All of these beliefs have a very clear negative effect on societies, as pointed out by the study.
The study notes that the processes by which people form their opinions and beliefs are of obvious public interest, particularly if major streams of belief persist that are an opposition to established facts. If a majority believes in something that's factually incorrect, the misinformation forms the basis for political and social decisions that run counter to a society's best interest. Put plainly, if individuals are misinformed, they may likewise make decisions for themselves and their families that are not in their best interests and can have serious consequences.
Marks and Engels didn't account for this, but shrewd politicians have. "Bread and Circuses" was the solution by Roman Emperors to the fact that the Roman Empire was disintegrating and that there was massive political, social and economic unrest causing great suffering. Had Marks and Engels lived in those times, they would have attempted to get the massive "plebeian" (peasant) population of Rome to arise. Yet they would have found their effort unsuccessful, because that same population was much happier listening to the Emperor's lies about the troubled times when those lies were accompanied by gladiatorial games of blood and free bread.
Reliance on misinformation differs from ignorance; we must all understand that. We define ignorance as the absence of relevant information. Ignorance too, can, of course, have obvious detrimental effects on decision making, but perhaps surprisingly, these effects may be less severe than those arising from reliance from misinformation. Ignorance may be a "lesser evil" because in the self acknowledged absence of knowledge, people often return to simple heuristics (or the organic version of them) when making decisions; they get more information and filter it logically and factually.
Of course, not all misinformation is deliberate. Rumors and fiction sometimes create misinformation, as do mistaken governments and politicians who rely in turn on sources of misinformation inadvertently. Vested interests of course, such as corporations, have a long and well documented history of seeking to influence public debate by promulgating incorrect information. Of course, it's fair to state that at times, some misinformation campaigns have been directed against corporate interests by non-governmental interest groups, but that battle is very one sided both numerically (it's rare) and in terms of resources (corporations have orders of magnitude more money to suborn the information process than do activists), and in terms, of course, of long term success.
So the big question the study asks is how we combat the mental apathy that helps reinforce the acceptance of misinformation.
In other words, what can you do?
We ought to communicate with one another by providing a narrative that replaces the gap left by false information. We ought to focus on facts and reason rather than myths, stereotypes, and "God and Flag". We ought to make sure that information we want people to have is simple and brief and we should make sure that our audience is receptive to the facts even when there may be beliefs that may run counter.
It's easy to lie to you, and it's not your fault. You're human. We all are. We're busy, and yet, let's admit that outside of our "busy" stuff, we're lazy about everything else. We all have concerns and worries that are personal, that affect our village, our state, our society, our country, our planet. We simply can't be responsible for everything that concerns and worries us, directly and indirectly, and so we delegate to others a great deal of our responsibility to gather information upon which base our decisions. This accompanies, or as often, results in, passion and prejudice filling in gaps that really ought to be occupied by facts.
Simply be on your guard. Be a cynic. Ask questions. Ask for proof. In the absence of proof, don't form a belief, and certainly, don't communicate it around like it becomes fact. Be ready to accept messages that don't necessarily jive with your political, religious or social beliefs. Be brave and mature enough to change views when the facts suggest. You don't have to abandon all beliefs, but you shouldn't use them as a "shield" to truth when it comes your way.
If we all don't do this, and if we don't all demand this, in our communications, then we're not going to get very far on the journey to justice.