Archive for January, 2016

You’re not as rational as you think you are

I found this article the other day and thought I would share it: http://fivethirtyeight.com/features/your-brain-is-primed-to-reach-false-conclusions/

A short summary of the article: humans often mistake correlation for causation. We assume that if two things happen in quick succession or according to a sort of pattern that we expected, they must be connected to each other. We also tend to be biased in favor of information we’ve already seen and accepted, and biased against data that disagrees with that information we’ve accepted, regardless of which information is correct.

It’s interesting to see clear examples of how irrational humans can be and not realize it. It’s rather concerning to see this illustration of how we double down on our bad logic. The implications are pretty huge, and I’m not really sure of any way around it. I don’t imagine any of us are exempt from this sort of behavior, at least not completely. I imagine some people have a greater propensity for this sort of bias than others, but, in general, this is how human brains work. Even experts, like the doctors discussed in the article, are prone to this “illusion of causality,” and these are the sorts of people we count on to be able to help us make decisions we’re not equipped to make on our own. I’m not suggesting that doctors aren’t reliable, but it is worth keeping in mind that even experts are human and can harbor biases, even simple misunderstandings based on correlation, not causation.

One thing we can take from this information is that effective messaging should keep this in mind. The article ends with this: “If you want someone to accept information that contradicts what they already know, you have to find a story they can buy into. That requires bridging the narrative they’ve already constructed to a new one that is both true and allows them to remain the kind of person they believe themselves to be.” It isn’t enough to present someone with new information. It isn’t enough to explain to them that how they came to their conclusion is wrong. It’s also not enough to explain your opinion or facts or arguments. I imagine this is a large part of why internet arguments never get anywhere, people are biased towards the information they already have and have accepted. Effective messaging isn’t about simply arguing your point better. In fact, it might have nothing to do with arguing your point or even what your point is in the first place. We’re far less rational than we like to think, which is why so much of political messaging is about grabbing people’s emotions or appealing to their identity (as an American, a worker, a taxpayer, a victim, etc).

Advertisements

On Experts

A friend shared an article with me on how contemporary culture is leading to a devaluation of experts (http://thefederalist.com/2014/01/17/the-death-of-expertise/). It’s a good article, and I think it makes some good points, but I had a few thoughts to share on it.

It’s true that experts should know more about their subject than the average person. Depending on the subject, though, that knowledge might be highly subjective or controversial. Even experts are still human, and bring their own biases into a discussion. While more technical subjects tend to be more cut-and-dry, there are still opposing camps when it comes to philosophy and approach. This leads to a pretty basic issue with relying on experts.

The problem is that experts can disagree on things. In such a case, they can’t both be correct, so appealing to the knowledge and wisdom of experts doesn’t necessarily get you closer to a solution. Thomas Sowell and Paul Krugman, for example, disagree on economics. They are both experts. They are both more likely to be right than I am on such matters, but they can’t both be right when they disagree. In fact, they might both be wrong, but other viewpoints might be excluded from the debate for any number of reasons.

If A and B are opposing points supported by experts, and someone comes along with C, they might be excluded for not being an expert even if they’re right. Even if they are an expert, they might lose some status for suggesting C if both A and B are the widely accepted viewpoints, regardless of which of the three is correct. C might not be a very big camp of opinion, or it might be based on new information that hasn’t been involved in previous discussions. By the time C emerges as a position, people involved in the A vs B debate may already be committed through spending of finances and effort. As humans, their egos may be committed to remaining in camp A or B, entire careers may depend on their stance on this issue. And this is all aside from what is actually true, assuming of course that C is the correct position. There might not be enough information to say whether A or B or C is correct, and it could even be that there isn’t a correct answer. Maybe A, B, and C are all just perspectives or opinions, and there’s no factual “correct” answer to the situation.

I guess what I’m saying is that science is not a democracy. Facts are facts regardless of layman or expert opinion. Experts are likely to have a better understanding of the situation and may be quite a bit more likely to come to the correct position based on fact, but as humans we’re all fallible and we might all be wrong until something proves to us that what we thought was true isn’t true. Science is a great system, and it can lead us to some amazing discoveries and move humanity forward. At times, though it seems things would go much more smoothly without humans involved.

The Spread of Ignorance

A friend shared this article with me today (http://www.bbc.com/future/story/20160105-the-man-who-studies-the-spread-of-ignorance) and I figured I had a few thoughts to share on the topic. The article is a good read and I think there are a few really good points made there. It does seem to slanted slightly leftward, but the discerning reader can recognize that and move past it. There are examples of this sort of thing going both ways.

One point I thought to be worth making is that another factor that appears to contribute to the spread of ignorance, but isn’t ignorance so much as a general mistrust of disseminated information. This factor is that organizations have agendas, even if they’re organizations you’re inclined to agree with. As an I recall a conversation in college about global warming where I told the other party that I was willing to accept information about global warming if I could be convinced it was true, but it was too hard to trust any information I got because both sides manipulate the public. I’m willing to be convinced, but I don’t want to be manipulated. Manipulation of information is such a fruitful endeavor that it’s not enough to have truth on your side of an argument, even people with correct information have to spin it to get traction. I’m still a skeptic on several parts of the global warming debate, and it’s largely because I’m a libertarian. I have my own views, and it seems that everyone who is wound up about saving the earth from global warming only proposes big government solutions that adds further regulatory/legal/tax burdens on people and gives more power to the government. That seems like a conflict of interest to me. That wouldn’t be enough on its own, but when it’s added to a lack of good answers to some questions I have along with general political manipulation, I just can’t buy into the whole thing. It reminds me of a blog post by Scott Adams about feminism and fair pay (http://blog.dilbert.com/post/114055529676/my-verdict-on-gender-bias-in-the-workplace). He determined in his post that women are treated more or less equally in the situations he discusses, but this is largely because feminists have stretched the truth. By exaggerating the actual inequality, activists have gotten peoples’ attention and gotten things changed. He concluded that therefore, although women are fairly treated in most of the categories he examined, he said it was acceptable for activists to lie about the truth of the matter because that led to an acceptable outcome that otherwise probably wouldn’t have been reached for a long time. It’s a very complicated problem to deal with.

Another complication is missed in the article, which gives the great example of Obama’s birth certificate. Many people speculated that Obama withheld his birth certificate for such a long time because it got some of his detractors to spread the rumor that he was not born in the US. Nobody switched sides to oppose Obama over the birth certificate issue, so he really lost nothing by not releasing it. When he finally got around to releasing his birth certificate, everybody who had latched onto this thing and made a bunch of noise about it for years looked stupid and was called a racist conspiracy theorist. This flip side of the spread of bad information makes things complicated as well, it can actually benefit both parties in a conflict.

The spread of some ignorance is intentional and strategic, but other times, it’s just a general mistrust of institutions or people we’ve already been given reason not to trust or whose values and interests don’t align with our own. There isn’t a clear-cut solution to this, and it’s difficult to tell what is false information and what is legitimate skepticism or conflicting information. We can’t simply assume trust in any party on an issue, because then we’re just favoring our pre-established bias and assuming the false narrative is on the other side. This is all rather vexing.