KCG logo
YouTube subscribe button

“4 out of 5 doctors recommend” that you watch this video on the Authority Bias

Written by John Kuder

Table Of Contents

In the realm of cognitive biases, the authority bias holds a significant place. This bias occurs when we give undue weight or belief to the opinions or actions of individuals in positions of authority, such as doctors, politicians, celebrities, or teachers. It is a natural tendency ingrained in us through evolution, stemming from a time when deferring to authority figures could mean access to more resources and increased chances of survival.

Unpacking the Authority Bias

The authority bias is a cognitive shortcut or heuristic that often operates without our conscious awareness. While it can be beneficial to trust credible experts in their respective fields, problems arise when we assume that authorities know everything about every subject. This blind trust can lead us to make decisions that are not in our best interests or to follow instructions that may be inappropriate.

Examples and Studies

One notable study that shed light on the authority bias is the Milgram obedience experiment conducted in 1961 by Stanley Milgram. Participants were instructed to administer electric shocks to others under the guidance of an authority figure, showcasing how people can be swayed to act against their own conscience in deference to authority.

Further studies have demonstrated how even in medical settings, students may over-prescribe medications if their mentors exhibit the same behavior, highlighting the impact of authority figures on decision-making.

Marketing Strategies and Celebrity Influence

In marketing, the authority bias is often manipulated to influence consumer behavior. Terms like “four out of five doctors recommend” or the depiction of individuals in uniforms or white coats aim to trigger this bias and sway our choices.

Moreover, celebrities leveraging their authority to endorse products or share opinions outside their expertise can also contribute to perpetuating the authority bias among the masses.

Mitigating the Authority Bias

To combat the authority bias, individuals must critically evaluate the expertise of those in positions of authority. It is essential to ask questions, seek diverse perspectives, and avoid blindly following recommendations solely based on an individual’s status.

If you find yourself in a position of authority, it is crucial to present balanced views, consider opposing perspectives, and encourage others to think critically and do their own research.

In a society heavily influenced by media and marketing tactics, recognizing and addressing the authority bias is vital for making informed decisions and safeguarding against potentially harmful influences.

Conclusion

The authority bias, though deeply ingrained in human psychology, can be mitigated through awareness, critical thinking, and a willingness to question assumptions. By understanding the power dynamics at play and actively engaging in informed decision-making processes, we can navigate a world where authority figures and influencers hold significant sway over our choices.

Remember, while authorities can offer valuable insights, it is essential to maintain a healthy level of skepticism and independent thinking to safeguard against the pitfalls of blind trust. By empowering ourselves with knowledge and critical awareness, we can navigate the complexities of the authority bias and make decisions that align with our best interests.

Thank you for accompanying us on this exploration of the authority bias. Stay informed, stay curious, and challenge the status quo.

Note: The preceding blog post was generated by AI and may have minimal editing. The transcription is AI generated but has been edited by a human for accuracy. The original video content is entirely human and imperfect.

Transcript:

Hi, we’re back with another cognitive bias. Oh, right. See, I’ve been missing these. This one is from the, the grouping. We’ve got the different conundrums or problems that we have, and this was, is from the one of not enough time. And and the strategy that it uses is number 13, which is Lucky number 13.

All other things being equal, we, we tend tostay safe. We tend to choose a safe choice. Okay. Yeah. But doesn’t, yeah. We’ve kind of done this a little bit with other thoughts, right? Yeah. We maybe another one. There’s, they’re kind of interwoven.

They are. And a lot of, and there’s several connected to this too. We’ll, we’ll, we’re gonna talk about one. Good. Okay. I’ve got quite a page of notes here. Number one. So the authority bias is a cognitive bias. Mm-hmm. That happens when we. Undue or weight or, or, or credence belief to the opinion or the actions of somebody in authority, that’s in a position of authority.

Oh, we like doctors, politicians, celebrities, right. Teachers, of course, we’re trained to do that. We are literally trained to do. I mean, it, it’s, it’s, it’s a survival mechanism probably from evolution, you know, from a tribal standpoint, you know, there’s always tribal elders and, and the, the elders or the, the shaman or, you know, somebody in charge has access to more resources, right?

So staying on their good side meant you got more food or more something. Okay. So, I mean, okay. It’s, it’s totally natural. And, and we are also raised to defer to authority. I mean, otherwise right. Basically the police couldn’t do their job, you know, if we weren’t somehow taught to respect compliance, that Yeah, I understand.

Okay. We, we’ve gotta have some order to, to have a society. Yeah. Some of us don’t like authority. That ain. Cool. So what, where this becomes a, cognitive bias, in other words, or a heuristic, in other words, something that we don’t really think about. We do it we just do it automatically. Heuristic. And, and it, it changes our.

It’s not, it doesn’t work in our favor, okay? Is when we do it too much or we over overdo it. Like we, we do something that we defer to authority without any thought and, and might do something that’s, that’s not in our best interest or, or that they might ask us to do something that’s not appropriate.

Now, that’s not common, but it happens. So it’s a shortcut. We’ve talked about cognitive biases as a shortcut, right? So it’s reasonable that we talk. It’s reasonable to put our trust, in credible experts, right? Oh, okay.

We do have there are people that are, that are very knowledgeable. Like, we go to a doctor because he’s been to, 12 years of medical school or something, and he knows stuff that we don’t know. But when we assume that somebody in authority, or somebody that knows a lot in a particular area, knows a lot about everything, Or they, or we assume that they know something about an area that’s outside their training.

That’s when the trouble starts. That’s when the Bible gets, okay. So we, we basically, we assume they have more knowledge or skills than they actually do. Mm-hmm. You know, I mentioned doctors. Doctors, we’ve known a lot of ’em. And they’re people, they’re people just like us, and they have, they have their own biases and their own prejudices and their own weaknesses and their own gaps.

That’s why it’s called a practice. And back in the seventies when, when the cholesterol controversy was raging and we were in the egg business and I started studying it, one of the things. That came out was that most doctors at that time didn’t take a single nutrition course during co during their training.

So their surprise, they knew about the same amount about nutrition as their as their receptionist, unless she had been on a diet and then she knew more. That was what was, you know, that was what was told to me. So that Great. Hold on, hold that. I’m just picking on doctor because they’re very common authorities and we, we tend, you know, my parents, I observe my parents very much.

They would just, if a doctor told ’em to do something, they did it no question. Today we tend to get second and third opinions about things, and so, , we, we. Balance that bias a little bit. And I demanded that they did too. Okay. So one of the things, we’ll come back to this, but we need to evaluate, when somebody is in authority, we need to evaluate whether or not they have expertise in the specific area that, that they’re asserting or that we’re assuming that they know, rather than just assuming.

Okay. And a lot of people don’t like that either, don’t like being questioned. So this, this. That’s true. That that is actually there’s another, that was another one that came up in my research. Oh, look, right to here. I think it was Expertise Blindness maybe was the name of it. And it was that, that an authority is used to being deferred to Uhhuh.

And so then they tend to assume that they know more than they do. Right. There’s another cognitive bias there as well. So they, don’t wanna be questioned because they’re used very much used to not be questioned. Oh, yeah. Oh yeah. So, okay. The, this was first identified in a a Milgram obedience experiment, which was also called the Milgram Shock Experiment.

Experiment. I have heard of this before, and you may have too. So what happened? This was the first and the most infamous study on this authority bias. It was conducted in 1961 by a guy named Stanley Milgram. He was a professor of psychology at Yale University. And, and part of his motivation, well, let, let me say first.

So what, what he was doing was to, he wanted to see how obedient people would be when they were instructed to harm another person by an authority. And, and this was. Inspired by the observations or, or the, the common at the time, common explanation of the, of the, not what had happened with the Nazi Germany, where, people did horrible things and, and then claimed they were just following orders.

Okay. So , he kinda wanted to test that. So what they did is they, they brought a bunch of people. They, they got study participants and they, they recruited them for a learning experiment. And they said, they told them, you know, here’s a group of people. We’re gonna be, some of you’re gonna be learners and some of you’re gonna be teachers.

Just, , okay. Randomly assigned. Right. And, and the teachers are going to ask questions of the learners, and the learners are gonna attempt to, to re respond to those questions. Okay. Okay. So when somebody, a, a learner got the question wrong, the teacher was then going to press a button and give them an electric shock.

They were wired, wired up to electrodes. And for each time they got a wrong answer. The shock got bigger. They , actually, had to keep turning it up. It was even more than that. It wasn’t automatic. The, the teacher had to keep turning the dial up. Oh, that’s right. That was the instruction.

Yeah. Okay. Now, what they didn’t know was that all of the people, the, the ordinary people that were recruited for this experiment, they were all the teachers and all the learners were , in, on, on the inside. They, were, part of the experiment. They knew they were part of the creating the experiment.

Okay. Oh, okay. Because nobody was actually getting shocked. Oh, okay. And so, but they had to react like they were. Exactly! So they pretended and they were crying and they were screaming and, and just, you know, react. As the shocks got higher, cuz the shocks went up to about 450 volts. It was no small amount that they were talking about Uhhuh.

Okay, so what happened? Well so nobody was actually getting shocked. They were faking their pain. 65% of the people that had been recruited for this continued to, to increase and shock them because they, and they were, they were pushed to do this a little bit. Okay. Okay. People conducting the experiment said, “You have to do this.”

“It’s really important that you do this” or something. So they, there were four levels of prodding them. They called to, to keep going. Some of them didn’t want to. I mean, some of ’em, they were sweating and, and, and, you know, they were having two, I think two of ’em actually had. From the stress. Ooh, okay. This was the sixties.

I’m not gonna excuse it. It was, it’s the past. But it, you know, it put them under a great deal of stress, but still 65% took it all the way. And that, that’s actually kind of shocking. Yeah. So other experiments, other less not contrived experiments were, were, have been done since. One one that I saw, and I’m gonna put in the notes I’m gonna put, yeah, put all this, I’ll put the the, the sources that you, so you can read more about this, right.

But there was a study of avalanche people were, that were hurting in avalanche mm-hmm. In accidents that happened. And, and a lot of those people got into the situation, they got into this danger. And, and the avalanche because they were following, it was a group and they were following a leader. Well, they chose the leader not because they were a member of the ski patrol necessarily.

It didn’t get real deal real detail, but, but because they were older or because they seemed confident. Ah! These are other things that we tend to use…. Right. If, if, if, if I’m, that’s true kind of an introverted person and I, and I, and I’m not real self-confident and I’m around somebody that’s very confident in themselves, yeah.

I’m more tend, I will tend to follow them. Because they seem stronger or somehow more authoritative and you might know what the hell they’re doing. Assume that they know what they’re doing when they don’t or might not. Okay. So so yeah, these people had no training. They were just either older or confident.

Another study looked at Yeah. Lead you into an avalanche medical students. Ah, there we go. Look at medical students willingness to over-prescribe antibiotics. So, you know, there was a, this was known to them that, that this happened. But if their mentors or their, their role models were in the habit of over-prescribing the antibiotics, then the, oh, so they followed, they were, they were following suit.

They were following the, the role model, even though they knew that it was likely not to be effective. They were, , the medical students knew that the antibiotics probably wouldn’t work. So they were educated, they weren’t ignorant, but they still followed the behavior. Okay. Okay. So now, now we’re gonna turn to marketing.

Mm-hmm. Okay. How about things like four out of five doctors recommend Oh, something. And now again, it’s a common authority figure. Mm-hmm. But it’s, it was, it was used back in the forties and fifties to sell. Oh, we all know how that works. And I know you’ve, I know we, we’ve all seen commercials where they’ve Oh yeah.

You know, some expert. Another thing is that in a commercial, they will position somebody in a white coat mm-hmm. With a stethoscope around their neck. Mm-hmm. Or with a, you know, a white coat and a clipboard that, and they’re talking heads, they got, uniforms also. It doesn’t have to be, it could also be a, a military uniform or police uniform.

There are other, other uniforms that convey that sense of authority and so, They use this, they do. They use that to, to trigger this intentionally, to trigger this bias in us because they know that it influences us. Celebrity. How about Hollywood celebrities? Yeah. Now there are, , I want to credit the, the celebrities that use their that authority, that celebrity to, to bring awareness.

To things that need to do good to, to do good, basically to, to bring awareness to things. But when they then begin to exceed that, , and speak about things that they really aren’t trained on and, and become, act like experts in something they’re not expert in, then, you know, this is an another example of, of that using that Okay.

And maybe not intentionally, but probably. So how do we avoid the bias? Yeah. How do we do that? Well, the first thing is to, we have to ask ourselves, what, what do we assume about authority? Right? Mm-hmm. We, we gotta look in the mirror a little bit and you know, do we assume that they’re always fair and objective?

That was actually something that was studied. And, and there was a, tendency to assume that an authority, because they were an authority, that they were automatically objective. They had some training in in that or something. No, there’s no I Is there such a thing as As objective training? Objectivity training.

Yeah. Well, in, in the sciences. So scientists say that’s another very popular ah, marketing terms. Oh yeah. Scientists say so, and we believe it just because somebody wrote that. Scientists say, because they’re. Okay. So the next thing is ask if they if, if you’re, if I’m assuming I should ask myself, if I’m assuming that they know something that’s outside their really specific field.

Mm-hmm. Okay. So, so like a, a neuroscientist for talking about gynecology or, or something even farther, afield geology. Oh, that’s true. Okay. I mean, let’s go, you know, it’s a, it’s a different science. It’s still a science, but it’s nowhere near their specialty. Right? And they’re unlikely to have any training. True.

Okay. Unless it’s a hobby. In a more social setting. Oh yeah. Like social setting. We need to be aware of the tendency to go along with the group. Like especially if they’re our group, a group of friends or mm-hmm. A group of people we know. This is, if all your friends are jumping off the building, do you go to whose parents said that to them?

Oh, yeah. If everybody else was doing it. Yes, yes. That’s, it’s, it’s, that’s a very related bias called the bandwagon effect. And it’s, it’s very much like peer pressure. Peer pressure, though as I understand it, peer pressure is more about, you know, peers are, Urging you to do something and you know, to go along.

Whereas the bandwagon effect is more self-directed. You’re just, you’re going along because they’re doing it. They don’t have to ask you to do it. Okay. But, but it, there is a, so there’s both, you know, this societal you know, the authority thing and there there’s also the part of just, you know, being part of a group that, that, okay.

Cool. So, and the last thing is, if you’re the authority.. Okay. If you’re in the position of being authority, how do you avoid the bias both in yourself and others? And that is to make sure that you, you look at both sides of things. You know, look at, look at points of view that are opposite from what, from your point of view, right?

You know what, what? If I was wrong, and we have talked about this one before. Mm-hmm. If I was wrong, you know, what would I believe or mm-hmm. What would the case, what would the situation be? Mm-hmm. And then make sure that, present that to the people that are listening to you and present both sides, you know?

Right. So they get a balance and, and then make sure you suggest to them that they do their. Oh, okay. And, and yeah, just encourage that. So, cool. So that’s the authority bias and and I think we’re all familiar with it, but Wow. I found layers to that I had not even Yeah. Thought about or, or didn’t know.

Right. Especially in the marketing, you know, it, it’s, it’s so insidious in our culture. Yeah. The tv. Yes. And, and that is one of the reasons we’re doing this series is because marketing and social media is such a strong influence in our. That we, we have to protect ourselves cause nobody, nobody else is gonna do it.

Nobody else cares. As long as we just give ’em money, that’s, that’s all they want. It’s money seems that way. Yeah. Okay. Thanks so much for spending a few minutes with us and we will see you in another video. Bye. Bye. Bye.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Comments:

    © 2023 Kuder Consulting Group. All rights reserved.