(单词翻译:单击)
You probably have strong opinions about all kinds of things:
你可能会对所有事情都抱有强烈的意见:
like whether Coke is better than Pepsi, which football team deserves to win the Super Bowl,
比如可口可乐是不是比百事可乐好,哪个足球队理应赢得“超级碗”,
or which Chris is the dreamiest movie star—Pratt, Pine, Hemsworth, or Evans.
或者哪一个最符合您心目中的影星——普拉特、派恩、海姆斯沃斯或者埃文斯 。
Why are there so many Chrises?
为什么有这么多?
But are all those opinions really yours?
但这些真的是你的观点吗?
Humans are social creatures.
人类属于社会群体 。
And when we talk about anything from TV shows to politics, lots of psychological phenomena come into play.
在谈论电视节目到政治上的任何事件时,许多心理现象开始起作用 。
Sometimes, this can lead to bad judgments and biased opinions.
有时候会行成错误的判断和偏见的观点 。
But by knowing how your thoughts can be swayed, you can recognize it when it's happening—and maybe stop it.
但通过了解想法动摇的方式,你就能在一开始的时候认出——也许抛弃这些错误的判断和偏见的观点 。
One kind of bias can come from the company you keep.
有一种偏见可能来自你的同伴 。
It's normal to be friends with people who have similar opinions and values.
和有着相似观点和价值观的人交朋友很正常 。
But many studies have shown that when you talk with people who feel similarly about things,
但许多研究发现,当你和有相似感受的人交谈时,
you can end up with even more extreme opinions.
你的观点甚至会更偏激 。
In other words, you get polarized.
换句话说,你被极化了 。
For example, some experiments have asked people to decide on a risky business proposition together,
例如,在实验中通过要求人们在一起决定一个高风险的商业提议时,
and found that groups of risk-takers get even more risky, while risk-avoiders get less risky.
他们发现,一群冒险者的提议风险性更大,而一群风险规避者的提议风险降低了 。
But it's hard to escape polarization:
但两极分化很难避免:
it can also happen when you have strong opinions that are challenged by others.
当你的强烈意见被质疑时,可能产生两极分化
One study in 2011 had people with diverse views on a social issue respectfully discuss their opinions.
2011年的一项研究中,对社会问题持不同观点的人在一起礼貌地讨论交流 。
Those who already had more extreme beliefs, both for and against the issue,
想法极端——支持和反对的人,
showed even more polarization afterwards.
讨论之后两极分化更加严重 。
This is called a boomerang effect,
我们称之为反向效应,
where a counter-argument makes someone believe even more strongly in their original judgment.
反驳会让别人更坚信他们最初的判断 。
Researchers think this is partly due to your social identity:
研究人员认为,部分原因是因为社会身份的缘故:
the fact that your beliefs and the groups you belong to are part of who you are, so you defend them.
信仰和所属的团体是你的一部分,所以你要为它们辩护 。
So if you and your friends are die-hard peanut M&M fans,
如果你和你的朋友都是花生姆姆巧克力的铁杆粉丝,
hearing a case for crispy M&Ms could just make you extra defensive of your peanut-loving identity.
听闻酥脆姆姆巧克力的案子,额外的辩护也是围绕你爱花生的身份 。
I know it does that to me.
我知道我是这样的 。
Another way your opinions can be swayed in a debate has to do with what you think of first—
在辩论中,观点动摇的另外一种方式和你首先想到的东西有关——
because that can act as an anchor, basically a starting point, for the rest of your thoughts.
它像一个锚一样,基本上说是一个起点,固化了你其它的想法 。
One study in 2000 involved taking an old car to 60 car experts, including mechanics and car dealers.
2000年的一项研究:拿一辆旧车给60名汽车专家,包括机械师和汽车经销商 。
The pretend-customer told the expert what they thought the car was worth,
冒充的顾客告诉专家他们认为这辆车值多少钱,
either higher or lower than it actually was, then asked for the expert's judgment.
要么比实际值高,要么比实际值低,然后询问专家的意见 。
And the initial suggestion affected the experts' estimates, making them similarly higher or lower.
这些顾客最初的建议影响了专家的估计,他们和冒充的顾客给出的高低价差不多,
Psychologists think this is partially due to selective accessibility,
心理学家认为部分原因是因为选择的可得性,
where an anchor makes some information more available in your mind, which affects your opinions.
选择的可得性指的是一些信息更容易在你的脑海中出现,从而影响了你的看法 。
For example, a small study in 2010 even found that when it was warmer outside,
例如,2010年的一项小型研究甚至发现,当天气暖和的时候,
or people were simply asked to think about hot things,
询问人们关于一些关于热的东西,
their responses to survey questions showed that they believed more strongly in global warming.
调查结果显示,这些人更坚信全球变暖 。
So if you stumble upon a flame war online, for example,
所以,例如,如果你在网上偶然发现一场火焰战,
the first thing you read in the comments could cause selective accessibility and shape your thoughts—
你在评论里看到的第一件事可能会引发选择的可得性,并且影响了你的想法——
although there hasn't really been research into that kind of anchoring.
尽管这方面的研究还没有出现 。
Your opinions can also be influenced when you're trying to make a decision with a group because of something called groupthink,
当你和一群人一起做决定的时候,由于所谓的群体思维,你的观点也会受到影响,
which can make you blind to bad reasoning.
会让你对不好的推理视而不见 。
Let's say you're a Doctor Who fan
假设你是《神秘博士》的粉丝,
and enter a heated debate after someone influential claims that, hands-down, Matt Smith's Eleventh Doctor is the best one.
在一些有影响力的人宣称,无疑的,马特·史密斯的第11个博士是最好的,之后卷入一场激烈的辩论 。
A discussion begins with that anchor, filled with a bunch of pro-Matt Smith arguments.
讨论由此开始,这些话言意支持马特·史密斯 。
Maybe you're more of a Tom Baker fan and you think that bow ties just aren't that cool,
也许你更喜欢汤姆·贝克,你觉得蝴蝶结领带不怎么酷,
but you keep your mouth shut in self-censorship, figuring that other people won't want to hear your opinion.
但你在自我审查中保持沉默,你认为别人不想听你的意见 。
You might notice that any arguments that other Doctors might be better are rationalized by the group,
你可能会注意到,任何其他博士可能更好的论点会被群体合理化,
meaning that they are dismissed as bad arguments.
这意味着这些论点被认为是糟糕的而被驳回 。
Or people might stereotype David Tennant's fans, saying they only liked him because of his looks, and ignoring their opinions.
或者,人们会对大卫·田纳特的粉丝持有成见,说他们喜欢田纳特只是因为他的长相,无视他们的意见 。
After lots of keysmashing back and forth, it seems like everyone agrees that Eleven is the best, but that's not necessarily true—
反复争论之后,似乎每个人都认为第11个最好,但这并不一定是真的——
it's what psychologists call an illusion of unanimity.
这就是心理学家所说的一致的错觉 。
And when people think everyone agrees, they're more likely to adjust their opinion.
当人们认为每个人都同意,他们更有可能调整自己的观点 。
When deciding on anything, from government policy to a group project at school,
决定任何事情,从政府政策到学校的集体项目,
all of these and other characteristics of groupthink can influence decisions and shut down critical debate.
所有这些都是群体思维特征,都能够对决策产生影响并停止批判性辩论 。
So ... it might seem like your opinions aren't ever really yours.
所以…似乎观点永远都不是你的 。
But there are ways to fight against the influence of polarization, anchoring, and groupthink.
但是有很多方法对抗两极分化、锚固和群体思维,
Essentially, it all comes down to critical thinking,
本质上讲,一切都归结为批判性思维,
and considering why your opinion might be wrong or too extreme, not just why it might be right.
不仅要想想你的观点为什么可能正确,还要想想它为什么可能错误或者太过极端 。
Some studies have found that having someone play the Devil's Advocate can help,
一些研究发现,唱反调会有所帮助,
genuinely arguing against the preferred decision and asking thoughtful questions.
反对时听从内心的选择,问一些令人深思的问题 。
But an extreme counter-argument can also backfire and cause the boomerang effect.
但极端的反论点也可能适得其反,导致反向效应 。
Other research has shown
其他研究表明,
it can help to talk with others outside of your group, and listen to diverse opinions.
和团队以外的人交谈,倾听不同的观点会有所帮助,
You might discover that what you thought was normal actually was an extreme stance,
你可能会发现,你认为的正常的想法其实是极端的,
or that the issue is more complex than you thought.
或者比你想象的更复杂 。
Also, you can learn about something or start a group discussion before forming a strong opinion—
同时,你可以学到一些东西,或者小组讨论之后再得出观点——
like, reading a bunch of news articles for yourself before reading the comments or tweetstorms about them.
比如,自己先读一些新闻,再去阅读评论或推文 。
We're all naturally influenced by the people around us—
我们都很自然地受到周围人的影响——
it's unavoidable, and it isn't necessarily a bad thing.
这不可避免,但未必是件坏事 。
But by being aware of bias and potentially bad choices,
但是通过意识到偏见的存在以及可能做出的不好的选择,
you can take back some control and know that it's okay to speak up, disagree, and change your mind.
你可以释放一些,大声说出来,不同意也好,改变主意也好都是可以的 。
After all, we're all learning here. But peanut M&Ms are the best.
毕竟,我们都来这里学习 。但是花生姆姆巧克力是最好的 。
Thanks for watching this episode of SciShow Psych, brought to you by our patrons on Patreon!
感谢收看心理科学秀学节目,感谢Patreon对本节目的支持!
If you would like to support us, you can go to patreon.com/scishow.
如果您想支持我们,你可以访问patreon.com/scishow 。
And if you just want to keep learning about brain things,
如果您想继续学习有关大脑的知识,
you can go to youtube.com/scishowpsych and subscribe.
可以登陆并订阅youtube.com/scishowpsych 。