我们为什么会对机器人有感情
日期:2018-11-24 17:07

(单词翻译:单击)

 MP3点击下载

There was a day, about 10 years ago, when I asked a friend to hold a baby dinosaur robot upside down.
大概10年前的一天,我让一个朋友头朝下地握着一个小恐龙机器人。
It was this toy called a Pleo that I had ordered, and I was really excited about it because I've always loved robots.
这个机器人是我订购的一款叫做Pleo的玩具,我对此非常兴奋,因为我一直都很喜欢机器人。
And this one has really cool technical features.
这个机器人有很酷的技术特征。
It had motors and touch sensors and it had an infrared camera.
它有马达和触觉传感器,还有一个红外摄像头。
And one of the things it had was a tilt sensor, so it knew what direction it was facing.
它还有一个部件是倾斜传感器,所以它就会知道自己面对的是什么方向。
And when you held it upside down, it would start to cry.
当你把它倒过来,它会开始哭泣。
And I thought this was super cool, so I was showing it off to my friend,
我觉得这点非常酷,所以我展示给我朋友看,
and I said, "Oh, hold it up by the tail. See what it does."
我说:“抓住尾巴竖起来,看看它会怎样。”
So we're watching the theatrics of this robot struggle and cry out.
于是我们看着这个机器人表演、挣扎和哭泣。
And after a few seconds, it starts to bother me a little,
几秒钟后,我开始感到有点不安,
and I said, "OK, that's enough now. Let's put him back down."
于是我说,“好了,差不多了,我们把它放回去吧。”
And then I pet the robot to make it stop crying.
然后我抚摸着机器人,让它停止哭泣。
And that was kind of a weird experience for me.
这对我来说是一种奇怪的经历。
For one thing, I wasn't the most maternal person at the time.
首先,我那时还不是个很有母性的人。
Although since then I've become a mother, nine months ago,
尽管在那之前的9个月,我已经成为了一个母亲,
and I've learned that babies also squirm when you hold them upside down.
我还知道,当你让婴儿大头朝下时,婴儿也会抽泣。
But my response to this robot was also interesting because I knew exactly how this machine worked,
但我对这个机器人的反应也非常有趣,因为我确切地知道这个机器工作的原理,
and yet I still felt compelled to be kind to it.
然而我仍然感到有必要对它仁慈些。
And that observation sparked a curiosity that I've spent the past decade pursuing. Why did I comfort this robot?
这个观察引起了好奇心,让我花费了长达10年的时间去追寻。为什么我会去安慰这个机器人?
And one of the things I discovered was that my treatment of this machine was more than just an awkward moment in my living room,
我发现我对待这个机器人的方式不仅是我起居室里一个尴尬时刻,
that in a world where we're increasingly integrating robots into our lives,
在这个世界里,我们正越来越多地将机器人融入到我们生活中,
an instinct like that might actually have consequences, because the first thing that I discovered is that it's not just me.
像这样的本能可能会产生一些后果,因为我发现的第一件事情是,这并非只是发生在我身上的个例。
In 2007, the Washington Post reported that the United States military was testing this robot that defused land mines.
2007年,《华盛顿邮报》报道称,美国军方正在测试拆除地雷的机器人。
And the way it worked was it was shaped like a stick insect and it would walk around a minefield on its legs,
它的形状就像一只竹节虫,用腿在雷区上行走,
and every time it stepped on a mine, one of the legs would blow up,
每次踩到地雷时它的一条腿就会被炸掉,
and it would continue on the other legs to blow up more mines.
然后继续用其他腿去引爆更多的地雷。
And the colonel who was in charge of this testing exercise ends up calling it off,
负责这次测试的上校后来取消了这个测试,
because, he says, it's too inhumane to watch this damaged robot drag itself along the minefield.
因为他说,看着这个机器人拖着残破的身躯在雷区挣扎行走,实在太不人道了。
Now, what would cause a hardened military officer and someone like myself to have this response to robots?
那么,是什么导致了一个强硬的军官和像我这样的人对机器人有这种反应呢?
Well, of course, we're primed by science fiction and pop culture to really want to personify these things,
不可否认,我们都被科幻小说及流行文化所影响,想要将这些东西拟人化,
but it goes a little bit deeper than that.
但真实情况还有着更深层的含义。
It turns out that we're biologically hardwired to project intent and life onto any movement in our physical space that seems autonomous to us.
事实表明,我们天生就具有将意图和生活投射到物理空间中在我们看来能自主行动的任何运动物体上。
So people will treat all sorts of robots like they're alive. These bomb-disposal units get names.
所以人们像对待活物一样对待各种各样的机器人。这些拆弹机器人有自己的名字。
They get medals of honor. They've had funerals for them with gun salutes.
它们能获得荣誉勋章。人们为它们举行了葬礼,并用礼炮向它们致敬。
And research shows that we do this even with very simple household robots, like the Roomba vacuum cleaner.
研究还发现,我们即便对非常简单的家居机器人也会这样,比如Roomba吸尘器。
It's just a disc that roams around your floor to clean it,
它只是一个在你地板上通过旋转进行清理的圆盘,
but just the fact it's moving around on its own will cause people to name the Roomba
但仅仅因为它能够自己移动,就会导致人们想要给Roomba取名,
and feel bad for the Roomba when it gets stuck under the couch.
当它卡在沙发下时,还会替它感到难过。
And we can design robots specifically to evoke this response,
我们可以专门设计机器人来唤起这种反应,
using eyes and faces or movements that people automatically, subconsciously associate with states of mind.
使用诸如眼睛、面孔或动作,这些人们自动地、在潜意识中与心智状态相联系的特征。
And there's an entire body of research called human-robot interaction that really shows how well this works.
这一整套叫做人机交互的研究显示了这个方法的效果的确非常好。
So for example, researchers at Stanford University found out
比如,在斯坦福大学的研究者发现,
that it makes people really uncomfortable when you ask them to touch a robot's private parts.
当你叫人们触摸机器人的私处时,他们会感到很不舒服。
So from this, but from many other studies, we know,
从这个以及更多其他研究中,
we know that people respond to the cues given to them by these lifelike machines, even if they know that they're not real.
我们知道人们会对这些栩栩如生的机器给他们的线索做出反应,即使他们知道它们只是机器。
Now, we're headed towards a world where robots are everywhere.
我们正迈入一个机器人无处不在的社会。
Robotic technology is moving out from behind factory walls. It's entering workplaces, households.
机器人科技正在走出工厂的围墙。它们正在进入工作场所,家居环境。
And as these machines that can sense and make autonomous decisions and learn enter into these shared spaces,
随着这些能够感知并自己做决定和学习的机器进入这些共享空间,
I think that maybe the best analogy we have for this is our relationship with animals.
我认为一个最好的类比就是我们和动物的关系。
Thousands of years ago, we started to domesticate animals, and we trained them for work and weaponry and companionship.
几千年前,我们开始驯养动物,我们训练它们为我们工作,保护和陪伴我们。
And throughout history, we've treated some animals like tools or like products,
在这个历史进程中,我们把有些动物当作工具或产品使用,
and other animals, we've treated with kindness and we've given a place in society as our companions.
对其它一些动物,我们则对它们很好,在社会中给予它们同伴的位置。
I think it's plausible we might start to integrate robots in similar ways.
我认为我们可能会开始以类似的方式整合机器人。
And sure, animals are alive. Robots are not.
当然,动物有生命。机器人没有。
And I can tell you, from working with roboticists, that we're pretty far away from developing robots that can feel anything.
作为机器人专家,我可以告诉各位,我们距离能产生感情的机器人还很遥远。
But we feel for them, and that matters, because if we're trying to integrate robots into these shared spaces,
但我们同情它们,这点很重要,因为如果我们尝试把机器人整合进这些共享空间,
we need to understand that people will treat them differently than other devices,
就需要懂得人们会把它们与其他设备区别对待,
and that in some cases, for example, the case of a soldier who becomes emotionally attached to the robot that they work with,
而且在有些场景下,比如,那个士兵对一起工作的机器人产生情感依恋的例子,
that can be anything from inefficient to dangerous.
这可能是低效的,也可能是危险的。

我们为什么会对机器人有感情

But in other cases, it can actually be useful to foster this emotional connection to robots.
但在其他场景下,培养与机器人的情感联系可能非常有用。
We're already seeing some great use cases,
我们已经看到了一些很好的使用场景,
for example, robots working with autistic children to engage them in ways that we haven't seen previously,
比如跟自闭症儿童一起的机器人以我们前所未见的方式与他们互动,
or robots working with teachers to engage kids in learning with new results. And it's not just for kids.
或者让机器人与老师共事,在帮助孩子们学习方面获得新的成果。并且并不只适用于儿童。
Early studies show that robots can help doctors and patients in health care settings.
早期的研究发现机器人可以在医疗保健领域帮助医生和病人。
This is the PARO baby seal robot. It's used in nursing homes and with dementia patients.
这是帕罗婴儿海豹机器人。它被用于疗养院来陪伴老年痴呆症患者。
It's been around for a while. And I remember, years ago, being at a party and telling someone about this robot,
它已经面世有阵子了。我记得若干年前,在参与的一次聚会上跟人讲到这个机器人时,
and her response was, "Oh my gosh. That's horrible. I can't believe we're giving people robots instead of human care."
她的反应往往是,“哦,天哪。太可怕了。我无法相信我们给人们的是机器人护理,而不是人类护理。”
And this is a really common response, and I think it's absolutely correct, because that would be terrible.
这是一个非常普遍的反应,我觉得这是完全正确的,因为这可能会很可怕。
But in this case, it's not what this robot replaces.
但在这个场景下,机器人替代的不是护理。
What this robot replaces is animal therapy in contexts where we can't use real animals but we can use robots,
机器人替代的是动物疗法,这可以用在无法使用真正动物,但可以使用机器人的场合中,
because people will consistently treat them more like an animal than a device.
因为人们会把它们当成动物而不是设备看待。
Acknowledging this emotional connection to robots can also help us anticipate challenges
承认这种与机器人的情感联系也能帮助我们预见到挑战,
as these devices move into more intimate areas of people's lives.
随着这些设备将进入人们生活中更亲密的领域。
For example, is it OK if your child's teddy bear robot records private conversations?
比如,用你孩子的玩具熊机器人录制私人对话是否合适?
Is it OK if your sex robot has compelling in-app purchases?
你的性爱机器人有强制的内置付费系统是否合适?
Because robots plus capitalism equals questions around consumer protection and privacy.
因为机器人加上资本就等于消费者保护和隐私问题。
And those aren't the only reasons that our behavior around these machines could matter.
这些还不是我们对待这些机器人的行为之所以重要的唯一原因。
A few years after that first initial experience I had with this baby dinosaur robot,
在我第一次见到这只小恐龙机器人的几年后,
I did a workshop with my friend Hannes Gassert.
我和朋友汉内斯·加瑟特开展了一次研讨会。
And we took five of these baby dinosaur robots and we gave them to five teams of people.
我们拿了5个小恐龙机器人,把它们分给5队人。
And we had them name them and play with them and interact with them for about an hour.
我们让他们为它们取名,陪伴它们一起互动大约一个小时。
And then we unveiled a hammer and a hatchet and we told them to torture and kill the robots.
然后我们拿出了斧头和锤子让他们去折磨和杀死机器人。
And this turned out to be a little more dramatic than we expected it to be,
这个结果比我们想的要更有戏剧性,
because none of the participants would even so much as strike these baby dinosaur robots,
因为甚至没有一个参与者去攻击这些小恐龙机器人,
so we had to improvise a little, and at some point, we said,
所以我们得临时凑合一下,在某个时候,我们说,
"OK, you can save your team's robot if you destroy another team's robot."
“好吧,你可以保住你们队的机器人,但前提是把其它队的机器人毁掉。”
And even that didn't work. They couldn't do it.
即便这样也没用,他们不愿意去做。
So finally, we said, "We're going to destroy all of the robots unless someone takes a hatchet to one of them."
所以最后,我们说,“我们将要毁掉所有的机器人,除非有人拿短柄斧砍掉它们中的一个。”
And this guy stood up, and he took the hatchet,
有个人站了起来,他拿起斧头,
and the whole room winced as he brought the hatchet down on the robot's neck,
当他把斧头砍到机器人的脖子上时,整个房间的人都缩了回去,
and there was this half-joking, half-serious moment of silence in the room for this fallen robot.
房间中出现了一个为这个倒下的机器人半玩笑半严肃的沉默时刻。
So that was a really interesting experience. Now, it wasn't a controlled study, obviously,
那真是一个有趣的体验。它不是一个对照实验,显然不是,
but it did lead to some later research that I did at MIT with Palash Nandy and Cynthia Breazeal,
但这引发了我后来在麻省理工跟帕拉什·南迪和辛西娅·布雷西亚尔做的研究,
where we had people come into the lab and smash these HEXBUGs that move around in a really lifelike way, like insects.
我们让来到实验室的人们打碎这些像活生生的昆虫那样移动的遥控电子甲虫。
So instead of choosing something cute that people are drawn to, we chose something more basic,
与选择人们喜欢的可爱东西相比,我们选择了一些更基本的东西,
and what we found was that high-empathy people would hesitate more to hit the HEXBUGS.
我们发现富有同情心的人们在击碎这些机器昆虫时要更加犹豫。
Now this is just a little study, but it's part of a larger body of research
这只是一个小小的研究,但它是一个更大范围研究的一部分,
that is starting to indicate that there may be a connection between people's tendencies for empathy and their behavior around robots.
这开始表明人们的同情心与他们对待机器人的行为可能存在某种联系。
But my question for the coming era of human-robot interaction is not: "Do we empathize with robots?"
但我对即将到来的人机交互时代的问题并不是:“我们对机器人会产生同情心吗?”
It's: "Can robots change people's empathy?"
而是:“机器人会改变人类的同情心吗?”
Is there reason to, for example, prevent your child from kicking a robotic dog,
是不是存在这样的原因,比如说,阻止你的孩子踢一只机器狗,
not just out of respect for property, but because the child might be more likely to kick a real dog?
不只是出于对财产的尊重,而是因为孩子更可能会去踢一只真的狗?
And again, it's not just kids. This is the violent video games question,
同样的,这不只适用于儿童。这是一个关于暴力游戏的问题,
but it's on a completely new level because of this visceral physicality that we respond more intensely to than to images on a screen.
但这个问题上升到了一个全新的水平,因为这种出于本能的物质性行为要比我们对屏幕上的图像反应更强烈。
When we behave violently towards robots, specifically robots that are designed to mimic life,
当我们对机器人,对专门设计来模拟生命的机器人表现出暴力行径时,
is that a healthy outlet for violent behavior or is that training our cruelty muscles?
这是暴力行为的健康疏导还是在培养我们实施残忍行径的力量?
We don't know ... But the answer to this question has the potential to impact human behavior,
我们还不知道...但这个问题的答案有可能影响人类行为,
it has the potential to impact social norms,
它有可能影响社会规范,
it has the potential to inspire rules around what we can and can't do with certain robots, similar to our animal cruelty laws.
可能会启发我们制定对特定机器人能做什么和不能做什么的规则,就类似于我们的动物虐待法。
Because even if robots can't feel, our behavior towards them might matter for us.
因为即便机器人不能感知,我们对待它们的行为也可能对我们有着重要意义。
And regardless of whether we end up changing our rules,
不管我们是否最终会改变我们的规则,
robots might be able to help us come to a new understanding of ourselves.
机器人也许能帮助我们对自己有一个全新的认识。
Most of what I've learned over the past 10 years has not been about technology at all.
我在过去10年中学到的经验大部分跟技术无关。
It's been about human psychology and empathy and how we relate to others.
而是关于人类心理学、同情心以及我们如何与他人相处。
Because when a child is kind to a Roomba, when a soldier tries to save a robot on the battlefield,
因为当一个儿童友好地对待Roomba时,当一个士兵试图拯救战场上的机器人时,
or when a group of people refuses to harm a robotic baby dinosaur, those robots aren't just motors and gears and algorithms.
或者当一组人拒绝伤害小恐龙机器人时,这些机器人就不只是马达、齿轮和算法。
They're reflections of our own humanity. Thank you.
它们映射出了我们的人性。谢谢。

分享到
重点单词
  • botherv. 使恼怒,使不安,烦扰,费心 n. 烦扰,麻烦,焦急
  • plausibleadj. 似真实合理的,似可信的
  • fostervt. 养育,培养,促进,鼓励,抱有(希望等) adj.
  • curiosityn. 好奇,好奇心
  • technicaladj. 技术的,工艺的
  • hammern. 锤,榔头 vi. 锤击,反复敲打 vt. 锤打,严
  • sealn. 印章,封条 n. 海豹 v. 盖印,密封
  • mimicadj. 模仿的,假的 [计算机] 模拟的 vt. 模仿
  • decaden. 十年
  • movementn. 活动,运动,移动,[音]乐章