如何让人工智能远离人类的偏见
日期:2019-06-19 13:52

(单词翻译:单击)

 MP3点击下载

How many decisions have been made about you today, or this week or this year, by artificial intelligence?
你今天,这周,或今年,有多少决定是人工智能(AI)做出的?
I build AI for a living so, full disclosure, I'm kind of a nerd.
我靠创建AI为生,所以坦白说,我是个技术狂。
And because I'm kind of a nerd, wherever some new news story comes out about artificial intelligence stealing all our jobs,
因为我算是个技术狂,每当有关于人工智能要抢走我们的工作这样的新闻报道出来,
or robots getting citizenship of an actual country,
或者机器人获得了一个国家的公民身份时,
I'm the person my friends and followers message freaking out about the future.
我就成了对未来感到担忧的朋友和关注者发消息的对象。
We see this everywhere. This media panic that our robot overlords are taking over.
这种事情随处可见。媒体担心机器人正在接管人类的统治。
We could blame Hollywood for that. But in reality, that's not the problem we should be focusing on.
我们可以为此谴责好莱坞。但现实中,这不是我们应该关注的问题。
There is a more pressing danger, a bigger risk with AI, that we need to fix first.
人工智能还有一个更紧迫的危机,一个更大的风险,需要我们首先应对。
So we are back to this question: How many decisions have been made about you today by AI?
所以我们再回到这个问题:今天我们有多少决定是由人工智能做出的?
And how many of these were based on your gender, your race or your background?
其中有多少决定是基于你的性别,种族或者背景?
Algorithms are being used all the time to make decisions about who we are and what we want.
算法一直在被用来判断我们是谁,我们想要什么。
Some of the women in this room will know what I'm talking about
在座的人里有些女性知道我在说什么,
if you've been made to sit through those pregnancy test adverts on YouTube like 1,000 times.
如果你有上千次被要求看完YouTube上那些怀孕测试广告。
Or you've scrolled past adverts of fertility clinics on your Facebook feed. Or in my case, Indian marriage bureaus.
或者你在Facebook的短新闻中刷到过生育诊所的广告。或者我的遇到的情况是,印度婚姻局。
But AI isn't just being used to make decisions about what products we want to buy or which show we want to binge watch next.
但人工智能不仅被用来决定我们想要买什么产品,或者我们接下来想刷哪部剧。
I wonder how you'd feel about someone who thought things like this:
我想知道你会怎么看这样想的人:
"A black or Latino person is less likely than a white person to pay off their loan on time."
“黑人或拉丁美洲人比白人更不可能按时还贷。”
"A person called John makes a better programmer than a person called Mary."
“名叫约翰的人编程能力要比叫玛丽的人好。”
"A black man is more likely to be a repeat offender than a white man."
“黑人比白人更有可能成为惯犯。”
You're probably thinking, "Wow, that sounds like a pretty sexist, racist person," right?
你可能在想,“哇,这听起来像是一个有严重性别歧视和种族歧视的人。”对吧?
These are some real decisions that AI has made very recently, based on the biases it has learned from us, from the humans.
这些都是人工智能近期做出的真实决定,基于它从我们人类身上学习到的偏见。
AI is being used to help decide whether or not you get that job interview;
人工智能被用来帮助决定你是否能够得到面试机会;
how much you pay for your car insurance; how good your credit score is;
你应该为车险支付多少费用;你的信用分数有多好;
and even what rating you get in your annual performance review.
甚至你在年度绩效评估中应该得到怎样的评分。
But these decisions are all being filtered through its assumptions about our identity, our race, our gender, our age.
但这些决定都是通过它对我们的身份、种族、性别和年龄的假设过滤出来的。
How is that happening?
为什么会这样?
Now, imagine an AI is helping a hiring manager find the next tech leader in the company.
想象一下人工智能正在帮助一个人事主管寻找公司下一位科技领袖。
So far, the manager has been hiring mostly men.
目前为止,主管雇佣的大部分是男性。
So the AI learns men are more likely to be programmers than women.
所以人工智能知道男人比女人更有可能成为程序员。
And it's a very short leap from there to: men make better programmers than women.
也就更容易做出这样的判断:男人比女人更擅长编程。
We have reinforced our own bias into the AI. And now, it's screening out female candidates.
我们通过人工智能强化了自己的偏见。现在,它正在筛选掉女性候选人。
Hang on, if a human hiring manager did that, we'd be outraged, we wouldn't allow it.
等等,如果人类招聘主管这样做,我们会很愤怒,不允许这样的事情发生。
This kind of gender discrimination is not OK.
这种性别偏见让人难以接受。
And yet somehow, AI has become above the law, because a machine made the decision. That's not it.
然而,或多或少,人工智能已经凌驾于法律之上,因为是机器做的决定。这还没完。
We are also reinforcing our bias in how we interact with AI.
我们也在强化我们与人工智能互动的偏见。
How often do you use a voice assistant like Siri, Alexa or even Cortana?
你们使用Siri,Alexa或者Cortana这样的语音助手有多频繁?
They all have two things in common: one, they can never get my name right, and second, they are all female.
它们有两点是相同的:第一点,它们总是搞错我的名字,第二点,它们都有女性特征。
They are designed to be our obedient servants, turning your lights on and off, ordering your shopping.
它们都被设计成顺从我们的仆人,开灯关灯,下单购买商品。
You get male AIs too, but they tend to be more high-powered,
也有男性的人工智能,但他们倾向于拥有更高的权力,
like IBM Watson, making business decisions, Salesforce Einstein or ROSS, the robot lawyer.
比如IBM的Watson可以做出商业决定,还有Salesforce的Einstein或者ROSS,是机器人律师。
So poor robots, even they suffer from sexism in the workplace.
所以即便是机器人也没能逃脱工作中的性别歧视。
Think about how these two things combine and affect a kid growing up in today's world around AI.
想想这两者如何结合在一起,又会影响一个在当今人工智能世界中长大的孩子。
So they're doing some research for a school project and they Google images of CEO.
比如他们正在为学校的一个项目做一些研究,他们在谷歌上搜索了CEO的照片。
The algorithm shows them results of mostly men.
算法向他们展示的大部分是男性。
And now, they Google personal assistant.
他们又搜索了个人助手。
As you can guess, it shows them mostly females.
你可以猜到,它显示的大部分是女性。
And then they want to put on some music, and maybe order some food,
然后他们想放点音乐,也许想点些吃的,
and now, they are barking orders at an obedient female voice assistant.
而现在,他们正对着一位顺从的女声助手发号施令。
Some of our brightest minds are creating this technology today.
我们中一些最聪明的人创建了今天的这个技术。
Technology that they could have created in any way they wanted.
他们可以用任何他们想要的方式创造技术。
And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary. Yay!
然而,他们却选择了上世纪50年代《广告狂人》的秘书风格。是的,你没听错!
But OK, don't worry, this is not going to end with me telling you
但还好,不用担心,这不会因为我告诉你
that we are all heading towards sexist, racist machines running the world.
我们都在朝着性别歧视、种族主义的机器前进而结束。
The good news about AI is that it is entirely within our control.
人工智能的好处是,一切都在我们的控制中。
We get to teach the right values, the right ethics to AI. So there are three things we can do.
我们得告诉人工智能正确的价值观,道德观。所以有三件事我们可以做。
One, we can be aware of our own biases and the bias in machines around us.
第一,我们能够意识到自己的偏见和我们身边机器的偏见。
Two, we can make sure that diverse teams are building this technology.
第二,我们可以确保打造这个技术的是背景多样的团队。
And three, we have to give it diverse experiences to learn from.
第三,我们必须让它从丰富的经验中学习。
I can talk about the first two from personal experience.
我可以从我个人的经验来说明前两点。
When you work in technology and you don't look like a Mark Zuckerberg or Elon Musk,
当你在科技行业工作,并且不像马克·扎克伯格或埃隆·马斯克那样位高权重,
your life is a little bit difficult, your ability gets questioned.
你的生活会有点困难,你的能力会收到质疑。
Here's just one example. Like most developers, I often join online tech forums and share my knowledge to help others.
这只是一个例子。跟大部分开发者一样,我经常参加在线科技论坛,分享我的知识帮助别人。
And I've found, when I log on as myself, with my own photo, my own name,
我发现,当我用自己的照片,自己的名字登陆时,
I tend to get questions or comments like this: "What makes you think you're qualified to talk about AI?"
我倾向于得到这样的问题或评论:“你为什么觉得自己有资格谈论人工智能?”
"What makes you think you know about machine learning?"
“你为什么觉得你了解机器学习?”

如何让人工智能远离人类的偏见

So, as you do, I made a new profile, and this time, instead of my own picture, I chose a cat with a jet pack on it.
所以,我创建了新的资料页,这次,我没有选择自己的照片,而是选择了一只带着喷气背包的猫。
And I chose a name that did not reveal my gender.
我选择了一个无法体现我性别的名字。
You can probably guess where this is going, right?
你能够大概猜到会怎么样,对吧?
So, this time, I didn't get any of those patronizing comments about my ability and I was able to actually get some work done.
于是这次,我不再收到任何居高临下的评论,我能够专心把工作做完。
And it sucks, guys. I've been building robots since I was 15, I have a few degrees in computer science,
这感觉太糟糕了,伙计们。我从15岁起就在构建机器人,我有计算机科学领域的几个学位,
and yet, I had to hide my gender in order for my work to be taken seriously.
然而,我不得不隐藏我的性别以让我的工作被严肃对待。
So, what's going on here? Are men just better at technology than women?
这是怎么回事呢?男性在科技领域就是强于女性吗?
Another study found that when women coders on one platform hid their gender,
另一个研究发现,当女性程序员在平台上隐藏性别时,
like myself, their code was accepted four percent more than men. So this is not about the talent.
像我这样,她们的代码被接受的比例比男性高4%。所以这跟能力无关。
This is about an elitism in AI that says a programmer needs to look like a certain person.
这是人工智能领域的精英主义,即程序员看起来得像具备某个特征的人。
What we really need to do to make AI better is bring people from all kinds of backgrounds.
让人工智能变得更好,我们需要切实的把来自不同背景的人集合到一起。
We need people who can write and tell stories to help us create personalities of AI.
我们需要能够书写和讲故事的人来帮助我们创建人工智能更好的个性。
We need people who can solve problems.
我们需要能够解决问题的人。
We need people who face different challenges and we need people who can tell us what are the real issues that need fixing
我们需要能应对不同挑战的人,我们需要有人告诉我们什么是真正需要解决的问题,
and help us find ways that technology can actually fix it.
帮助我们找到用技术解决问题的方法。
Because, when people from diverse backgrounds come together,
因为,当不同背景的人走到一起时,
when we build things in the right way, the possibilities are limitless.
当我们以正确的方式做事情时,就有无限的可能。
And that's what I want to end by talking to you about.
这就是我最后想和你们讨论的。
Less racist robots, less machines that are going to take our jobs -- and more about what technology can actually achieve.
减少种族歧视的机器人,减少夺走我们工作的机器--更多专注于技术究竟能实现什么。
So, yes, some of the energy in the world of AI, in the world of technology is going to be about what ads you see on your stream.
是的,人工智能世界中,科技世界中的一些能量是关于你在流媒体中看到的广告。
But a lot of it is going towards making the world so much better.
但更多是朝着让世界更美好的方向前进。
Think about a pregnant woman in the Democratic Republic of Congo,
想想刚果民主共和国的一位孕妇,
who has to walk 17 hours to her nearest rural prenatal clinic to get a checkup.
需要走17小时才能到最近的农村产前诊所进行产检。
What if she could get diagnosis on her phone, instead?
如果她在手机上就能得到诊断会怎样呢?
Or think about what AI could do for those one in three women in South Africa who face domestic violence.
或者想象一下人工智能能为1/3面临家庭暴力的南非女性做什么。
If it wasn't safe to talk out loud, they could get an AI service to raise alarm, get financial and legal advice.
如果大声说出来不安全的话,她们可以通过一个人工智能服务来报警,获得财务和法律咨询。
These are all real examples of projects that people, including myself, are working on right now, using AI.
这些都是包括我在内正在使用人工智能的人所做的项目中的真实案例。
So, I'm sure in the next couple of days there will be yet another news story about the existential risk,
我确信在未来的几十天里面,会有另一个新闻故事,告诉你们,
robots taking over and coming for your jobs.
机器人会接管你们的工作。
And when something like that happens, I know I'll get the same messages worrying about the future.
当这样的事情发生时,我知道我会收到同样对未来表示担忧的信息。
But I feel incredibly positive about this technology.
但我对这个技术极为乐观。
This is our chance to remake the world into a much more equal place.
这是我们重新让世界变得更平等的机会。
But to do that, we need to build it the right way from the get go.
但要做到这一点,我们需要在一开始就以正确的方式构建它。
We need people of different genders, races, sexualities and backgrounds.
我们需要不同性别,种族,性取向和背景的人。
We need women to be the makers and not just the machines who do the makers' bidding.
我们需要女性成为创造者,而不仅仅是听从创造者命令的机器。
We need to think very carefully what we teach machines,
我们需要仔细思考我们教给机器的东西,
what data we give them, so they don't just repeat our own past mistakes.
我们给它们什么数据,这样它们就不会只是重复我们过去的错误。
So I hope I leave you thinking about two things.
所以我希望我留给你们两个思考。
First, I hope you leave thinking about bias today.
首先,我希望你们思考当今社会中的的偏见。
And that the next time you scroll past an advert that assumes you are interested in fertility clinics or online betting websites,
下次当你滚动刷到认为你对生育诊所或者网上投注站有兴趣的广告时,
that you think and remember that the same technology is assuming that a black man will reoffend.
这会让你回想起同样的技术也在假定黑人会重复犯罪。
Or that a woman is more likely to be a personal assistant than a CEO.
或者女性更可能成为个人助理而非CEO。
And I hope that reminds you that we need to do something about it.
我希望那会提醒你,我们需要对此有所行动。
And second, I hope you think about the fact that you don't need to look a certain way
第二,我希望你们考虑一下这个事实,你不需要以特定的方式去看,
or have a certain background in engineering or technology to create AI,
也不需要有一定的工程或技术背景去创建人工智能,
which is going to be a phenomenal force for our future.
人工智能将成为我们未来的一股非凡力量。
You don't need to look like a Mark Zuckerberg, you can look like me.
你不需要看起来像马克·扎克伯格,你可以看起来像我。
And it is up to all of us in this room to convince the governments and the corporations
我们这个房间里的所有人都有责任去说服政府和公司,
to build AI technology for everyone, including the edge cases.
为每个人创建人工智能技术,包括边缘的情况。
And for us all to get education about this phenomenal technology in the future.
让我们所有人都能在未来接受有关这项非凡技术的教育。
Because if we do that, then we've only just scratched the surface of what we can achieve with AI. Thank you.
因为如果我们那样做了,才刚刚打开了人工智能世界的大门。谢谢。

分享到
重点单词
  • obedientadj. 服从的,顺从的
  • convincevt. 使确信,使信服,说服
  • streamn. (人,车,气)流,水流,组 v. 流动,流出,飘动
  • violencen. 暴力,猛烈,强暴,暴行
  • pregnancyn. 怀孕
  • headingn. 标题,题目,航向 动词head的现在分词
  • intelligencen. 理解力,智力 n. 情报,情报工作,情报机关
  • revealvt. 显示,透露 n. (外墙与门或窗之间的)窗侧,门
  • controln. 克制,控制,管制,操作装置 vt. 控制,掌管,支
  • achievev. 完成,达到,实现