返回首页

KritiSharma_2018X-_如何让人工智能远离人类的偏见_

How many decisions have been made about you today, or this week or this year, by artificial intelligence ? 你今天,这周,或今年 有多少决定 是人工智能(AI)做出的?
artificial intelligence:n.人工智能;
I build AI for a living so, full disclosure , I'm kind of a nerd . 我靠创建AI为生, 所以,坦白说,我是个技术狂。
disclosure:n.揭露;透露;公开;公开的事情; nerd:n.呆子;讨厌的人;
And because I'm kind of a nerd, wherever some new news story comes out about artificial intelligence stealing all our jobs, or robots getting citizenship of an actual country, 因为我是算是个技术狂, 每当有关于人工智能 要抢走我们的工作 这样的新闻报道出来, 或者机器人获得了 一个国家的公民身份时,
citizenship:n.[法]公民身份,公民资格;国籍;公民权;
I'm the person my friends and followers message freaking out about the future. 我就成了对未来感到 担忧的朋友和关注者 发消息的对象。
freaking:adj.该死的,他妈的;v.(使)强烈反应,畏惧;(freak的现在分词)
We see this everywhere. 这种事情随处可见。
This media panic that our robot overlords are taking over. 媒体担心机器人 正在接管人类的统治。
media:n.媒体;媒质(medium的复数);血管中层;浊塞音;中脉; panic:adj.恐慌的;n.惊恐;恐慌;惶恐不安;v.惊慌失措; overlords:n.领主,庄主,大王;(overlord的复数)
We could blame Hollywood for that. 我们可以为此谴责好莱坞。
But in reality, that's not the problem we should be focusing on. 但现实中,这不是 我们应该关注的问题。
There is a more pressing danger, a bigger risk with AI, that we need to fix first. 人工智能还有一个更紧迫的 危机, 一个更大的风险, 需要我们首先应对。
So we are back to this question: 所以我们再回到这个问题:
How many decisions have been made about you today by AI? 今天我们有多少决定 是由人工智能做出的?
And how many of these were based on your gender , your race or your background? 其中有多少决定 是基于你的性别,种族或者背景?
gender:n.性别;
Algorithms are being used all the time to make decisions about who we are and what we want. 算法一直在被用来 判断我们是谁,我们想要什么。
Some of the women in this room will know what I'm talking about if you've been made to sit through those pregnancy test adverts on YouTube like 1,000 times. 在座的人里有些女性 知道我在说什么, 如果你有上千次 被要求看完YouTube上 那些怀孕测试广告,
sit through:v.一直挺到结束;耐着性子看完; pregnancy:n.怀孕;丰富,多产;意义深长; adverts:广告;提及;
Or you've scrolled past adverts of fertility clinics on your Facebook feed. 或者你在Faceboo的短新闻中 刷到过生育诊所的广告。
scrolled:v.滚屏;滚动;(scroll的过去分词和过去式) fertility:n.多产;肥沃;[农经]生产力;丰饶; clinics:n.诊所(clinic的复数形式);
Or in my case, Indian marriage bureaus . 或者我的遇到的情况是, 印度婚姻局。
bureaus:n.局,处;衣柜;办公桌;
(Laughter) (笑声)
But AI isn't just being used to make decisions about what products we want to buy or which show we want to binge watch next. 但人工智能不仅被用来决定 我们想要买什么产品, 或者我们接下来想刷哪部剧。
binge:n.狂欢,狂闹;放纵;vt.放纵;vi.狂饮作乐;大吃大喝;
I wonder how you'd feel about someone who thought things like this: "A black or Latino person is less likely than a white person to pay off their loan on time." 我想知道你会怎么看这样想的人: “黑人或拉丁美洲人 比白人更不可能按时还贷。”
Latino:adj.拉丁美洲的;拉丁美洲人的;n.拉丁美洲人;
'"A person called John makes a better programmer than a person called Mary." “名叫约翰的人编程能力 要比叫玛丽的人好。”
'"A black man is more likely to be a repeat offender than a white man." “黑人比白人更有可能成为惯犯。”
offender:n.罪犯;冒犯者;违法者;
You're probably thinking, "Wow, that sounds like a pretty sexist , racist person," right? 你可能在想, “哇,这听起来像是一个有严重 性别歧视和种族歧视的人。” 对吧?
sexist:n.性别歧视者;男性至上主义者;adj.性别主义者的;性别歧视者的; racist:n.种族主义者;种族主义的;
These are some real decisions that AI has made very recently , based on the biases it has learned from us, from the humans. 这些都是人工智能 近期做出的真实决定, 基于它从我们人类身上 学习到的偏见。
recently:adv.最近;新近; biases:n.偏差,偏见(bias的复数形式);v.偏见(bias的三单形式);
AI is being used to help decide whether or not you get that job interview; how much you pay for your car insurance; how good your credit score is; and even what rating you get in your annual performance review. 人工智能被用来帮助决定 你是否能够得到面试机会; 你应该为车险支付多少费用; 你的信用分数有多好; 甚至你在年度绩效评估中 应该得到怎样的评分。
whether or not:是否…; annual:n.年报;年鉴;年刊;adj.每年的;年度的;一年的; performance:n.性能;表现;业绩;表演;
But these decisions are all being filtered through its assumptions about our identity , our race, our gender, our age. 但这些决定都是 通过它对我们的身份、 种族、性别和年龄的 假设过滤出来的。
filtered:v.过滤;(用程序)筛选;缓行;(filter的过去分词和过去式) assumptions:n.假定;假设;承担;获得;(assumption的复数) identity:n.身份;同一性,一致;特性;恒等式;
How is that happening? 为什么会这样?
Now, imagine an AI is helping a hiring manager find the next tech leader in the company. 想象一下人工智能 正在帮助一个人事主管 寻找公司下一位科技领袖。
So far, the manager has been hiring mostly men. 目前为止,主管雇佣的大部分是男性。
So the AI learns men are more likely to be programmers than women. 所以人工智能知道男人比女人 更有可能成为程序员,
And it's a very short leap from there to: men make better programmers than women. 也就更容易做出这样的判断: 男人比女人更擅长编程。
leap:n.跳越;跳跃;跳高;骤变;v.跳;跳跃;跳越;猛冲;
We have reinforced our own bias into the AI. 我们通过人工智能强化了自己的偏见。
reinforced:adj.加固的;加强的;加筋的;v.加强;增援;(reinforce的过去式和过去分词)
And now, it's screening out female candidates. 现在,它正在筛选掉女性候选人。
female:adj.女性的;雌性的;柔弱的,柔和的;n.女人;[动]雌性动物;
Hang on, if a human hiring manager did that, we'd be outraged , we wouldn't allow it. 等等,如果人类招聘主管这样做, 我们会很愤怒, 不允许这样的事情发生。
outraged:adj.义愤填膺的;愤慨的,气愤的;v.使愤怒(outrage的过去式,过去分词);
This kind of gender discrimination is not OK. 这种性别偏见让人难以接受。
discrimination:n.歧视;区别,辨别;识别力;
And yet somehow , AI has become above the law, because a machine made the decision. 然而,或多或少, 人工智能已经凌驾于法律之上, 因为是机器做的决定。
somehow:adv.以某种方法;莫名其妙地;
That's not it. 这还没完。
We are also reinforcing our bias in how we interact with AI. 我们也在强化我们与 人工智能互动的偏见。
reinforcing:vt.加强(reinforce的ing形式); interact:v.互相影响;互相作用;n.幕间剧;幕间休息;
How often do you use a voice assistant like Siri , Alexa or even Cortana? 你们使用Siri,Alexa或者Cortana 这样的语音助手有多频繁?
Siri:n.iPhone4S上的语音控制功能;
They all have two things in common: one, they can never get my name right, and second, they are all female. 它们有两点是相同的: 第一点,它们总是搞错我的名字, 第二点,它们都有女性特征。
They are designed to be our obedient servants , turning your lights on and off, ordering your shopping. 它们都被设计成顺从我们的仆人, 开灯关灯,下单购买商品。
obedient:adj.顺从的,服从的;孝顺的; servants:n.仆人;服务员(servant的复数);
You get male AIs too, but they tend to be more high-powered , like IBM Watson, making business decisions, 也有男性的人工智能, 但他们倾向于拥有更高的权力, 比如IBM的Watson可以做出商业决定,
high-powered:adj.高性能的;精力充沛的,强有力的;马力大的;
Salesforce Einstein or ROSS, the robot lawyer. 还有Salesforce的Einstein 或者ROSS, 是机器人律师。
So poor robots, even they suffer from sexism in the workplace . 所以即便是机器人也没能 逃脱工作中的性别歧视。
sexism:n.(针对女性的)性别歧视;男性至上主义; workplace:n.工作场所;车间;
(Laughter) (笑声)
Think about how these two things combine and affect a kid growing up in today's world around AI. 想想这两者如何结合在一起, 又会影响一个在当今人工智能 世界中长大的孩子。
So they're doing some research for a school project and they Google images of CEO. 比如他们正在为学校的 一个项目做一些研究, 他们在谷歌上搜索了CEO的照片。
Google:谷歌;谷歌搜索引擎; images:n.印象;声誉;形象;画像;雕像;(image的第三人称单数和复数)
The algorithm shows them results of mostly men. 算法向他们展示的大部分是男性。
And now, they Google personal assistant . 他们又搜索了个人助手。
personal assistant:n.私人助理;私人秘书;
As you can guess, it shows them mostly females. 你可以猜到,它显示的大部分是女性。
And then they want to put on some music, and maybe order some food, and now, they are barking orders at an obedient female voice assistant. 然后他们想放点音乐, 也许想点些吃的, 而现在,他们正对着一位 顺从的女声助手发号施令。
barking:v.(狗)吠叫;厉声发令;厉声质问;(bark的现在分词)
Some of our brightest minds are creating this technology today. 我们中一些最聪明的人 创建了今天的这个技术。
technology:n.技术;工艺;术语;
Technology that they could have created in any way they wanted. 他们可以用任何他们 想要的方式创造技术。
And yet, they have chosen to create it in the style of 1950s "Mad Man" secretary. 然而,他们却选择了上世纪50年代 《广告狂人》的秘书风格。
Yay! 是的,你没听错!
But OK, don't worry, this is not going to end with me telling you that we are all heading towards sexist, racist machines running the world. 但还好,不用担心。 这不会因为我告诉你 我们都在朝着性别歧视、 种族主义的机器前进而结束。
The good news about AI is that it is entirely within our control. 人工智能的好处是, 一切都在我们的控制中。
We get to teach the right values, the right ethics to AI. 我们得告诉人工智能 正确的价值观,道德观。
ethics:n.伦理学;伦理观;道德标准;
So there are three things we can do. 所以有三件事我们可以做。
One, we can be aware of our own biases and the bias in machines around us. 第一,我们能够意识到自己的偏见 和我们身边机器的偏见。
Two, we can make sure that diverse teams are building this technology. 第二,我们可以确保打造 这个技术的是背景多样的团队。
diverse:adj.不同的;多种多样的;变化多的;
And three, we have to give it diverse experiences to learn from. 第三,我们必须让它 从丰富的经验中学习。
I can talk about the first two from personal experience. 我可以从我个人的经验来说明前两点。
When you work in technology and you don't look like a Mark Zuckerberg or Elon Musk , your life is a little bit difficult, your ability gets questioned. 当你在科技行业工作, 并且不像马克·扎克伯格 或埃隆·马斯克那样位高权重, 你的生活会有点困难, 你的能力会收到质疑。
Elon:n.埃伦(可溶性显影剂粉末); Musk:n.麝香;麝香鹿;麝香香味;
Here's just one example. 这只是一个例子。
Like most developers , I often join online tech forums and share my knowledge to help others. 跟大部分开发者一样, 我经常参加在线科技论坛, 分享我的知识帮助别人。
developers:n.开发商;发展者;[摄]显影剂(developer的复数); forums:n.论坛; (forum的复数)
And I've found, when I log on as myself, with my own photo, my own name, 我发现, 当我用自己的照片, 自己的名字登陆时,
I tend to get questions or comments like this: "What makes you think you're qualified to talk about AI?" 我倾向于得到这样的问题或评论: “你为什么觉得自己 有资格谈论人工智能?”
qualified:adj.有资格的; v.合格; (qualify的过去分词和过去式)
'"What makes you think you know about machine learning?" “你为什么觉得你了解机器学习?”
So, as you do, I made a new profile , and this time, instead of my own picture, I chose a cat with a jet pack on it. 所以,我创建了新的资料页, 这次,我没有选择自己的照片, 而是选择了一只带着喷气背包的猫。
profile:n.轮廓;简介;形象;外形;v.扼要介绍;概述;写简介;
And I chose a name that did not reveal my gender. 并选择了一个无法体现我性别的名字。
reveal:v.显示;透露;揭露;泄露;n.揭露;暴露;门侧,窗侧;
You can probably guess where this is going, right? 你能够大概猜到会怎么样,对吧?
So, this time, I didn't get any of those patronizing comments about my ability and I was able to actually get some work done. 于是这次,我不再收到 任何居高临下的评论, 我能够专心把工作做完。
patronizing:adj.自认为高人一等的; v.摆出高人一等的派头; (patronize的现在分词)
And it sucks , guys. 这感觉太糟糕了,伙计们。
sucks:v.吮吸;吸;咂;啜;抽吸;抽取;(suck的第三人称单数)
I've been building robots since I was 15, 我从15岁起就在构建机器人,
I have a few degrees in computer science , and yet, I had to hide my gender in order for my work to be taken seriously. 我有计算机科学领域的几个学位, 然而,我不得不隐藏我的性别 以让我的工作被严肃对待。
computer science:n.计算机科学;
So, what's going on here? 这是怎么回事呢?
Are men just better at technology than women? 男性在科技领域就是强于女性吗?
Another study found that when women coders on one platform hid their gender, like myself, their code was accepted four percent more than men. 另一个研究发现, 当女性程序员在平台上 隐藏性别时,像我这样, 她们的代码被接受的 比例比男性高4%。
coders:n.编码器;编码员; platform:n.平台; v.把…放在台上[放在高处;
So this is not about the talent. 所以这跟能力无关。
This is about an elitism in AI that says a programmer needs to look like a certain person. 这是人工智能领域的精英主义, 即程序员看起来得像 具备某个特征的人。
elitism:n.精英主义;杰出人物统治论;
What we really need to do to make AI better is bring people from all kinds of backgrounds. 让人工智能变得更好, 我们需要切实的 把来自不同背景的人集合到一起。
We need people who can write and tell stories to help us create personalities of AI. 我们需要能够书写和讲故事的人 来帮助我们创建人工智能更好的个性。
personalities:n.性格;个性;人格;气质;名人;(personality的复数)
We need people who can solve problems. 我们需要能够解决问题的人。
We need people who face different challenges and we need people who can tell us what are the real issues that need fixing and help us find ways that technology can actually fix it. 我们需要能应对不同挑战的人, 我们需要有人告诉我们什么是 真正需要解决的问题, 帮助我们找到用技术 解决问题的方法。
issues:n.重要议题;争论的问题;v.宣布;公布;发出;(issue的第三人称单数和复数)
Because, when people from diverse backgrounds come together, when we build things in the right way, the possibilities are limitless . 因为,当不同背景的人走到一起时, 当我们以正确的方式做事情时, 就有无限的可能。
limitless:adj.无限制的;无界限的;
And that's what I want to end by talking to you about. 这就是我最后想和你们讨论的。
Less racist robots, less machines that are going to take our jobs -- and more about what technology can actually achieve. 减少种族歧视的机器人, 减少夺走我们工作的机器—— 更多专注于技术究竟能实现什么。
So, yes, some of the energy in the world of AI, in the world of technology is going to be about what ads you see on your stream. 是的,人工智能世界中, 科技世界中的一些能量 是关于你在流媒体中看到的广告。
But a lot of it is going towards making the world so much better. 但更多是朝着让世界更美好的方向前进。
Think about a pregnant woman in the Democratic Republic of Congo, who has to walk 17 hours to her nearest rural prenatal clinic to get a checkup . 想想刚果民主共和国的一位孕妇, 需要走17小时才能 到最近的农村产前诊所 进行产检。
pregnant:adj.怀孕的;富有意义的; Democratic:adj.民主的;民主政治的;大众的; rural:adj.农村的,乡下的;田园的,有乡村风味的; prenatal:adj.产前的;胎儿期的;[医]出生以前的; checkup:na.检验;(严格的)健康检查;身体检查;
What if she could get diagnosis on her phone, instead? 如果她在手机上 就能得到诊断会怎样呢?
What if:如果…怎么办? diagnosis:n.诊断;
Or think about what AI could do for those one in three women in South Africa who face domestic violence . 或者想象一下人工智能 能为1/3面临家庭暴力的 南非女性做什么。
domestic:n.佣人;家佣;家庭纠纷;家庭矛盾;adj.本国的;国内的;家用的;家庭的; violence:n.暴力;侵犯;激烈;歪曲;
If it wasn't safe to talk out loud, they could get an AI service to raise alarm, get financial and legal advice. 如果大声说出来不安全的话, 她们可以通过一个 人工智能服务来报警, 获得财务和法律咨询。
financial:adj.金融的;财政的,财务的; legal:adj.法律的;合法的;法定的;
These are all real examples of projects that people, including myself, are working on right now, using AI. 这些都是包括我在内, 正在使用人工智能的人 所做的项目中的真实案例。
So, I'm sure in the next couple of days there will be yet another news story about the existential risk, robots taking over and coming for your jobs. 我确信在未来的几十天里面, 会有另一个新闻故事, 告诉你们, 机器人会接管你们的工作。
existential:adj.存在主义的;有关存在的;存在判断的;
(Laughter) (笑声)
And when something like that happens, 当这样的事情发生时,
I know I'll get the same messages worrying about the future. 我知道我会收到 同样对未来表示担忧的信息。
But I feel incredibly positive about this technology. 但我对这个技术极为乐观。
incredibly:adv.难以置信地;非常地; positive:adj.积极的;[数]正的,[医][化学]阳性的;确定的;n.正数;[摄]正片;
This is our chance to remake the world into a much more equal place. 这是我们重新让世界 变得更平等的机会。
remake:n.重做;重制物;vt.再制;
But to do that, we need to build it the right way from the get go. 但要做到这一点,我们需要 在一开始就以正确的方式构建它。
We need people of different genders , races, sexualities and backgrounds. 我们需要不同性别,种族, 性取向和背景的人。
genders:n.性别;性; sexualities:n.[胚]性别;性欲;性征;性方面的事情(比如性行为或性能力);
We need women to be the makers and not just the machines who do the makers' bidding. 我们需要女性成为创造者, 而不仅仅是听从创造者命令的机器。
We need to think very carefully what we teach machines, what data we give them, so they don't just repeat our own past mistakes. 我们需要仔细思考 我们教给机器的东西, 我们给它们什么数据, 这样它们就不会 只是重复我们过去的错误。
So I hope I leave you thinking about two things. 所以我希望我留给你们两个思考。
First, I hope you leave thinking about bias today. 首先,我希望你们思考 当今社会中的的偏见。
And that the next time you scroll past an advert that assumes you are interested in fertility clinics or online betting websites, that you think and remember that the same technology is assuming that a black man will reoffend . 下次当你滚动刷到 认为你对生育诊所 或者网上投注站有兴趣的广告时, 这会让你回想起 同样的技术也在假定 黑人会重复犯罪。
assumes:v.假定;采用;承担;呈;(assumes是assume的第三人称单数) betting:n.打赌;赌钱;v.下赌注(于);用…打赌;敢说;(bet的现在分词) assuming:conj.假设…为真; adj.傲慢的; v.假定; (assume的现在分词) reoffend:再犯罪;
Or that a woman is more likely to be a personal assistant than a CEO. 或者女性更可能成为 个人助理而非CEO。
And I hope that reminds you that we need to do something about it. 我希望那会提醒你, 我们需要对此有所行动。
reminds:v.提醒;使想起;(remind的第三人称单数)
And second, 第二,
I hope you think about the fact that you don't need to look a certain way or have a certain background in engineering or technology to create AI, which is going to be a phenomenal force for our future. 我希望你们考虑一下这个事实, 你不需要以特定的方式去看, 也不需要有一定的工程或技术背景 去创建人工智能, 人工智能将成为我们未来的 一股非凡力量。
engineering:n.工程;工程学;v.密谋策划;设计制造;改变…的基因;(engineer的现在分词) phenomenal:adj.现象的;显著的;异常的;能知觉的;惊人的,非凡的;
You don't need to look like a Mark Zuckerberg, you can look like me. 你不需要看起来像马克·扎克伯格, 你可以看起来像我。
And it is up to all of us in this room to convince the governments and the corporations to build AI technology for everyone, including the edge cases. 我们这个房间里的所有人都有责任 去说服政府和公司 为每个人创建人工智能技术, 包括边缘的情况。
convince:v.使确信;使相信;说服,劝说; corporations:n.[贸易]公司,[经]企业(corporation的复数形式);
And for us all to get education about this phenomenal technology in the future. 让我们所有人都能 在未来接受有关这项非凡技术的教育。
Because if we do that, then we've only just scratched the surface of what we can achieve with AI. 因为如果我们那样做了, 才刚刚打开了人工智能世界的大门。
scratched:adj.受抓损的;有刮痕;v.抓;搔;(scratch的过去式及过去分词)
Thank you. 谢谢。
(Applause) (鼓掌)