返回首页

ZeynepTufekci_2016T-_机器智能时代,坚守人类道德更重要_

So, I started my first job as a computer programmer in my very first year of college — basically , as a teenager. 我的第一份工作是程序员, 那是在我刚上大学的时候, 不到二十岁。
basically:adv.主要地,基本上;
Soon after I started working, writing software in a company, a manager who worked at the company came down to where I was, and he whispered to me, "Can he tell if I'm lying?" 我刚开始工作不久, 正当在公司写程序, 公司的一位经理来到我旁边, 他悄悄的对我说, “他能看出来我在撒谎吗?”
There was nobody else in the room. 当时屋子里没有别人。
'"Can who tell if you're lying? And why are we whispering?" “你是指谁能看出你在撒谎?还有,我们干嘛要悄悄地说话?”
The manager pointed at the computer in the room. 那个经理指着屋子里的电脑,说:
'"Can he tell if I'm lying?" “他能看出我在撒谎吗?”
Well, that manager was having an affair with the receptionist . 其实,那个经理和前台有一腿。
receptionist:n.接待员;传达员;
(Laughter) (笑声)
And I was still a teenager. 当时我只有十来岁,
So I whisper-shouted back to him, "Yes, the computer can tell if you're lying." 我低声地回答他, “是的,电脑什么都知道。”
(Laughter) (笑声)
Well, I laughed, but actually, the laugh's on me. 我笑了,但其实我是在笑自己,
Nowadays, there are computational systems that can suss out emotional states and even lying from processing human faces. 现在,计算机系统已经可以 通过分析人脸来辨别人的情绪, 甚至包括是否在撒谎。
computational:adj.计算的; suss:v.意识到;认识到;发现; emotional:adj.情绪的;易激动的;感动人的; processing:v.加工;处理;审核;数据处理;v.列队行进;缓缓前进;(process的现在分词)
Advertisers and even governments are very interested. 广告商,甚至政府都对此很感兴趣。
Advertisers:n.广告商(advertiser的复数形式);
I had become a computer programmer because I was one of those kids crazy about math and science. 我选择成为电脑程序员, 因为我是那种痴迷于数学和科学孩子。
But somewhere along the line I'd learned about nuclear weapons, and I'd gotten really concerned with the ethics of science. 其间我也学习过核武器, 我也非常关心科学伦理。
nuclear:adj.原子能的;[细胞]细胞核的;中心的;原子核的; concerned with:关心;涉及;忙于;与…有关; ethics:n.伦理学;伦理观;道德标准;
I was troubled. 我曾经很困惑。
However, because of family circumstances , 但是,因为家庭原因,
circumstances:n.情况;环境;情形;(circumstance的复数)
I also needed to start working as soon as possible. 我需要尽快参加工作。
as soon as:一…就;
So I thought to myself, hey, let me pick a technical field where I can get a job easily and where I don't have to deal with any troublesome questions of ethics. 我对自己说,嘿,选一个容易找工作 的科技领域吧, 并且找个不需要操心伦理问题的。
technical:adj.工艺的,科技的;技术上的;专门的; troublesome:adj.麻烦的;讨厌的;使人苦恼的;
So I picked computers. 所以我选了计算机。
(Laughter) (笑声)
Well, ha, ha, ha! All the laughs are on me. 哈哈哈,我多可笑。
Nowadays, computer scientists are building platforms that control what a billion people see every day. 如今,计算机科学控制着 十亿人每天能看到的信息,
platforms:n.平台; v.把…放在台上;
They're developing cars that could decide who to run over. 它们可以控制汽车朝哪里开,
They're even building machines, weapons, that might kill human beings in war. 它们可以建造机器、武器, 那些在战争中用于杀人的武器。
It's ethics all the way down. 说到底,都是伦理问题。
Machine intelligence is here. 机器智能来了。
We're now using computation to make all sort of decisions, but also new kinds of decisions. 我们用计算机来做各种决策, 包括人们面临的新决策。
We're asking questions to computation that have no single right answers, that are subjective and open-ended and value-laden . 我们向计算机询问多解的、 主观的、 开放性的或意义深远的问题。
subjective:adj.主观的;个人的;自觉的; open-ended:adj.开放式的;无限制的;自由回答的;两端未封闭的; value-laden:n.价值负载;
We're asking questions like, "Who should the company hire?" 我们会问, “我们公司应该聘请谁?”
'"Which update from which friend should you be shown?" “你该关注哪个朋友的哪条状态?”
update:vt.使现代化;更新;n.现代化;更新的信息;
'"Which convict is more likely to reoffend ?" “哪种犯罪更容易再犯?”
convict:vt.证明…有罪;宣告…有罪;n.罪犯; reoffend:再犯罪;
'"Which news item or movie should be recommended to people?" “应该给人们推荐哪条新闻或是电影?”
item:n.条款,项目;一则;一件商品(或物品);adj.又,同上; recommended:v.推荐;举荐;介绍;劝告;建议;(recommend的过去分词和过去式)
Look, yes, we've been using computers for a while , but this is different. 看,是的,我们使用计算机已经有一段时间了, 但现在不一样了。
for a while:adv.片刻;暂时;一会儿;一时;
This is a historical twist , because we cannot anchor computation for such subjective decisions the way we can anchor computation for flying airplanes , building bridges, going to the moon. 这是历史性的转折, 因为我们在这些主观决策上无法主导计算机, 不像我们在管理飞机、建造桥梁、 登月等问题上,可以主导它们。
historical:adj.历史的;史学的;基于史实的; twist:v.捻;扭转;曲折;扭动;n.捻;拧;扭动;搓; airplanes:n.(美)[航]飞机(airplane的复数);
Are airplanes safer? Did the bridge sway and fall? 飞机会更安全吗?桥梁会摇晃或倒塌吗?
sway:n.影响;摇摆;摆动;统治;v.摇动;说服;使相信;使动摇;
There, we have agreed-upon , fairly clear benchmarks , and we have laws of nature to guide us. 在这些问题上,我们有统一而清晰的判断标准, 我们有自然定律来指导。
agreed-upon:互相认可的;议定;预定付款的时间; fairly:adv.相当地;公平地;简直; benchmarks:n.[计]基准; v.测定基准点(benchmark的三单形式);
We have no such anchors and benchmarks for decisions in messy human affairs. 但是在复杂的人类事务上, 我们没有这样的客观标准。
messy:adj.肮脏的;凌乱的;不整洁的;
To make things more complicated , our software is getting more powerful, but it's also getting less transparent and more complex . 让问题变得更复杂的,是我们的软件正越来越强大, 同时也变得更加不透明,更加复杂。
complicated:adj.复杂的;难懂的;v.使复杂化;(complicate的过去分词和过去式) transparent:adj.透明的;显然的;坦率的;易懂的; complex:adj.复杂的;合成的;n.复合体;综合设施;
Recently , in the past decade, complex algorithms have made great strides . 最近的几十年, 复杂算法已取得了长足发展,
Recently:adv.最近;新近; strides:n.大步;步幅(stride的复数形式);v.跨过;迈步(stride的第三人称单数形式);
They can recognize human faces. 它们可以识别人脸,
recognize:v.认识;认出;辨别出;承认;意识到;
They can decipher handwriting. 它们可以破解笔迹,
decipher:n.密电译文;v.解释;
They can detect credit card fraud and block spam and they can translate between languages. 它们可以识别信用卡欺诈, 可以屏蔽垃圾信息, 它们可以翻译语言,
detect:vt.察觉;发现;探测; credit card:n.[经]信用卡; fraud:n.欺骗;骗子;诡计; spam:n.垃圾邮件;罐头猪肉; translate:v.翻译;被翻译;被译成;(使)转变;
They can detect tumors in medical imaging . 他们可以通过医学图像识别肿瘤,
tumors:n.肿瘤(tumor的复数); imaging:n.[物]成像;造像;v.反映;想像;作…的像;象征;(image的现在分词形式)
They can beat humans in chess and Go. 它们可以在国际象棋和围棋上击败人类。
Much of this progress comes from a method called "machine learning." 类似的很多发展,都来自一种叫“机器学习”的方法。
Machine learning is different than traditional programming, where you give the computer detailed, exact, painstaking instructions. 机器学习不像传统程序一样, 需要给计算机详细、准确的逐条指令。
different than:不同于; traditional:传统的,惯例的, painstaking:adj.艰苦的;勤勉的;小心的;n.辛苦;勤勉;
It's more like you take the system and you feed it lots of data, including unstructured data, like the kind we generate in our digital lives. 它更像是你给系统喂了很多数据, 包括非结构化数据, 比如我们在数字生活中产生的数据。
unstructured:adj.无社会组织的;松散的;非正式组成的; generate:v.产生;引起; digital:adj.数字的;手指的;n.数字;键;
And the system learns by churning through this data. 系统扎进这些数据中学习,
churning:n.搅乳;一次提制的奶油;v.搅拌(churn的现在分词);
And also, crucially , these systems don't operate under a single-answer logic . 重要的是, 这些系统不再局限单一答案。
crucially:adv.关键地;至关重要地; logic:n.逻辑;逻辑学;逻辑性;adj.逻辑的;
They don't produce a simple answer; it's more probabilistic : "This one is probably more like what you're looking for." 他们得出的不是一个简单的答案,而是概率性的: “这个更像是你在寻找的。”
probabilistic:adj.概率性的;或然说的,盖然论的;
Now, the upside is: this method is really powerful. 它的优势是:它真的非常强大。
The head of Google 's AI systems called it, "the unreasonable effectiveness of data." Google 人工智能系统的负责人称它为: “不可思议的数据效率”。
Google:谷歌;谷歌搜索引擎; unreasonable:adj.不合理的;过度的;不切实际的;非理智的; effectiveness:n.效力;
The downside is, we don't really understand what the system learned. 缺点在于, 我们无法清楚的了解系统学到了什么,
downside:n.下降趋势;底侧;adj.底侧的;
In fact, that's its power. 事实上,这也正是它的强大之处。
This is less like giving instructions to a computer; it's more like training a puppy-machine-creature we don't really understand or control. 不像是给计算机下达指令, 更像是在训练一个机器狗, 我们无法精确的了解和控制它。
So this is our problem. 这就是我们遇到的问题。
It's a problem when this artificial intelligence system gets things wrong. 人工智能会出错,这是一个问题。
artificial intelligence:n.人工智能;
It's also a problem when it gets things right, because we don't even know which is which when it's a subjective problem. 但他们得出正确答案,又是另一种问题。 因为我们面对主观问题,是不应该有答案的。
We don't know what this thing is thinking. 我们不知道这些机器在想什么。
So, consider a hiring algorithm — a system used to hire people, using machine-learning systems. 所以,考虑一下招聘算法- 通过机器学习构建的招聘系统。
machine-learning:机器学习;
Such a system would have been trained on previous employees' data and instructed to find and hire people like the existing high performers in the company. 这样的系统会用员工现有的数据进行自我培训, 参照公司的优秀员工 来寻找和招聘新人。
previous:adj.以前的;早先的;过早的;adv.在先;在…以前;
Sounds good. 听起来很好。
I once attended a conference that brought together human resources managers and executives , high-level people, using such systems in hiring. 有次我参加了一个会议, 会上聚集了很多人力资源部的经理和总监, 都是高管, 让他们使用这样的招聘系统。
conference:n.会议;研讨会;商讨会;体育协会(或联合会) human resources:n.人力资源;(公司的)人事部; executives:n.经理,主管领导,管理人员;领导层;行政部门(executive的复数) high-level:adj.高级的;高阶层的;在高空的;
They were super excited. 他们都非常兴奋,
They thought that this would make hiring more objective , less biased , and give women and minorities a better shot against biased human managers. 认为这可以让招聘变得更加客观,从而减少偏见, 给女性和少数族裔更多的机会, 减少他们自身的偏见。
objective:n.目标; adj.客观的; biased:adj.有偏见的;结果偏倚的,有偏的; minorities:n.少数(minority的复数形式);少数民族;少数族裔;
And look — human hiring is biased. 你知道的,招聘是存在偏见的,
I know. 我也很清楚。
I mean, in one of my early jobs as a programmer, my immediate manager would sometimes come down to where I was really early in the morning or really late in the afternoon, and she'd say, "Zeynep, let's go to lunch!" 在我刚开始做程序员的时候, 我的直接主管会来找我, 在早晨很早或下午很晚的时候, 说,“ 图费, 我们去吃午饭!”
I'd be puzzled by the weird timing. 我就被这奇怪的时间给搞糊涂了,
puzzled:adj.困惑的;茫然的;搞糊涂的; weird:adj.奇怪的;奇异的;离奇的;n.命运;宿命;命运女神;
It's 4pm. Lunch? 现在是下午4点,吃午饭?
I was broke, so free lunch. I always went. 我当时很穷,所以不会放过免费的午餐。
I later realized what was happening. 后来我才想明白原因,
My immediate managers had not confessed to their higher-ups that the programmer they hired for a serious job was a teen girl who wore jeans and sneakers to work. 我的主管们没有向他们的上级坦白, 他们雇了一个十多岁的小女孩来做重要的编程工作, 一个穿着牛仔裤,运动鞋工作的女孩。
confessed:adj.公开认错的;v.供认,坦白,承认;悔过;(confess的过去分词和过去式) higher-ups:n.上级;上司;大人物; sneakers:n.胶底运动鞋(sneaker的复数形式);
I was doing a good job, I just looked wrong and was the wrong age and gender . 我的工作做得很好,我只是看起来不合适, 年龄和性别也不合适。
gender:n.性别;
So hiring in a gender- and race-blind way certainly sounds good to me. 所以,忽略性别和种族的招聘, 听起来很适合我。
But with these systems, it is more complicated, and here's why: 但是这样的系统会带来更多问题,
Currently , computational systems can infer all sorts of things about you from your digital crumbs , even if you have not disclosed those things. 当前,计算机系统能根据零散的数据, 推断出关于你的一切, 甚至你没有公开的事。
Currently:adv.当前;一般地; infer:v.推断;推论;暗示;推理; crumbs:v.捏碎;裹上面包屑(用油煎);n.食物碎屑;一点;(crumb的第三人称单数和复数) disclosed:v.揭露;泄露;使显露;使暴露;(disclose的过去式和过去分词)
They can infer your sexual orientation , your personality traits , your political leanings . 它们可以推断你的性取向, 你的性格特点, 你的政治倾向。
sexual:adj.性的;性别的;有性的; orientation:n.方向;定向;适应;情况介绍;向东方; personality:n.性格;个性;人格;魅力;气质;名人;特色; traits:n.特性,特质,性格(trait的复数); leanings:n.倾向,爱好;倾斜;v.倚靠(lean的ing形式);
They have predictive power with high levels of accuracy . 它们有高准确度的预测能力,
predictive:adj.预言性的;成为前兆的; accuracy:n.[数]精确度,准确性;
Remember — for things you haven't even disclosed. 记住,是你没有公开的事情,
This is inference . 这就是推断。
inference:n.推理;推断;推论;推断的结果;
I have a friend who developed such computational systems to predict the likelihood of clinical or postpartum depression from social media data. 我有个朋友就是开发这种系统, 从社交媒体的数据中, 推断患临床或产后抑郁症的可能性。
likelihood:n.可能性,可能; clinical:adj.临床的;诊所的; postpartum:adj.产后的;adv.在产后; depression:n.沮丧;洼地;不景气;忧愁; media:n.媒体;媒质(medium的复数);血管中层;浊塞音;中脉;
The results are impressive . 结果令人印象深刻,
impressive:adj.感人的;令人钦佩的;给人以深刻印象的;
Her system can predict the likelihood of depression months before the onset of any symptoms — months before. 她的系统可以在症状出现前几个月 成功预测到患抑郁的可能性, 提前几个月。
onset:n.开始,着手;发作;攻击,进攻; symptoms:n.症状;征候;征兆;(symptom的复数)
No symptoms, there's prediction . 在有症状之前,就可以预测到,
prediction:n.预报;预言;
She hopes it will be used for early intervention . Great! 她希望这可以用于临床早期干预,这很棒!
intervention:n.介入;调停;妨碍;
But now put this in the context of hiring. 现在我们把这项技术放到招聘中来看。
context:n.环境;上下文;来龙去脉;
So at this human resources managers conference, 在那次人力资源管理会议中,
I approached a high-level manager in a very large company, and I said to her, "Look, what if , unbeknownst to you, your system is weeding out people with high future likelihood of depression? 我接近了一位大公司的高管, 我对她说,“看,如果这个系统在不通知你的情况下, 就剔除了未来有可能抑郁的人,怎么办?
approached:v.走近;临近;探讨;建议;(approach的过去分词和过去式) what if:如果…怎么办? unbeknownst:adv.不知地;adj.不知的; weeding:v.除(地面的)杂草;(weed的现在分词)
They're not depressed now, just maybe in the future, more likely. 他们现在不抑郁,只是未来有可能。
depressed:adj.沮丧的; v.使抑郁; (depress的过去式和过去分词)
What if it's weeding out women more likely to be pregnant in the next year or two but aren't pregnant now? 如果它剔除了有可能怀孕的女性,怎么办? 她们现在没怀孕,但未来一两年有可能。
pregnant:adj.怀孕的;富有意义的;
What if it's hiring aggressive people because that's your workplace culture?" 如果因为你的公司文化,它只雇佣激进的候选人怎么办?”
aggressive:adj.侵略性的;好斗的;有进取心的;有闯劲的; workplace:n.工作场所;车间;
You can't tell this by looking at gender breakdowns . 只看性别比例,你发现不了这些问题,
breakdowns:n.故障;受控制帧;节奏鼓点(breakdown的复数);
Those may be balanced. 性别比例是可以被调整的。
And since this is machine learning, not traditional coding , there is no variable there labeled "higher risk of depression," 并且因为这是机器学习,不是传统的代码, 不会有一个变量来标识“高抑郁风险”、
coding:n.译码;v.把…编码;(code的现在分词) variable:n.变量;可变因素;可变情况;adj.多变的;易变的;变化无常的;可更改的; labeled:adj.有标签的; v.示踪; (label的过去分词和过去式)
'"higher risk of pregnancy ," “高怀孕风险”、
pregnancy:n.怀孕;丰富,多产;意义深长;
'"aggressive guy scale ." “人员的激进程度”。
scale:n.规模;比例;鳞;刻度;天平;数值范围;v.衡量;攀登;剥落;生水垢;
Not only do you not know what your system is selecting on, you don't even know where to begin to look. 你不仅无法了解系统在选什么样的人, 你甚至不知道从哪里入手了解。
It's a black box . 它是个暗箱。
black box:黑箱;(装在飞机上记录飞行情况等的)密封仪器;
It has predictive power, but you don't understand it. 它有预测的能力,但你不了解它。
'"What safeguards ," I asked, "do you have to make sure that your black box isn't doing something shady ?" 我问,“你有什么措施可以保证, 你的暗箱没有在做些见不得人的事?”
safeguards:n.保障措施; v.[安全]保护; shady:adj.背阴的;阴凉的;多阴的;成荫的;
She looked at me as if I had just stepped on 10 puppy tails. 她看着我,就好像我刚踩了10只小狗的尾巴。
puppy:n.小狗;幼犬;傲慢小子;自负无礼的青年
(Laughter) (笑声)
She stared at me and she said, "I don't want to hear another word about this." 她瞪着我说: “我不想再听你多说一个字。”
stared:v.盯着看;凝视;注视;(stare的过去分词和过去式)
And she turned around and walked away. 然后她转身走开了。
Mind you — she wasn't rude. 其实,她不是无礼,
It was clearly: what I don't know isn't my problem, go away, death stare. 她想表达的其实是:我不知道,这不是我的错,走开,不然我瞪死你。
(Laughter) (笑声)
Look, such a system may even be less biased than human managers in some ways. 看,这样的系统可能在某些方面 比人类高管怀有更少偏见,
And it could make monetary sense. 而且可以创造经济价值。
monetary:adj.货币的;财政的;
But it could also lead to a steady but stealthy shutting out of the job market of people with higher risk of depression. 但它也可能 用一种顽固且隐秘的方式, 把高抑郁风险的人清出职场。
steady:adj.稳定的; v.使稳定; v.稳定地; n.关系固定的情侣; stealthy:adj.鬼鬼祟祟的;秘密的;
Is this the kind of society we want to build, without even knowing we've done this, because we turned decision-making to machines we don't totally understand? 这是我们想要的未来吗? 把决策权给予我们并不完全了解的机器, 在我们不知情的状况下构建一种新的社会?
decision-making:n.决策;
Another problem is this: these systems are often trained on data generated by our actions,human imprints . 另一个问题是, 这些系统通常使用我们真实的行为数据来训练。
generated:v.产生;引起;(generate的过去式和过去分词) imprints:[法]痕迹;
Well, they could just be reflecting our biases , and these systems could be picking up on our biases and amplifying them and showing them back to us, while we're telling ourselves, "We're just doing objective, neutral computation." 它们可能只是在反馈我们的偏见, 这些系统会继承我们的偏见, 并把它们放大, 然后反馈给我们。 我们骗自己说, “我们只做客观、中立的预测。”
reflecting:v.反映;映出(影像);反射;显示,表明,表达;(reflect的现在分词) biases:n.偏差,偏见(bias的复数形式);v.偏见(bias的三单形式); amplifying:adj.放大的; neutral:n.中立国; adj.中立的;
Researchers found that on Google, women are less likely than men to be shown job ads for high-paying jobs. 研究者发现,在 Google 上, 高收入工作的广告更多的被展示给男性用户。
And searching for African-American names is more likely to bring up ads suggesting criminal history, even when there is none. 搜索非裔美国人的名字, 更可能出现关于犯罪史的广告, 即使某些根本不存在。
African-American:非洲裔美国人(指美国黑人);
Such hidden biases and black-box algorithms that researchers uncover sometimes but sometimes we don't know, can have life-altering consequences . 这些潜在的偏见以及暗箱中的算法, 有些会被研究者揭露,有些根本不会被发现, 它的后果可能是改变一个人的人生。
black-box:黑箱;(装在飞机上记录飞行情况等的)密封仪器; uncover:v.揭开盖子;发现;揭露;揭发; life-altering:改变生活的;改变人生的; consequences:n.后果,结果;影响(consequence的复数);
In Wisconsin , a defendant was sentenced to six years in prison for evading the police. 在威斯康星,一个被告 因逃避警察被判刑六年。
Wisconsin:n.威斯康星州(美国州名); defendant:adj.[法律]被告的;辩护的;n.[法律]被告人;被告方; evading:v.逃脱;躲开;躲避;规避;回避;(evade的现在分词)
You may not know this, but algorithms are increasingly used in parole and sentencing decisions. 你可能不知道, 但计算机算法正越来越多的被应用在假释及量刑裁定上。
increasingly:adv.越来越多地;渐增地; parole:n.语言;誓言,诺言;释放宣言;v.有条件释放,假释;使假释出狱;
He wanted to know: How is this score calculated? 他想要弄清楚,这个得分是怎么算出来的?
It's a commercial black box. 这是个商业暗箱,
commercial:adj.贸易的;商业的;赢利的;以获利为目的的;n.(电台或电视播放的)广告;
The company refused to have its algorithm be challenged in open court. 这家公司拒绝在公开法庭上讨论他们的算法。
But ProPublica, an investigative nonprofit , audited that very algorithm with what public data they could find, and found that its outcomes were biased and its predictive power was dismal , barely better than chance, 但是一家叫 ProPublica的非盈利机构, 根据公开数据,对这个算法进行了评估, 他们发现这个算法的结论是有偏见的, 它的预测能力很差,比碰运气强不了多少,
investigative:adj.研究的;调查的;好调查的; nonprofit:adj.非赢利的;不以赢利为目的的; audited:adj.受审查的;受审计的;v.审计;旁听(audit的过去分词); outcomes:n.结果;成果;后果;出路;(outcome的复数) dismal:adj.凄凉的,忧郁的;阴沉的,沉闷的;n.低落的情绪; barely:adv.仅仅,勉强;几乎不;公开地;贫乏地;
and it was wrongly labeling black defendants as future criminals at twice the rate of white defendants. 并且它错误的把黑人被告未来犯罪的可能性 标记为白人的两倍。
wrongly:adv.错误地;不恰当地;不公正地; labeling:n.标签;标记;[计]标号;v.贴标签;分类;(label的现在分词) defendants:n.[法]被告(defendant的复数);
So, consider this case: 看下这个案例:
This woman was late picking up her godsister from a school in Broward County , Florida, running down the street with a friend of hers. 这个女人急着去佛罗里达州,布劳沃德县的一所学校, 去接她的干妹妹。 女人和她的朋友在街上狂奔,
County:n.郡,县;
They spotted an unlocked kid's bike and a scooter on a porch and foolishly jumped on it. 她们看到门廊上一辆没上锁的儿童自行车,和一辆电瓶车, 于是就愚蠢的骑上了车。
spotted:adj.有花点的;有斑点的;v.看见;看出;发现;让步;(spot的过去分词和过去式) scooter:n.小型摩托车:(儿童)滑板车: porch:n.门廊;走廊; foolishly:adv.愚笨地;无聊地;可笑地;
As they were speeding off, a woman came out and said, "Hey! That's my kid's bike!" 正在她们要骑走的时候,另一个女人出来,喊道: “嘿!那是我孩子的自行车!”
They dropped it, they walked away, but they were arrested. 她们扔掉车走开,但还是被抓住了。
She was wrong, she was foolish, but she was also just 18. 她做错了,她很愚蠢,但她也才刚满18岁,
She had a couple of juvenile misdemeanors . 她之前有不少青少年轻罪的记录。
juvenile:adj.青少年的;幼稚的;n.青少年;少年读物; misdemeanors:n.恶劣的品行;坏事;[法]轻罪;(misdemeanor的复数)
Meanwhile , that man had been arrested for shoplifting in Home Depot — 85 dollars' worth of stuff , a similar petty crime. 与此同时,这个男人在连锁超市偷窃被捕了, 偷了价值85美金的东西,同样的轻微犯罪,
Meanwhile:adv.同时,其间;n.其间,其时; shoplifting:n.冒充顾客在商店行窃(罪);v.在商店行窃;(shoplift的现在分词) Depot:n.仓库;停车场;航空站;vt.把…存放在储藏处;adj.药性持久的; stuff:n.东西:物品:基本特征:v.填满:装满:标本: petty:adj.琐碎的;小气的;小规模的;
But he had two prior armed robbery convictions . 但他有两次持枪抢劫的案底。
prior:adj.先前的; n.(小隐修院)院长; v.居先; robbery:n.盗窃;抢劫;掠夺; convictions:n.判罪;定罪;坚定的看法(或信念);坚信;肯定;(conviction的复数)
But the algorithm scored her as high risk, and not him. 这个程序将这位女性判定为高风险,而这位男性则不是。
Two years later, ProPublica found that she had not reoffended. 两年后,ProPublica发现她没有再次犯罪,
It was just hard to get a job for her with her record. 但这个记录使她很难找到工作。
He, on the other hand , did reoffend and is now serving an eight-year prison term for a later crime. 而这位男性,却再次犯罪, 并因此被判八年监禁。
on the other hand:另一方面;
Clearly, we need to audit our black boxes and not have them have this kind of unchecked power. 显然,我们需要审查这些暗箱, 确保它们不再有这样不加限制的权限。
unchecked:adj.未经核对的;未加抑制的;
(Applause) (掌声)
Audits are great and important, but they don't solve all our problems. 审查是很重要的,但不能解决所有的问题。
Audits:n.审计(audit的复数);经查核纠正的帐目;v.审计(audit的三单形式);
Take Facebook's powerful news feed algorithm — you know, the one that ranks everything and decides what to show you from all the friends and pages you follow. 拿 Facebook 的强大的新闻流算法来说, 就是通过你的朋友圈和你浏览过的页面, 决定你的“推荐内容”的算法。
Should you be shown another baby picture? 它会决定要不要再推一张婴儿照片给你,
(Laughter) (笑声)
A sullen note from an acquaintance ? 要不要推一条熟人的沮丧状态?
sullen:adj.愠怒的,不高兴的;(天气)阴沉的;沉闷的; acquaintance:n.认识的人;熟人;略有交情;(对某事物的)了解;
An important but difficult news item? 要不要推一条重要但艰涩的新闻?
There's no right answer. 这个问题没有正解。
Facebook optimizes for engagement on the site : likes, shares, comments. Facebook 会根据网站的参与度来优化: 喜欢、分享、评论。
optimizes:优化; engagement:n.婚约;约会;交战;诺言;n.参与度; site:n.地点;位置;场所;v.设置;为…选址;
In August of 2014, protests broke out in Ferguson, Missouri , after the killing of an African-American teenager by a white police officer , under murky circumstances. 在2014年8月, 密苏里州弗格森市爆发了游行, 一个白人警察在不明状况下 杀害了一位非裔少年。
protests:v.抗议;反对(protest的三单形式);n.抗议(protest的复数); Missouri:n.密苏里(美国州名); police officer:n.警官;警察;警务人员; murky:adj.黑暗的;朦胧的;阴郁的;
The news of the protests was all over my algorithmically unfiltered Twitter feed, but nowhere on my Facebook. 关于游行的新闻 在我的未经算法过滤的Twitter 上大量出现, 但 Facebook 上却没有。
algorithmically:[计]在算法上; unfiltered:adj.未滤过的; nowhere:v.无处; n.无处; adj.不存在的;
Was it my Facebook friends? 是因为我的 Facebook 好友不关注这事吗?
I disabled Facebook's algorithm, which is hard because Facebook keeps wanting to make you come under the algorithm's control, and saw that my friends were talking about it. 我禁用了 Facebook 的算法, 这是很麻烦的一键事,因为 Facebook 希望 你一直在它的算法控制下使用, 希望我的朋友持续地谈论这件事。
It's just that the algorithm wasn't showing it to me. 只是算法没法给我这些信息。
I researched this and found this was a widespread problem. 我研究了这个现象,发现这是个普遍的问题。
widespread:adj.普遍的,广泛的;分布广的;
The story of Ferguson wasn't algorithm-friendly. 弗格森事件对算法是不适用的,
It's not " likable ." 它不是值得“赞”的新闻,
likable:adj.可爱的;
Who's going to click on "like?" 谁会在这样的文章下点“赞”呢?
It's not even easy to comment on . 甚至这新闻都不好被评论。
comment on:对…评论;
Without likes and comments, the algorithm was likely showing it to even fewer people, so we didn't get to see this. 因为没有“赞”和评论, 算法会减少这些新闻的曝光, 所以我们无法看到。
Instead, that week, 相反的,在同一周,
Facebook's algorithm highlighted this, which is the ALS Ice Bucket Challenge. Facebook 的算法热推了 ALS 冰桶挑战的信息。
highlighted:adj.突出的;v.使显著;照亮(highlight的过去分词); Ice Bucket:n.(餐桌上冰镇酒或饮料的)冰桶;
Worthy cause; dump ice water , donate to charity , fine. 这很有意义,倒冰水,为慈善捐款,很好。
Worthy:adj.值得的; n.杰出人物; dump:v.倾倒;抛售;抛弃;转存;n.垃圾场;转储;转存;废物堆; ice water:n.冰水(饮料); donate:v.赠送;献(血);捐献(器官); charity:n.慈善;施舍;慈善团体;宽容;施舍物;
But it was super algorithm-friendly. 这个事件对算法是很适用的,
The machine made this decision for us. 机器帮我们做了这个决定。
A very important but difficult conversation might have been smothered , had Facebook been the only channel. 非常重要但艰涩的新闻事件 可能会被埋没掉, 因为 Facebook 已经成为主要的信息来源。
smothered:v.使窒息而死;
Now, finally , these systems can also be wrong in ways that don't resemble human systems. 最后,这些系统也可能会在一些 不同于人力系统的那些事情上搞错。
finally:adv.终于;最终;(用于列举)最后;彻底地; resemble:v.看起来像;显得像;像;
Do you guys remember Watson, IBM's machine-intelligence system that wiped the floor with human contestants on Jeopardy ? 你们记得 Watson 吧,那个在智力竞赛《危险边缘》中 横扫人类选手的 IBM 机器智能系统,
contestants:n.选手;竞争者;争论者;争辩者(contestant的复数); Jeopardy:n.危险;(被告处于被判罪或受处罚的)危险境地;
It was a great player. 它是个很厉害的选手。
But then, for Final Jeopardy, Watson was asked this question: "Its largest airport is named for a World War II hero, its second-largest for a World War II battle." 但是,在最后一轮比赛中,Watson 被问道: “它最大的机场是以二战英雄命名的, 它第二大机场是以二战战场命名的。”
airport:n.机场;航空港;
(Hums Final Jeopardy music) (哼唱《危险边缘》插曲)
Chicago. 芝加哥。
The two humans got it right. 两位人类选手答对了,
Watson, on the other hand, answered " Toronto " — for a US city category ! 但 Watson 答的是,“多伦多”, 这是个猜美国城市的环节!
Toronto:n.多伦多(加拿大城市); category:n.种类,分类;[数]范畴;
The impressive system also made an error that a human would never make, a second-grader wouldn't make. 这个厉害的系统也会犯 人类都不会犯的,二年级小孩都不会犯的错误。
Our machine intelligence can fail in ways that don't fit error patterns of humans, in ways we won't expect and be prepared for. 我们的机器智能系统, 会在一些不符合人类出错模式的问题上出错, 这些问题都是我们无法预料和准备的。
It'd be lousy not to get a job one is qualified for, but it would triple suck if it was because of stack overflow in some subroutine . 丢失一份完全有能力胜任的工作时,人们会感到很糟, 但如果是因为机器子程序的过度堆积, 就简直糟透了。
lousy:adj.非常糟的;极坏的;恶劣的; qualified:adj.有资格的; v.合格; (qualify的过去分词和过去式) triple:adj.三部分的; n.三倍的数[量]; v.成为三倍; suck:v.吸吮;吸取;n.吮吸; stack:n.堆栈;一摞;大量;许多;v.(使)放成整齐的一叠(或一摞,一堆); overflow:vi.溢出;泛滥;充溢;n.充满,洋溢;泛滥;超值;溢值;vt.使溢出;使泛滥;使充溢; subroutine:n.[计]子程序;
(Laughter) (笑声)
In May of 2010, a flash crash on Wall Street fueled by a feedback loop in Wall Street's "sell" algorithm wiped a trillion dollars of value in 36 minu tes. 在2010年五月, 华尔街出现一次股票闪电崩盘, 原因是“卖出”算法的反馈回路导致, 在36分钟内损失了几十亿美金。
flash:n.闪光; v.闪光; adj.庞大的; Wall Street:n.华尔街(美国纽约金融中心和证券交易所所在地); feedback:n.反馈;反馈意见;回授;[电子]反馈; loop:n.循环;回路;环路;圈;v.使成环;使绕成圈;成环形移动;
I don't even want to think what "error" means in the context of lethal autonomous weapons. 我甚至不敢想,致命的自动化武器 发生“错误”会是什么后果。
lethal:adj.致命的,致死的;n.致死因子; autonomous:adj.自治的;自主的;自发的;
So yes, humans have always made biases. 是的,人类总是会有偏见,
Decision makers and gatekeepers , in courts, in news, in war ... 法庭上、新闻机构、战争中的, 决策者、看门人…
gatekeepers:n.门卫;信息传递者(gatekeeper的复数);
they make mistakes; but that's exactly my point. 他们都会犯错,但这恰恰是我要说的。
We cannot escape these difficult questions. 我们无法抛开这些困难的问题,
We cannot outsource our responsibilities to machines. 我们不能把我们自身该承担的责任推给机器。
outsource:vt.把…外包;vi.外包;
(Applause) (掌声)
Artificial intelligence does not give us a "Get out of ethics free" card. 人工智能不会给我们一张“伦理免责卡”。
Data scientist Fred Benenson calls this math-washing. 数据科学家 Fred Benenson称之为“数学粉饰”。
We need the opposite. 我们需要是相反的东西。
We need to cultivate algorithm suspicion , scrutiny and investigation. 我们需要培养算法的怀疑、复查和调研能力。
cultivate:vt.培养;陶冶;耕作; suspicion:n.怀疑;嫌疑;疑心;一点儿;v.怀疑; scrutiny:n.详细审查;监视;细看;选票复查;
We need to make sure we have algorithmic accountability , auditing and meaningful transparency . 我们需要确保有人为算法负责, 为算法审查,并切实的公开透明。
accountability:n.有义务;有责任;可说明性; auditing:n.审计;[审计]查帐;v.审计;查账(audit的现在分词); meaningful:adj.严肃的;重要的;重大的;意味深长的; transparency:n.透明,透明度;幻灯片;有图案的玻璃;
We need to accept that bringing math and computation to messy, value-laden human affairs does not bring objectivity; rather, the complexity of human affairs invades the algorithms. 我们必须认识到,把数学和计算引入 解决复杂的、高价值的人类事务中, 并不能带来客观性, 相反,人类事务的复杂性会扰乱算法。
complexity:n.复杂性;难以理解的局势 invades:v.武装入侵;侵略;侵犯;干扰;(invade的第三人称单数)
Yes, we can and we should use computation to help us make better decisions. 是的,我们可以并且需要使用计算机 来帮助我们做更好的决策,
But we have to own up to our moral responsibility to judgment , and use algorithms within that framework, not as a means to abdicate and outsource our responsibilities to one another as human to human. 但我们也需要在判断中加入道德义务, 在这个框架下使用算法, 而不是像人与人之间相互推卸那样, 就把责任转移给机器。
moral:n.寓意;品行;教益;adj.道德的;道义上的;道德上的;品行端正的; judgment:n.判断;裁判;判决书;辨别力; abdicate:v.退位;逊位;失(职);放弃(职责);
Machine intelligence is here. 人工智能到来了,
That means we must hold on ever tighter to human values and human ethics. 这意味着我们要格外坚守 人类的价值观和伦理。
Thank you. 谢谢。
(Applause) (掌声)