返回首页

CathyONeil_2017-_盲目信仰大数据的时代必须结束_

Algorithms are everywhere. 算法无处不在。
They sort and separate the winners from the losers . 他们把成功者和失败者区分开来。
losers:n.输者,败者:(loser的复数)
The winners get the job or a good credit card offer. 成功者得到工作 或是一个很好的信用卡优惠计划。
credit card:n.[经]信用卡;
The losers don't even get an interview or they pay more for insurance . 失败者甚至连面试机会都没有, 或者要为保险付更多的钱。
interview:n.接见,采访;面试,面谈;v.采访;接见;对…进行面谈; insurance:n.保险;保险业;保险费;保费;adj.胜券在握的;
We're being scored with secret formulas that we don't understand that often don't have systems of appeal . 我们被不理解的秘密公式打分, 却并没有上诉的渠道。
formulas:n.[数]公式;配方;规则(formula的复数); appeal:n.上诉;吸引力;申诉;魅力;v.上诉;呼吁;申诉;恳求;
That begs the question: 这引出了一个问题:
What if the algorithms are wrong? 如果算法是错误的怎么办?
What if:如果…怎么办?
To build an algorithm you need two things: you need data, what happened in the past, and a definition of success, the thing you're looking for and often hoping for. 构建一个算法需要两个要素: 需要数据,如过去发生的事情, 和成功的定义, 你正在寻找的,通常希望得到的东西。
definition:n.定义;清晰度;(尤指词典里的词或短语的)释义;解释;
You train an algorithm by looking, figuring out. 你可以通过观察,理解来训练算法。
The algorithm figures out what is associated with success. 这种算法能找出与成功相关的因素。
associated:adj.有关联的; v.联想; (associate的过去分词和过去式)
What situation leads to success? 什么情况意味着成功?
Actually, everyone uses algorithms. 其实,每个人都使用算法。
They just don't formalize them in written code. 他们只是没有把它们写成书面代码。
formalize:vt.使形式化;使正式;拘泥礼仪;vi.拘泥于形式;
Let me give you an example. 举个例子。
I use an algorithm every day to make a meal for my family. 我每天都用一种算法来 为我的家人做饭。
The data I use is the ingredients in my kitchen, the time I have, the ambition I have, and I curate that data. 我使用的数据 就是我厨房里的原料, 我拥有的时间, 我的热情, 然后我整理了这些数据。
ingredients:n.成分;(尤指烹饪)原料;(成功的)要素;(ingredient的复数) ambition:n.追求的目标;野心;志向;抱负; curate:n.助理牧师;副牧师;
I don't count those little packages of ramen noodles as food. 我不把那种小包拉面算作食物。
ramen:n.拉面;面条;
(Laughter) (笑声)
My definition of success is: a meal is successful if my kids eat vegetables. 我对成功的定义是: 如果我的孩子们肯吃蔬菜, 这顿饭就是成功的。
It's very different from if my youngest son were in charge. 这和我最小的儿子 负责做饭时的情况有所不同。
He'd say success is if he gets to eat lots of Nutella . 他说,如果他能吃很多 Nutella巧克力榛子酱就是成功。
Nutella:n.能多益(巧克力酱品牌);
But I get to choose success. 但我可以选择成功。
I am in charge. My opinion matters. 我负责。我的意见就很重要。
That's the first rule of algorithms. 这就是算法的第一个规则。
Algorithms are opinions embedded in code. 算法是嵌入在代码中的观点。
embedded:adj.嵌入式的;植入的;内含的;v.嵌入(embed的过去式和过去分词形式);
It's really different from what you think most people think of algorithms. 这和你认为大多数人对 算法的看法是不同的。
They think algorithms are objective and true and scientific . 他们认为算法是客观、真实和科学的。
objective:n.目标; adj.客观的; scientific:adj.科学的,系统的;
That's a marketing trick. 那是一种营销技巧。
It's also a marketing trick to intimidate you with algorithms, to make you trust and fear algorithms because you trust and fear mathematics . 这也是一种用算法来 恐吓你的营销手段, 为了让你信任和恐惧算法 因为你信任并害怕数学。
intimidate:vt.恐吓,威胁;胁迫; mathematics:n.数学;数学运算;
A lot can go wrong when we put blind faith in big data. 当我们盲目信任大数据时, 很多人都可能犯错。
faith:n.信心;信任;宗教信仰;
This is Kiri Soares. She's a high school principal in Brooklyn. 这是凯丽·索尔斯。 她是布鲁克林的一名高中校长。
Kiri:n.(日本)泡桐树; principal:adj.主要的;资本的;n.首长;校长;资本;当事人;
In 2011, she told me her teachers were being scored with a complex , secret algorithm called the " value-added model." 2011年,她告诉我, 她学校的老师们正在被一个复杂 并且隐秘的算法进行打分, 这个算法被称为“增值模型 。
complex:adj.复杂的;合成的;n.复合体;综合设施; value-added:adj.加值的;增值的;
I told her, "Well, figure out what the formula is, show it to me. 我告诉她,“先弄清楚这个 公式是什么,然后给我看看。
I'm going to explain it to you." 我来给你解释一下。”
She said, "Well, I tried to get the formula, but my Department of Education contact told me it was math and I wouldn't understand it." 她说,“我寻求过这个公式, 但是教育部的负责人告诉我这是数学, 给我我也看不懂。”
contact:n.接触,联系;v.使接触,联系;
It gets worse. 更糟的还在后面。
The New York Post filed a Freedom of Information Act request, got all the teachers' names and all their scores and they published them as an act of teacher-shaming. 纽约邮报提出了“信息自由法”的要求, 来得到所有老师的名字与他们的分数, 并且他们以羞辱教师的方式 发表了这些数据。
Freedom of Information:n.信息自由(查阅政府所掌握有关个人及组织的信息的权利);
When I tried to get the formulas, the source code , through the same means, 当我试图用同样的方法来获取公式, 源代码的时候,
source code:n.源(代)码;
I was told I couldn't. 我被告知我没有权力这么做。
I was denied . 我被拒绝了。
denied:v.否认;拒绝;否定;不承认;(deny的过去分词和过去式)
I later found out that nobody in New York City had access to that formula. 后来我发现, 纽约市压根儿没有人能接触到这个公式。
No one understood it. 没有人能看懂。
Then someone really smart got involved , Gary Rubenstein. 然后,一个非常聪明的人参与了, 加里·鲁宾斯坦。
involved:adj.有关的; v.涉及; (involve的过去式和过去分词)
He found 665 teachers from that New York Post data that actually had two scores. 他从纽约邮报的数据中 找到了665名教师, 实际上他们有两个分数。
That could happen if they were teaching seventh grade math and eighth grade math. 如果他们同时教七年级与八年级的数学, 就会得到两个评分。
He decided to plot them. 他决定把这些数据绘成图表。
plot:n.情节;阴谋;布局;小块土地;v.密谋;暗中策划;(在地图上)标出;绘制(图表);
Each dot represents a teacher. 每个点代表一个教师。
represents:v.代表;维护…的利益;相当于;(represent的第三人称单数)
(Laughter) (笑声)
What is that? 那是什么?
(Laughter) (笑声)
That should never have been used for individual assessment . 它永远不应该被用于个人评估。
individual:n.个人;有个性的人;adj.单独的;个别的; assessment:n.评定;估价;
It's almost a random number generator . 它几乎是一个随机数生成器。
random:adj.[数]随机的;任意的;胡乱的;n.随意;adv.胡乱地; generator:n.发电机;发生器;电力公司;
(Applause) (掌声)
But it was. 但它确实被使用了。
This is Sarah Wysocki. 这是莎拉·维索斯基。
She got fired, along with 205 other teachers, from the Washington, DC school district , even though she had great recommendations from her principal and the parents of her kids. 她连同另外205名教师被解雇了, 都是来自华盛顿特区的学区, 尽管她的校长还有学生的 父母都非常推荐她。
school district:n.学区; recommendations:n.推荐;推荐信;推荐规范(recommendation的复数形式);
I know what a lot of you guys are thinking, especially the data scientists, the AI experts here. 我知道你们很多人在想什么, 尤其是这里的数据科学家, 人工智能专家。
especially:adv.尤其;特别;格外;十分;
You're thinking, "Well, I would never make an algorithm that inconsistent ." 你在想,“我可永远不会做出 这样前后矛盾的算法。”
inconsistent:adj.不一致的;前后矛盾的;
But algorithms can go wrong, even have deeply destructive effects with good intentions . 但是算法可能会出错, 即使有良好的意图, 也会产生毁灭性的影响。
destructive:adj.破坏的;毁灭性的;有害的,消极的; intentions:n.目的,意向,意图;打算;(intention的复数)
And whereas an airplane that's designed badly crashes to the earth and everyone sees it, an algorithm designed badly can go on for a long time, silently wreaking havoc . 每个人都能看到一架设计的 很糟糕的飞机会坠毁在地, 而一个设计糟糕的算法 可以持续很长一段时间, 并无声地造成破坏。
whereas:conj.然而;鉴于;反之; airplane:n.飞机; go on for:接近; wreaking:v.造成(巨大的破坏或伤害);(wreak的现在分词) havoc:n.大破坏;浩劫;蹂躏;v.严重破坏;损毁;
This is Roger Ailes. 这是罗杰·艾尔斯。
Roger:n.罗杰;[男名]男子名;int.明白;v.与某人性交;
(Laughter) (笑声)
He founded Fox News in 1996. 他在1996年创办了福克斯新闻。
More than 20 women complained about sexual harassment . 公司有超过20多名女性曾抱怨过性骚扰。
complained:v.抱怨;埋怨;发牢骚;(complain的过去分词和过去式) sexual harassment:n.性骚扰;
They said they weren't allowed to succeed at Fox News. 她们说她们不被允许在 福克斯新闻有所成就。
He was ousted last year, but we've seen recently that the problems have persisted . 他去年被赶下台,但我们最近看到 问题依然存在。
ousted:驱逐;取代(oust的过去式和过去分词); recently:adv.最近;新近; persisted:v.持续(persist的过去分词);坚持不懈;
That begs the question: 这引出了一个问题:
What should Fox News do to turn over another leaf? 福克斯新闻应该做些什么改变?
Well, what if they replaced their hiring process with a machine-learning algorithm? 如果他们用机器学习算法 取代传统的招聘流程呢?
process:v.处理;加工;列队行进;n.过程,进行;方法,adj.经过特殊加工(或处理)的; machine-learning:机器学习;
That sounds good, right? 听起来不错,对吧?
Think about it. 想想看。
The data, what would the data be? 数据,这些数据到底是什么?
A reasonable choice would be the last 21 years of applications to Fox News. 福克斯新闻在过去21年的申请函 是一个合理的选择。
reasonable:adj.合理的,公道的;通情达理的;
Reasonable. 很合理。
What about the definition of success? 那么成功的定义呢?
Reasonable choice would be, well, who is successful at Fox News? 合理的选择将是, 谁在福克斯新闻取得了成功?
I guess someone who, say, stayed there for four years and was promoted at least once. 我猜的是,比如在那里呆了四年, 至少得到过一次晋升的人。
promoted:v.促进:促销:提升:(promote的过去分词和过去式)
Sounds reasonable. 听起来很合理。
And then the algorithm would be trained. 然后这个算法将会被训练。
It would be trained to look for people to learn what led to success, what kind of applications historically led to success by that definition. 它会被训练去向人们 学习是什么造就了成功, 什么样的申请函在过去拥有 这种成功的定义。
historically:adv.历史上地;从历史观点上说;
Now think about what would happen if we applied that to a current pool of applicants . 现在想想如果我们把它 应用到目前的申请者中会发生什么。
applied:adj.应用的;实用的;v.应用;使用;申请,请求;(apply的过去分词和过去式) applicants:n.申请人(尤指求职、进高等学校等);(applicant的复数)
It would filter out women because they do not look like people who were successful in the past. 它会过滤掉女性, 因为她们看起来不像 在过去取得成功的人。
filter:n.滤波器;过滤器;滤光器;滤声器;v.过滤;渗入;(用程序)筛选;缓行;
Algorithms don't make things fair if you just blithely , blindly apply algorithms. 算法不会让事情变得公平, 如果你只是轻率地, 盲目地应用算法。
blithely:adv.快活地;无忧无虑地; blindly:adv.盲目地;轻率地;摸索地; apply:v.申请;涂,敷;应用;适用;请求;
They don't make things fair. 它们不会让事情变得公平。
They repeat our past practices, our patterns. 它们只是重复我们过去的做法, 我们的规律。
They automate the status quo . 它们使现状自动化。
automate:v.使自动化; status quo:n.现状;原来的状况;
That would be great if we had a perfect world, but we don't. 如果我们有一个 完美的世界那就太好了, 但是我们没有。
And I'll add that most companies don't have embarrassing lawsuits , but the data scientists in those companies are told to follow the data, to focus on accuracy . 我还要补充一点, 大多数公司都没有令人尴尬的诉讼, 但是这些公司的数据科学家 被告知要跟随数据, 关注它的准确性。
embarrassing:adj.令人尴尬的; v.使尴尬; (embarrass的现在分词) lawsuits:n.诉讼,法律诉讼;控诉(lawsuit的复数形式); accuracy:n.[数]精确度,准确性;
Think about what that means. 想想这意味着什么。
Because we all have bias , it means they could be codifying sexism or any other kind of bigotry . 因为我们都有偏见, 这意味着他们可以编纂性别歧视 或者任何其他的偏见。
bias:adv.使有偏见;n.偏见;偏心;偏爱;v.使有偏见;使偏向;adj.斜的;[电]偏动的; codifying:vt.编纂;将...编成法典;编成法典; sexism:n.(针对女性的)性别歧视;男性至上主义; bigotry:n.偏执;顽固;盲从;
Thought experiment, because I like them: an entirely segregated society -- racially segregated, all towns, all neighborhoods and where we send the police only to the minority neighborhoods to look for c rime. 思维实验, 因为我喜欢它们: 一个完全隔离的社会—— 种族隔离存在于所有的城镇, 所有的社区, 我们把警察只送到少数族裔的社区 去寻找犯罪。
segregated:adj.被隔离的;v.隔离(segregate的过去式);分离; racially:adv.人种上;按人种; rime.:n.[水文]雾凇;白霜;结晶;vt.使蒙霜;vi.蒙上白霜;
The arrest data would be very biased . 逮捕数据将会是十分有偏见的。
biased:adj.有偏见的;结果偏倚的,有偏的;
What if, on top of that, we found the data scientists and paid the data scientists to predict where the next crime would occur ? 除此之外,我们还会寻找数据科学家 并付钱给他们来预测 下一起犯罪会发生在哪里?
predict:v.预报;预言;预告; occur:v.发生;出现;存在于;出现在;
Minority neighborhood . 少数族裔的社区。
neighborhood:n.附近;地区;街坊;adj.附近的;
Or to predict who the next criminal would be? 或者预测下一个罪犯会是谁?
A minority. 少数族裔。
The data scientists would brag about how great and how accurate their model would be, and they'd be right. 这些数据科学家们 会吹嘘他们的模型有多好, 多精确, 当然他们是对的。
brag:n.吹牛,自夸;v.吹牛,自夸; accurate:adj.精确的;
Now, reality isn't that drastic , but we do have severe segregations in many cities and towns, and we have plenty of evidence of biased policing and justice system data. 不过现实并没有那么极端, 但我们确实在许多城市里 有严重的种族隔离, 并且我们有大量的证据表明 警察和司法系统的数据存有偏见。
drastic:adj.激烈的;猛烈的;n.烈性泻药; severe:adj.极为恶劣的;十分严重的;严厉的;苛刻的; segregations:用法,发音,音标,搭配,同义词,反义词和例句等在线英语服务。; evidence:n.证据,证明;迹象;明显;v.证明; justice:n.公平;公正;司法制度;审判;
And we actually do predict hotspots , places where crimes will occur. 而且我们确实预测过热点, 那些犯罪会发生的地方。
hotspots:n.热点(hotspot的复数);麻烦地点,热点地区;
And we do predict, in fact, the individual criminality , the criminality of individuals . 我们确实会预测个人犯罪, 个人的犯罪行为。
criminality:n.有罪,犯罪;犯罪行为; individuals:n.[经]个人;[生物]个体(individual的复数);
The news organization ProPublica recently looked into one of those " recidivism risk" algorithms, as they're called, being used in Florida during sentencing by judges. 新闻机构“人民 (ProPublica)”最近调查了, 其中一个称为 “累犯风险”的算法。 并在佛罗里达州的 宣判期间被法官采用。
organization:n.组织;机构;体制;团体; recidivism:n.再犯,累犯;
Bernard, on the left, the black man, was scored a 10 out of 10. 伯纳德,左边的那个黑人, 10分中得了满分。
Dylan, on the right, 3 out of 10. 在右边的迪伦, 10分中得了3分。
10 out of 10, high risk. 3 out of 10, low risk. 10分代表高风险。 3分代表低风险。
They were both brought in for drug possession . 他们都因为持有毒品 而被带进了监狱。
possession:n.拥有;具有;属地;个人财产;
They both had records, but Dylan had a felony but Bernard didn't. 他们都有犯罪记录, 但是迪伦有一个重罪 但伯纳德没有。
felony:n.重罪;
This matters, because the higher score you are, the more likely you're being given a longer sentence. 这很重要,因为你的分数越高, 你被判长期服刑的可能性就越大。
What's going on? 到底发生了什么?
Data laundering . 数据洗钱。
laundering:v.洗熨(衣物);洗钱;(launder的现在分词)
It's a process by which technologists hide ugly truths inside black box algorithms and call them objective; call them meritocratic . 这是一个技术人员 把丑陋真相隐藏在 算法黑盒子中的过程, 并称之为客观; 称之为精英模式。
technologists:n.技术人员(technologist的复数);[劳经]技术专家; ugly:adj.丑陋的;邪恶的;令人厌恶的; black box:黑箱;(装在飞机上记录飞行情况等的)密封仪器; meritocratic:精英;精英领导;精英管治
When they're secret, important and destructive, 当它们是秘密的, 重要的并具有破坏性的,
I've coined a term for these algorithms: "weapons of math destruction ." 我为这些算法创造了一个术语: “杀伤性数学武器”。
destruction:n.破坏,毁灭;摧毁;
(Laughter) (笑声)
(Applause) (鼓掌)
They're everywhere, and it's not a mistake. 它们无处不在,也不是一个错误。
These are private companies building private algorithms for private ends. 这些是私有公司为了私人目的 建立的私有算法。
Even the ones I talked about for teachers and the public police, those were built by private companies and sold to the government institutions . 甚至是我谈到的老师 与公共警察使用的(算法), 也都是由私人公司所打造的, 然后卖给政府机构。
institutions:n.机构;慈善机构;风俗习惯,制度;(institution的复数)
They call it their "secret sauce " -- that's why they can't tell us about it. 他们称之为“秘密配方(来源)”—— 这就是他们不能告诉我们的原因。
sauce:n.酱;调味汁;讨厌的话(或举动);v.给…加滋味;(俚)对…说冒昧话;
It's also private power. 这也是私人权力。
They are profiting for wielding the authority of the inscrutable . 他们利用神秘莫测的权威来获利。
profiting:v.获益;得到好处;对…有用(或有益);(profit的现在分词) wielding:v.拥有,运用,行使,支配(权力等);(wield的现在分词) authority:n.权威;权力;当局; inscrutable:adj.神秘的;不可理解的;不能预测的;不可思议的;
Now you might think, since all this stuff is private and there's competition , maybe the free market will solve this problem. 你可能会想,既然所有这些都是私有的 而且会有竞争, 也许自由市场会解决这个问题。
stuff:n.东西:物品:基本特征:v.填满:装满:标本: competition:n.竞争;比赛,竞赛; free market:自由市场;
It won't. 然而并不会。
There's a lot of money to be made in unfairness . 在不公平的情况下, 有很多钱可以赚。
unfairness:n.不公平,不公正;不正当;
Also, we're not economic rational agents . 而且,我们不是经济理性的代理人。
economic:adj.经济的,经济上的;经济学的; rational:n.理性;人类;合理的事物;[数]有理数;adj.合理的;理性的;明智的;理智的; agents:n.代理人,经纪人;原动力;(agent的复数)
We all are biased. 我们都是有偏见的。
We're all racist and bigoted in ways that we wish we weren't, in ways that we don't even know. 我们都是固执的种族主义者, 虽然我们希望我们不是, 虽然我们甚至没有意识到。
racist:n.种族主义者;种族主义的; bigoted:adj.顽固的;心地狭窄的;盲从的;
We know this, though, in aggregate , because sociologists have consistently demonstrated this with these experiments they build, where they send a bunch of applications to jobs out, 总的来说,我们知道这一点, 因为社会学家会一直通过这些实验 来证明这一点, 他们发送了大量的工作申请,
aggregate:vi.集合; vt.集合; n.合计; adj.聚合的; sociologists:n.社会学家; consistently:adv.一贯地;一致地;坚实地; demonstrated:v.证明;证实;论证;说明;表达;(demonstrate的过去分词和过去式) a bunch of:一群;一束;一堆;
equally qualified but some have white-sounding names and some have black-sounding names, and it's always disappointing , the results -- always. 都是有同样资格的候选人, 有些用白人人名, 有些用黑人人名, 然而结果总是令人失望的。
qualified:adj.有资格的; v.合格; (qualify的过去分词和过去式) disappointing:adj.令人失望的;v.使失望;使破灭;使落空(disappoint的现在分词)
So we are the ones that are biased, and we are injecting those biases into the algorithms by choosing what data to collect, like I chose not to think about ramen noodles -- 所以我们是有偏见的, 我们还通过选择收集到的数据 来把偏见注入到算法中, 就像我不选择去想拉面一样——
injecting:v.(给…)注射(药物等);(给…)添加,增加(某品质);(inject的现在分词) biases:n.偏差,偏见(bias的复数形式);v.偏见(bias的三单形式);
I decided it was irrelevant . 我自认为这无关紧要。
irrelevant:adj.不相干的;不切题的;
But by trusting the data that's actually picking up on past practices and by choosing the definition of success, how can we expect the algorithms to emerge unscathed ? 但是,通过信任那些 在过去的实践中获得的数据 以及通过选择成功的定义, 我们怎么能指望算法 会是毫无瑕疵的呢?
emerge:v.浮现;显现;暴露;露出真相; unscathed:adj.毫发无损的;未受伤害;未受伤
We can't. We have to check them. 我们不能。我们必须检查。
We have to check them for fairness. 我们必须检查它们是否公平。
The good news is, we can check them for fairness. 好消息是,我们可以做到这一点。
Algorithms can be interrogated , and they will tell us the truth every time. 算法是可以被审问的, 而且每次都能告诉我们真相。
interrogated:v.审问;盘问;查询,询问;(interrogate的过去分词和过去式)
And we can fix them. We can make them better. 然后我们可以修复它们。 我们可以让他们变得更好。
I call this an algorithmic audit , and I'll walk you through it. 我把它叫做算法审计, 接下来我会为你们解释。
algorithmic:adj.[数]算法的;规则系统的; audit:v.审计;[审计]查账;n.审计;[审计]查账;
First, data integrity check. 首先,数据的完整性检查。
integrity:n.完整;正直;诚实;廉正;
For the recidivism risk algorithm I talked about, a data integrity check would mean we'd have to come to terms with the fact that in the US, whites and blacks smoke pot at the same rate but blacks are far more likely to be arrested -- four or five times more likely, depending on the area. 对于刚才提到过的累犯风险算法, 数据的完整性检查将意味着 我们不得不接受这个事实, 在美国,白人和黑人 吸毒的比例是一样的, 但是黑人更有可能被逮捕—— 取决于区域,可能性是白人的4到5倍。
come to terms with:达成协议;让步;
What is that bias looking like in other crime categories , and how do we account for it? 这种偏见在其他犯罪类别中 是什么样子的, 我们又该如何解释呢?
categories:n.(人或事物的)类别,种类(category的复数) account for:对…负有责任;对…做出解释;说明…的原因;导致;(比例)占;
Second, we should think about the definition of success, audit that. 其次,我们应该考虑成功的定义, 审计它。
Remember -- with the hiring algorithm? We talked about it. 还记得我们谈论的雇佣算法吗?
Someone who stays for four years and is promoted once? 那个呆了四年的人, 然后被提升了一次?
Well, that is a successful employee, but it's also an employee that is supported by their culture. 这的确是一个成功的员工, 但这也是一名受到公司文化支持的员工。
That said, also it can be quite biased. 也就是说, 这可能会有很大的偏差。
We need to separate those two things. 我们需要把这两件事分开。
We should look to the blind orchestra audition as an example. 我们应该去看一下乐团盲选试奏, 举个例子。
orchestra:n.管弦乐队;乐队演奏处; audition:n.试演,试唱,试音;v.试演;试唱;试音;面试;
That's where the people auditioning are behind a sheet . 这就是人们在幕后选拔乐手的地方。
auditioning:v.试演;试唱;试音;(audition的现在分词) sheet:n.薄板;床单;纸张;报纸;v.覆盖;展开;给…铺床单;铺开;adj.片状的;
What I want to think about there is the people who are listening have decided what's important and they've decided what's not important, and they're not getting distracted by that. 我想要考虑的是 倾听的人已经 决定了什么是重要的, 同时他们已经决定了 什么是不重要的, 他们也不会因此而分心。
distracted:adj.心烦意乱; v.转移(注意力); (distract的过去分词和过去式)
When the blind orchestra auditions started, the number of women in orchestras went up by a factor of five. 当乐团盲选开始时, 在管弦乐队中, 女性的数量上升了5倍。
auditions:n.试演; v.试演; orchestras:n.管弦乐队;乐队演奏处; factor:n.因素;要素;[物]因数;代理人;v.做代理商;v.把…作为因素计入;
Next, we have to consider accuracy. 其次,我们必须考虑准确性。
This is where the value-added model for teachers would fail immediately. 这就是针对教师的增值模型 立刻失效的地方。
No algorithm is perfect, of course, so we have to consider the errors of every algorithm. 当然,没有一个算法是完美的, 所以我们要考虑每一个算法的误差。
How often are there errors, and for whom does this model fail? 出现错误的频率有多高, 让这个模型失败的对象是谁?
What is the cost of that failure? 失败的代价是什么?
And finally , we have to consider the long-term effects of algorithms, the feedback loops that are engendering . 最后,我们必须考虑 这个算法的长期效果, 与正在产生的反馈循环。
finally:adv.终于;最终;(用于列举)最后;彻底地; long-term:adj.长期的;从长远来看; feedback:n.反馈;反馈意见;回授;[电子]反馈; loops:n.[计]循环(loop复数); v.使...成环,以圈结,以环连结(loop的第三人称单数形式); engendering:引起;
That sounds abstract , but imagine if Facebook engineers had considered that before they decided to show us only things that our friends had posted. 这听起来很抽象, 但是想象一下 如果脸书的工程师们之前考虑过, 并决定只向我们展示 我们朋友所发布的东西。
abstract:n.摘要; adj.抽象的; vt.摘要; vi.做摘要;
I have two more messages, one for the data scientists out there. 我还有两条建议, 一条是给数据科学家的。
Data scientists: we should not be the arbiters of truth. 数据科学家们:我们不应该 成为真相的仲裁者。
arbiters:n.[法]仲裁者;裁决人;
We should be translators of ethical discussions that happen in larger society. 我们应该成为大社会中 所发生的道德讨论的 翻译者。
ethical:adj.伦理的;道德的;凭处方出售的;n.处方药;
(Applause) (掌声)
And the rest of you, the non-data scientists: this is not a math test. 然后剩下的人, 非数据科学家们: 这不是一个数学测试。
This is a political fight. 这是一场政治斗争。
We need to demand accountability for our algorithmic overlords . 我们应该要求我们的 算法霸主承担问责。
accountability:n.有义务;有责任;可说明性; overlords:n.领主,庄主,大王;(overlord的复数)
(Applause) (掌声)
The era of blind faith in big data must end. 盲目信仰大数据的时代必须结束。
Thank you very much. 非常感谢。
(Applause) (掌声)