|
|
JoyBuolamwini_2016X-_我如何与算法偏见对抗_
|
Hello, I'm Joy, a poet of code, on a mission to stop an unseen force that's rising, a force that I called "the coded gaze ," |
大家好 我是乔伊 一位写代码的诗人 我正努力阻止一股 逐渐凸显的无形力量 一种我称为 代码的凝视 的力量 |
mission:n.使命,任务;代表团;布道;v.派遣;向…传教; unseen:adj.看不见的,未看见的;未经预习的;n.(事前未看过原文的)即席翻译; coded:adj.[计]编码的;电码的;译成电码的;v.译成密码(code的过去式和过去分词); gaze:v.凝视;注视;盯着;n.凝视;注视;
|
my term for algorithmic bias . |
这是我用来定义算法偏见的术语 |
algorithmic:adj.[数]算法的;规则系统的; bias:adv.使有偏见;n.偏见;偏心;偏爱;v.使有偏见;使偏向;adj.斜的;[电]偏动的;
|
Algorithmic bias, like human bias, results in unfairness . |
正如人类之间的偏见 算法偏见也会导致不公平 |
unfairness:n.不公平,不公正;不正当;
|
However, algorithms, like viruses, can spread bias on a massive scale at a rapid pace. |
然而算法就像病毒一样 会以飞快的速度大范围地 扩散偏见 |
massive:adj.大量的;巨大的,厚重的;魁伟的; scale:n.规模;比例;鳞;刻度;天平;数值范围;v.衡量;攀登;剥落;生水垢;
|
Algorithmic bias can also lead to exclusionary experiences and discriminatory practices. |
算法也将会导致排他的经历和 歧视性的做法 |
exclusionary:adj.排他的; discriminatory:adj.有辨识力的;差别对待的;
|
Let me show you what I mean. |
给大家举个例子 |
(Video) Joy Boulamwini: Hi, camera. I've got a face. |
(录像)乔伊·博拉维尼: 嘿 摄像头 我来了 |
Can you see my face? |
你可以看到我的脸吗 |
No-glasses face? |
没有戴眼镜的脸呢 |
You can see her face. |
你可以看到她的脸 |
What about my face? |
那么我的脸呢 |
I've got a mask. Can you see my mask? |
我戴上了一个面罩 你可以看到我的面罩吗 |
Joy Boulamwini: So how did this happen? |
乔伊·博拉维尼: 这是怎么回事呢 |
Why am I sitting in front of a computer in a white mask, trying to be detected by a cheap webcam ? |
为什么我坐在一台电脑前 戴着一个白色的面罩 尝试着被一个廉价的 网络摄像头检测到 |
detected:v.发现;查明;侦察出;(detect的过去分词和过去式) webcam:n.网络摄像头;
|
Well, when I'm not fighting the coded gaze as a poet of code, |
当我的身份不是写代码的诗人 与 代码的凝视 较劲的时候 |
I'm a graduate student at the MIT Media Lab, and there I have the opportunity to work on all sorts of whimsical projects, including the Aspire Mirror, a project I did so I could project digital masks onto my reflection . |
我是MIT媒体实验室的 一位硕士生 在那里我有机会参与 各种不同的项目 包括激励镜子 一个可以将数字面罩 投射在我的映像上的项目 |
Media:n.媒体;媒质(medium的复数);血管中层;浊塞音;中脉; whimsical:adj.古怪的;异想天开的;反复无常的; Aspire:v.渴望(成就);有志(成为); digital:adj.数字的;手指的;n.数字;键; reflection:n.反映;沉思;映像;深思;
|
So in the morning, if I wanted to feel powerful, |
在早上的时候 如果我想充满力量 |
I could put on a lion. |
我可以放上一个狮子的图像 |
If I wanted to be uplifted , I might have a quote . |
如果我想要感到积极向上 我也许就会放上一句格言 |
uplifted:adj.上升的;举起的;v.提高;抬起(uplift的过去分词形式); quote:v.引用;报价;举例说明;开价;为(企业的股份)上市;n.引用;
|
So I used generic facial recognition software to build the system, but found it was really hard to test it unless I wore a white mask. |
我使用通用的人脸识别软件 来搭建系统 但是我发现除非我戴上白色的面罩 否则测试很难成功 |
generic:adj.类的;一般的;属的;非商标的; facial:adj.面部的,表面的;脸的,面部用的;n.美容,美颜;脸部按摩; recognition:n.识别;认识;承认;认可;
|
Unfortunately , I've run into this issue before. |
遗憾的是 我以前 也曾遇到过这种问题 |
Unfortunately:adv.不幸地; issue:n.重要议题;争论的问题;v.宣布;公布;发出;发行;
|
When I was an undergraduate at Georgia Tech studying computer science , |
当我在佐治亚理工学院 读计算机科学专业本科的时候 |
undergraduate:n.本科生; computer science:n.计算机科学;
|
I used to work on social robots, and one of my tasks was to get a robot to play peek-a-boo , a simple turn-taking game where partners cover their face and then uncover it saying, "Peek-a-boo!" |
我曾经在一个 社交机器人上进行实验 我的任务之一是 让机器人玩躲猫猫 一个简单的轮换游戏 在游戏中玩伴盖住他们的脸 然后掀开说“躲猫猫!“ |
peek-a-boo:n.同位穿孔; uncover:v.揭开盖子;发现;揭露;揭发;
|
The problem is, peek-a-boo doesn't really work if I can't see you, and my robot couldn't see me. |
问题是躲猫猫在我不能 看见你的时候不起作用 而我的机器人看不见我 |
But I borrowed my roommate's face to get the project done, submitted the assignment , and figured, you know what, somebody else will solve this problem. |
我只好借了我室友的脸 去完成这个项目 递交了作业 寻思着总会有人 来解决这个问题的把 |
submitted:v.提交;顺从;投降;表示;认为;建议;(submit的过去分词和过去式) assignment:n.分配:(分派的)工作,任务:
|
Not too longer after, |
不久之后 |
I was in Hong Kong for an entrepreneurship competition . |
我在香港参加一次创业比赛 |
entrepreneurship:n.企业家精神; competition:n.竞争;比赛,竞赛;
|
The organizers decided to take participants on a tour of local start-ups . |
组织者决定将各位参与者 带到当地的初创企业参观 |
organizers:n.主办单位,组织者(organizer的复数形式); participants:n.参与者(participant的复数形式); start-ups:起动;新兴公司(start-up的名词复数);
|
One of the start-ups had a social robot, and they decided to do a demo . |
其中一个创业公司 有一个社交机器人 他们决定进行一个项目演示 |
demo:v.试用(尤指软件);演示;示范;n.试样唱片;录音样带;
|
The demo worked on everybody until it got to me, and you can probably guess it. |
这个项目演示对除我之外的 每个人都有效果 你恐怕可以猜到 |
It couldn't detect my face. |
它不能检测到我的脸 |
I asked the developers what was going on, and it turned out we had used the same generic facial recognition software. |
我问开发师到底发生了什么 结果是我们使用了同一款 通用面部识别软件 |
developers:n.开发商;发展者;[摄]显影剂(developer的复数);
|
Halfway around the world, |
在地球的另一边 |
I learned that algorithmic bias can travel as quickly as it takes to download some files off of the internet. |
我意识到算法偏见 传播得如此之快 只需要从互联网上 下载一些文件 |
So what's going on? Why isn't my face being detected? |
那么到底发生了什么 为什么我的脸没有被检测到 |
Well, we have to look at how we give machines sight. |
我们需要了解我们 如何教会机器识别 |
Computer vision uses machine learning techniques to do facial recognition. |
计算机视觉使用机器学习技术 来进行面部识别 |
vision:n.视力;美景;幻象;想象力;v.想象;显现;梦见; techniques:n.技巧;技艺;工艺;技术;(technique的复数)
|
So how this works is, you create a training set with examples of faces. |
所以你要用一系列脸的样本 创建一个训练体系 |
This is a face. This is a face. This is not a face. |
这是一张脸 这是一张脸 而这不是一张脸 |
And over time, you can teach a computer how to recognize other faces. |
慢慢地你可以教电脑 如何识别其它的脸 |
recognize:v.认识;认出;辨别出;承认;意识到;
|
However, if the training sets aren't really that diverse , any face that deviates too much from the established norm will be harder to detect, which is what was happening to me. |
然而如果这个训练集 不是那么的多样化 那些与已建立的标准 偏差较多的脸 将会难以被检测到 而这正是我遭遇的问题 |
diverse:adj.不同的;多种多样的;变化多的; deviates:vi.脱离;越轨;vt.使偏离; established:adj.已确立的;著名的;v.建立;创立;设立;(establish的过去分词和过去式) norm:n.规范;标准;定额;常态;v.规范;规定;
|
But don't worry -- there's some good news. |
不过别担心 我们还有好消息 |
Training sets don't just materialize out of nowhere . |
训练集并不是凭空产生的 |
materialize:vt.使具体化,使有形;使突然出现;使重物质而轻精神;vi.实现,成形;突然出现; nowhere:v.无处; n.无处; adj.不存在的;
|
We actually can create them. |
实际上我们可以创造它们 |
So there's an opportunity to create full-spectrum training sets that reflect a richer portrait of humanity . |
现在就有机会去创造 全波段光谱的训练集 可以反映更加饱满的人类面貌 |
portrait:n.肖像;描写;半身雕塑像; humanity:n.人类;人道;仁慈;人文学科;
|
Now you've seen in my examples how social robots was how I found out about exclusion with algorithmic bias. |
现在你看到了在我的例子中 社交机器人 使我发现了算法偏见的排他性 |
But algorithmic bias can also lead to discriminatory practices. |
不过算法偏见还会导致 各种歧视性的做法 |
Across the US, police departments are starting to use facial recognition software in their crime-fighting arsenal . |
美国境内的警察局 在打击犯罪的过程中 开始使用面部识别软件 |
crime-fighting:n.打击犯罪; arsenal:n.兵工厂;军械库;
|
Georgetown Law published a report showing that one in two adults in the US -- that's 117 million people -- have their faces in facial recognition networks. |
乔治敦大学法学院 发表了一个报告 表明在全美两个成年人中就有一个 也就是近1.2亿的人口 他们的面部信息 被储存在了面部识别网络中 |
Georgetown:n.乔治城(圭亚那首都);
|
Police departments can currently look at these networks unregulated , using algorithms that have not been audited for accuracy . |
警察局如今可以访问 这些未被规范的 使用着未审核准确性的 算法的面部识别网络 |
currently:adv.当前;一般地; unregulated:adj.未经调节的;[仪]未校准的; audited:adj.受审查的;受审计的;v.审计;旁听(audit的过去分词); accuracy:n.[数]精确度,准确性;
|
Yet we know facial recognition is not fail proof , and labeling faces consistently remains a challenge. |
然而我们知道面部识别 并非万无一失 而持续地给面部标签 还是很有挑战性的 |
proof:n.证据;证实;adj.能抵御;可防护; labeling:n.标签;标记;[计]标号;v.贴标签;分类;(label的现在分词) consistently:adv.一贯地;一致地;坚实地;
|
You might have seen this on Facebook. |
你也许在Facebook上见过这个 |
My friends and I laugh all the time when we see other people mislabeled in our photos. |
当我和我的朋友看到其他人 在我们的照片上被错误标注时 都会捧腹大笑 |
mislabeled:vt.贴错标签;
|
But misidentifying a suspected criminal is no laughing matter, nor is breaching civil liberties . |
但是误认一个犯罪嫌疑人 可不是闹着玩儿的 对公民自由的侵犯也不容忽视 |
misidentifying:vt.识别错; suspected:v.怀疑;不信任;(suspect的过去分词和过去式) breaching:v.违反;违背;在…上打开缺口;(breach的现在分词) civil:adj.公民的;民间的;文职的;有礼貌的;根据民法的; liberties:n.自由(liberty的复数);
|
Machine learning is being used for facial recognition, but it's also extending beyond the realm of computer vision. |
机器学习正被用于面部识别 但也延伸到了计算机视觉领域之外 |
extending:v.使伸长;扩大;扩展;延长;(extend的现在分词) realm:n.领域,范围;王国;
|
In her book, "Weapons of Math Destruction ," |
在数据科学家凯西·欧奈尔在她 《数学杀伤性武器》一书中 |
Destruction:n.破坏,毁灭;摧毁;
|
data scientist Cathy O'Neil talks about the rising new WMDs -- widespread , mysterious and destructive algorithms that are increasingly being used to make decisions that impact more aspects of our lives. |
叙述了逐渐严重的 新型大规模杀伤性武器 即 广泛应用而又神秘的 具有破坏性的算法 正在被越来越多地 运用于决策制定上 而这些决策影响着 我们生活的方方面面 |
widespread:adj.普遍的,广泛的;分布广的; mysterious:adj.神秘的;不可思议的;难解的; destructive:adj.破坏的;毁灭性的;有害的,消极的; increasingly:adv.越来越多地;渐增地; impact:n.影响;效果;碰撞;冲击力;v.挤入,压紧;撞击;对…产生影响; aspects:n.方面;相位;面貌(aspect的复数);
|
So who gets hired or fired? |
谁被录用 又有谁被解雇 |
Do you get that loan? Do you get insurance ? |
你得到了贷款吗 你买到了保险吗 |
insurance:n.保险;保险业;保险费;保费;adj.胜券在握的;
|
Are you admitted into the college you wanted to get into? |
你被心目中的理想大学录取了吗 |
Do you and I pay the same price for the same product purchased on the same platform ? |
在同一平台上的同一件产品 你和我是否支付同样的价格 |
purchased:v.买;购买;采购;(purchase的过去式和过去分词) platform:n.平台; v.把…放在台上[放在高处;
|
Law enforcement is also starting to use machine learning for predictive policing. |
为了实现警情预测 执法机构也开始 使用起机器学习 |
enforcement:n.执行,实施;强制; predictive:adj.预言性的;成为前兆的;
|
Some judges use machine-generated risk scores to determine how long an individual is going to spend in prison. |
一些法官使用机器生成的 危险评分来决定 囚犯要在监狱里呆多久 |
determine:v.决定;确定;测定;查明;形成;影响;裁决;安排; individual:n.个人;有个性的人;adj.单独的;个别的;
|
So we really have to think about these decisions. |
我们真的应该 仔细思考这些决定 |
Are they fair? |
它们公平吗 |
And we've seen that algorithmic bias doesn't necessarily always lead to fair outcomes . |
我们已经清楚了 算法偏见 不一定总能带来公平的结果 |
necessarily:adv.必要地;必定地,必然地; outcomes:n.结果;成果;后果;出路;(outcome的复数)
|
So what can we do about it? |
那我们应该怎么做呢 |
Well, we can start thinking about how we create more inclusive code and employ inclusive coding practices. |
我们可以开始思考如何 创造更具有包容性的代码 并且运用有包容性的编程实践 |
inclusive:adj.包括的,包含的; employ:vt.使用,采用;雇用;使忙于,使从事于;n.使用;雇用; coding:n.译码;v.把…编码;(code的现在分词)
|
It really starts with people. |
这真的要从人开始 |
So who codes matters. |
由谁来编程很重要 |
Are we creating full-spectrum teams with diverse individuals who can check each other's blind spots? |
我们组建的全光谱团队中 是否包括各种各样的个体 他们可以弥补彼此的盲区吗 |
individuals:n.[经]个人;[生物]个体(individual的复数);
|
On the technical side, how we code matters. |
在技术层面上 我们如何编程很重要 |
technical:adj.工艺的,科技的;技术上的;专门的;
|
Are we factoring in fairness as we're developing systems? |
我们在研发系统的同时 有没有也考虑到公平的因素 |
factoring:n.[数]因子分解,[数]因式分解; v.把…因素包括进去(factor的现在分词);
|
And finally , why we code matters. |
最后一点 我们为什么编程也很重要 |
finally:adv.终于;最终;(用于列举)最后;彻底地;
|
We've used tools of computational creation to unlock immense wealth . |
我们用计算机创建的工具 创造了巨大的财富 |
computational:adj.计算的; creation:n.创造,创作;创作物,产物; immense:adj.巨大的,广大的;无边无际的;非常好的; wealth:n.财富;大量;富有;
|
We now have the opportunity to unlock even greater equality if we make social change a priority and not an afterthought . |
现在我们有机会去 创造进一步的平等 我们应该优先考虑社会变革 而不是想着事后优化 |
priority:n.优先;优先权;[数]优先次序;优先考虑的事; afterthought:n.事后的想法;后来添加的东西;
|
And so these are the three tenets that will make up the "incoding" movement. |
所以这三个宗旨 将构成“译码”运动 |
tenets:n.原理,原则(tenet的复数);信条;
|
Who codes matters, how we code matters and why we code matters. |
由谁来编程很重要 我们如何编程很重要 以及我们为什么编程很重要 |
So to go towards incoding, we can start thinking about building platforms that can identify bias by collecting people's experiences like the ones I shared, but also auditing existing software. |
所以就译码来说 我们可以开始考虑 建立一个我们可以辨识偏见的平台 通过收集人们与我类似的经历 不过也要审查现有的软件 |
platforms:n.平台; v.把…放在台上; auditing:n.审计;[审计]查帐;v.审计;查账(audit的现在分词);
|
We can also start to create more inclusive training sets. |
我们也可以创造一些 更有包容性的训练集 |
Imagine a "Selfies for Inclusion " campaign where you and I can help developers test and create more inclusive training sets. |
想象一个为了包容性的自拍运动 在那里 你和我可以帮助 程序员测试以及创造 更具包容性的训练集 |
Inclusion:n.包含;内含物;
|
And we can also start thinking more conscientiously about the social impact of the technology that we're developing. |
我们还可以开始更认真地思考 关于正在发展的科技 造成的社会影响 |
conscientiously:adv.有良心地; technology:n.技术;工艺;术语;
|
To get the incoding movement started, |
为了开启译码运动 |
I've launched the Algorithmic Justice League , where anyone who cares about fairness can help fight the coded gaze. |
我发起了算法正义联盟 在那里任何关心公平的人 可以出力来对抗 代码的凝视 |
launched:v.发射;发起;开展;开始;(launch的过去式和过去分词) Justice:n.公平;公正;司法制度;审判; League:n.联赛;联盟;协会;种类;v.(使)结盟;(使)联合[团结];一鼻孔出气;组成联盟;
|
request audits , become a tester and join the ongoing conversation, |
请求审核 成为测试者 以及加入正在进行的谈话 |
audits:n.审计(audit的复数);经查核纠正的帐目;v.审计(audit的三单形式); tester:检验器,试验员 ongoing:n.发展; adj.持续存在的;
|
#codedgaze. |
标签就是 代码的凝视 |
So I invite you to join me in creating a world where technology works for all of us, not just some of us, a world where we value inclusion and center social change. |
我在此邀请各位加入我 去创造一个让科技为我们 所有人服务的世界 而不是只服务于部分人 一个我们珍惜包容和 聚焦社会变革的世界 |
Thank you. |
谢谢 |
(Applause) |
(掌声) |
But I have one question: |
不过我还有一个问题 |
Will you join me in the fight? |
你会与我并肩战斗吗 |
(Laughter) |
(笑声) |
(Applause) |
(掌声) |