返回首页

ZeynepTufekci_2017G-_为了让人们点击广告,我们正在建造一个反乌托邦_

So when people voice fears of artificial intelligence , very often, they invoke images of humanoid robots run amok . 当人们谈论起对于 人工智能的恐惧时 浮现在脑海里的 往往是失控的机器人
artificial intelligence:n.人工智能; invoke:vt.调用;祈求;引起;恳求; images:n.印象;声誉;形象;画像;雕像;(image的第三人称单数和复数) humanoid:adj.像人的;n.类人动物; amok:adv.狂乱地;杀人狂地;adj.杀人狂的;狂乱的;
You know? Terminator ? 就像终结者一样
Terminator:n.终结者;终止子;明暗界限;
You know, that might be something to consider, but that's a distant threat. 这种担心固然有一定道理 但目前和我们相隔甚远
distant:adj.遥远的;远处的;久远的;
Or, we fret about digital surveillance with metaphors from the past. 我们也会对数字监控心生恐惧 这从过去的隐喻中就可以初见端倪
fret:v.使烦恼;焦急;使磨损;n.烦躁;焦急;磨损; digital:adj.数字的;手指的;n.数字;键; surveillance:n.监督;监视; metaphors:n.隐喻(metaphor的复数形式);
'"1984," George Orwell's "1984," 例如乔治·奥威尔的着作 1984
it's hitting the bestseller lists again. 最近再次登上热销榜
bestseller:n.畅销书;畅销书作者;畅销商品(等于bestseller);
It's a great book, but it's not the correct dystopia for the 21st century. 这是一本很好的书 但是书中的反乌托邦社会 并不是21世纪的正确缩影
dystopia:n.反面假想国;
What we need to fear most is not what artificial intelligence will do to us on its own, but how the people in power will use artificial intelligence to control us and to manipulate us in novel , sometimes hidden, subtle and unexpected ways. 我们最应该担心的 并不是人工智能本身 对我们的影响 而是掌权的人会怎样 利用人工智能 来控制并摆布我们 通过新奇 有时是隐蔽的 微妙以及不可预料的手段
manipulate:vt.操纵;操作;巧妙地处理;篡改; novel:adj.新奇的;异常的;n.小说; subtle:adj.微妙的;精细的;敏感的;狡猾的;稀薄的; unexpected:adj.意外的,想不到的;
Much of the technology that threatens our freedom and our dignity in the near-term future is being developed by companies in the business of capturing and selling our data and our attention to advertisers and others: 很多对我们的 自由和尊严有潜在威胁的科技 正在被那些收集 并贩卖我们的私人信息给广告商的 公司开发出来
technology:n.技术;工艺;术语; dignity:n.尊严;高贵; near-term:adj.近期的; capturing:v.俘虏;捕获;攻占;夺得;(capture的现在分词) advertisers:n.广告商(advertiser的复数形式);
Facebook, Google , Amazon , 例如脸书 谷歌 亚马逊
Google:谷歌;谷歌搜索引擎; Amazon:亚马逊;古希腊女战士;
Alibaba, Tencent. 以及阿里巴巴和腾讯
Now, artificial intelligence has started bolstering their business as well. 现在 人工智能也开始强化 他们自身的业务
bolstering:n.支持;支撑;v.支撑;鼓励;增强(bolster的现在分词形式);
And it may seem like artificial intelligence is just the next thing after online ads. 看起来好像人工智能只不过 是网络广告的下一步
It's not. 但并非如此
It's a jump in category . 它是一个全新的类别
category:n.种类,分类;[数]范畴;
It's a whole different world, and it has great potential . 是一个完全不同的世界 并且有着极高的潜力
potential:n.潜能;可能性;[电]电势;adj.潜在的;可能的;势的;
It could accelerate our understanding of many areas of study and research. 它可以加快人们在很多 领域的学习与研究速度
accelerate:v.加快;加速;n.接受速成教育的学生;
But to paraphrase a famous Hollywood philosopher , "With prodigious potential comes prodigious risk." 但就如好莱坞一名着名哲学家所言 惊人的潜力带来的是惊人的风险
paraphrase:n.释义;解释;意译;vt.释义; philosopher:n.哲学家;深思的人;善于思考的人; prodigious:adj.惊人的,异常的,奇妙的;巨大的;
Now let's look at a basic fact of our digital lives, online ads. 我们得明白关于数字生活 以及网络广告的基本事实
Right? We kind of dismiss them. 是吧 我们几乎把它们忽略了
dismiss:v.不予考虑;摒弃;消除;解雇;
They seem crude , ineffective . 尽管它们看起来很粗糙 没什么说服力
crude:adj.粗略的;简略的;大概的;粗糙的;n.原油;石油; ineffective:adj.无效的,失效的;不起作用的;
We've all had the experience of being followed on the web by an ad based on something we searched or read. 我们都曾在上网时 被网上的一些广告追踪过 它们是根据我们的浏览历史生成的
You know, you look up a pair of boots and for a week, those boots are following you around everywhere you go. 比如 你搜索了一双皮靴 接下来的一周里 这双皮靴就 在网上如影随形的跟着你
Even after you succumb and buy them, they're still following you around. 即使你屈服了 买下了它们 广告也不会消失
succumb:vi.屈服;死;被压垮;
We're kind of inured to that kind of basic, cheap manipulation . 我们已经习惯了这种 廉价粗暴的操纵
inured:adj.习惯的;v.使习惯;有助于;生效(inure的过去分词); manipulation:n.操作;管理措施;处理;操纵证券市场;变换;
We roll our eyes and we think, "You know what? These things don't work." 还不屑一顾的想着 这东西对我没用的
Except, online, the digital technologies are not just ads. 但是别忘了 在网上 广告并不是数字科技的全部
technologies:n.技术;科技(technology的复数);
Now, to understand that, let's think of a physical world example. 为了便于理解 我们举几个 现实世界的例子
physical:adj.[物]物理的;身体的;物质的;符合自然法则的;n.体格检查;
You know how, at the checkout counters at supermarkets, near the cashier , there's candy and gum at the eye level of kids? 你知道为什么在超市收银台的旁边 要放一些小孩子 一眼就能看到的糖果吗
checkout:检验,结帐 cashier:n.出纳员;v.开除…的军籍; gum:n.牙龈;树胶;齿龈;树脂;v.在…上涂胶;用黏胶粘; eye level:n.视线高度,齐眼高度;
That's designed to make them whine at their parents just as the parents are about to sort of check out. 那是为了让孩子在父母面前撒娇 就当他们马上要结账的时候
whine:v.发牢骚;哭诉;发呜呜声;n.抱怨;牢骚;哀鸣; are about to:眼看就要;即将;正要;行将;
Now, that's a persuasion architecture . 那是一种说服架构
persuasion:n.说服;说服力;信念;派别; architecture:n.建筑学;建筑风格;建筑式样;架构;
It's not nice, but it kind of works. 并不完美 但很管用
That's why you see it in every supermarket. 这也是每家超市惯用的伎俩
Now, in the physical world, such persuasion architectures are kind of limited , because you can only put so many things by the cashier. Right? 在现实世界里 这种说服架构是有限制的 因为能放在收银台旁边的 东西是有限的 对吧
architectures:n.建筑;架构(architecture的复数); limited:adj.有限的; n.高级快车; v.限制; (limit的过去分词和过去式)
And the candy and gum, it's the same for everyone, even though it mostly works only for people who have whiny little humans beside them. 而且所有人看到的都是同样的糖果 所以说大多数情况下 只是针对那些带着小孩的买主
whiny:adj.烦躁的;爱抱怨的;常发牢骚的;
In the physical world, we live with those limitations . 这些是现实世界的种种局限
limitations:n.局限性;(限制)因素;边界(limitation的复数形式);
In the digital world, though, persuasion architectures can be built at the scale of billions and they can target, infer , understand and be deployed at individuals one by one by figuring out your weaknesses, 但在网络世界里 说服架构可以千变万化 因人而异 它们可以理解并推断个体用户的喜好 然后被部署在用户周围 一个接一个 通过对每个人弱点的了解
scale:n.规模;比例;鳞;刻度;天平;数值范围;v.衡量;攀登;剥落;生水垢; infer:v.推断;推论;暗示;推理; deployed:v.部署(deploy的过去式);展开; individuals:n.[经]个人;[生物]个体(individual的复数); one by one:一个接一个;
and they can be sent to everyone's phone private screen, so it's not visible to us. 出现在每个人的私人手机屏幕上 而其他人却看不见
visible:adj.明显的;看得见的;现有的;可得到的;n.可见物;进出口贸易中的有形项目;
And that's different. 这是(与物质世界)截然不同的地方
And that's just one of the basic things that artificial intelligence can do. 而这仅仅是人工智能的基本功能之一
Now, let's take an example. 再举个例子
Let's say you want to sell plane tickets to Vegas. Right? 假如你要销售飞往 拉斯维加斯的机票
So in the old world , you could think of some demographics to target based on experience and what you can guess. 在过去 你也许需要一些 统计资料来确定销售对象 然后根据你的个人经验和判断
old world:adj.古代世界的,古时的;从前的,以前的; demographics:n.人口统计资料;
You might try to advertise to, oh, men between the ages of 25 and 35, or people who have a high limit on their credit card , or retired couples. Right? 你也许会把推广目标定为 25岁到35岁的男性 或者是高信用卡额度人群 或者是退休夫妇 对吧
credit card:n.[经]信用卡;
That's what you would do in the past. 那就是你以前采用的方法
With big data and machine learning, that's not how it works anymore. 但在大数据和人工智能面前 一切都改变了
So to imagine that, think of all the data that Facebook has on you: every status update you ever typed, every Messenger conversation, every place you logged in from, all your photographs that you uploaded there. 请想象一下 你被Facebook掌握的所有信息 你的每一次状态更新 每一条对话内容 所有的登陆地点 你上传的所有照片
status:n.地位;状态;情形;重要身份; update:vt.使现代化;更新;n.现代化;更新的信息; Messenger:n.报信者,送信者;先驱; logged:v.把…载入正式记录;记录;行驶;采伐;(log的过去分词和过去式) uploaded:vt.上传;
If you start typing something and change your mind and delete it, 还有你输入了一部分 后来又删掉的内容
Facebook keeps those and analyzes them, too. Facebook也会保存下来进行分析
analyzes:分析;
Increasingly , it tries to match you with your offline data. 它将越来越多的数据 和你的离线生活匹配
Increasingly:adv.越来越多地;渐增地; offline:n.脱机;挂线;adj.脱机的;离线的,未连线的;adv.未连线地;
It also purchases a lot of data from data brokers . 还有从网络信息商贩那里购买信息
purchases:v.[贸易]购买; n.所购物; brokers:n.[经]经纪人(broker复数);
It could be everything from your financial records to a good chunk of your browsing history. 从你的财务记录到 所有网页浏览记录 各类信息无所不包
financial:adj.金融的;财政的,财务的; chunk:n.大块;矮胖的人或物; browsing:v.随便看看;浏览;翻阅;浏览信息;(browse的现在分词)
Right? In the US, such data is routinely collected, collated and sold. 在美国 这种数据是经常被收集 被整理 然后被贩卖的
routinely:adv.例行公事地;老一套地; collated:v.整理(collate的过去分词);核对;
In Europe, they have tougher rules. 而在欧洲 这是被明令禁止的
So what happens then is, by churning through all that data, these machine-learning algorithms -- that's why they're called learning algorithms -- they learn to understand the characteristics of people who purchased tickets to Vegas before. 所以接下来会发生的是 电脑通过算法分析 所有收集到的数据 这个算法之所以叫做学习算法 因为它们能够学会分析所有之前买过 去维加斯机票的人的性格特征
churning:n.搅乳;一次提制的奶油;v.搅拌(churn的现在分词); machine-learning:机器学习; characteristics:n.特征;特点;品质;(characteristic的复数) purchased:v.买;购买;采购;(purchase的过去式和过去分词)
When they learn this from existing data, they also learn how to apply this to new people. 而在学会分析已有数据的同时 它们也在学习如何将其 应用在新的人群中
apply:v.申请;涂,敷;应用;适用;请求;
So if they're presented with a new person, they can classify whether that person is likely to buy a ticket to Vegas or not. 如果有个新用户 它们可以迅速判断这个人 会不会买去维加斯的机票
classify:vt.分类;分等;
Fine. You're thinking, an offer to buy tickets to Vegas. 这倒还好 你也许会想 不就是一个卖机票的广告吗
I can ignore that. 我不理它不就行了
ignore:v.驳回诉讼;忽视;不理睬;
But the problem isn't that. 但问题不在这儿
The problem is, we no longer really understand how these complex algorithms work. 真正的问题是 我们已经无法真正理解 这些复杂的算法究竟是怎样工作的了
complex:adj.复杂的;合成的;n.复合体;综合设施;
We don't understand how they're doing this categorization . 我们不知道它们是 如何进行这种分类的
categorization:n.分类;分门别类;编目方法;
It's giant matrices , thousands of rows and columns , maybe millions of rows and columns , and not the programmers and not anybody who looks at it, even if you have all the data, understands anymore how exactly it's operating 那是庞大的数字矩阵 成千上万的行与列 也许是数百万的行与列 而没有程序员看管它们 没有任何人看管它们 即使你拥有所有的数据 也完全了解算法是如何运行的
giant:n.巨人;伟人;巨兽;adj.巨大的;特大的 matrices:n.[数]矩阵;模型;[生物][地质]基质;母岩(matrix的复数); columns:n.柱:(通常为)圆形石柱:(书,报纸印刷页上的)栏(column的复数)
any more than you'd know what I was thinking right now if you were shown a cross section of my brain. 如果仅仅展示给你我的部分脑截面 你也不可能知道我的想法
cross section:n.横截面;横断面;
It's like we're not programming anymore, we're growing intelligence that we don't truly understand. 就好像这已经不是我们在编程了 我们是在创造一种 我们并不了解的智能
And these things only work if there's an enormous amount of data, so they also encourage deep surveillance on all of us so that the machine learning algorithms can work. 这种智能只有在 庞大的数据支持下才能工作 所以它们才致力于对我们 所有人进行强力监控 以便学习算法的运行
enormous:adj.庞大的,巨大的;凶暴的,极恶的;
That's why Facebook wants to collect all the data it can about you. 这就是Facebook费尽心思 收集用户信息的原因
The algorithms work better. 这样算法才能更好的运行
So let's push that Vegas example a bit. 我们再将那个维加斯的 例子强化一下
What if the system that we do not understand was picking up that it's easier to sell Vegas tickets to people who are bipolar and about to enter the manic phase . 如果那个我们并不了解的系统 发现即将进入躁狂 阶段的躁郁症患者 更有可能买去维加斯的机票
What if:如果…怎么办? bipolar:adj.有两极的,双极的; manic:adj.躁狂的;狂热的;n.躁狂症者; phase:n.阶段;时期;月相;(月亮的)盈亏;v.分阶段进行;逐步做;
Such people tend to become overspenders, compulsive gamblers . 这是一群有挥霍金钱 以及好赌倾向的人
compulsive:adj.强制的;强迫的; gamblers:n.赌徒;投机者(gambler的复数);
They could do this, and you'd have no clue that's what they were picking up on. 这些算法完全做得到 而你却对它们 是如何做到的毫不知情
clue:n.提示;迹象;(纵横填字谜、游戏或问题的)提示词语;v.提示;为…提供线索;
I gave this example to a bunch of computer scientists once and afterwards, one of them came up to me. 我曾把这个例子举给 一些计算机科学家 后来其中一个找到我
a bunch of:一群;一束;一堆;
He was troubled and he said, "That's why I couldn't publish it." 他很烦恼 并对我说 这就是我没办法发表它的原因
publish:v.出版;发表;公布;
I was like, "Couldn't publish what?" 我问 发表什么
He had tried to see whether you can indeed figure out the onset of mania from social media posts before clinical symptoms , and it had worked, and it had worked very well, and he had no idea how it worked or what it was picking up on. 他曾尝试在狂躁症病人 被确诊具有某些医疗症状前 是否可以从他们的社交媒体上 发现病情的端倪 他做到了 还做得相当不错 但他不明白这是怎么做到的 或者说如何算出来的
onset:n.开始,着手;发作;攻击,进攻; mania:n.狂热;狂躁;热衷; media:n.媒体;媒质(medium的复数);血管中层;浊塞音;中脉; clinical:adj.临床的;诊所的; symptoms:n.症状;征候;征兆;(symptom的复数)
Now, the problem isn't solved if he doesn't publish it, because there are already companies that are developing this kind of technology, and a lot of the stuff is just off the shelf. 那么如果他不发表论文 这个问题就得不到解决 因为早就有其他的一些公司 在发展这样的科技了 很多类似的东西现在就摆在货架上
stuff:n.东西:物品:基本特征:v.填满:装满:标本:
This is not very difficult anymore. 这已经不是什么难事了
Do you ever go on YouTube meaning to watch one video and an hour later you've watched 27? 你是否曾经想在YouTube 上看一个视频 结果不知不觉看了27个
You know how YouTube has this column on the right that says, "Up next" 你知不知道YouTube的 网页右边有一个边栏 上面写着 即将播放
and it autoplays something? 然后它往往会自动播放一些东西
It's an algorithm picking what it thinks that you might be interested in and maybe not find on your own. 这就是算法 算出你的兴趣点 甚至连你自己都没想到
It's not a human editor. 这可不是人工编辑
It's what algorithms do. 这就是算法的本职工作
It picks up on what you have watched and what people like you have watched, and infers that that must be what you're interested in, what you want more of, and just shows you more. 它选出你以及和你 相似的人看过的视频 然后推断出你的大致兴趣圈 推断出你想看什么 然后就那些东西展示给你
infers:vt.推断;推论;vi.推断;作出推论;
It sounds like a benign and useful feature, except when it isn't. 听起来像是一个无害且贴心的功能 但有时候它并不是
benign:adj.良性的;和蔼的,亲切的;吉利的;
So in 2016, I attended rallies of then-candidate Donald Trump to study as a scholar the movement supporting him. 2016年 我参加了当时的总统 候选人 唐纳德 特朗普 的系列集会 以学者的身份研究 这个支持他的运动
rallies:v.恢复; n.集会; (rally的复数) Trump:n.王牌;主牌花色;v.出王牌赢(牌);;赢;胜过;打败; scholar:n.学者;奖学金获得者;聪颖勤奋的学生;
I study social movements, so I was studying it, too. 我当时正好在研究社会运动
And then I wanted to write something about one of his rallies, so I watched it a few times on YouTube. 然后我想要写一些 有关其中一次集会的文章 所以我在YouTube上看了几遍 这个集会的视频
YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism . 然后YouTube就开始 不断给我推荐 并且自动播放一些 白人至上主义的视频 这些视频一个比一个更极端
recommending:v.推荐,举荐;介绍;劝告;建议;使受欢迎;(recommend的现在分词) supremacist:n.至上主义者;adj.至上主义者的; extremism:n.极端主义(尤指政治上的极右或极左);极端性;过激主义;
If I watched one, it served up one even more extreme and autoplayed that one, too. 如果我看了一个 就会有另一个更加 极端的视频加入队列 并自动播放
extreme:adj.极端的;极度的;偏激的;尽头的;n.极端;末端;最大程度;极端的事物;
If you watch Hillary Clinton or Bernie Sanders content , 如果你看有关 希拉里 克林顿 或者 伯尼 桑德斯 的内容
Hillary:n.希拉里(美国国务卿); Sanders:n.打磨机;(sander的复数) content:n.内容,目录;满足;容量;adj.满意的;vt.使满足;
YouTube recommends and autoplays conspiracy left, and it goes downhill from there. YouTube就会开始推荐并 自动播放左翼阴谋内容 并且愈演愈烈
recommends:v.推荐;介绍;劝告;建议;使受欢迎;(recommend的第三人称单数) conspiracy:n.阴谋;共谋;阴谋集团; downhill:adv.下坡;向下;每况愈下;adj.下坡的;容易的;n.下坡;滑降;
Well, you might be thinking, this is politics , but it's not. 你也许觉得这和政治有关
politics:n.政治;钩心斗角;政治观点;v.(贬)从事政治活动;(politic的第三人称单数)
This isn't about politics. 但事实上并不是这样
This is just the algorithm figuring out human behavior. 这只不过是算法在 学习人类行为而已
I once watched a video about vegetarianism on YouTube and YouTube recommended and autoplayed a video about being vegan . 我曾在YouTube上观看过 一个有关素食主义的视频 然后YouTube就推送了 纯素主义的视频
vegetarianism:n.素食主义; recommended:v.推荐;举荐;介绍;劝告;建议;(recommend的过去分词和过去式) vegan:n.(英)严格的素食主义者;adj.严守素食主义的;
It's like you're never hardcore enough for YouTube. 在YouTube上你就 好像永远都不够决绝
hardcore:adj.赤裸裸描写性行为的;n.硬核;硬底层;碎砖垫层;
(Laughter) (笑声)
So what's going on? 这到底是怎么回事儿
Now, YouTube's algorithm is proprietary , but here's what I think is going on. 现在YouTube有其专有的算法 但我认为事情是这样的
proprietary:n.所有权;所有人;adj.所有的;专利的;私人拥有的;
The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they're more likely to stay on the site 这算法已经分析出了 如果能展示出更加核心的内容 以此来诱惑网站用户 那么人们就更有可能沉浸在网页里
entice:v.诱使;引诱; site:n.地点;位置;场所;v.设置;为…选址;
watching video after video going down that rabbit hole while Google serves them ads. 一个接一个的观看推荐的视频 同时Google给它们投放广告
Now, with nobody minding the ethics of the store, these sites can profile people who are Jew haters , who think that Jews are parasites and who have such explicit anti-Semitic content, and let you target them with ads. 目前没有人在意网络的道德规范 这些网站可以对用户进行划分 哪些人仇视犹太人 哪些人视犹太人为寄生虫 以及说过明显反犹太言论的人 然后让你面向这些 目标人群投放广告
ethics:n.伦理学;伦理观;道德标准; profile:n.轮廓;简介;形象;外形;v.扼要介绍;概述;写简介; Jew:v.(集合词)犹太人;犹太人的世界;犹太教;-dom;n.犹太人;犹太教徒; haters:怀恨在心者(hater的名词复数); parasites:n.[基医]寄生虫;[生物]寄生生物(parasite的复数); explicit:adj.明确的;清楚的;直率的;详述的; anti-Semitic:adj.反对犹太人的;排犹的;
They can also mobilize algorithms to find for you look-alike audiences, people who do not have such explicit anti-Semitic content on their profile but who the algorithm detects may be susceptible to such messages, and lets you target them with ads, too. 他们也可以利用算法 来找到和你类似的观众 那些个人账号中虽然没有过 明显的反犹太人言论 但却被算法检测出 可能被这种言论影响的人 然后也面向他们投放同样的广告
mobilize:v.动员;动员;调动;组织;鼓动;调用; look-alike:n.外形酷似之人;adj.极为相像的; detects:发现; susceptible:adj.易受影响的;易感动的;容许…的;n.易得病的人;
Now, this may sound like an implausible example, but this is real. 这听起来难以置信 但确有其事
implausible:adj.难以置信的,不像真实的;
ProPublica investigated this and found that you can indeed do this on Facebook, and Facebook helpfully offered up suggestions on how to broaden that audience. ProPublica在这方面调查过 发现这的确可以在Facebook上实现 Facebook还积极的就 有关如何将算法的受众 再度扩大提出了建议
investigated:v.侦查;调查;研究;(investigate的过去分词和过去式) helpfully:adv.有益地;有用地; broaden:v.加宽;变宽;(broaden的现在分词)
BuzzFeed tried it for Google, and very quickly they found, yep, you can do it on Google, too. Buzzfeed曾在Google上 进行尝试 并很快发现 没错 这也可在Google实现
And it wasn't even expensive. 而这甚至花不了多少钱
The ProPublica reporter spent about 30 dollars to target this category. ProPublica只花了大概30美元 就找出了目标人群
So last year, Donald Trump's social media manager disclosed that they were using Facebook dark posts to demobilize people, not to persuade them, but to convince them not to vote at all. 那么去年 特朗普的 社交媒体经理披露道 他们使用Facebook的 隐藏发帖来动员大众退出 不是劝告 而是说服他们根本就不要投票
disclosed:v.揭露;泄露;使显露;使暴露;(disclose的过去式和过去分词) demobilize:vt.遣散;使复员;使退伍(demobilise); persuade:v.说服;劝说;使信服;使相信; convince:v.使确信;使相信;说服,劝说;
And to do that, they targeted specifically , for example, African-American men in key cities like Philadelphia , and I'm going to read exactly what he said. 为了做到这一点 他们有 针对性的找到目标 比如 在费城这种关键城市里 居住的非裔美国人 请注意接下来我要复述的
specifically:adv.特别地;明确地; African-American:非洲裔美国人(指美国黑人); Philadelphia:n.费城(美国宾夕法尼亚州东南部港市);
I'm quoting . 都是他们的原话
quoting:v.引用;报价;举例说明;开价;为(企业的股份)上市;(quote的现在分词)
They were using "nonpublic posts whose viewership the campaign controls so that only the people we want to see it see it. 他们使用 以下是引用 由竞选者控制的 非面向公众的贴文发帖 这样就只有我们选定的人 可以看到其内容
viewership:n.电视观众的总称;
We modeled this. 我们估算了一下
It will dramatically affect her ability to turn these people out." 这会极大程度的做到让这些人退出
dramatically:adv.戏剧地;引人注目地;adv.显著地,剧烈地;
What's in those dark posts? 以上我引述的隐藏贴文说了些什么呢
We have no idea. 我们无从知晓
Facebook won't tell us. Facebook不会告诉我们
So Facebook also algorithmically arranges the posts that your friends put on Facebook, or the pages you follow. 所以Facebook也利用 算法管理贴文 不管是你朋友的发帖 还是你的跟帖
algorithmically:[计]在算法上; arranges:vt.安排;排列;整理;vi.安排;排列;协商;
It doesn't show you everything chronologically . 它不会把东西按时间顺序展现给你
chronologically:adv.按年代地;
It puts the order in the way that the algorithm thinks will entice you to stay on the site longer. 而是按算法计算的顺序展现给你 以使你更长时间停留在页面上
in the way:妨碍;挡道;
Now, so this has a lot of consequences . 而这一切都是有后果的
consequences:n.后果,结果;影响(consequence的复数);
You may be thinking somebody is snubbing you on Facebook. 你也许会觉得有人在 Facebook上对你不理不睬
snubbing:v.冷落;拒不出席;拒不接受;抵制;(snub的现在分词)
The algorithm may never be showing your post to them. 这是因为算法可能根本就 没有给他们展示你的发帖
The algorithm is prioritizing some of them and burying the others. 算法会优先展示一些贴文 而把另一些埋没
prioritizing:v.按重要性排列;划分优先顺序;优先处理;(prioritize的现在分词)
Experiments show that what the algorithm picks to show you can affect your emotions . 实验显示 算法决定展示给你的东西 会影响到你的情绪
emotions:n.强烈的感情;激情;情感;(emotion的复数)
But that's not all. 还不止这样
It also affects political behavior. 它也会影响到政治行为
So in 2010, in the midterm elections, 在2010年的中期选举中
midterm:adj.期中的;中间的;n.期中考试;
Facebook did an experiment on 61 million people in the US that was disclosed after the fact. Facebook对美国6100万人 做了一个实验 这是在事后被披露的
So some people were shown, "Today is election day," 当时有些人收到了 今天是选举日 的贴文
the simpler one, and some people were shown the one with that tiny tweak with those little thumbnails of your friends who clicked on "I voted." 简单的版本 而有一些人则收到了 微调过的贴文 上面有一些小的缩略图 显示的是你的 哪些好友 已投票
tweak:vt.扭;拧;扯;稍稍调整;n.扭;拧;扯;轻微调整 thumbnails:n.拇指甲(thumbnail的复数形式);
This simple tweak. 这小小的微调
OK? So the pictures were the only change, and that post shown just once turned out an additional 340,000 voters in that election, according to this research as confirmed by the voter rolls. 看到了吧 改变仅仅是 添加了缩略图而已 并且那些贴文仅出现一次 后来的调查结果显示 在那次选举中 根据选民登记册的确认 多出了34万的投票者
additional:adj.附加的,额外的; according to:根据,据说; voter:n.选举人,投票人;有投票权者;
A fluke ? No. 仅仅是意外吗 并非如此
fluke:n.侥幸;锚爪;意外的挫折;vt.侥幸成功;意外受挫;vi.侥幸成功;
Because in 2012, they repeated the same experiment. 因为在2012年 他们再次进行了同样的实验
And that time, that civic message shown just once turned out an additional 270,000 voters. 而那一次 类似贴文也只出现了一次 最后多出了28万投票者
civic:adj.市的;公民的,市民的;
For reference , the 2016 US presidential election was decided by about 100,000 votes. 作为参考 2016年总统大选的 最终结果是由大概 十万张选票决定的
reference:n.参考,提及;参考书目;证明书;v.引用; presidential:adj.总统的;首长的;统辖的;
Now, Facebook can also very easily infer what your politics are, even if you've never disclosed them on the site. Facebook还可以轻易 推断出你的政治倾向 即使你从没有在网上披露过
Right? These algorithms can do that quite easily. 这可难不倒算法
What if a platform with that kind of power decides to turn out supporters of one candidate over the other? 而如果一个拥有 这样强大能力的平台 决定要让一个候选者胜利获选
platform:n.平台; v.把…放在台上[放在高处; supporters:n.拥护者;(运动队的)支持者;(supporter的复数)
How would we even know about it? 我们根本无法察觉
Now, we started from someplace seemingly innocuous -- online adds following us around -- and we've landed someplace else. 现在我们从一个无伤大雅的方面 也就是如影随形的 网络广告 转到了另一个方面
seemingly:adv.看来似乎;表面上看来; innocuous:adj.无害的;无伤大雅的;
As a public and as citizens, we no longer know if we're seeing the same information or what anybody else is seeing, and without a common basis of information, little by little , public debate is becoming impossible, and we're just at the beginning stages of this. 作为一个普通大众和公民 我们已经无法确认 自己看到的信息 和别人看到的信息是否一样 而在没有一个共同的 基本信息的情况下 逐渐的 公开辩论将变得不再可能 而我们已经开始走在这条路上了
little by little:渐渐;逐渐地; debate:n.辩论;争论;考虑;v.辩论;争论;考虑; at the beginning:首先;从一开始;起初;从头开始;
These algorithms can quite easily infer things like your people's ethnicity , religious and political views, personality traits , intelligence, happiness, use of addictive substances , parental separation , age and genders , just from Facebook likes. 这些算法可以轻易推断出 任何一个用户的种族 宗教信仰 包括政治倾向 还有个人喜好 你的智力 心情 以及用药历史 父母是否离异 你的年龄和性别 这些都可以从你的 Facebook关注里推算出来
ethnicity:n.种族划分; religious:adj.宗教的;虔诚的;严谨的;修道的;n.修道士;尼姑; personality:n.性格;个性;人格;魅力;气质;名人;特色; traits:n.特性,特质,性格(trait的复数); addictive:adj.使人上瘾的;使人入迷的; substances:n.[物]物质;基本内容;物品药物(substance的复数); parental:adj.父母亲的,父母的;亲代的,亲本的; separation:n.分离;分开;分割;隔离; genders:n.性别;性;
These algorithms can identify protesters even if their faces are partially concealed . 这些算法可以识别抗议人士 即使他们部分掩盖了面部特征
identify:v.识别:鉴定:确认:发现: protesters:n.(公开)抗议者,反对者;(protester的复数) partially:adv.部分地;偏袒地; concealed:v.隐藏;隐瞒;掩盖(conceal的过去分词和过去式)
These algorithms may be able to detect people's sexual orientation just from their dating profile pictures. 这些算法可以测出人们的性取向 只需要查看他们的约会账号头像
sexual:adj.性的;性别的;有性的; orientation:n.方向;定向;适应;情况介绍;向东方;
Now, these are probabilistic guesses, so they're not going to be 100 percent right, but I don't see the powerful resisting the temptation to use these technologies just because there are some false positives , which will of course create a whole other layer of problems. 然而所有的一切都 只是概率性的推算 所以它们不会百分之百精确 这些算法有很多误报 但我没有看到对想要使用这些 科技的有力反抗
probabilistic:adj.概率性的;或然说的,盖然论的; resisting:adj.稳定的;坚固的;v.抵抗;忍住(resist的现在分词); temptation:n.诱惑;引诱;煽诱人的事物; positives:n.实在的事物;阳极板(positive的复数); layer:n.层,层次; vt.把…分层堆放; vi.形成或分成层次;
Imagine what a state can do with the immense amount of data it has on its citizens. 想象一下 拥有了海量的市民数据 一个国家能做出什么
immense:adj.巨大的,广大的;无边无际的;非常好的;
China is already using face detection technology to identify and arrest people. 中国已经在使用 面部识别来抓捕犯人
detection:n.侦查,探测;发觉,发现;察觉;
And here's the tragedy : we're building this infrastructure of surveillance authoritarianism merely to get people to click on ads. 然而不幸的是 我们正在建造一个 监控独裁性质的设施 目的仅是为了让人们点击广告
tragedy:n.悲惨的事;不幸;灾难;悲剧作品; infrastructure:n.基础设施;公共建设;下部构造; authoritarianism:n.独裁主义;权力主义; merely:adv.仅仅,只不过;只是;
And this won't be Orwell's authoritarianism. 而这和奥威尔笔下的独裁政府不同
This isn't "1984." 不是 1984 里的情景
Now, if authoritarianism is using overt fear to terrorize us, we'll all be scared, but we'll know it, we'll hate it and we'll resist it. 现在如果独裁主义公开恐吓我们 我们会惧怕 但我们也会察觉 我们会奋起抵抗并瓦解它
overt:adj.明显的;公然的;蓄意的; terrorize:vt.使…恐怖;vi.实施恐怖统治;
But if the people in power are using these algorithms to quietly watch us, to judge us and to nudge us, to predict and identify the troublemakers and the rebels , to deploy persuasion architectures at scale and to manipulate individuals one by one 但如果掌权的人使用这种算法 来安静的监视我们 来评判我们 煽动我们 来预测和识别出那些 会给政府制造麻烦的家伙 并且大规模的布置说服性的架构 利用每个人自身的
nudge:n.推动; vt.推进; vi.轻推; predict:v.预报;预言;预告; troublemakers:n.麻烦制造者(troublemaker的复数); rebels:n.反叛者(rebel的复数形式);v.谋反(rebel的第三人称单数形式);
using their personal, individual weaknesses and vulnerabilities , and if they're doing it at scale through our private screens so that we don't even know what our fellow citizens and neighbors are seeing, 弱点和漏洞来把我们逐个击破 假如他们的做法受众面很广 就会给每个手机都推送不同的信息 这样我们甚至都不会知道 我们周围的人看到的是什么
vulnerabilities:n.缺陷(vulnerability的复数形式);脆弱点;
that authoritarianism will envelop us like a spider's web and we may not even know we're in it. 独裁主义会像蜘蛛网 一样把我们困住 而我们并不会意识到 自己已深陷其中
envelop:vt.包围;包封;遮盖;n.信封;包裹;
So Facebook's market capitalization is approaching half a trillion dollars. Facebook现在的市值 已经接近了5000亿美元
capitalization:n.资本化;资本总额;用大写; approaching:v.靠近,接近;接洽;建议;要求;(approach的现在分词) trillion:n.[数]万亿;adj.万亿的;num.[数]万亿;
It's because it works great as a persuasion architecture. 只因为它作为一个说服架构 完美的运作着
But the structure of that architecture is the same whether you're selling shoes or whether you're selling politics. 但不管你是要卖鞋子 还是要卖政治思想 这个架构的结构都是固定的
The algorithms do not know the difference. 算法并不知道其中的差异
The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows, and that's what's got to change. 同样的算法也被使用在我们身上 它让我们更易受广告诱导 也管控着我们的政治 个人 以及社会信息的流向 而那正是需要改变的部分
loose:adj.宽松的; v.释放; v.松散地; n.放纵; pliable:adj.柔韧的;柔软的;圆滑的;易曲折的; organizing:v.组织;安排;处理;分配;管理;(organize的现在分词)
Now, don't get me wrong, we use digital platforms because they provide us with great value. 我还需要澄清一下 我们使用数字平台 因为它们带给我们便利
platforms:n.平台; v.把…放在台上;
I use Facebook to keep in touch with friends and family around the world. 我和世界各地的朋友和家人 通过 Facebook 联系
keep in touch with:与…保持联系;
I've written about how crucial social media is for social movements. 我也曾撰文谈过社交媒体 在社会运动中的重要地位
crucial:adj.重要的;决定性的;定局的;决断的;
I have studied how these technologies can be used to circumvent censorship around the world. 我也曾研究过如何使用这些技术 来绕开世界范围内的审查制度
circumvent:vt.包围;陷害;绕行; censorship:n.审查制度;审查机构;
But it's not that the people who run, you know, Facebook or Google are maliciously and deliberately trying to make the country or the world more polarized and encourage extremism. 但并不是那些管理Facebook 或者Google的人 在意图不轨的尝试 如何使世界走向极端化 并且推广极端主义
maliciously:adv.有敌意地,恶意地; deliberately:adv.故意地;谨慎地;慎重地; polarized:v.(使)两极化,截然对立; (polarize的过去分词和过去式)
I read the many well-intentioned statements that these people put out. 我曾读到过很多由这些人写的 十分善意的言论
well-intentioned:adj.好意的,好心的;出于善意的; statements:n.说明; v.(英国)对儿童进行特殊教育评估认定; (statement的第三人称单数和复数)
But it's not the intent or the statements people in technology make that matter, it's the structures and business models they're building. 但重要的并不是 这些科技人员说的话 而是他们正在建造的 架构体系和商业模式
structures:n.结构; v.建造(structure的第三人称单数形式);
And that's the core of the problem. 那才是问题的关键所在
Either Facebook is a giant con of half a trillion dollars and ads don't work on the site, it doesn't work as a persuasion architecture, or its power of influence is of great concern . 要么Facebook是个 5000亿市值的弥天大谎 那些广告根本就不奏效 它并不是以一个 说服架构的模式成功运作 要么Facebook的影响力 就是令人担忧的
influence:n.影响;势力;感化;有影响的人或事;v.影响;改变; concern:v.涉及,关系到;使担心;n.关系;关心;关心的事;
It's either one or the other. 只有这两种可能
It's similar for Google, too. Google也是一样
So what can we do? 那么我们能做什么呢
This needs to change. 我们必须改变现状
Now, I can't offer a simple recipe , because we need to restructure the whole way our digital technology operates. 现在我还无法给出 一个简单的方法 因为我们必须重新调整 整个数字科技的运行结构
recipe:n.食谱;方法;诀窍;烹饪法; restructure:vt.调整;重建;更改结构;
Everything from the way technology is developed to the way the incentives , economic and otherwise, are built into the system. 一切科技从发展到激励的方式 不论是在经济 还是在其他领域 都是建立在这种结构之上
incentives:n.激励;奖励;诱因(incentive的复数形式);奖励措施; economic:adj.经济的,经济上的;经济学的;
We have to face and try to deal with the lack of transparency created by the proprietary algorithms, the structural challenge of machine learning's opacity , all this indiscriminate data that's being collected about us. 我们必须得面对并尝试解决 由专有算法制造出来的 透明度过低问题 还有由机器学习的 不透明带来的结构挑战 以及所有这些不加选择 收集到的我们的信息
transparency:n.透明,透明度;幻灯片;有图案的玻璃; structural:adj.结构的;建筑的; opacity:n.不透明;不传导;暧昧; indiscriminate:adj.任意的;无差别的;不分皂白的;
We have a big task in front of us. 我们的任务艰巨
We have to mobilize our technology, our creativity and yes, our politics so that we can build artificial intelligence that supports us in our human goals but that is also constrained by our human values. 必须调整我们的科技 我们的创造力 以及我们的政治 这样我们才能够制造出 真正为人类服务的人工智能 但这也会受到人类价值观的阻碍
constrained:adj.不自然的; v.约束; (constrain的过去分词和过去式)
And I understand this won't be easy. 我也明白这不会轻松
We might not even easily agree on what those terms mean. 我们甚至都无法在这些 理论上达成一致
But if we take seriously how these systems that we depend on for so much operate, 但如果我们每个人都认真对待 这些我们一直以来 都在依赖的操作系统
I don't see how we can postpone this conversation anymore. 我认为我们也 没有理由再拖延下去了
postpone:v.延期;延迟;展缓;
These structures are organizing how we function and they're controlling what we can and we cannot do. 这些结构 在影响着我们的工作方式 它们同时也在控制 我们能做与不能做什么事情
And many of these ad-financed platforms, they boast that they're free. 而许许多多的 这种以广告为生的平台 他们夸下海口 对大众分文不取
boast:v.自夸;自吹自擂;n.夸耀;夸口;
In this context , it means that we are the product that's being sold. 而事实上 我们却是他们销售的产品
context:n.环境;上下文;来龙去脉;
We need a digital economy where our data and our attention is not for sale to the highest-bidding authoritarian or demagogue . 我们需要一种数字经济 一种我们的数据以及我们专注的信息 不会如竞拍一样被售卖给 出价最高的独裁者和煽动者
economy:n.经济;节约;理财; demagogue:n.煽动者;煽动家;煽动政治家;
(Applause) (掌声)
So to go back to that Hollywood paraphrase, we do want the prodigious potential of artificial intelligence and digital technology to blossom , but for that, we must face this prodigious menace , open-eyed and now. 回到那句好莱坞名人说的话 我们的确想要 由人工智能与数字科技发展 带来的惊人潜力 但与此同时 我们也要 做好面对惊人风险的准备 睁大双眼 就在此时此刻
blossom:n.花朵,花簇;v.开花;变得更加健康(或自信、成功); menace:n.威胁;恐吓;vi.恐吓;进行威胁;vt.威胁;恐吓; open-eyed:adj.公开的;留神的;
Thank you. 谢谢
(Applause) (掌声)