返回首页

VeronicaBarassi_2019X-_科技公司对你的孩子了解多少_-

Every day, every week, we agree to terms and conditions. 每一天, 每一个星期, 我们都会同意各种服务条款。
And when we do this, we provide companies with the lawful right to do whatever they want with our data and with the data of our children. 每当我们这样做, 我们其实就赋予了公司法律上的权利, 用我们的数据去做任何事, 也包括我们孩子的数据。
Which makes us wonder: how much data are we giving away of children, and what are its implications ? 这难免使我们感到困惑: 我们到底提供了多少 关于孩子的数据, 它们的用途又是什么?
implications:n.蕴涵式;暗指,暗示;含蓄,含意;卷入(implication的复数);
I'm an anthropologist , and I'm also the mother of two little girls. 我是个人类学家, 也是两个女孩的母亲。
anthropologist:n.人类学家;人类学者;
And I started to become interested in this question in 2015 when I suddenly realized that there were vast -- almost unimaginable amounts of data traces that are being produced and collected about children. 2015 年,我开始关注这个问题, 当时我突然发现很多科技公司 从孩子那里搜集到了 庞大到无法想象的数据信息。
unimaginable:adj.不可思议的;难以想像的; traces:v.跟踪;探索;n.痕迹;踪迹;(trace的复数和第三人单数)
So I launched a research project, which is called Child Data Citizen, and I aimed at filling in the blank. 所以我启动了一个研究项目, 叫“儿童数据市民”, 希望能够填补空缺的信息。
launched:v.发射;发起;开展;开始;(launch的过去式和过去分词)
Now you may think that I'm here to blame you for posting photos of your children on social media , but that's not really the point. 现在,你们有可能以为我在责怪你们 在社交网络上传了孩子的照片, 但是这不是重点。
media:n.媒体;媒质(medium的复数);血管中层;浊塞音;中脉;
The problem is way bigger than so-called "sharenting." 实际问题比分享要严重得多。
so-called:adj.所谓的;号称的;
This is about systems, not individuals . 这事关系统,而不是个人。
individuals:n.[经]个人;[生物]个体(individual的复数);
You and your habits are not to blame. 你的行为习惯并没有错。
For the very first time in history, we are tracking the individual data of children from long before they're born -- sometimes from the moment of conception , and then throughout their lives. 历史上首次, 我们开始追踪孩子的个人数据, 从他们出生之前—— 有时候是从受孕开始, 然后贯穿他们的一生。
tracking:n.追踪,跟踪;v.跟踪;(track的现在分词) conception:n.怀孕;概念;设想;开始; throughout:adv.自始至终,到处;全部;prep.贯穿,遍及;
You see, when parents decide to conceive , they go online to look for "ways to get pregnant ," 通常,当家长决定要一个孩子, 他们会在网上搜索 “怎么怀孕”,
conceive:v.想出;想象;构想;设想;怀孕; pregnant:adj.怀孕的;富有意义的;
or they download ovulation-tracking apps. 或者下载排卵期追踪软件。
When they do get pregnant, they post ultrasounds of their babies on social media, they download pregnancy apps or they consult Dr. Google for all sorts of things, like, you know -- for " miscarriage risk when flying" 等到真的怀孕了, 他们会在社交网络上 发布宝宝的超音波图像, 下载关于怀孕的软件, 或者在谷歌上搜索相关信息。 比如, “乘飞机时的流产风险”
ultrasounds:n.超声;超音波; pregnancy:n.怀孕;丰富,多产;意义深长; consult:v.咨询;请教;商量;查阅;查询;参看; Google:谷歌;谷歌搜索引擎; miscarriage:n.[妇产]流产;失败;误送;
or " abdominal cramps in early pregnancy." 或者“怀孕早期的腹痛”。
abdominal:adj.腹部的;有腹鳍的; cramps:n.肌肉抽筋; v.束缚;
I know because I've done it -- and many times. 我知道这些, 因为我也有过类似的经历, 而且是很多次。
And then, when the baby is born, they track every nap , every feed, every life event on different technologies . 等到宝宝出生后, 他们会用不同的技术 记录每个午觉、 每次喂食和每个重要时刻。
nap:n.打盹;短绒毛;赛马情报;v.打盹;(naps是nap的复数) technologies:n.技术;科技(technology的复数);
And all of these technologies transform the baby's most intimate behavioral and health data into profit by sharing it with others. 所有这些技术 都会通过把宝宝的资料分享给别人 从而换取利润。
transform:v.使改变;使改观;使转换;n.[数]变换式;[化]反式; intimate:n.知己; v.暗示; adj.亲密的; behavioral:adj.行为的; profit:n.利润;利益;v.获利;有益;
So to give you an idea of how this works, in 2019, the British Medical Journal published research that showed that out of 24 mobile health apps, 19 shared information with third parties. 先给各位举一个例子, 在 2019 年, 英国医学杂志发布了一项研究: 在 24 个健康类的手机软件里, 有 19 个把用户资料 分享给了第三方,
Journal:n.杂志;日记;日志;(用于报纸名)…报; mobile:n.手机;汽车;移动电话;adj.活跃的;可动的;
And these third parties shared information with 216 other organizations . 而这些第三方又分享给了 216 个其他的组织。
organizations:n.组织,构造,有机体(organization的复数);组织机构;
Of these 216 other fourth parties, only three belonged to the health sector . 而这 216 个第四方机构, 只有三个属于健康类机构,
sector:n.部门;扇形,扇区;象限仪;函数尺;vt.把…分成扇形;
The other companies that had access to that data were big tech companies like Google, Facebook or Oracle , they were digital advertising companies and there was also a consumer credit reporting agency . 其他的则是大型科技公司, 比如谷歌,脸书或甲骨文, 都是数据广告类的公司, 而且还有消费信贷的报告机构。
Oracle:n.神谕;预言;神谕处;圣人; digital:adj.数字的;手指的;n.数字;键; advertising:n.做广告;广告业;广告活动;v.做广告;(advertise的现在分词) consumer:n.[经]消费者;[生,生态]消费者; agency:n.代理,中介;代理处,经销处;
So you get it right: ad companies and credit agencies may already have data points on little babies. 所以你的猜测是对的: 广告公司和信贷机构 已经有了宝宝们的数据。
agencies:n.代理;代理处(agency的复数);
But mobile apps, web searches and social media are really just the tip of the iceberg , because children are being tracked by multiple technologies in their everyday lives. 但是手机软件、网站搜索和社交媒体 只是冰山一角, 因为孩子们的日常生活 已经被很多科技追踪了。
tip of the iceberg:冰山一角;事物的表面部分; tracked:v.跟踪;追踪;(track的过去分词和过去式) multiple:adj.数量多的;多种多样的;n.倍数;
They're tracked by home technologies and virtual assistants in their homes. 他们被家里的设备和虚拟助手追踪,
virtual:adj.[计]虚拟的;实质上的,事实上的(但未在名义上或正式获承认);
They're tracked by educational platforms and educational technologies in their schools. 他们被教育网站 和学校里的教育技术追踪。
educational:adj.教育的;有关教育的;有教育意义的 platforms:n.平台; v.把…放在台上;
They're tracked by online records and online portals at their doctor's office. 他们被诊所的 网上记录和门户网站追踪。
portals:n.门户网站;[建]入口;大门(portal的复数);
They're tracked by their internet-connected toys, their online games and many, many, many, many other technologies. 他们也在被连网的玩具、 在线游戏 和很多很多其他的技术追踪。
So during my research, a lot of parents came up to me and they were like, "So what? 在我的研究过程中, 很多家长问我,“那又怎么样?
Why does it matter if my children are being tracked? 就算我的孩子被追踪,那又怎么样?
We've got nothing to hide." 我们又没什么见不得人的秘密。”
Well, it matters. 但是,这真的很重要。
It matters because today individuals are not only being tracked, they're also being profiled on the basis of their data traces. 因为现如今,个人信息不仅仅被追踪, 还会被用来创建网络个人档案。
profiled:v.描绘…轮廓,评论人物(profile的过去式); on the basis of:根据;基于…;
Artificial intelligence and predictive analytics are being used to harness as much data as possible of an individual life from different sources : family history, purchasing habits, social media comments. 那些公司会用人工智能和预测分析 从不同渠道搜集越来越多的 个人数据: 家庭历史、购物习惯和社交媒体评论,
Artificial intelligence:n.人工智能; predictive:adj.预言性的;成为前兆的; analytics:n.[化学][数]分析学;解析学; harness:vt.治理; n.马具; sources:n.来源;出处;起源;根源;原因;v.(从…)获得(source的第三人称单数和复数) purchasing:n.购买;采购;v.买;购买;采购;(purchase的现在分词)
And then they bring this data together to make data-driven decisions about the individual. 然后将这些信息结合在一起 去做出关于你的决定。
data-driven:adj.依照数据处理的;
And these technologies are used everywhere. 这些技术几乎无处不在。
Banks use them to decide loans. 银行利用这些信息 决定批准谁的贷款,
Insurance uses them to decide premiums . 保险公司用它们决定保费额度,
Insurance:n.保险;保险业;保险费;保费;adj.胜券在握的; premiums:n.赠品,保险费;额外费用(premium复数);
Recruiters and employers use them to decide whether one is a good fit for a job or not. 招聘人员和雇主用它们 来决定你们到底适不适合某个工作。
Recruiters:n.招聘人员,征兵人员; employers:n.雇主;雇用者;(employer的复数)
Also the police and courts use them to determine whether one is a potential criminal or is likely to recommit a crime. 警察和法庭也利用它们 去决定这个人是不是罪犯, 或者有没有可能犯罪。
determine:v.决定;确定;测定;查明;形成;影响;裁决;安排; potential:n.潜能;可能性;[电]电势;adj.潜在的;可能的;势的; recommit:vt.再犯;再委托;重新提交(议案等)讨论;再关进监狱;
We have no knowledge or control over the ways in which those who buy, sell and process our data are profiling us and our children. 这些购买、售卖 和处理我们信息的人 究竟如何调查我们和我们的孩子, 我们对此一无所知, 也没有任何控制权。
process:v.处理;加工;列队行进;n.过程,进行;方法,adj.经过特殊加工(或处理)的; profiling:n.资料搜集;v.扼要介绍;概述;写简介;(profile的现在分词)
But these profiles can come to impact our rights in significant ways. 但这些信息会 严重影响我们的权益。
profiles:n.配置文件; v.扼要描述; impact:n.影响;效果;碰撞;冲击力;v.挤入,压紧;撞击;对…产生影响; significant:adj.重大的;有效的;有意义的;值得注意的;意味深长的;n.象征;有意义的事物;
To give you an example, in 2018 the "New York Times" published the news that the data that had been gathered through online college-planning services -- 举个例子, 2018 年《纽约时报》 发布的一则新闻称, 由线上大学规划服务 搜集的数据——
that are actually completed by millions of high school kids across the US who are looking for a college program or a scholarship -- had been sold to educational data brokers . 这些数据都来自 全美数百万正在寻找 大学项目或奖学金的高中生—— 已经被售卖给了教育数据经纪人。
scholarship:n.奖学金;学问;学识;学术成就; brokers:n.[经]经纪人(broker复数);
Now, researchers at Fordham who studied educational data brokers revealed that these companies profiled kids as young as two on the basis of different categories : ethnicity , religion, affluence , social awkwardness and many other random categories. 福特汉姆的研究人员在对一些 教育数据经纪人进行分析之后透露, 这些公司根据以下类别 对不小于两岁的孩子 进行了分组: 种族、宗教、家庭富裕程度、 社交恐惧症, 以及很多其他的随机分类。
revealed:v.揭示;显示;露出;(reveal的过去分词和过去式) categories:n.(人或事物的)类别,种类(category的复数) ethnicity:n.种族划分; affluence:n.富裕;丰富;流入;汇集; awkwardness:n.尴尬;笨拙; random:adj.[数]随机的;任意的;胡乱的;n.随意;adv.胡乱地;
And then they sell these profiles together with the name of the kid, their home address and the contact details to different companies, including trade and career institutions , student loans and student credit card companies. 然后他们会将这些资料, 以及孩子的名字、 地址和联系方式 出售给不同的公司, 包括贸易和职业发展机构, 学生贷款 和学生信用卡公司。
contact:n.接触,联系;v.使接触,联系; career:n.职业;事业;生涯;经历; institutions:n.机构;慈善机构;风俗习惯,制度;(institution的复数) credit card:n.[经]信用卡;
To push the boundaries , the researchers at Fordham asked an educational data broker to provide them with a list of 14-to-15-year-old girls who were interested in family planning services. 更夸张的是, 研究人员要求教育数据经纪人 提供一份对家庭生育服务感兴趣, 年龄在 14 至 15 岁的少女名单。
boundaries:n.边界;分界线;(boundary的复数) family planning:n.计划生育;家庭计划;
The data broker agreed to provide them the list. 数据经纪人同意了。
So imagine how intimate and how intrusive that is for our kids. 所以不难想象,我们孩子的隐私 得到了何等程度的侵犯。
intrusive:adj.侵入的;打扰的;
But educational data brokers are really just an example. 但是教育数据经纪人的例子 只是冰山一角。
The truth is that our children are being profiled in ways that we cannot control but that can significantly impact their chances in life. 诚然,孩子们的信息 正以不可控的方式被人操纵着, 但这会极大地影响他们以后的人生。
significantly:adv.意味深长地;值得注目地;
So we need to ask ourselves: can we trust these technologies when it comes to profiling our children? 所以我们要扪心自问: 这些搜集孩子们信息的技术 还值得信任吗?
Can we? 值得吗?
My answer is no. 我的答案是否定的。
As an anthropologist, 作为一个人类学家,
I believe that artificial intelligence and predictive analytics can be great to predict the course of a disease or to fight climate change. 我相信人工智能和 预测分析可以很好的 预测疾病的发展过程 或者对抗气候变化。
disease:n.病,[医]疾病;弊病;vt.传染;使…有病;
But we need to abandon the belief that these technologies can objectively profile humans and that we can rely on them to make data-driven decisions about individual lives. 但是我们需要摒弃 这些技术可以客观的分析人类数据, 我们能够以数据为依据做出 关于个人生活的决定 这一想法。
abandon:n.狂热;放任;v.遗弃;放弃; objectively:adv.客观地; rely:vi.依靠;信赖;
Because they can't profile humans. 因为它们做不到。
Data traces are not the mirror of who we are. 数据无法反映我们的真实情况。
Humans think one thing and say the opposite, feel one way and act differently. 人类往往心口不一, 言行不一。
Algorithmic predictions or our digital practices cannot account for the unpredictability and complexity of human experience. 算法预测或者数据实践 无法应对人类经验的 不可预测性和复杂性。
Algorithmic:adj.[数]算法的;规则系统的; predictions:n.预测,预言(prediction复数形式); account for:对…负有责任;对…做出解释;说明…的原因;导致;(比例)占; unpredictability:n.不可预测性,不可预知性;不可预见性; complexity:n.复杂性;难以理解的局势
But on top of that, these technologies are always -- always -- in one way or another, biased . 但是在此之上, 这些科技总是—— 总是—— 以这样或那样的方式存在偏见。
biased:adj.有偏见的;结果偏倚的,有偏的;
You see, algorithms are by definition sets of rules or steps that have been designed to achieve a specific result, OK? 要知道,算法的定义是 被设计成实现一个具体结果的 很多套规则或步骤,对吧?
definition:n.定义;清晰度;(尤指词典里的词或短语的)释义;解释; specific:adj.特殊的,特定的;明确的;详细的;[药]具有特效的;n.特性;细节;特效药;
But these sets of rules or steps cannot be objective, because they've been designed by human beings within a specific cultural context and are shaped by specific cultural values. 但是这些都不是客观的, 因为它们都是 由带有特殊文化背景, 被特殊文化价值所塑造的人类 设计出来的。
cultural:adj.与文化有关的;文化的;与艺术、文学、音乐等有关的; context:n.环境;上下文;来龙去脉;
So when machines learn, they learn from biased algorithms, and they often learn from biased databases as well. 所以当机器在学习的时候, 它们利用的是带有偏见的算法, 以及往往同样带有偏见的数据。
At the moment, we're seeing the first examples of algorithmic bias. 如今,我们已经看到了 第一批算法偏见的例子,
And some of these examples are frankly terrifying . 其中有一些真的很可怕。
terrifying:adj.令人恐惧的;骇人的;极大的;v.使害怕,使恐怖;(terrify的现在分词)
This year, the AI Now Institute in New York published a report that revealed that the AI technologies that are being used for predictive policing have been trained on "dirty" data. 今年,位于纽约的 人工智能现在研究所(AI Now Institute) 发表的一份报告揭示了 预测警务领域的人工智能技术 是使用非常糟糕的数据进行训练的。
Institute:v.开始(调查);制定;创立;提起(诉讼);n.学会,协会;学院;
This is basically data that had been gathered during historical periods of known racial bias and nontransparent police practices. 这些数据基本上都是 在历史上存在已知的种族偏见 和不透明的警察行为时期 收集的数据。
basically:adv.主要地,基本上; historical:adj.历史的;史学的;基于史实的; racial:adj.种族的;人种的; nontransparent:adj.非透明的;
Because these technologies are being trained with dirty data, they're not objective, and their outcomes are only amplifying and perpetrating police bias and error. 因为这些技术都是 用这类数据训练的, 它们无法做到客观, 结果只是放大和进一步深化 警察的偏见和错误。
outcomes:n.结果;成果;后果;出路;(outcome的复数) amplifying:adj.放大的; perpetrating:vt.犯(罪);做(恶);
So I think we are faced with a fundamental problem in our society. 所以我觉得我们是在面对社会中的 一个基本问题。
fundamental:n.基础; adj.十分重大的;
We are starting to trust technologies when it comes to profiling human beings. 我们正在放心大胆的 用各种技术对人类信息进行分析。
We know that in profiling humans, these technologies are always going to be biased and are never really going to be accurate . 我们知道在这方面, 这些技术总是有偏见的, 结果也永远不可能准确。
accurate:adj.精确的;
So what we need now is actually political solution . 所以我们现在需要 一个政治层面的解决方案。
solution:n.解决方案;溶液;溶解;解答;
We need governments to recognize that our data rights are our human rights. 我们需要让政府认识到, 我们的数据权利也是人权。
recognize:v.认识;认出;辨别出;承认;意识到;
(Applause and cheers) (鼓掌和欢声)
Until this happens, we cannot hope for a more just future. 在这样的转变发生之前, 我们无法期待一个更加公平的未来。
I worry that my daughters are going to be exposed to all sorts of algorithmic discrimination and error. 我担心我的女儿们会暴露在 各种算法的歧视与错误判断中。
exposed:adj.无遮蔽的; v.暴露; (expose的过去分词和过去式) discrimination:n.歧视;区别,辨别;识别力;
You see the difference between me and my daughters is that there's no public record out there of my childhood . 我和我女儿的区别就在于, 我的童年并没有公开的记录,
childhood:n.童年;幼年;孩童时期
There's certainly no database of all the stupid things that I've done and thought when I was a teenager. 当然,我十几岁时做过的傻事 和那些荒唐的想法也没有被记录。
(Laughter) (笑声)
But for my daughters this may be different. 但是我的女儿们就不同了。
The data that is being collected from them today may be used to judge them in the future and can come to prevent their hopes and dreams. 今天从她们那里搜集的数据 在将来有可能被用来 评判她们的未来, 并可能阻止她们的希望和梦想。
I think that's it's time. 我觉得是时候了,
It's time that we all step up. 是时候
It's time that we start working together as individuals, as organizations and as institutions, and that we demand greater data justice for us and for our children before it's too late. 采取行动—— 无论是个人, 还是组织和机构—— 在一切还来得及之前就开展合作, 为我们和我们的孩子 争取更大程度的 数据公正。
justice:n.公平;公正;司法制度;审判;
Thank you. 谢谢大家!
(Applause) (掌声)