|
|
IyadRahwan_2016X-_无人车的道德评判标准是什么?_
|
Today I'm going to talk about technology and society. |
今天我想谈谈技术和社会。 |
technology:n.技术;工艺;术语;
|
The Department of Transport estimated that last year 35,000 people died from traffic crashes in the US alone. |
据交通部的估算, 在美国,仅去年就有 3万5千人死于交通事故。 |
Transport:n.运输;输送;运送;运输机;v.输送;传播;使产生身临其境的感觉;(旧时)流放; estimated:adj.估计的;预计的;估算的;
|
Worldwide , 1.2 million people die every year in traffic accidents. |
而在全世界,每年则有 120万人死于交通事故。 |
Worldwide:adj.全世界的;adv.在世界各地;
|
If there was a way we could eliminate 90 percent of those accidents, would you support it? |
如果有一种方法能减少90%的交通事故, 你会支持它吗? |
eliminate:v.消除;排除;
|
Of course you would. |
答案绝对是肯定的。 |
This is what driverless car technology promises to achieve by eliminating the main source of accidents -- human error. |
这就是无人车技术所承诺实现的目标, 通过消除造成事故的主要原因—— 人为过错。 |
driverless:无人驾驶的; eliminating:v.排除;清除;消除;淘汰;(eliminate的现在分词) source:n.来源;水源;原始资料;
|
Now picture yourself in a driverless car in the year 2030, sitting back and watching this vintage TEDxCambridge video. |
现在想象一下你在2030年中的一天, 坐在一辆无人车里 悠闲地观看我这个过时的 TEDxCambridge视频。 |
vintage:n.葡萄收获期; adj.古老的; v.采葡萄; vi.采葡萄;
|
(Laughter) |
(笑声) |
All of a sudden , the car experiences mechanical failure and is unable to stop. |
突然间, 车子出现了机械故障,刹车失灵了。 |
All of a sudden:突然地,出乎意料地; mechanical:adj.机械的;力学的;呆板的;无意识的;手工操作的;
|
If the car continues, it will crash into a bunch of pedestrians crossing the street, but the car may swerve , hitting one bystander , killing them to save the pedestrians. |
如果车继续行驶, 就会冲入正在穿越人行道的人群中, 但是车还可能转向, 撞到路边一个不相干的人, 用他的生命来换那些行人的生命。 |
a bunch of:一群;一束;一堆; pedestrians:n.行人(pedestrian的复数); swerve:vi.转弯;突然转向;背离;vt.使转弯;使突然转向;使背离;n.转向;偏离的程度; bystander:n.旁观者;看热闹的人;
|
What should the car do, and who should decide? |
这辆车该怎么做, 又该是谁来做这个决定呢? |
What if instead the car could swerve into a wall, crashing and killing you, the passenger , in order to save those pedestrians? |
再如果,这辆车会转向并撞墙, 连你在内人车俱毁, 从而挽救其他人的生命, 这会是个更好的选择吗? |
What if:如果…怎么办? passenger:n.旅客;乘客;白吃饭的人;闲散人员;
|
This scenario is inspired by the trolley problem, which was invented by philosophers a few decades ago to think about ethics . |
这个场景假设是受到了 “电车问题”的启发, 这是几十年前由一群哲学家 发起的对道德的拷问。 |
scenario:n.方案;情节;剧本; inspired:adj.受到启发的; v.鼓舞; (inspire的过去分词和过去式) trolley:n.手推车;(美)有轨电车;vi.乘电车;vt.用手推车运; philosophers:n.哲学家(philosopher的复数); ethics:n.伦理学;伦理观;道德标准;
|
Now, the way we think about this problem matters. |
我们如何思考这个问题非常关键。 |
We may for example not think about it at all. |
我们也许压根儿就 不应该去纠结这个问题。 |
We may say this scenario is unrealistic , incredibly unlikely , or just silly. |
我们可以辩称这个场景假设不现实, 太不靠谱,简直无聊透顶。 |
unrealistic:adj.不切实际的;不实在的; incredibly:adv.难以置信地;非常地; unlikely:adj.不大可能发生的;非心目中的;非想象的;难以相信的;
|
But I think this criticism misses the point because it takes the scenario too literally . |
不过我觉得这种批判没有切中要害, 因为仅仅是停留在了问题表面。 |
criticism:n.批评;批判;评论;指责; literally:adv.按字面:字面上:确实地:
|
Of course no accident is going to look like this; no accident has two or three options where everybody dies somehow . |
当然没有任何事故会出现这种情况; 没有哪个事故会同时出现2-3种选择, 而每种选择中都会有人失去生命。 |
options:n.选择; v.得到或获准进行选择; (option的三单形式) somehow:adv.以某种方法;莫名其妙地;
|
Instead, the car is going to calculate something like the probability of hitting a certain group of people, if you swerve one direction versus another direction, |
相反,车辆自身会做些计算, 比如撞击一群人的可能性, 如果转向另一个方向, |
probability:n.可能性;机率;[数]或然率; versus:prep.对;与...相对;对抗;
|
you might slightly increase the risk to passengers or other drivers versus pedestrians. |
相对于行人来说,你可能略微 增加了乘客,或者其他驾驶员 受伤的可能性。 |
slightly:adv.些微地,轻微地;纤细地;
|
It's going to be a more complex calculation, but it's still going to involve trade-offs , and trade-offs often require ethics. |
这将会是一个更加复杂的计算, 不过仍然会涉及到某种权衡, 而这种权衡经常需要做出道德考量。 |
complex:adj.复杂的;合成的;n.复合体;综合设施; involve:v.包含;需要;牵涉;牵连;影响;(使)参加; trade-offs:n.权衡(trade-off的复数);交易;物物交换;
|
We might say then, "Well, let's not worry about this. |
我们可能会说,“还是别杞人忧天了。 |
Let's wait until technology is fully ready and 100 percent safe." |
不如等到技术完全成熟,能达到 100%安全的时候再用吧。” |
Suppose that we can indeed eliminate 90 percent of those accidents, or even 99 percent in the next 10 years. |
假设我们的确可以在 未来的10年内消除90%, 甚至99%的事故。 |
Suppose:v.推断:假定:假设:设想:
|
What if eliminating the last one percent of accidents requires 50 more years of research? |
如果消除最后这1%的事故 却需要再研究50年才能实现呢? |
Should we not adopt the technology? |
我们是不是应该放弃这项技术了? |
adopt:v.采取;接受;收养;正式通过;
|
That's 60 million people dead in car accidents if we maintain the current rate. |
按照目前的死亡率计算, 那可还要牺牲 6千万人的生命啊。 |
maintain:v.维持;保持;维修;保养;坚持(意见);
|
So the point is, waiting for full safety is also a choice, and it also involves trade-offs. |
所以关键在于, 等待万无一失的技术也是一种选择, 这里也有权衡的考虑。 |
involves:v.包含;需要;牵涉;牵连;影响;(使)参加,加入(involve的第三人称单数)
|
People online on social media have been coming up with all sorts of ways to not think about this problem. |
社交媒体上的人们想尽了办法 去回避这个问题。 |
media:n.媒体;媒质(medium的复数);血管中层;浊塞音;中脉;
|
One person suggested the car should just swerve somehow in between the passengers -- |
有人建议无人车应该把握好角度, 刚好从人群和路边的 无辜者之间的缝隙—— |
(Laughter) |
(笑声) |
and the bystander. |
穿过去。 |
Of course if that's what the car can do, that's what the car should do. |
当然,如果车辆能做到这一点, 毫无疑问就应该这么做。 |
We're interested in scenarios in which this is not possible. |
我们讨论的是无法实现这一点的情况。 |
scenarios:n.情节;脚本;情景介绍(scenario的复数);
|
And my personal favorite was a suggestion by a blogger to have an eject button in the car that you press -- |
我个人比较赞同一个博主的点子, 在车里加装一个弹射按钮—— |
personal:adj.个人的;身体的;亲自的;n.人事消息栏;人称代名词; blogger:n.写博客的人;博客使用者; eject:vt.喷射;驱逐,逐出;
|
(Laughter) |
(笑声) |
just before the car self-destructs . |
在车辆自毁前按一下就行了。 |
self-destructs:vi.自毁;adj.自毁的;自杀的;
|
(Laughter) |
(笑声) |
So if we acknowledge that cars will have to make trade-offs on the road, how do we think about those trade-offs, and how do we decide? |
那么如果我们认同车辆 将不得不在行驶中做出权衡的话, 我们要如何考量这种权衡 并做出决策呢? |
Well, maybe we should run a survey to find out what society wants, because ultimately , regulations and the law are a reflection of societal values. |
也许我们应该做些调查问卷 看看大众是什么想法, 毕竟最终, 规则和法律应该反映社会价值。 |
survey:n.调查;测量;审视;纵览;vt.调查;勘测;俯瞰;vi.测量土地; ultimately:adv.最终;最后;归根结底;终究; regulations:n.章程;规则;法规;管理,控制;(regulation的复数) reflection:n.反映;沉思;映像;深思; societal:adj.社会的;
|
So this is what we did. |
所以我们做了这么件事儿。 |
With my collaborators , |
跟我的合作者 |
collaborators:n.[劳经]合作者;投敌者(collaborator的复数);
|
Jean-Fran?ois Bonnefon and Azim Shariff, we ran a survey in which we presented people with these types of scenarios. |
朗·弗朗索瓦·伯尼夫和 阿米滋·谢里夫一起, 我们做了一项调查问卷, 为人们列举了这些假设的场景。 |
We gave them two options inspired by two philosophers: |
受哲学家杰里米·边沁(英国)和 伊曼努尔·康德(德国)的启发, |
Jeremy Bentham and Immanuel Kant. |
我们给出了两种选择。 |
Bentham:n.边沁(英国哲学家);
|
Bentham says the car should follow utilitarian ethics: it should take the action that will minimize total harm -- even if that action will kill a bystander and even if that action will kill the passenger. |
边沁认为车辆应该 遵循功利主义道德: 它应该采取最小伤害的行动—— 即使是以牺牲一个无辜者为代价, 即使会令乘客身亡。 |
utilitarian:adj.功利的;功利主义的;实利的;n.功利主义者; minimize:v.使减少到最低限度;降低;贬低;使显得不重要;
|
Immanuel Kant says the car should follow duty-bound principles , like "Thou shalt not kill." |
伊曼努尔·康德则认为 车辆应该遵循义不容辞的原则, 比如“不可杀人。” |
duty-bound:adj.义不容辞的; principles:n.原则;主义;本质;政策;(principle的复数) shalt:v.应该;将要;必须(shall的第二人称单数现在式);
|
So you should not take an action that explicitly harms a human being, and you should let the car take its course even if that's going to harm more people. |
因此你不应该有意 去伤害一个人, 应该让车顺其自然行驶, 即使这样会伤害到更多的人。 |
explicitly:adv.明确地;明白地;
|
What do you think? |
你会怎么选择? |
Bentham or Kant? |
支持边沁还是康德? |
Here's what we found. |
我们得到的结果是这样的。 |
Most people sided with Bentham. |
大部分人赞同边沁的观点。 |
So it seems that people want cars to be utilitarian, minimize total harm, and that's what we should all do. |
所以人们似乎希望 车辆是功利主义的, 将伤害降到最小, 我们都应该这么做。 |
Problem solved. |
问题解决了。 |
But there is a little catch. |
不过这里还有个小插曲。 |
When we asked people whether they would purchase such cars, they said, " Absolutely not ." |
当我们问大家他们 会不会买这样一辆车时, 他们不约而同地回答,“绝对不会。” |
purchase:n.购买;采购;购买的东西;购买项目;v.购买;采购; Absolutely not:绝对不会;绝对不是;绝对不行;
|
(Laughter) |
(笑声) |
They would like to buy cars that protect them at all costs , but they want everybody else to buy cars that minimize harm. |
他们更希望买能够 不顾一切保障自己安全的车, 不过却指望其他人 都买能将伤害降到最低的车。 |
at all costs:无论如何,不惜任何代价;
|
(Laughter) |
(笑声) |
We've seen this problem before. |
这个问题以前就出现过。 |
It's called a social dilemma . |
叫做社会道德困境。 |
dilemma:n.困境;进退两难;两刀论法;
|
And to understand the social dilemma, we have to go a little bit back in history. |
为了理解这个概念, 我们要先简单回顾一下历史。 |
In the 1800s, |
在19世纪, |
English economist William Forster Lloyd published a pamphlet which describes the following scenario. |
英国经济学家威廉·福斯特·劳埃德 出版了一个宣传册, 里面描述了这样一个场景。 |
pamphlet:n.小册子; describes:v.描述;形容;把…称为;画出…图形;(describe的第三人称单数)
|
You have a group of farmers -- |
有一群农场主, |
English farmers -- who are sharing a common land for their sheep to graze . |
英国农场主, 共同在一片地里放羊。 |
graze:vt.放牧;擦伤;vi.吃草;擦伤;n.放牧;轻擦;
|
Now, if each farmer brings a certain number of sheep -- let's say three sheep -- the land will be rejuvenated , the farmers are happy, the sheep are happy, everything is good. |
如果每个农场主都 带了一定数量的羊, 比如每家三只, 这片土地上的植被还可以正常再生, 农场主们自然高兴, 羊群也自在逍遥, 一切都相安无事。 |
rejuvenated:adj.更生的; v.使恢复青春;
|
Now, if one farmer brings one extra sheep, that farmer will do slightly better, and no one else will be harmed. |
如果有一个农场主多放了一只羊, 他就会获益更多, 不过其他人也都没什么损失。 |
extra:adj.额外的:n.额外的事物:adv.额外:另外:
|
But if every farmer made that individually rational decision, the land will be overrun , and it will be depleted to the detriment of all the farmers, and of course, to the detriment of the sheep. |
但是如果每个农场主 都擅自增加羊的数量, 土地容量就会饱和,变得不堪重负, 所有农场主都会受损, 当然,羊群也会开始挨饿。 |
individually:adv.个别地,单独地; rational:n.理性;人类;合理的事物;[数]有理数;adj.合理的;理性的;明智的;理智的; overrun:n.泛滥成灾;超出限度;vt.泛滥;超过;蹂躏;vi.泛滥;蔓延; depleted:v.大量减少;耗尽;使枯竭;(deplete的过去分词和过去式) detriment:n.损害;伤害;损害物;
|
We see this problem in many places: in the difficulty of managing overfishing , or in reducing carbon emissions to mitigate climate change. |
我们在很多场合都见到过这个问题: 比如过度捕捞的困境, 或者应对气候变化的碳减排。 |
overfishing:n.渔捞过度;vt.过度捕捞(overfish的现在分词); carbon:n.[化学]碳;碳棒;复写纸;adj.碳的;碳处理的; emissions:n.(光、热、气等的)发出,排放;排放物;散发物;(emission的复数) mitigate:vt.使缓和,使减轻;vi.减轻,缓和下来;
|
When it comes to the regulation of driverless cars, the common land now is basically public safety -- that's the common good -- and the farmers are the passengers or the car owners who are choosing to ride in those cars. |
而到了无人车的制度问题, 公共土地在这里指的就是公共安全, 也就是公共利益, 而农场主就是乘客, 或者车主,决定乘车出行的人。 |
basically:adv.主要地,基本上;
|
And by making the individually rational choice of prioritizing their own safety, they may collectively be diminishing the common good, which is minimizing total harm. |
通过自作主张把自己的安全凌驾于 其他人的利益之上, 他们可能共同损害了 能将总损失降到最低的 公共利益。 |
prioritizing:v.按重要性排列;划分优先顺序;优先处理;(prioritize的现在分词) collectively:adv.共同地,全体地; diminishing:v.减少; adj.逐渐缩小的; minimizing:v.使减少到最低限度;降低;贬低;使显得不重要;(minimize的现在分词)
|
It's called the tragedy of the commons, traditionally , but I think in the case of driverless cars, the problem may be a little bit more insidious because there is not necessarily an individual human being making those decisions. |
传统上把这称为 公地悲剧, 不过我认为对于无人车来说, 问题可能是更深层次的, 因为并没有一个人 去做决策。 |
tragedy:n.悲惨的事;不幸;灾难;悲剧作品; traditionally:adv.传统上;习惯上;传说上; insidious:adj.阴险的;隐伏的;暗中为害的;狡猾的; necessarily:adv.必要地;必定地,必然地;
|
So car manufacturers may simply program cars that will maximize safety for their clients , and those cars may learn automatically on their own that doing so requires slightly increasing risk for pedestrians. |
那么无人车制造商可能会 简单的把行车电脑程序 设定成最大程度保护车主的安全, 而那些车可能会自主学习, 而这一过程也就会略微增加 对行人的潜在危险。 |
manufacturers:n.生产者;制造者;生产商;(manufacturer的复数) maximize:vt.取…最大值;对…极为重视;vi.尽可能广义地解释;达到最大值; clients:n.委托人;当事人;客户机;(client的复数) automatically:adv.自动地;机械地;无意识地;adj.不经思索的;
|
So to use the sheep metaphor , it's like we now have electric sheep that have a mind of their own. |
跟羊群的比喻类似, 这就好像换成了一批 可以自己思考的机器羊。 |
metaphor:n.暗喻,隐喻;比喻说法; electric:n.供电;adj.电的;用电的;电动的;发电的;
|
(Laughter) |
(笑声) |
And they may go and graze even if the farmer doesn't know it. |
它们可能会自己去吃草, 而农场主对此毫不知情。 |
So this is what we may call the tragedy of the algorithmic commons, and if offers new types of challenges. |
这就是我们所谓的算法共享悲剧, 这会带来新的类型的挑战。 |
algorithmic:adj.[数]算法的;规则系统的;
|
Typically , traditionally, we solve these types of social dilemmas using regulation, so either governments or communities get together , and they decide collectively what kind of outcome they want |
通常在传统模式下, 我们可以通过制定规则 来解决这些社会道德困境, 政府或者社区共同商讨决定 他们能够接受什么样的后果, |
Typically:adv.代表性地;作为特色地; dilemmas:n.困境(dilemma的复数); communities:n.社区;社会;团体;共有(community的复数) get together:聚会 outcome:n.结果,结局;成果;
|
and what sort of constraints on individual behavior they need to implement . |
以及需要对个人行为施加 什么形式的限制。 |
constraints:n.[数]约束;限制;约束条件(constraint的复数形式); implement:v.实施;执行;贯彻;使生效;n.工具;
|
And then using monitoring and enforcement , they can make sure that the public good is preserved . |
通过监管和强制执行, 就可以确定公共利益得到了保障。 |
enforcement:n.执行,实施;强制; preserved:v.保护;维护;保留;保存;保养;(preserve的过去式和过去分词)
|
So why don't we just, as regulators , require that all cars minimize harm? |
那么我们为什么不像 立法者那样, 让所有无人车把危险降到最小? |
regulators:n.调整者;调节阀(regulator的复数形式);
|
After all, this is what people say they want. |
毕竟这是所有人的共同意愿。 |
And more importantly, |
更重要的是, |
I can be sure that as an individual, if I buy a car that may sacrifice me in a very rare case, |
作为一个个体,我很确定 如果我买了一辆会在极端情况下 牺牲我的利益的车, |
sacrifice:n.牺牲;舍弃;祭献;祭祀;祭品;v.牺牲;献出;作祭献
|
I'm not the only sucker doing that while everybody else enjoys unconditional protection. |
我不会是唯一一个自残, 让其他所有人都受到无条件保护的人。 |
sucker:n.吸管;乳儿;易受骗的人;v.成为吸根;长出根出条; unconditional:adj.无条件的;绝对的;无限制的;
|
In our survey, we did ask people whether they would support regulation and here's what we found. |
在我们的调查问卷中确实 也问了人们,是否会支持立法, 调查结果如下。 |
First of all , people said no to regulation; and second, they said, "Well if you regulate cars to do this and to minimize total harm, |
首先,人们并不赞同立法, 其次,他们认为, “如果你们要制定规则保证 这些车造成的损失最小, |
First of all:adv.首先; regulate:v.调节;控制;
|
I will not buy those cars." |
那我肯定不会买。” |
So ironically , by regulating cars to minimize harm, we may actually end up with more harm because people may not opt into the safer technology even if it's much safer than human drivers. |
讽刺的是, 让无人车遵循最小损失原则, 我们得到的反而可能是更大的损失, 因为人们可能放弃 使用这种更安全的技术, 即便其安全性远超过人类驾驶员。 |
ironically:adv.讽刺地;说反话地; regulating:vt.调节;校正(regulate的现在分词);
|
I don't have the final answer to this riddle , but I think as a starting point , we need society to come together to decide what trade-offs we are comfortable with and to come up with ways in which we can enforce those trade-offs. |
对于这场争论 我并没有得到最终的答案, 不过我认为作为一个开始, 我们需要团结整个社会 来决定哪种折中方案 是大家都可以接受的, 更要商讨出可以有效推行 这种权衡决策的方法。 |
riddle:n.谜;谜语;神秘事件;无法解释的情况;v.使布满窟窿; starting point:n.出发点;基础; come up with:提出;想出;赶上;
|
As a starting point, my brilliant students, |
以此为基础,我的两位出色的学生, |
And we vary the ages and even the species of the different victims. |
我们还对不同的(潜在)受害者 设置了年龄,甚至种族信息。 |
vary:vi.变化;变异;违反;vt.改变;使多样化;变奏; species:n.[生物]物种;种类;
|
So far we've collected over five million decisions by over one million people worldwide from the website. |
目前我们已经搜集到了 超过5百万份决定, 来自于全世界超过1百万人 在网上给出的答案。 |
And this is helping us form an early picture of what trade-offs people are comfortable with and what matters to them -- even across cultures. |
这帮助我们 形成了一个概念雏形, 告诉了我们对人们来说 哪些折中方案最适用, 他们最在意的是什么—— 甚至跨越了文化障碍。 |
But more importantly, doing this exercise is helping people recognize the difficulty of making those choices and that the regulators are tasked with impossible choices. |
不过更重要的是, 这项练习能帮助人们认识到 做出这些选择有多难, 而立法者更是被要求做出 不现实的选择。 |
recognize:v.认识;认出;辨别出;承认;意识到;
|
And maybe this will help us as a society understand the kinds of trade-offs that will be implemented ultimately in regulation. |
这还可能帮助我们整个社会 去理解那些最终将会被 纳入法规的折中方案。 |
implemented:v.使生效;贯彻;执行;实施;(implement的过去式和过去分词)
|
And indeed, I was very happy to hear that the first set of regulations that came from the Department of Transport -- announced last week -- included a 15-point checklist for all carmakers to provide, and number 14 was ethical consideration -- how are you going to deal with that. |
诚然,我很高兴听到 第一套由交通部 批准的法规—— 上周刚刚公布—— 囊括了需要所有无人车厂商 提供的15点清单, 而其中第14点就是道德考量—— 要如何处理道德困境。 |
checklist:n.清单;检查表;备忘录;目录册; carmakers:n.汽车制造商(carmaker的复数);汽车制造厂; ethical:adj.伦理的;道德的;凭处方出售的;n.处方药; consideration:n.顾及;报酬;斟酌;仔细考虑;
|
We also have people reflect on their own decisions by giving them summaries of what they chose. |
我们还通过为大家 提供自己选择的概要, 让人们反思自己的决定。 |
reflect on:仔细考虑,思考;反省;回想,回顾;怀疑; summaries:n.总结(summary的复数);
|
I'll give you one example -- |
给大家举个例子—— |
I'm just going to warn you that this is not your typical example, your typical user. |
我要提醒大家 这不是一个典型的例子, 也不是典型的车主。 |
This is the most sacrificed and the most saved character for this person. |
这个人有着最容易牺牲(儿童), 也最容易被保护的特征(宠物)。 |
sacrificed:v.牺牲:献出:以(人或动物)作祭献:(sacrifice的过去分词和过去式)
|
(Laughter) |
(笑声) |
Some of you may agree with him, or her, we don't know. |
你们中有人可能会赞同他, 或者她,我们并不知道其性别。 |
But this person also seems to slightly prefer passengers over pedestrians in their choices and is very happy to punish jaywalking . |
不过这位调查对象 也似乎更愿意保护乘客, 而不是行人, 甚至相当支持严惩横穿马路的行人。 |
prefer:v.更喜欢;宁愿;提出;提升; jaywalking:n.走路不遵守交通规则;v.(不遵守交通规则)乱穿马路(jaywalk的ing形式);
|
(Laughter) |
(笑声) |
So let's wrap up. |
那么我们来总结一下。 |
wrap:v.缠绕;隐藏;掩护;包起来;缠绕;穿外衣;n.外套;围巾;
|
We started with the question -- let's call it the ethical dilemma -- of what the car should do in a specific scenario: swerve or stay? |
我们由一个问题开始—— 就叫它道德困境问题—— 关于在特定条件下 无人车应该如何抉择: 转向还是直行? |
specific:adj.特殊的,特定的;明确的;详细的;[药]具有特效的;n.特性;细节;特效药;
|
But then we realized that the problem was a different one. |
但之后我们意识到这并不是问题的核心。 |
It was the problem of how to get society to agree on and enforce the trade-offs they're comfortable with. |
关键的问题在于如何让 大众在他们能够接受的权衡方案中 达成一致并付诸实施。 |
It's a social dilemma. |
这是个社会道德困境。 |
In the 1940s, Isaac Asimov wrote his famous laws of robotics -- the three laws of robotics. |
在20世纪40年代, 艾萨克·阿西莫夫 (俄科幻小说家)就写下了他那着名的 机器人三大法则。 |
robotics:n.机器人学;
|
A robot may not harm a human being, a robot may not disobey a human being, and a robot may not allow itself to come to harm -- in this order of importance. |
机器人不能伤害人类, 机器人不能违背人类的命令, 机器人不能擅自伤害自己—— 这是按重要性由高到低排序的。 |
disobey:v.不服从;不听话;不顺从;
|
But after 40 years or so and after so many stories pushing these laws to the limit, |
但是大约40年后, 太多事件不断挑战这些法则的底线, |
Asimov introduced the zeroth law which takes precedence above all, and it's that a robot may not harm humanity as a whole . |
阿西莫夫又引入了第零号法则, 凌驾于之前所有法则之上, 说的是机器人不能伤害人类这个整体。 |
zeroth:adj.[数]零的; precedence:n.优先;居先; humanity:n.人类;人道;仁慈;人文学科; as a whole:总的来说;
|
I don't know what this means in the context of driverless cars or any specific situation, and I don't know how we can implement it, but I think that by recognizing |
我不太明白在无人车和 其他特殊背景下 这句话是什么意思, 也不清楚我们要如何实践它, 但我认为通过认识到 |
context:n.环境;上下文;来龙去脉; recognizing:v.认识;认出;承认;接受,赞成(recognize的现在分词)
|
that the regulation of driverless cars is not only a technological problem but also a societal cooperation problem, |
针对无人车的立法不仅仅是个技术问题, 还是一个社会合作问题, |
technological:adj.技术[工程](上)的;因工艺技术高度发展而引起的; cooperation:n.合作;配合;
|
I hope that we can at least begin to ask the right questions. |
我希望我们至少可以 从提出正确的问题入手。 |
Thank you. |
谢谢大家。 |
(Applause) |
(掌声) |