演讲MP3+双语文稿:为了“干净”的社交媒体环境付出的代价
教程:TED音频  浏览:195  
  • 00:00/00:00
  • 提示:点击文章中的单词,就可以看到词义解释

    听力课堂TED音频栏目主要包括TED演讲的音频MP3及中英双语文稿,供各位英语爱好者学习使用。本文主要内容为演讲MP3+双语文稿:为了“干净”的社交媒体环境付出的代价,希望你会喜欢!

    【演讲人及介绍】Hans Block &Moritz Riesewieck

    Hans Block:电影制片人,戏剧导演,音乐家;

    Moritz Riesewieck:作者,编剧,戏剧和电影导演

    汉斯·布洛克(Hans Block)和莫里兹·里斯维克(Moritz Riesewieck)在Laokoon品牌下开发电影,戏剧作品,散文,演讲表演和广播剧,以解决人类和社会观念在数字时代如何变化或可以转变的问题。

    【演讲主题】为了“干净”的社交媒体环境而付出的代价

    【演讲文稿-中英文】

    翻译者 Wanting Zhong 校对 Jiasi Hao

    00:12

    [This talk contains mature content]

    【本演讲包含成人内容】

    00:16

    Moritz Riesewieck: On March 23, 2013, usersworldwide discovered in their news feed a video of a young girl being raped byan older man. Before this video was removed from Facebook, it was alreadyshared 16,000 times, and it was even liked 4,000 times. This video went viraland infected the net.

    莫里兹·里斯维克: 2013 年 3 月 23 日,世界各地的用户在他们的新闻推送里发现了一个年轻女孩被年长男性强奸的视频。在这个视频从Facebook上被移除前,它已经被转发了 1.6 万次,甚至被点赞了 4 千次。这个视频被疯转,像病毒一样侵染了网络。

    00:49

    Hans Block: And that was the moment weasked ourselves how could something like this get on Facebook? And at the sametime, why don't we see such content more often? After all, there's a lot ofrevolting material online, but why do we so rarely see such crap on Facebook,Twitter or Google?

    汉斯·布洛克:也正是在这一刻,我们问自己,这种东西是怎么得以出现在 Facebook 上的?同时,为什么我们没有更加频繁地看见这种内容?毕竟网络上有很多令人反胃的资料信息,但为什么我们很少在 Facebook 、推特,或谷歌上看到这样的垃圾?

    01:08

    MR: While image-recognition software canidentify the outlines of sexual organs, blood or naked skin in images andvideos, it has immense difficulties to distinguish pornographic content fromholiday pictures, Adonis statues or breast-cancer screening campaigns. It can'tdistinguish Romeo and Juliet dying onstage from a real knife attack. It can'tdistinguish satire from propaganda or irony from hatred, and so on and soforth. Therefore, humans are needed to decide which of the suspicious contentshould be deleted, and which should remain.

    莫:虽说图像识别软件可以在图片和视频中分辨性器官、血或者裸体,它很难从度假照片、阿多尼斯雕像,或乳腺癌检查的宣传活动中,区分出色情内容。它无法区分舞台上罗密欧与朱丽叶的死亡,和现实中的持刀袭击。它无法区分讽喻和煽动,反语和仇恨,如此种种。因此,需要人类来判断可疑内容中哪些应被删除,哪些可以保留。

    02:00

    Humans whom we know almost nothing about,because they work in secret. They sign nondisclosure agreements, which prohibitthem from talking and sharing what they see on their screens and what this workdoes to them. They are forced to use code words in order to hide who they workfor. They are monitored by private security firms in order to ensure that theydon't talk to journalists. And they are threatened by fines in case they speak.All of this sounds like a weird crime story, but it's true. These people exist,and they are called content moderators.

    我们对这些人几乎一无所知,因为他们进行的是秘密工作。他们签了保密协议,禁止他们谈论与分享自己在屏幕上看到了什么,以及这份工作对他们造成的影响。他们被迫使用暗号以隐藏他们的雇主。他们被私人安保公司监控,以确保他们不会同记者交谈。而要是他们发声,便会被威胁处以罚款。这些听起来像是某个离奇的犯罪故事,但这是真实的。这些人是存在的,他们被称为“网络审查员”。

    02:42

    HB: We are the directors of the featuredocumentary film "The Cleaners," and we would like to take you to aworld that many of you may not know yet. 我们是专题纪录片《The Cleaners》 (《网络清道夫》)的导演。请让我们将你们带往 一个你们大多数人 可能还未曾知晓的世界。

    02:48

    HB: The so-called content moderators don'tget their paychecks from Facebook, Twitter or Google themselves, but fromoutsourcing firms around the world in order to keep the wages low. Tens ofthousands of young people looking at everything we are not supposed to see. Andwe are talking about decapitations, mutilations, executions, necrophilia,torture, child abuse. Thousands of images in one shift -- ignore, delete, dayand night. And much of this work is done in Manila, where the analog toxicwaste from the Western world was transported for years by container ships, nowthe digital waste is dumped there via fiber-optic cable. And just as theso-called scavengers rummage through gigantic tips on the edge of the city, thecontent moderators click their way through an endless toxic ocean of images andvideos and all manner of intellectual garbage, so that we don't have to look atit.

    汉:这个被称作“网络审查员”的群体并不是直接从Facebook、推特,或谷歌拿工资,而是受雇于世界各地的外包公司,以降低时薪成本。成千上万的年轻人看着我们不应当看到的一切。我们指的是斩首、残割、处决、尸奸、酷刑、儿童虐待。一次轮值要处理几千张图像——忽略,删除—— 不论昼夜。这项工作大部分是在 马尼拉进行的,多年来西方世界 都用集装箱船 将有毒的电子垃圾 输向这里,如今数字垃圾正通过 光纤电缆倾倒在同一个地方。而正如同所谓的拾荒者 在城市边缘的 巨大垃圾山里翻捡一样,网络审查员点击着鼠标,趟过一片无边无际的、 由图像、视频和各种知识垃圾构成的充满毒素的汪洋,感谢于此,我们无需亲自面对这些内容。

    03:49

    MR: But unlike the wounds of the scavengers,those of the content moderators remain invisible. Full of shocking anddisturbing content, these pictures and videos burrow into their memories where,at any time, they can have unpredictable effects: eating disorders, loss oflibido, anxiety disorders, alcoholism, depression, which can even lead tosuicide. The pictures and videos infect them, and often never let them goagain. If they are unlucky, they develop post-traumatic stress disorders, likesoldiers after war missions.

    莫:但和拾荒者们身上的伤口不同,网络审查员的伤口是看不见的。这些图片和视频充斥着令人震惊与不安的内容,烙印在他们的记忆里,随时可能造成难以预计的影响:饮食失调、性欲丧失、焦虑症、酗酒、 抑郁症,甚至可能造成自杀。那些图片和视频感染了他们,往往再也不会放过他们。如果不幸的话,他们会像从战场归来的士兵一样,患上创伤后应激障碍(PTSD)。

    04:29

    In our film, we tell the story of a youngman who had to monitor livestreams of self-mutilations and suicide attempts,again and again, and who eventually committed suicide himself. It's not anisolated case, as we've been told. This is the price all of us pay for ourso-called clean and safe and "healthy" environments on social media.Never before in the history of mankind has it been easier to reach millions ofpeople around the globe in a few seconds. What is posted on social mediaspreads so quickly, becomes viral and excites the minds of people all aroundthe globe. Before it is deleted, it is often already too late. Millions ofpeople have already been infected with hatred and anger, and they either becomeactive online, by spreading or amplifying hatred, or they take to the streetsand take up arms.

    在影片里,我们讲述了一个年轻人的故事:他的工作是监控 自残以及自杀企图的直播,周而复始,然而最终,他也以自杀的方式结束了自己的生命。我们被告知的是,这样的事并非个例。这是我们所有人,为了我们所谓的干净、安全、且“健康”的社交媒体环境,付出的代价。在人类历史中,从未有哪个时代能像现在这样轻易地在数秒之内便触及全球各地的数百万人。在社交媒体上发布的内容传递得如此之快,迅速爆红疯转,刺激全球所有人的神经。在它被删除之前,往往已为时晚矣。数百万人已经被憎恨和愤怒感染,他们抑或在网上变得活跃,继续传播或放大憎恨,抑或走上街头,诉诸暴力。

    05:35

    HB: Therefore, an army of contentmoderators sit in front of a screen to avoid new collateral damage. And theyare deciding, as soon as possible, whether the content stays on the platform --ignore; or disappears -- delete. But not every decision is as clear as thedecision about a child-abuse video. What about controversial content,ambivalent content, uploaded by civil rights activists or citizen journalists? Thecontent moderators often decide on such cases at the same speed as the [clear]cases.

    汉:因此,一支由网络审查员形成的军队,守在屏幕前,防止新的附带损害产生。他们必须尽快做出决断,是否保留某条内容——忽略;还是让它消失——删除。但并不是每个决定都能像对儿童虐待的视频那样迅速做出清晰明了的判断。对于由民权活动人士、公民记者上传的 有争议的、模棱两可的内容,该怎么处理呢? 网络审查员在判断这些案例时,通常和处理泾渭分明的案例时 使用同样的速度。

    06:11

    People armed with their mobile phones canmake visible what journalists often do not have access to. Civil rights groupsoften do not have any better option to quickly make their recordings accessibleto a large audience than by uploading them to social media. Wasn't this theempowering potential the World Wide Web should have? Weren't these the dreamspeople in its early stages had about the World Wide Web? Can't pictures andvideos like these persuade people who have become insensitive to facts torethink?

    拥有手机的人们能曝光记者们通常难以接触的事情。人权组织为了让他们的录像能迅速向广大观众公开,除了上传到社交媒体常常没有更好的选择。这难道不是万维网应当拥有的能够赋予力量的潜力吗?这难道不是万维网初具雏形时,人们对它抱有的梦想吗?这样的图片和视频难道无法劝说已对事实变得麻木的人们 开始反思吗?

    06:49

    HB: But instead, everything that might bedisturbing is deleted. And there's a general shift in society. Media, forexample, more and more often use trigger warnings at the top of articles whichsome people may perceive as offensive or troubling. Or more and more studentsat universities in the United States demand the banishment of antique classicswhich depict sexual violence or assault from the curriculum. But how far shouldwe go with that? Physical integrity is guaranteed as a human right inconstitutions worldwide. In the Charter of Fundamental Rights of the EuropeanUnion, this right expressly applies to mental integrity. But even if thepotentially traumatic effect of images and videos is hard to predict, do wewant to become so cautious that we risk losing social awareness of injustice?So what to do? Mark Zuckerberg recently stated that in the future, the users,we, or almost everybody, will decide individually what they would like to seeon the platform, by personal filter settings. So everyone could easily claim toremain undisturbed by images of war or other violent conflicts, like ...

    汉:然而,一切可能造成不安的内容都被删除了。在社会中还有这样的一种变化趋势。比如说,媒体更加频繁地在有人可能感到冒犯或者不安的文章顶部使用“敏感警告”。美国的大学校园内有越来越多的学生要求从课程中剔除描写性暴力或性侵犯的古典内容。但这些行为的尺度该如何把握?在世界各地的宪法中,身体健全是被保障的基本人权。欧盟的《基本权利宪章》明文规定,这项权利同样适用于心理健全。但即使图像和视频带来的潜在创伤难以预测,我们是否想变得如此谨小慎微,以至于要冒险失去对不公的社会意识?那么该怎么做呢?马克·扎克伯格最近声明在未来,用户们,即我们,或者几乎是任何人,将会通过个人过滤设定,个人独立决定在平台上想看到的内容。也就是说任何人能轻松地声称看到战争和暴力冲突的图像时能不为所动,比如说——

    08:05

    MR: I'm the type of guy who doesn't mindseeing breasts and I'm very interested in global warming, but I don't like warso much.

    莫:我是那种不介意看到胸部的男人,我对全球变暖很感兴趣,但不怎么喜欢战争。

    08:16

    HB: Yeah, I'm more the opposite, I havezero interest in naked breasts or naked bodies at all. But why not guns? I likeguns, yes.

    汉:嗯,我就比较相反,我对胸部或者裸体压根没有一点兴趣。但何不谈谈枪支?没错,我喜欢枪。

    08:26

    MR: Come on, if we don't share a similarsocial consciousness, how shall we discuss social problems? How shall we callpeople to action? Even more isolated bubbles would emerge. One of the centralquestions is: "How, in the future, freedom of expression will be weighedagainst the people's need for protection." It's a matter of principle. Dowe want to design an either open or closed society for the digital space? Atthe heart of the matter is "freedom versus security." Facebook hasalways wanted to be a "healthy" platform. Above all, users shouldfeel safe and secure. It's the same choice of words the content moderators inthe Philippines used in a lot of our interviews.

    莫:看嘛,如果我们没有共享相似的社会意识,我们该如何讨论社会问题?我们该如何呼吁人们行动?更多互相孤立的泡泡会浮现。核心问题之一是:“在未来,我们该如何平衡言论自由与人们对保护的需求。”这是个原则性的问题。我们想为数字空间设计一个相较开放或是封闭的社会?问题的核心是 “自由 vs. 安全感”。Facebook 一直想成为 一个“健康”的平台。重中之重的是,用户应当感到安全。在我们的很多采访中,菲律宾的网络审查员们也使用了同样的遣词。

    09:19

    MR: For the young content moderators in thestrictly Catholic Philippines, this is linked to a Christian mission. Tocounter the sins of the world which spread across the web. "Cleanliness isnext to godliness," is a saying everybody in the Philippines knows.

    莫:这些来自信奉天主教的菲律宾的年轻网络审查员们,对于他们来说,这份工作和基督教的使命有所联系。为了对抗在网络上传播的这个世界的罪恶。“清洁近于圣洁”,这个说法在菲律宾人尽皆知。

    09:40

    HB: And others motivate themselves bycomparing themselves with their president, Rodrigo Duterte. He has been rulingthe Philippines since 2016, and he won the election with the promise: "Iwill clean up." And what that means is eliminating all kinds of problemsby literally killing people on the streets who are supposed to be criminals,whatever that means. And since he was elected, an estimated 20,000 people havebeen killed. And one moderator in our film says, "What Duterte does on thestreets, I do for the internet." And here they are, our self-proclaimedsuperheroes, who enforce law and order in our digital world. They clean up,they polish everything clean, they free us from everything evil. Tasks formerlyreserved to state authorities have been taken over by college graduates intheir early 20s, equipped with three- to five-day training -- this is thequalification -- who work on nothing less than the world's rescue.

    汉:其他人则将自己与他们的总统罗德里戈·杜特尔特相比较,以此激励自身。他自 2016 年当选以来一直掌权菲律宾,他凭借“我会进行清扫”的承诺在当年的选举中胜出。而这个承诺的意思是通过杀掉街上被视为罪犯的人,不管这是什么意思,从而达到排除社会上各种问题的目的。自从他当选以后,估计有 2 万人被杀。我们影片中的一位审查员说:“杜特尔特在街头上怎么做,我在网络上也怎么做。”这就是他们,我们的“自我标榜的超级英雄”,在数字世界里维持法制与秩序。他们进行扫除,把一切擦拭得干干净净,他们将我们从一切邪恶中解放出来。曾经为国家机关保留的任务如今落到了二十岁出头的大学毕业生肩上,他们接受完三天到五天的训练,——这便是他们的资格证,他们的工作不亚于拯救世界。

    10:43

    MR: National sovereignties have beenoutsourced to private companies, and they pass on their responsibilities tothird parties. It's an outsourcing of the outsourcing of the outsourcing, whichtakes place. With social networks, we are dealing with a completely newinfrastructure, with its own mechanisms, its own logic of action and therefore,also, its own new dangers, which had not yet existed in the predigitalizedpublic sphere.

    莫:国家权能被外包给私人公司,他们又将自己的责任托付给第三方。情况就是——外包,再外包,再外包。对于社交网络,我们要处理的是一个全新的架构,它有着自己的运行机制,自己的行为逻辑,因而也有其特定的潜在新危险。这些危险在电子化时代以前的公共领域中不曾存在过。

    11:13

    HB: When Mark Zuckerberg was at the USCongress or at the European Parliament, he was confronted with all kinds ofcritics. And his reaction was always the same: "We will fix that, and Iwill follow up on that with my team." But such a debate shouldn't be heldin back rooms of Facebook, Twitter or Google -- such a debate should be openlydiscussed in new, cosmopolitan parliaments, in new institutions that reflectthe diversity of people contributing to a utopian project of a global network.And while it may seem impossible to consider the values of users worldwide,it's worth believing that there's more that connects us than separates us.

    汉:当马克·扎克伯格在美国国会或者欧洲议会时,他面对的是各式各样的批评。而他的反应总是千篇一律的:“这一点我们会改进,那一点我们团队会跟进。”可是这样的辩论不应该在 Facebook 、推特 或谷歌的幕后进行—— 这样的辩论应当被公开探讨,在崭新的、国际化的议会中,在新的机构中——它们应当能反映 “为全球化网络理想工程 做出贡献的人们的多元化”。考虑到全球用户的价值观 虽说看上去不可能 但值得相信的是,我们之间的 联系将比隔阂更强大。

    11:58

    MR: Yeah, at a time when populism isgaining strength, it becomes popular to justify the symptoms, to eradicatethem, to make them invisible. This ideology is spreading worldwide, analog aswell as digital, and it's our duty to stop it before it's too late. Thequestion of freedom and democracy must not only have these two options.

    莫:没错,在这个民粹主义抬头的时点,为症状辩解、将它们消除、将它们隐形,这样的做法变得普及。这种观念正在全世界扩散,无论在现实里还是在网络上,而我们的义务是在为时已晚前阻止它。自由和民主的问题并不能只有这两个选项。

    12:29

    HB: Delete.

    汗:删除。

    12:30

    MR: Or ignore.

    莫:或者忽略。

    12:33

    HB: Thank you very much.

    汗:谢谢大家。

    12:35

    (Applause)

    (掌声)

    0/0
      上一篇:演讲MP3+双语文稿:我们如何与人工智能一起工作 下一篇:演讲MP3+双语文稿:如何让员工在工作中感到快乐

      本周热门

      受欢迎的教程

      下载听力课堂手机客户端
      随时随地练听力!(可离线学英语)