一个"洁净"网络的代价
日期:2020-02-12 17:30

(单词翻译:单击)

 MP3点击下载

On March 23, 2013, users worldwide discovered in their news feed a video of a young girl being raped by an older man.
2013年3月23日,世界各地的用户在他们的新闻推送里发现了一个年轻女孩被年长男性强奸的视频。
Before this video was removed from Facebook, it was already shared 16,000 times, and it was even liked 4,000 times.
在这个视频从Facebook上被移除前,它已经被转发了1.6万次,甚至被点赞了4千次。
This video went viral and infected the net.
这个视频被疯转,像病毒一样侵染了网络。
And that was the moment we asked ourselves how could something like this get on Facebook?
也正是在这一刻,我们问自己,这种东西是怎么得以出现在Facebook上的?
And at the same time, why don't we see such content more often?
同时,为什么我们没有更加频繁地看见这种内容?
After all, there's a lot of revolting material online, but why do we so rarely see such crap on Facebook, Twitter or Google?
毕竟网络上有很多令人反胃的资料信息,但为什么我们很少在Facebook、推特或谷歌上看到这样的垃圾?
While image-recognition software can identify the outlines of sexual organs, blood or naked skin in images and videos,
虽说图像识别软件可以在图片和视频中分辨性器官、血或者裸体,
it has immense difficulties to distinguish pornographic content from holiday pictures, Adonis statues or breast-cancer screening campaigns.
它很难从度假照片、阿多尼斯雕像或乳腺癌检查的宣传活动中,区分出色情内容。
It can't distinguish Romeo and Juliet dying onstage from a real knife attack.
它无法区分舞台上罗密欧与朱丽叶的死亡和现实中的持刀袭击。
It can't distinguish satire from propaganda or irony from hatred, and so on and so forth.
它无法区分讽喻和煽动,反语和仇恨,如此种种。
Therefore, humans are needed to decide which of the suspicious content should be deleted, and which should remain.
因此,需要人类来判断可疑内容中哪些应被删除,哪些可以保留。
Humans whom we know almost nothing about, because they work in secret.
我们对这些人几乎一无所知,因为他们进行的是秘密工作。
They sign nondisclosure agreements,
他们签了保密协议,
which prohibit them from talking and sharing what they see on their screens and what this work does to them.
禁止他们谈论与分享自己在屏幕上看到了什么,以及这份工作对他们造成的影响。
They are forced to use code words in order to hide who they work for.
他们被迫使用暗号以隐藏他们的雇主。
They are monitored by private security firms in order to ensure that they don't talk to journalists.
他们被私人安保公司监控,以确保他们不会同记者交谈。
And they are threatened by fines in case they speak.
而要是他们发声,便会被威胁处以罚款。
All of this sounds like a weird crime story, but it's true.
这些听起来像是某个离奇的犯罪故事,但这是真实的。
These people exist, and they are called content moderators.
这些人是存在的,他们被称为“网络审查员”。
We are the directors of the feature documentary film "The Cleaners,"
我们是专题纪录片《网络清道夫》的导演,
and we would like to take you to a world that many of you may not know yet. Here's a short clip of our film.
请让我们将你们带往一个你们大多数人可能还未曾知晓的世界。这是我们电影的一个片段。
I need to be anonymous, because we have a contract signed.
我必须匿名,因为我们签了合同。
We are not allowed to declare whom we are working with.
我们不被允许透露我们在和谁工作。
The reason why I speak to you is because the world should know that we are here.
我之所以和你对话的原因,是因为世界应当知道我们在这里。
There is somebody who is checking the social media.
有人在检查社交媒体。
We are doing our best to make this platform safe for all of them.
我们在尽自己所能为他们所有人维持一个安全的平台。
Delete. Ignore. Delete. Ignore. Delete. Ignore. Ignore. Delete.
删除。忽略。删除。忽略。删除。忽略。忽略。删除。
The so-called content moderators don't get their paychecks from Facebook, Twitter or Google themselves,
这个被称作“网络审查员”的群体并不是直接从Facebook、推特,或谷歌拿工资,
but from outsourcing firms around the world in order to keep the wages low.
而是受雇于世界各地的外包公司,以降低时薪成本。
Tens of thousands of young people looking at everything we are not supposed to see.
成千上万的年轻人看着我们不应当看到的一切。
And we are talking about decapitations, mutilations, executions, necrophilia, torture, child abuse.
我们指的是斩首、残割、处决、尸奸、酷刑、儿童虐待。
Thousands of images in one shift -- ignore, delete, day and night.
一次轮值要处理几千张图像--忽略,删除,不论昼夜。
And much of this work is done in Manila,
这项工作大部分是在马尼拉进行的,
where the analog toxic waste from the Western world was transported for years by container ships,
多年来西方世界都用集装箱船将有毒的电子垃圾输向这里,
now the digital waste is dumped there via fiber-optic cable.
如今数字垃圾正通过光纤电缆倾倒在同一个地方。
And just as the so-called scavengers rummage through gigantic tips on the edge of the city,
而正如同所谓的拾荒者在城市边缘的巨大垃圾山里翻捡一样,
the content moderators click their way through an endless toxic ocean of images and videos and all manner of intellectual garbage,
网络审查员点击着鼠标,趟过一片无边无际的、由图像、视频和各种知识垃圾构成的充满毒素的汪洋,
so that we don't have to look at it.
感谢于此,我们无需亲自面对这些内容。
But unlike the wounds of the scavengers, those of the content moderators remain invisible.
但和拾荒者们身上的伤口不同,网络审查员的伤口是看不见的。
Full of shocking and disturbing content, these pictures and videos burrow into their memories where,
这些图片和视频充斥着令人震惊与不安的内容,烙印在他们的记忆里,
at any time, they can have unpredictable effects:
随时可能造成难以预计的影响:
eating disorders, loss of libido, anxiety disorders, alcoholism, depression, which can even lead to suicide.
饮食失调、性欲丧失、焦虑症、酗酒、抑郁症,甚至可能造成自杀。
The pictures and videos infect them, and often never let them go again.
那些图片和视频感染了他们,往往再也不会放过他们。
If they are unlucky, they develop post-traumatic stress disorders, like soldiers after war missions.
如果不幸的话,他们会像从战场归来的士兵一样,患上创伤后应激障碍。
In our film, we tell the story of a young man who had to monitor livestreams of self-mutilations and suicide attempts,
在影片里,我们讲述了一个年轻人的故事:他的工作是监控自残以及自杀企图的直播,
again and again, and who eventually committed suicide himself.
周而复始,然而最终,他也以自杀的方式结束了自己的生命。
It's not an isolated case, as we've been told.
我们被告知的是,这样的事并非个例。
This is the price all of us pay for our so-called clean and safe and "healthy" environments on social media.
这是我们所有人,为了我们所谓的干净、安全且“健康”的社交媒体环境,付出的代价。
Never before in the history of mankind has it been easier to reach millions of people around the globe in a few seconds.
在人类历史中,从未有哪个时代能像现在这样轻易地在数秒之内便触及全球各地的数百万人。
What is posted on social media spreads so quickly, becomes viral and excites the minds of people all around the globe.
在社交媒体上发布的内容传递得如此之快,迅速爆红疯转,刺激全球所有人的神经。
Before it is deleted, it is often already too late.
在它被删除之前,往往已为时晚矣。
Millions of people have already been infected with hatred and anger,
数百万人已经被憎恨和愤怒感染,
and they either become active online, by spreading or amplifying hatred, or they take to the streets and take up arms.
他们抑或在网上变得活跃,继续传播或放大憎恨,抑或走上街头,诉诸暴力。
Therefore, an army of content moderators sit in front of a screen to avoid new collateral damage.
因此,一支由网络审查员形成的军队,守在屏幕前,防止新的附带损害产生。
And they are deciding, as soon as possible, whether the content stays on the platform -- ignore; or disappears -- delete.
他们必须尽快做出决断,是否保留某条内容--忽略;还是让它消失--删除。
But not every decision is as clear as the decision about a child-abuse video.
但并不是每个决定都能像对儿童虐待的视频那样迅速做出清晰明了的判断。
What about controversial content, ambivalent content, uploaded by civil rights activists or citizen journalists?
对于由民权活动人士、公民记者上传的有争议的、模棱两可的内容,该怎么处理呢?
The content moderators often decide on such cases at the same speed as the cases.
网络审查员在判断这些案例时,通常和处理泾渭分明的案例时使用同样的速度。
We will show you a video now, and we would like to ask you to decide: Would you delete it, or would you not delete it?
下面我们会让大家观看一段视频,我们想让你们决定:你们会删除它吗?还是不删除它呢?
Yeah, we did some blurring for you.
没错,我们已为大家对视频进行了打码处理。
A child would potentially be dangerously disturbed and extremely frightened by such content.
一个孩子要是看到这样的内容,可能会感到严重不安以及极度恐惧。
So, you rather delete it? But what if this video could help investigate the war crimes in Syria?
那么,你们觉得删了更好?但要是说,这段视频能帮助调查在叙利亚发生的战争罪行呢?
What if nobody would have heard about this air strike, because Facebook, YouTube, Twitter would have decided to take it down?
要是因为Facebook、Youtube、推特都决定撤除这段视频,导致无人得知这场空袭呢?

一个

Airwars, a nongovernmental organization based in London,
Airwars是一个位于伦敦的非政府组织,
tries to find those videos as quickly as possible whenever they are uploaded to social media, in order to archive them.
他们试图在这些视频上传到社交网络时,尽快找到这些视频,以便对它们进行归档记录。
Because they know, sooner or later, Facebook, YouTube, Twitter would take such content down.
因为他们知道,这些内容迟早会被Facebook、Youtube、推特删除。
People armed with their mobile phones can make visible what journalists often do not have access to.
拥有手机的人们能曝光记者们通常难以接触的事情。
Civil rights groups often do not have any better option
人权组织常常没有更好的选择,
to quickly make their recordings accessible to a large audience than by uploading them to social media.
为了让他们的录像能迅速向广大观众公开,除了上传到社交媒体之外。
Wasn't this the empowering potential the World Wide Web should have?
这难道不是万维网应当拥有的能够赋予力量的潜力吗?
Weren't these the dreams people in its early stages had about the World Wide Web?
这难道不是万维网初具雏形时,人们对它抱有的梦想吗?
Can't pictures and videos like these persuade people who have become insensitive to facts to rethink?
这样的图片和视频难道无法劝说已对事实变得麻木的人们开始反思吗?
But instead, everything that might be disturbing is deleted.
然而,一切可能造成不安的内容都被删除了。
And there's a general shift in society.
在社会中还有这样的一种变化趋势。
Media, for example, more and more often use trigger warnings at the top of articles which some people may perceive as offensive or troubling.
比如说,媒体更加频繁地在有人可能感到冒犯或者不安的文章顶部使用“敏感警告”。
Or more and more students at universities in the United States
美国的大学校园内有越来越多的学生
demand the banishment of antique classics which depict sexual violence or assault from the curriculum.
要求从课程中剔除描写性暴力或性侵犯的古典内容。
But how far should we go with that? Physical integrity is guaranteed as a human right in constitutions worldwide.
但这些行为的尺度该如何把握?在世界各地的宪法中,身体健全是被保障的基本人权。
In the Charter of Fundamental Rights of the European Union, this right expressly applies to mental integrity.
欧盟的《基本权利宪章》明文规定,这项权利同样适用于心理健全。
But even if the potentially traumatic effect of images and videos is hard to predict,
但即使图像和视频带来的潜在创伤难以预测,
do we want to become so cautious that we risk losing social awareness of injustice? So what to do?
我们是否想变得如此谨小慎微,以至于要冒险失去对不公的社会意识?那么该怎么做呢?
Mark Zuckerberg recently stated that in the future, the users, we, or almost everybody,
马克·扎克伯格最近声明,在未来,用户们,即我们,或者几乎是任何人,
will decide individually what they would like to see on the platform, by personal filter settings.
将会通过个人过滤设定,个人独立决定在平台上想看到的内容。
So everyone could easily claim to remain undisturbed by images of war or other violent conflicts, like ...
也就是说任何人能轻松地声称看到战争和暴力冲突的图像时能不为所动,比如说...
I'm the type of guy who doesn't mind seeing breasts and I'm very interested in global warming, but I don't like war so much.
我是那种不介意看到胸部的男人,我对全球变暖很感兴趣,但不怎么喜欢战争。
Yeah, I'm more the opposite, I have zero interest in naked breasts or naked bodies at all. But why not guns? I like guns, yes.
嗯,我就比较相反,我对胸部或者裸体压根没有一点兴趣。但何不谈谈枪支?没错,我喜欢枪。
Come on, if we don't share a similar social consciousness, how shall we discuss social problems?
看嘛,如果我们没有共享相似的社会意识,我们该如何讨论社会问题?
How shall we call people to action? Even more isolated bubbles would emerge.
我们该如何呼吁人们行动?更多互相孤立的泡泡会浮现。
One of the central questions is: "How, in the future, freedom of expression will be weighed against the people's need for protection."
核心问题之一是:“在未来,我们该如何平衡言论自由与人们对保护的需求。”
It's a matter of principle.
这是个原则性的问题。
Do we want to design an either open or closed society for the digital space?
我们想为数字空间设计一个相较开放或是封闭的社会?
At the heart of the matter is "freedom versus security." Facebook has always wanted to be a "healthy" platform.
问题的核心是“自由vs.安全感”。Facebook一直想成为一个“健康”的平台。
Above all, users should feel safe and secure.
重中之重的是,用户应当感到安全。
It's the same choice of words the content moderators in the Philippines used in a lot of our interviews.
在我们的很多采访中,菲律宾的网络审查员们也使用了同样的遣词。
The world that we are living in right now, I believe, is not really healthy.
我相信,我们正生活的世界,并不是真的健康。
In this world, there is really an evil who exists.
在这世界上,确实存在着邪恶。
We need to watch for it.
我们需要警惕它。
We need to control it -- good or bad.
我们需要控制它--无论好坏。
For the young content moderators in the strictly Catholic Philippines, this is linked to a Christian mission.
这些来自信奉天主教的菲律宾的年轻网络审查员们,对于他们来说,这份工作和基督教的使命有所联系。
To counter the sins of the world which spread across the web.
为了对抗在网络上传播的这个世界的罪恶。
"Cleanliness is next to godliness," is a saying everybody in the Philippines knows.
“清洁近于圣洁”,这个说法在菲律宾人尽皆知。
And others motivate themselves by comparing themselves with their president, Rodrigo Duterte.
其他人则将自己与他们的总统罗德里戈·杜特尔特相比较,以此激励自身。
He has been ruling the Philippines since 2016, and he won the election with the promise: "I will clean up."
他自2016年当选以来一直掌权菲律宾,他凭借“我会进行清扫”的承诺在当年的选举中胜出。
And what that means is eliminating all kinds of problems by literally killing people on the streets who are supposed to be criminals, whatever that means.
而这个承诺的意思是通过杀掉街上被视为罪犯的人,不管这是什么意思,从而达到排除社会上各种问题的目的。
And since he was elected, an estimated 20,000 people have been killed.
自从他当选以后,估计有2万人被杀。
And one moderator in our film says, "What Duterte does on the streets, I do for the internet."
我们影片中的一位审查员说:“杜特尔特在街头上怎么做,我在网络上也怎么做。”
And here they are, our self-proclaimed superheroes, who enforce law and order in our digital world.
这就是他们,我们的“自我标榜的超级英雄”,在数字世界里维持法制与秩序。
They clean up, they polish everything clean, they free us from everything evil.
他们进行扫除,把一切擦拭得干干净净,他们将我们从一切邪恶中解放出来。
Tasks formerly reserved to state authorities have been taken over by college graduates in their early 20s,
曾经为国家机关保留的任务如今落到了二十岁出头的大学毕业生肩上,
equipped with three- to five-day training -- this is the qualification -- who work on nothing less than the world's rescue.
他们接受完三天到五天的训练,这便是他们的资格证,他们的工作不亚于拯救世界。
National sovereignties have been outsourced to private companies, and they pass on their responsibilities to third parties.
国家权能被外包给私人公司,他们又将自己的责任托付给第三方。
It's an outsourcing of the outsourcing of the outsourcing, which takes place.
情况就是—,外包,再外包,再外包。
With social networks, we are dealing with a completely new infrastructure,
对于社交网络,我们要处理的是一个全新的架构,
with its own mechanisms, its own logic of action and therefore, also, its own new dangers,
它有着自己的运行机制,自己的行为逻辑,因而也有其特定的潜在新危险,
which had not yet existed in the predigitalized public sphere.
这些危险在电子化时代以前的公共领域中不曾存在过。
When Mark Zuckerberg was at the US Congress or at the European Parliament, he was confronted with all kinds of critics.
当马克·扎克伯格在美国国会或者欧洲议会时,他面对的是各式各样的批评。
And his reaction was always the same: "We will fix that, and I will follow up on that with my team."
而他的反应总是千篇一律的:“这一点我们会改进,那一点我们团队会跟进。”
But such a debate shouldn't be held in back rooms of Facebook, Twitter or Google
可是这样的辩论不应该在Facebook、推特或谷歌的幕后进行,
such a debate should be openly discussed in new, cosmopolitan parliaments,
这样的辩论应当被公开探讨,在崭新的、国际化的议会中,
in new institutions that reflect the diversity of people contributing to a utopian project of a global network.
在新的机构中,它们应当能反映“为全球化网络理想工程做出贡献的人们的多元化”。
And while it may seem impossible to consider the values of users worldwide,
考虑到全球用户的价值观虽说看上去不可能,
it's worth believing that there's more that connects us than separates us.
但值得相信的是,我们之间的联系将比隔阂更强大。
Yeah, at a time when populism is gaining strength,
没错,在这个民粹主义抬头的时点,
it becomes popular to justify the symptoms, to eradicate them, to make them invisible.
为症状辩解、将它们消除、将它们隐形,这样的做法变得普及。
This ideology is spreading worldwide, analog as well as digital, and it's our duty to stop it before it's too late.
这种观念正在全世界扩散,无论在现实里还是在网络上,而我们的义务是在为时已晚前阻止它。
The question of freedom and democracy must not only have these two options.
自由和民主的问题并不能只有这两个选项。
Delete. Or ignore. Thank you very much.
删除。或者忽略。谢谢大家。

分享到