PBS高端访谈:Facebook的心理研究终结了关于用户信赖度的测试
日期:2014-07-02 11:46

(单词翻译:单击)

JUDY WOODRUFF: Let's turn now to a social media story that's been generating lots of reaction, including anger, over the past few days.

It's in response to a study Facebook conducted with hundreds of thousands of its users. The study in question goes back to 2012, when Facebook manipulated the incoming content of pages belonging to almost 700,000 of its users for a week, without telling them. It was designed to see how people's attitudes were affected when they read either a stream of more positive posts or more negative ones in their so-called news feeds.

The results were published in a respected scientific journal in June. As that information has come to light, many are upset at what Facebook did and how they did it. It's also prompted concerns about the ethics of the research, the journal where it was published and much more.

To fill in the details, we're joined now by Reed Albergotti of The Wall Street Journal.

Welcome to the program.

Reed, first of all, where did the idea for this study come from? What did Facebook hope it was going to accomplish by doing this?

REED ALBERGOTTI, The Wall Street Journal: Well, around the time of this study, there was sort of a meme going around the Internet that when you go on Facebook and you see all these wonderful things that your friends and family are posting about their lives, you start to feel a little bad about your own life.

And there was some research, some academic research at the time that really kind of backed up that theory. And Facebook wanted to find out whether or not that was true. And that's why they embarked on this research project. And they say they have debunked that theory, and they weren't shy about it. They worked with Cornell to publish the study and tell the public what they'd found.

JUDY WOODRUFF: Did they raise any questions internally? Is it known about the propriety or the ethics of doing this?

REED ALBERGOTTI: No.

And, in fact, Cornell issued a statement saying they looked at it and they decided they were not subject to federal guidelines, laws actually, that require informed consent of human research subjects, because the study was done by Facebook without the involvement of the Cornell researchers at the time.

So Cornell is sort of washing their hands of the ethical implications here.

JUDY WOODRUFF: But what about Facebook? Do we know if there was discussion about whether they should have let people know ahead of time?

REED ALBERGOTTI: Well, Facebook says that it has an internal review process, but it said at the time it wasn't as rigorous as it is now. And it's one thing that we have been pressing Facebook to tell us more about, is, you know, how did this internal review process evolve? And what are really the procedures in place now?

JUDY WOODRUFF: So, Reed Albergotti, what are the ethical — I mean, first of all, are there any legal considerations to this, that maybe they violated a law by doing this?

REED ALBERGOTTI: Well, I think, right now, it's really more of a question of ethics.

The laws really apply to government institutions — institutions that receive federal funding, like Cornell University, and not really to private companies. In fact, Facebook isn't the only social media company or tech company that's gathering reams of personal data and using it in these scientific experiments.

But Facebook is one that publishes it publicly more than other companies.

JUDY WOODRUFF: So, if — set aside any legal question. What about the ethics of it? What are you, what are others saying about what ethical lines might have been crossed here?

REED ALBERGOTTI: Well, look, I have talked to a lot of academic researchers here about this study, and I think really there's a consensus sort of being formed that there needs to be a strong, hard look at the ethics of this.

It's a growing trend really in the scientific community, private companies, corporations using their data in conjunction with research institutions for scientific studies. And, right now, it's really an ethical gray area.

And I think researchers would like to see something like another level of informed consent that Facebook would put in front of its users when they enter them into these types of studies. But, right now, it's so early, I think we will have to look at how this backlash shakes out to see if that actually happens.

JUDY WOODRUFF: And just to clarify, give us an example of how the news — so-called news feed was manipulated. As we said earlier, they were in some cases making sure they were seeing more positive information, in other cases more negative. What's an example of how that worked?

REED ALBERGOTTI: Well, there was actually an algorithm, a computer algorithm, that had certain words that were associated with positive or negative news feed posts.

So the algorithm was run totally automatically without any hands-on involvement of the data scientists at Facebook. And that's because they wanted to keep these research subjects totally anonymous. So the algorithm decided which posts were positive and negative, and then automatically removed those from the news feeds of those users for about a week.

And then after that week was up, some of those posts might have been reintroduced to those news feeds and the users might have eventually seen them.

JUDY WOODRUFF: I just wanted to read. We — on our Web site, we asked some of our visitors what they thought about this. We got comments both positive or at least — at least not so critical and others.

I'm just going to read two quickly, one from someone named Carrie. She said: “So, read the TOS, terms of service, and don't sign if you don't agree. That's the point. People don't read terms of service, and then they get upset when Facebook does something that the terms allows.”

And then from another visitor, Scott. He wrote: “The problem is that the terms of service is deliberately so vague that they can basically claim that they do whatever they want at any time. Would you buy a TV from Sony if the manual said that they could for any reason decide what programs you could watch on their TV?”

How typical would you say those reactions are?

REED ALBERGOTTI: Oh, I think they're very typical.

We saw similar reactions on our own website in the comments section. And I think what academic researchers are saying is, yes, Facebook has these terms of service that really indemnify them against any legal repercussions, although that may be debated in the future, but there really needs to be — in order for this academic research to be ethical, according to very acceptable — accepted guidelines, there needs to be another terms of service.

Users need to be asked again when they're being entered into a study if really they want to and they need to be told about the risks. In this case, the risk could have been, if someone was predisposed to depression, that might have triggered some sort of emotional instability. So there are big questions that we need to answer here.

JUDY WOODRUFF: Certainly are. I think a lot of people didn't even realize how much there's just a regular adjustment of what people see on their Facebook pages. But that's a subject for a future conversation.

Reed Albergotti, we thank you.

REED ALBERGOTTI: Thanks for having me.

分享到
重点单词
  • reactionn. 反应,反作用力,化学反应
  • instabilityn. 不安定,不稳定(性)
  • pressingadj. 紧迫的,紧急的 press的现在分词
  • statementn. 声明,陈述
  • socialadj. 社会的,社交的 n. 社交聚会
  • evolvev. 进展,进化,展开
  • deliberatelyadv. 慎重地,故意地
  • automaticallyadv. 自动地,机械地
  • claimn. 要求,要求权;主张,断言,声称;要求物 vt. 要
  • trendn. 趋势,倾向,方位 vi. 倾向,转向