PBS高端访谈:医疗算法存在种族歧视
日期:2019-11-28 09:42

(单词翻译:单击)

IF5.36;G!Y~_Do7..%mNu]lSK7|EO(n.2I

听力文本

iNjf==jX-Bv(_]f

Hari Sreenivasan: A recent study published in Science magazine found significant racial bias in an algorithm used by hospitals across the nation to determine who needs follow up care and who does not. Megan Thompson recently spoke with STAT's Shraddha Chakradhar, who explained what the researchers found.

hwwR0Q.M;Ac0

Megan Thompson: Where exactly was this bias coming from?

qeq@d2.n^jQ

Shraddha Chakradhar: There are two ways that we can identify how sick a person is. One, is how many dollars are spent on that person. You know, the assumption being the more health care they come in for, the more treatment that they get, the more dollars they spend and presumably the sicker they are if they're getting all that treatment. And the other way is that, you know, we can measure actual biophysical things, you know, from lab tests, what kind of conditions or diseases they might have. So it seems like this algorithm was relying on the cost prediction definition. In other words, the more dollars a patient was projected to spend on the part of an insurance company or a hospital, then that was a sign of how sick they were going to be. And that seems to be where the bias emerged.

JejkggTh7IfWo=~_@t[z

Megan Thompson: I understand that the researchers then sort of use the algorithm using a different type of data. Can you just tell us a little bit more about that? What did they use?

4S1QrC[dmjM&H~

1222.jpg

xZ[1t~-PuVxUSSm5^

Shraddha Chakradhar: Yeah. So instead of relying on just costs to predict which patients are going to need follow up care, they actually used biometric data, physical biophysical data, physiological data, and they saw a dramatic difference, you know, in the previous model. The algorithm missed some 48,000 extra chronic conditions that African-American patients had. But when they rejiggered the algorithm to look more at actual biological data, they brought that down to about 7,700. So it was about an 84 percent reduction in bias.

@Htwepyej-f|aDd9nC

Megan Thompson: Do we know anything about how the use of this biased algorithm actually affected patient care?

8-]|f2d(-,ftmxxA0U8u

Shraddha Chakradhar: We don't actually know that. But as I mentioned, the algorithm is used by hospitals to help them flag patients who might need extra care in the coming year, whether it's, you know, an at-home nurse or making sure that they come in for regularly scheduled doctor's appointments. So we can only presume that if black patients, sicker black patients weren't being flagged accurately, that they also missed out on this follow up care.

]FpP5UyS[%AloCBEMG|R

Megan Thompson: Are there any consequences for the company, Optum, that was behind this algorithm?

[8F#8#2PN2!|kTvmpx

Shraddha Chakradhar: Yes. So the day after the study came out, actually, New York regulators, the Department of Financial Services and the Department of Health sent a letter to the company saying they were investigating this algorithm and that the company had to show that the way the algorithm worked wasn't in violation of anti-discrimination laws in New York. So that investigation is pending. One encouraging thing is that when the researchers did the study, they actually reached back to Optum and let them know about the discrepancy in the data. And the company was glad to be told about it. And I'm told that they're working on a fix. And the other encouraging thing is that the researchers have actually now launched an initiative to help other companies who may be behind similar algorithms to help them fix any biases in their programs. So they've launched a program based out of the University of Chicago's Booth School to do this work on a pro bono basis so that they can sort of catch these things in other algorithms that might be used across the country.

hW5QdgH&Sx

Megan Thompson: All right, Shraddha Chakradhar of STAT, thank you so much for being with us.

1p9wN~y0pQfU-

Shraddha Chakradhar: Thank you for having me.

EIWKmQ[ri)kT(aPq)b

重点解析

DW+]PS**v4;+0|-!~w

c&%~@3aExdqmF7DM&kE

1.rely on 依靠

cNoYtx#TtTxi

He believes many 'psychics' are frauds who rely on perception and subtle deception.
他认为许多“通灵者”都是骗子,他们都是靠直觉和巧妙的骗术蒙人|1B#^JyH;mzt[Gdz[

(stMBLb[qQBl;8P42,q

2.projected to 预计

1.c3ILt+g=1g49

Africa's mid-1993 population is projected to more than double by 2025.
预计到2025年,非洲的人口会是1993年年中人口的两倍多pyz9tcJarMYulIy

ShU)_C5e(3

3.sort of 有几分

Fo(O]wu6Q(+

Is there a lot of effort and money going into this sort of research?
这种研究目前是否有大量的人力和资金投入?

tMiJ)IM~9RM(

4.spend on 花费

G!jUXg82IjmKQFs-;I

They will then have more money to spend on other things.
这样他们就可以有更多的钱花在其他地方RrIK!ag^.23h|Oa

*aURWviyz([

5.miss out 错过

a).k6,xtg&0kUl

Well, I'm glad you could make it. I didn't want you to miss out.
嗯,很高兴你赶来了,我可不希望你错过机会=7vG9P^B[j;U

M2M)7~;KOZwf

^mBCSovKatQlvB3j

参考译文

5zCeb=g%f*VAwl

哈里·斯里尼瓦桑:近日,《科学》杂志上发表了一则研究,该研究发现,美国各地有很多医院会使用某种算法来决定哪些病人需要后续的照料、哪些病人不需要后续的照料I#6D_eMQxMl(~^q。但这个算法存在巨大的种族偏见8GFyrk9Jx5wg@.。近日,梅根·汤普森跟STAT的史拉达·柴可拉达聊了聊,史拉达解释了研究人员的发现成果*I^V|^bU8yl

%GUhnoX7Do_

梅根·汤普森:为什么会有这种偏见呢?

h!3tUD.H,e

史拉达·柴可拉达:有2种方法可以判断一个人生病的严重程度!UhmDr&9yt~-WT。第一种:这个人治病花了多少钱7fYLRZTu5AaD|aDGX*T!。大家都知道,一般都会这样看待:获得的医疗照顾越多,获得的治疗就越多;b5z58o0G)sadubk。在获得了所有治疗的情况下,他们花费的钱越多,他们病的就越重sCm!S+HOf@2Nw!。第二种:大家都知道,我们可以通过实验测试评估实际的生物物理学情况,了解人可能出现的问题或者疾病NImw%o_@%ZqX)XY-a。所以,似乎这个算法是取决于成本预测的定义p4b8%IrzU!,o。换言之,一位病人预计花在保险公司或者医院的钱越多,他们病得就越重a5*~BOCl0rq。而偏见似乎也是从这里生发出来的KASa~_pjJc

9FHK6Ju~vY*.Q

梅根·汤普森:我理解,研究人员用这个算法来搭配不同类型的数据de1t,|qNsH*V=K!+Yp。您能稍微介绍一下吗?他们用的是什么?

Fr5H*lVJuU0

史拉达·柴可拉达:没问题+^]y@V^!&~L^=TShzL。是这样的,他们没有只依据成本来预测哪些病人需要后续的照顾,他们用的是生物统计的数据、生物物理学的数据、生理数据,然后他们在之前的模型上发现了巨大的不同urUBA|xk2Y。这个算法错失了近4.8万额外的慢性病,这种慢性病的患者是非裔美籍病人MK8eC~FDiPqIv。但他们调试了算法来观测实际的生物数据时,他们将这个数字降到了7700JUkCuIamn2Ig%[JC。所以,偏见程度下降了大概84%TjnYZN%1e9nMu~ptdg

R*klz6~bx!!0

梅根·汤普森:我们是否知道这个偏见算法是否会影响病人获得的照顾呢?

H5Th%T%W7gAY@t

史拉达·柴可拉达:这个不知道f0WM2(BuWl4T。不过,就像我之前提到的那样,这个算法是一些医院用来帮助自己确定未来一年中哪些病人可能需要额外照料的aGl,t9rkcSCYk6e]!。这种照料包括在家的照料、确保病人可以定期跟医生预约TdU+o]x[cvM%K*R!!k|。所以,咱们只能这样想:如果黑人病人、病得更重的黑人病人没有获得准确的定位,那么他们也无法获得后续的照顾E,y7#amc(D#Ff

=9zEp-.c@&M~*J^B

梅根·汤普森:Optum公司是否会有什么负面影响呢?毕竟这个算法是他们做的!t~4EJO*8EDI)u1mSj[

|mIhOb,]rNfsK-^u=O%]

史拉达·柴可拉达:会的QO%z~ywgiJ*b9hZwP。该研究发布一天后,纽约的一些监管人员,比如纽约州金融服务署和卫生部的监管人员,他们给该公司发了一封信Yw9mgV7T34Zd。他们说,他们在调查这个算法,他们还说,该公司必须要证明这个算法的运作方式并没有违背纽约的反歧视法@cI=Ka^l=QxTbFc。所以,这个调查还未进行Q|RxOc&(aha*%S9。一件令人鼓舞的事情是:研究人员做研究的时候,实际上,他们找过这家公司,也让他们知道了数据的不一致性l-bI~;@purFQk1xC。该公司听闻后表示很开心oUyt+x4s]dtb。我听说,他们正在修复这个问题Ie2P~*G5r#_jnMf!X@E9。另一件令人鼓舞的事情是:这些研究人员发起了一项活动,帮助其他可能落后的公司在自己的项目里修复这些偏见问题E~jZWVslt)A。所以,他们推出了一个项目,该项目是在芝加哥大学布斯商学院外进行的,而且是无偿的oWH~3Nm,z#+mm-9。所以他们可以在其他算法中发现这些可能在全国各地使用的内容%yF,jk7ckpF|L0

55kZvjK56ek-

梅根·汤普森:好的,让我们感谢史拉达的讲解EN~LJb9W@l3Q.sXFfs*o

.8_!Xb2~DWZk=Tty

史拉达·柴可拉达:谢谢大家cGMzA5-!Nm

C*OR3Fre=xqc

译文为可可英语翻译,未经授权请勿转载!

Q0wwIcrKq#wm.x9N]*DYjitYLN.&DiYc2F8.zOd4.
分享到