npr | Listen Again: Warped Reality (2020)

Hey, it's Manoush here. It has been shocking to watch the Russian invasion of Ukraine happen live on TV and social media. It's also kind of shocking that Russians are seeing a completely different version of the war play out full of fabricated images videos and suppose it facts. Disinformation and propaganda have always been part of any geopolitical conflict but now of course it's all over the internet and any of us can fall victim to it. That's why we want to revisit our episode called warped reality this week. It's about how some high tech deceptions get produced and why some people believe them. It's a show that's perfect to listen to or listen to again as once more we are glued to the headlines. I'll be back next week with a brand new episode meanwhile thanks so much for being here. Oh, and before we get started quick note this episode makes a couple references to sexual violence which might be hard for some listeners to hear.

This is the Ted radio hour. Each week groundbreaking Ted talks, “Our job now is to dream big.” delivered at Ted conferences “To bring about the future we want to see.” around the world “To understand who we are.” from those talks we bring you speakers and ideas that will surprise you, “You just don't know what you're gonna find.” challenge you, “Retrieved acts ourself like why is that noteworthy.” and even change you. “I literally feel like I'm a different person.” Yes. “Do you feel that way?” Ideas worth spreading. From Ted. and NPR.

I'm Manoush Zomorodi and on the show today technology and deception. The word deception has particular meaning. Deception is the intentional falsehood that is there is something done to manipulate help people feel in the world. It's designed to change people's behavior and mislead. This is Danielle Citron she's a professor of law at the university of Virginia where she teaches privacy and free speech and also researches tech and cyber harassment. And one of the best examples of this she says is the story of a woman named Ron Iube.

It was the 20 second fall 4/20/18 you know

Ronna Iube is an investigative journalist in India who has expose human rights abuses and government corruption know I am somebody who say it's 1 of the most important ministers in the Modi government behind bars in 2010 and that's mine now happens to be the second most powerful man in India.

And in April of 2018 she received an email from a source inside the Modi government and the person said heads up a video is going around about you.

It was like a 2 minute 20 seconds porn video with my image all of it.

it was a fake sex video. And I mean she's got big brown eyes it looked like her that was Rana no question about it

When I got that video I said like I was humiliated, I was shamed by the people who want to discredit me

And it went viral. Speech also the video all over my social media on Instagram walks out messages and call what's all over India

Within 40 8:00 hours it's been reported that it was on like half of the phones in India

Before I knew it was on my father's phone my brother's phone.

Within a day after that her home address her cell phone number all over the internet. They were fake ads on adults like finder sites saying that she was available for sex and this is where she lives she was inundated with death and rape threats

I think it was as good as dead for the next 5 days since I received the video

And she pretty much didn't leave her house for like 6 months.

And I kept asking my friend I said what have I done to deserve this.

She became like a shell of herself.

And so how did how was that video made possible if it wasn't her. What was it?

It was a deepfake her face was inserted into a porn clip off so you know when I first worked on it you know what we knew what we knew about it was that you could insert faces into videos and and use sophisticated neural networks to do that you know it's called generative adversarial networks sort of insert a video and then find mistakes and keep iterating so that it becomes pretty perfected, but even that's a 2 years ago you could sort of tell though you know like if you stared at it enough that wasn't as good as Pixar it wasn't as good as you know the Lucas films and over time what we've seen is that now we can create from hold digital whole cloth video showing you doing and saying things that you never did or sad and they're really hard to tell with the human eye that it's just not manufactured right. And so Irona it was a perfect example of and the first one I had heard out of a deep fake sex video being used to basically drive someone out of the marketplace of ideas.

So if I go on one of these platforms right now like what's the likelihood that I will come across a deep fake or are we talking about a future that we're careening towards.

I'm largely imagining a terrible future but there's it's pretty bad here now let me let me explain. A group called Sancity they found that a year ago there are 15000 deepfake videos online and of those 150, 96 percent were deep fake sex videos and 99 percent of those 96 percent were of women's faces uncertain support. Wow. yes for 2 just a year later. 50000 deepfake videos again same line up right mostly over 90 percent deepfake sex videos and again same line up mostly all of women whose faces are being inserted into porn without their permission and it's not just U. S. women, you know, they found that it was women from all over the world. Like I guess women's images have been altered and air brush for so long and in some sense we're already surrounded by fake images everywhere but this is clearly taking into a homeowner there disturbing level.

Yep yep my eyes are absolutely nothing new to the human condition but what makes this phenomenon different is sort of 2 things coming together and the first is that we have this human frailty where audio and video have this power over us especially you know what we see so that we see something we're gonna believe it what's new is that we're in an online environment in which online platforms their business model their incentives is to accelerate share an insure that we make things go viral because then we're liking clicking and sharing and they're making money off of online advertising. And so their business model is aligned with our worst instincts.

Information travels faster and farther than ever and it does much more than just sparked elation or outrage. It changes what we believe. Conspiracy theories new kinds of audio and video and algorithms working behind the scenes make knowing what's true or false harder and harder. Our sense of reality is warping. We can see the consequences a deep distrust in each other and our fundamental institutions like democracy. So today on the show technology deception and ideas about what we can do to bring ourselves back to reality. Because as Danielle Citron says it takes just a trick of the human eye to up end someone's deeply held beliefs.

Deepfakes appear authentic and realistic but they're not they're total falsehoods

Daniel continues from the Ted stage

Now with the interaction of some of our most basic human frailties and network tools that can turn deepfakes into weapons so let me explain. As human beings we have a visceral reaction to audio and video we believe they're true on the notion that of course you can believe what your eyes and ears are telling you. And it's that mechanism that might undermine our shared sense of reality although we believe deep fakes to be true than not. And we're attracted to the salacious the provocative. We tend to believe in to share information that's a negative and novel, and researchers have found that online hoaxes spread 10 times faster than accurate stories. We're also drawn to information that aligns with our viewpoints. Psychologists call that tendency confirmation bias. And social media platforms supercharge that tendency by allowing us to instantly and widely share information that accords with our viewpoints.

Okay so all that information leads us to believe things whether they are indeed facts or lies but what about the people who say that they have the right to produce deepfakes or spread other misinformation because of the first amendment free speech

It's it's an odd misunderstanding both of first amendment doctrine and free speech very right because not all ones and zeros and words are protected speech as a matter of first amendment law and and matter of free speech values right why do we protect protect free speech because it helps us figure out how to govern ourselves because the Jews sort themselves out in the marketplace of ideas, because it helps us engage in self expression because it's a safety value is so many reasons why the reasons the out all those reasons that we can add more you know we got a few more. But when it comes to defamatory falsehoods let's take the deepfake video of showing someone doing and saying something they never did. As a matter of first amendment doctrine which show that kind of speech if you with actual malice spread a fake video of a public official doing and saying something they never did but you know it's false. You can be sued for that right so much and I've been writing about the stuff for a long time in my book hate crimes in cyberspace kind of explore how you know these online tools. It is all just the public square you know this record has a series of silly kind of you know this is understanding of the internet as like as if it's still 1996 right like it's all the town square and we're all town criers, that's foolish. What we're doing online is were working were hustling for clients were spreading ideas we are finding love ones right we are exploring ideas we are doing everything that we do offline we do it online. Because phones are wherever we are and so the idea that everything that happens online is protected free speech is wrong and it's not good for free speech values so the deepfakes sex video Ironna guess what it ended up with her off line and silenced. You know you're nude photo appears in a search of your name you are offline you take down this is just my experience working with victims you literally take down L. of your presence online

You're basically canceled for something that you didn't do

That's right. Your private persona becomes your public persona in an unwilling way that destroys your public persona. And it's so easy for people to say it's free speech and you know I will often get the push back. Often from people who are privileged so white men love yeah but they say will say to me like you know Danielle like you're you're approved why make such a big deal about nude photos we should all just put our nudes online. And I use them to take a beat I'm calm right I don't get mad. I'm so glad you're going to make that choice but I'm not going to make that choice right because it's going to cost me and other women, women of color, transgender, gay men trans men you know by folks like where folks it's just going to cost them more.

In a minute Daniel Citron Ahn Y. deepfakes have the potential to undermine our democracy. On the show today technology deception and are changing sense of reality I'm M Z and you're listening to the Ted radio hour from NPR. Stay with us.

This message comes from NPR sponsored Ted health a podcast where Ted speakers answer questions about health you never knew you had like can we edit memories how do we eat real meat without harming animals you'll learn about a smart bra for better heart health. 3 ways to prepare for the next pandemic and more find Ted health wherever you listen.

In stressful times you want to spend your time checking out not just what's best but what's best for you we know you care about what you watch what you read and what you listen to. NPR's pop culture happy hour podcast is with you 5 days a week to make sure that time is well spent listen now to the pop culture happy hour podcast from NPR.

It's the Ted radio hour from NPR I'm Manoush Zomorodi. We were just hearing lawyer and privacy expert Danielle citron described a recent internet phenomenon videos called deepfakes.

Walk with me into the future where we are in a place where we don't know whether to believe anything we see. what is that like.

So that's what we call that the liars dividends. In a world in which we can't tell the difference between what's fake and what's real. That's a real boon to the mischief makers and the liars. Because they got to point to real evidence of their wrong doing. And say it's not true and gets to walk away from responsibility and accountability for bad things that they've done and we've seen illustrations of this

yeah what a medically objected to beautiful I just start kissing them it's like a magnet

After the access Hollywood tape came out president trump said you know half said that I'm sorry

This was locker room talk I'm not proud of it I apologize to my family I apologize ...

you're later he shared with a reporter. She wasn't me on the access Hollywood tape. You sort of throwing out the layers dividend maybe to work right now for the most part that didn't really have great traction. And it's kind of part of his brand all of that so that you know the liars dividend like he tried it didn't work and maybe that didn't hurt him it didn't matter. But in an environment, in which we are sort of post truth so that we're gonna believe falsehoods if the accord with what we believe and we're going to disbelieve true if they don't accord with what we think you know confirmation bias then we're in this kind of post truth environment. And I never thought I would say this about our own country but political discourse feels fragile in a way that makes me feel like we're much more like a Myanmar than we are you know Canada. We don't feel like on solid ground in terms of discourse and so in this environment it feels so fragile our democracy.

As we see what happens in terms of the platforms taking responsibility or laws passed or whatever sort of systemic change may or may not happen I mean how much of us each of us as individuals who go online a lot. Do we have what's our responsibility to maybe I don't know sometimes I say to people like you know you are up against massive corporations when you like and share and all that stuff you're being manipulated but but maybe you see it differently maybe you think that we each have to do a better job as well.

We do you know in in in the here the now where there aren't laws right there are very few state laws around deep you know digital forgeries. We are the guardian at the gate platforms are going to do it for us do you like we can't we can't expect platforms whose incentives are to share because that's where their money is. it's on us each and everyone of us we need to protect ourselves and our democracy it's ours as it's ours to lose. So I do think we have a huge role. What I'm asking is so modest. Think before you click and share. Ask yourself is this likely and it was really crazy don't you think that it's fake. You know that's why it's there right it's there because it's negative and novels It's there 2 feet on our salacious curiosity. Don't do it.

That's Danielle Citron she's a professor of law at the university of Virginia where she teaches and writes about privacy and free speech you can see her full talk at On the show today ideas about technology deception and are changing sense of reality. And deep fakes make up one disturbing side of misinformation another conspiracy theories. Sure they've been around a long time classics like the earth is flat or another one that just won't go away.

There's a secret club all of China we will see a leads to either explicitly are named as Jews or just kind of fit into tight all molds that sounds like they're probably Jews.

In the past few years these conspiracy theories along with a whole set of new ones have moved from the extreme fringes into the American mainstream.

If ever there were questions whether a deep state exists it is real folks and you got your answer this week George Soros is behind all of this ... With the global warming and that that is a lot of it's a hoax it's a hoax I mean it's a money making video they will be advancing their new conspiracy theory and their newest hoax one recent example an ... online conspiracy theory it's unclear if it was when I came in on space list conspiracy theories have been repeatedly to bomb despite this the far right group continues to network of conspiracy theories all leading back to a mistake anonymous leader named Q. allegedly a high level government official who has access to top secret information.

Well I don't know much about the movement other than I understand they liked me very much. Which I appreciate but I don't know much about the movement

yes so. Q. anon starts with the assumption that Donald Trump is secretly saving the world and that you know he doesn't want credit for it he doesn't want to boast about it but he's actually the only person who is ferreting out this massive deep state conspiracy that involves hundreds of people engaging and child sex trafficking and satanic cannibalistic rituals. What it's like to sorry what. And you know that is just straight up misinformation slash disinformation so in a way that's almost a less ambiguous case because it's so completely bonkers honestly and it's almost like if I were writing the script I would be like nahh guys that's too on the nose like that's too bad that's it's too much of a leave that bizarre yes no 1 will believe it but you know millions of people do.

This is journalist Andrew Marantz

I am staff writer at The New Yorker magazine and so I wrote a book called anti social makes remember the subtitle online extremists techno utopians and the hijacking of the American conversation.

Even before Q. anon entered mainstream conversation in 2017 intra noticed a huge rise in far right extremism and conspiracy theories online

Starting in 2014 or 2015 around there I started seeing this informational crisis on the horizon I certainly was not the only person but I was kind of behind the curve at the time because I wasn't back then thinking of this as a particularly political story I was thinking of it as you know a business story or a tech story or yeah but then the summer of 2016 I really started saying okay this is going to have a massive impact on the presidential election in a way how could it not right. Racist memes misogynist propaganda viral misinformation so I wanted to know who is making this stuff I wanted to understand how they were spreading it ultimately I wanted to know what kind of impact it might be having on our society

Aere's Andrew on the Ted stage.

So that's how I ended up in the living room of a social media propagandists in southern California he was a married a white guy in his late thirties he had a a table in front of him with a mug of coffee a laptop for tweeting, a phone for texting an iPad for live streaming to periscope and YouTube, and yet with those tools he was able to propel his fringe noxious talking points into the heart of the American conversation. For example one of the days I was there a bomb had just exploded in New York. And the guy accused of planting the bomb had a Muslim sounding name. Now to the propagandist in California this seems like an opportunity. Because one of the things he wanted was for the US to cut off almost all immigration especially from Muslim majority countries so he started live streaming getting his followers worked up into a frenzy about how to open borders agenda was going to kill us all and asking him to tweet about this and your specific hashtags trying to get those hashtags trending in tweet baited hundreds and hundreds of tweets.

It must be kind of weird for you like sitting there and watching how public manipulation works from a couch.

Yeah what what you see when you sort of sit at someone's elbow and watch them do this is it's like getting good at poker or something you kind of learn the basic mechanics of the thing and then you play a lot of rounds and if you're good enough you know you don't win every round but you went a lot of so what that means in terms of social media is you cannot get a sense of what the algorithms wants and the really simplistic way of putting it is that they want whatever has the sharpest emotional impact on the viewer and specific kinds of emotions too the motions that make people do something with their either click or share but for the most part he wasn't breaking the rules of Twitter or whatever platform he was using he was just really good at getting his message out there.

So this guy in California and all the other folks he spent time with like what was their mission was it just to create chaos to tear down democracy because that guy doesn't seem like he really believes in the stuff but there are people who do.

yeah it's a spectrum right it's it's sometimes we talk about you know people who are doing this for profit cynically who don't believe what they're peddling and then people on the other hand who are true believers and I think that is a true and worthwhile distinction and I guess all I would add is that there are many shades of gray in between. right so it's not purely immediate monetary motivation in most cases and then you'll get some cases where it is just people who have just been radicalized or you know read billed as they call it and they just think the world will not be safe until we have a white ethnostate and you know obviously those people are hard to deal with because they're pretty far gone. I talked a lot with one young woman who grew up in New Jersey. And then after high school she moved to a new place and suddenly she just felt alienated and cut off and started retreating into the phone. She found some of the spaces on the internet where people would post the most shocking heinous things and she found the stuff really off putting but also kind of engrossing started interacting with people in these online spaces and they made her feel smart meter feel validated she started feeling a sense of community started wondering if maybe some of these shocking memes might actually contain a kernel of truth. A few months later she was in a car with some of her new internet friends headed to Charlottesville Virginia to March with torches in the name of the white race. She gone a few months from Obama supporter to fully radicalized white supremacist.

Okay so for those of us who just can't wrap our heads around how someone's ideas about the world can change so rapidly “right” how does that happen especially with something like Q. anon which is really an entire mind set

Yes so it's sort of like any call to you know you start with the stuff that sounds less controversial and then the more and more people get initiated the more they're prepared to believe more and more outlandish things so you kind of start with parts of it that are closer to the truth like there was this guy Jeffrey Epstein who was really was doing all kinds of outrageously terrible things and he really was friends with Bill Clinton and prince Andrew and “you right” out there really was a conspiracy to cover up those crimes that is still on going and then you kind of go from there to you know I bet Hillary was involved in I also bet Tom Hanks was involved in it at Oprah's probably involved and they all have a dungeon somewhere where they're locking up children and trying to harvest their adrenal glands and

okay you lost me with the adrenal glands

well I I think when you start going down that rabbit hole part of it is engaging in a kind of collective fan fiction and part of it is we'll wait but what if this is real and it's flirting with that line and you know I can get that to some extent I get the thrill of being like what is. there really was a an Illuminati and it's just to a certain kind of person at a certain desperate moment in their lives or just spend too much time on the internet they they can't keep those blurry lines straight and it becomes their entire reality

I can understand how in a time of financial problems in security this idea of being part of a movement and having meaning in your life and being you know part of like the revolution on Ritesh actually that's that's very exciting yes the underground

part of the resistance and have. In secret knowledge that no one else can see except for your compadres yet it's just bizarre how far it can go

And some of these troops are you know he is centuries old and dumb ways they're not new

Yeah I did not expect the fact that I am Jewish to matter in any way I didn't expect to be talking about things like protocols of the elders of Zion and like mine constant stuff I thought frankly that they would be a little more original than that but as it turns out there are certain tropes that just refused to go away, and it does a lot of work for people it helps explain things that are otherwise unexplainable, you know, why is the economy so obviously you seem to be rigged against me why can't I find meaningful work or why do I have to work but I still don't feel like my life has any purpose light you know on and on and on with these kind of sometimes unanswerable questions and if the answer is because there are 10 people in a room somewhere saying I don't want people to have meaning in their lives and then in a way that's kind of a comforting explanation because it it it means that there's at least a namable reason or a an identifiable enemy. And often that's Jews often that's women often it's just whoever's a visible other in terms of being a person of color or what have you. But I think one of the notable things for me is not that these troops still exist in the world but the fact that they can be revived in terms of popularity and in terms of salience to the national discourse that that I did find surprising.

I mean it feels as though this idea that a single person switching on a story that changes people's perception of what reality even is has become so commonplace that we are in the midst of an error very little trust

Yes. It is common place now and the companies have had a lot of time to try to figure this out in some ways they have you know it is no longer okay on Facebook to buy an ad in an American election using rubles as the. That was a loophole that was opened in 2016 that should not have been open but it was. They did close that loophole but the larger loophole which is the entire incentive structure the entire thing that social media is built to do that hasn't changed and yes as a result we are living in a pretty confused and confusing time

So what do we do in the meantime like how do we fix this at least a little bit

Yes so a lot of the bigger solutions to this are going to have to be systemic and the companies are going to have to step up and it might involve government regulation and all kinds of bigger things but until they rebuild and dismantle their business model there are things that individuals can do when one of them I call it being a smart skeptic so there are things that pass for skepticism online that I think I actually just need jerk contrarian troll arrays so you often see people saying well I'm just asking for more evidence and I'm just asking the question, but that is not real skepticism in real skepticism is being open minded but not being so open minded that your brain is also demanding evidence but not demanding evidence past the time when a question has been settled. If you just sort of say well I don't know you know everybody says racism is bad but like I'm skeptical of that claim I don't think skepticism is the best word for what you're doing there and sometimes they're just is consensus on something and there's a certain cast of mind of a person who just doesn't want to hear that. It's a kind of addiction to feeling like you have secret access to the knowledge that you know the countries of polite society doesn't want you to have.

If only we all had that secret accessory

yeah I mean it's a thrilling idea it's just that sometimes the real answer is the answer that most people already believe and I'm sorry if that's boring but it's just sometimes is the case.

That's Andrew Marantz. he's a journalist and staff writer for The New Yorker you can see his full talk at On the show today technology and deception I'm a new summer roadie and you're listening to the Ted radio hour from NPR.

This message comes from NPR sponsor work life a podcast from Ted where organizational psychologist Adam grant explores the science of making work not suck. In the new season you'll learn about that not so great resignation and how to retain employees. You'll discover how to wrestle with perfectionism how to master a pitch and more find a work life with Adam grants were ever you listen. This message comes from NPR sponsor work life a podcast from Ted where organizational psychologist Adam grant explores the science of making work not suck in the new season you'll learn about that not so great resignation and how to retain employees you'll discover how to wrestle with perfectionism how to master rip hitch and more. Find a work life with Adam grants were ever you listen.

It's the Ted radio hour from NPR I'm Manoush Zomorodi. on the show today technology deception and are changing sense of reality. And so far we've been talking about deepfakes conspiracy theories and other kinds of misinformation the data and algorithms they can board a reality too

We can deceive ourselves into thinking they're not doing hardware we can fool ourselves into thinking because it's based on numbers that it is somehow neutral. A. I. is creeping into our lives and even though the promises that's going to be more efficient it's going to be better if what's happening is we're automating inequality through weapons of mass destruction and we have algorithms of oppression this promise is not actually true and certainly not true for everybody.

Weapons of mass destruction algorithms of oppression which basically means bias and human error can be encoded into algorithms leading to inequality to keep them in check the algorithmic justice league to the rest.

My name is Joy Buolamwini I'm the founder of the algorithmic justice league where we use our research and art to create a world with more equitable and accountable A.I. You might have heard of the male gaze or the white gaze or the post colonial gaze to that lacks the god I add the coded gaze the ad that we want to make sure people are even aware of it because you can't fight the power you don't see you don't know about.

Joy hunts down the flaws in the technology that's running every part of our lives from deciding what we see on Instagram to how we might be sentenced for a crime

What happens when somebody is harmed by a system you created you know what happens if your heart where do you go we want that kind of place to be the algorithmic justice league so you can seek redress of for algorithmic harms

You are a lot of things you're a poet you're at a computer scientist you are a super hero like. Hard to put into a box and can you just explain why you created the algorithmic justice league

Yes so the algorithmic justice league is a bit of an accident when I was in graduate school I was working on an art project that use some computer vision technology to track my face “hi camera I thank. Can you see my face” at least that was the idea you can see her face. And when I tried to get it to work on my face I found that putting up white mask on my dark skin. Is what I needed in order to have the system ahh pick me up and so that led to questions about where our machines neutral why do I need to change myself to be seen by a machine and if this is using A.I. techniques that are being used in other areas of our lives whether it's health or education transportation the criminal justice system what does that mean if different kinds of mistakes are being made, and also even if the systems do work well let's say you are able to track of face perfectly what does that mean for surveillance what does it mean for democracy first amendment rights you know.

Joy continues from the Ted stage

Across the U. S. police departments are starting to use facial recognition software in their crime fighting arsenal Georgetown law published a report showing not whining into adults in the U. S. that's 117000000 people have their faces and facial recognition networks. Police departments can currently look at these networks I'm regulated using algorithms that have not been audited for accuracy. Machine learning is being used for facial recognition but it's also extending beyond the realm of computer vision. So who gets hired or fired you get that loan you get insurance are you a minute into the college that you wanted to get into do you and I pay the same price for the same product purchased on the same platform. Law enforcement is also starting to use machine learning for predictive policing. Some judges use machine generated arrest scores to determine how long an individual is going to spend in prison. So we really have to think about these decisions are they fair, and we've seen that algorithmic bias doesn't necessarily always lead to fair outcomes.

When I think about algorithmic bias and people ask me well what do you mean machines are biased it's just numbers in just the data I talk about machine learning and it's a question of well what is the machine learning from well what is the machine learning from like what's the information that it's taking in

So an example of this what I found was that for face detection and the ways in which systems were being trained involved collecting large data sets of images of human faces and when you look at those data sets I found that many of them were pale and male right you might have a data set that 75 percent on male faces over 80 percent lighter skinned faces and so what it means is that the machine is learning a representation of the world that is skewed. And so what you might have thought should be a neutral process is actually reflecting the bias sees that it has been a train dogs and sometimes what you're seeing is a skewed representation but other times what machines are picking up on our our own societal biases that are actually true to those data

For example Amazon was building a hiring tool

You need a job somebody in your life needs a job. Right you want to get hired

and to get hired you upload your resume in your cover letter

that's the goal it starts off well

but before human looks at your resume gets vetted by algorithms written by software engineers

so we that we start off with that it and then Ted for efficiency we have many more applications than any human could go through let's create a system that can do it more efficiently than we can

and how to build that better system

well we're going to gather data of it resumes and we're going to sort those resumes by the ones that represented candidates we hired or did well your target is who you think will be a good long term employee

And now the system gets trained on the data

and the system is learning from prior data so I like to say the past wells within our algorithms. You don't have to have the sexes hiring manager in front of you now you have a black box that serving as the gatekeeper but what it's learning are the patterns of what success has looked like in the past so for defining success by how would look like in the past in the past has been one where men were given opportunities white people who were given opportunity and you don't necessarily fit that profile even though you might think you're creating this objective system it's going through resumes right this is where we run into problems.

So here's what happened with Amazon's hiring tool

What happened was as the model was being built and it was being tested what they found was a gender bias where resumes that contained the word women or women or even an all women's colleges right so in the case of being a woman were categorically being rank lower than those that did it and try as they might they were not able to remove that gender bias so they ended up scratching the system.

They scratch the system and that's a big win but one win compared to thousands of platforms that use skewed algorithms that could warp reality

It is not been the case that we had universal equality or absolute equality in the words of Frederick Douglass and and I I specially worry about this when we think about techno benevolence in the space of health care. Right we're looking at let's see a breakthrough that comes in talking about skin care skin cancer we now have an A. I. system right that cannot classify skin cancer as well as the top dermatologists the study might say a headline might read and then when you look at it's like oh well actually when you look at the data set that it was for lighter skinned individuals. You might argue well you don't like your skin people are more likely to get skin cancer and when I was looking into this it actually darker skinned people who get skin cancer usually has to tap into the stage 4 because they're all of these assumptions are not even going to get it in the first place. So these assumptions can have a meaningful consequences.

You know we weren't just talking before about the 2016 presidential election have you seen any examples of artificial intelligence being used in voting more more politics

yes Sir channel 4 news just did this massive investigation showing that the 2016 trump campaign targeted 3.5000000 African Americans in the United States labeled them as deterrents into an attempt to actually keep people from showing up to the polls

They used targeted ads

yes and we know we know from Facebook's own research right that you can influence voter turn now based on the kinds of posts that are put on their platform. And they did this in battleground states and so in this way we're seeing predictive modeling and I had targeting right being used as a tool of voter suppression which has always been the case to disenfranchise right, you might say black lives don't matter but it's clear black votes matter because of so much effort used to rob people of what the blood was spilt for hero for generations. so it shouldn't be the case right that any sorts of algorithmic tools that are intended to be used again have to be verified for non discrimination before it's even adopted by

So as a black woman technologist you know there's not that many of you frankly why not you know go work at Google or Amazon and make these changes to the algorithms directly why act as sort of a watchdog.

I think there are multiple ways to be involved in the ecosystem but I do think this question you pose it's really important because it can be an assumption that by changing who's in the room which is an important and needs to happen we're going to change the outcome and the output of the system so I like to remind people that most software developers engineers computer sciences you don't build everything from scratch right you you got reusable parts and so if there's a bias within those reusable parts or large scale bias in the data sets that have become standard practice or the status quo right changing the people who are involved in the system without changing the system itself is still going to reproduce algorithmic bias and algorithmic harms so how do we build systems that are more fair like if there's no data for the artificial intelligence to sort of you know process to to start to pump out recommendations then then how do we even change that yeah well it's a question of a what tools do you use towards what objectives so the first thing is seeing if this is the appropriate tool not every told not every decision needs to be run through A. I. and oftentimes you also need to make sure your being intentional and so the kind rate changes you would need to make systematically for even who gets into the job pool in general it means you do have to change society to change what A.I. is learning.

What do you say join it to people who list might be listening and thinking like you know you ... let's let's take take a step back in there and look at the bigger picture we in in many ways things are way better than they were in thanks to technology because you know here we are in a pandemic and anyone can work from anywhere because we have the internet and we M. zoom in all these platforms equality and access is on the whole improved why let's let's not think be Debbie downers about it.

Yeah I mean I always ask who can afford to say that because I can tell you the kids were sitting in McDonald's parking lot so they can access the internet to be able to attend school remotely that has never been a their reality and so often times if you are able to say technology on the whole has done well it probably means you're in a fairly privileged position there's still a huge digital divide even their billions of people who don't have access to the internet I mean I was born in Canada move to god and then grew up in the U.S. I had very western assumptions you know about what tech could do I'm very much excited to use the tech skills I've gained as a undergrad at George's pack you know to use tech for good for the benefit of humanity and so when I critique tactic it's really coming from a place of having been enamored it with that and wanting it to live up to its promises. I don't think it's a being a Debbie Downer to show ways in which we can improve so the promise of something we've created can actually be realized I think that's even a more optimistic approach than to believe in a wishful thinking that is not true

You know one thing that you said that I I find so I love this idea that that you see there's a difference between potential and reality and that we must separate those two ideas

Yes so it's so easy to fix the aid on our aspirations of what could be it and I think in some ways this this hope that we can transcend up our own humanity right our own failures and so yes even if we haven't gotten society quite right ideally we can build technology that better than we are but we then have to look at that that that technology reflects who we are it doesn't transcend who we are and so I think it's important that when we think about technology we ask what's the promise, what's the reality and not only what's that got but who does it work for who does it benefit who does it harm and why, and also how do we then step up and stand up to those harms.

That's Joy Buolamwini founder of the algorithmic justice league you can watch her full talk at Thank you so much for listening to our show this week about technology deception and are warped reality to learn more about the people who were on it go to and to see hundreds more Ted talks check out or the Ted app. Our Ted radio production staff at NPR includes Jeff Rogers son as Michigan poor Rachel Faulkner deep emotion James Dale who C. J. C. Howard Katie Monteleone Maria Paz Gutierrez Christina Kala and Matthew crew TA with help from Daniel shook in our intern is fair at safari our theme music was written by rom teen era Bluey our partners at Ted are Chris Anderson Colin Helms Anna Phelan and Michelle Quint. I'm Manoush Zomorodi and you've been listening to the Ted radio hour from NPR.