Episode 11: Scientific Racism and the Myth of Raw Data

Episode 11: Scientific Racism and the Myth of Raw Data

01:18:22

Hosts: Anna Reser, Leila McNeill, and Rebecca Ortenberg 

Guest: Dr. Safiya Umoja Noble 

Producer: Leila McNeill 

Music: Careful! and Cassie Lace by the Zombie Dandies 


Subscribe. Rate. Review. 


In this episode, the hosts talk about the history of the IQ test and how disparities in intelligence have been used as a tool of oppression against people of color. Dr. Safiya Umoja Noble joins in to talk about her book Algorithms of Oppression: How Search Engines Reinforce Racism, which explores how Google and other search engines are engineered to marginalize women of color, particularly black women.

Show Notes

The birth of American intelligence testing by Ludy T. Benjamin Jr., PhD

Genetics Generation by Laura Rivard

Sick? Or Slow? On the origins of intelligence as a psychological object by Serge Nicolas, et al.

“Scientific racism” is on the rise on the right. But it’s been lurking there for years. by Nicole Hemmer

Sam Harris, Charles Murray, and the allure of race science by Ezra Klein

The Immortal Life of Henrietta Lacks by Rebecca Skloot

The Averaged American: Surveys, Citizens, and the Making of a Mass Public by Sarah E. Igo

Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble

Safiya Umoja Noble

Insatiable” Trailer

‘Insatiable’ Is Lazy and Dull, But At Least It’s Insulting by Linda Holmes

Naked Labs’ 3-D Body Scanner Shows You The Naked Truth by Laura Goode


Transcript 

Edited for clarity 

Rebecca: Welcome to episode 11 of the Lady Science Podcast. This podcast is a monthly deep dive on topics centered on women and gender in the history and popular culture of science. With you every month are the editors of Lady Science Magazine.

Anna: I'm Anna Reser, co-founder and co-owner and chief of Lady science. I'm a writer editor and PhD student studying 20th century American culture and the history of the American Space Program in the 1960s.

Leila: I'm Leila McNeill the other founder and editor-in-chief of Lady Science. I'm a historian of science and freelance writer with words in various places on the internet. I'm currently a regular writer on women and the history of science at Smithsonianmag.com.

Rebecca: And I'm Rebecca Ortenberg, Lady Science's managing editor. When I'm [00:01:00] not working with the Lady Science team, I can be found writing about museums and public history around the internet and managing social media for the Science History Institute in Philadelphia.

Leila: Okay, so we've got some housekeeping before we start on the episode. So first thing is that if you listened to last month's episode, you'll know that we just wrapped up our summer pledge drive that ran through June and July. At the beginning of the pledge-drive, we had $228 in monthly pledges, and we just closed out the drive with $538. So that's a $300 increase. Though we didn't reach our third goal of $650 we're incredible incredibly grateful for what you guys helped us to achieve these last two months. So we just want to say a big thank you to everyone who made new pledges, increase their pledges and spread the word.

It means so much to us, and we can't do any of this -- the magazine, the [00:02:00] podcast, any of it -- without you guys. So, thank you.

Rebecca: Seriously, it's so awesome. Since we did reach our first two pledge goals. we're gonna be putting out some new content. So that's exciting. When we reached $400 in pledges, we promise that we started an Instagram account. And so we went and did that. You'll find video, stories, and more there so follow us for more Lady Science goodness. Our handle is ladyxscience, which is the same as our Twitter handle.

When we had our second goal, we promise to share bonus podcast content. So to get all the bonus and mini episodes, make sure to subscribe on Apple podcasts Google play or Stitcher.

And the last sort of housekeeping thing I wanted to mention is that we are getting ready to start another special series [00:03:00] that will be running in October, and we're accepting pitches for it right now. We're doing a series on the intersection of sports gender and science.

So we specially when hear from people that are sort of not traditionally well-represented in sports writing. So, you know folks who are not white dudes, and even if you've never written about sports before, if you want to you should pitch us and we're looking for six hundred to a thousand word pieces. They can be historical or contemporary. They can even have personal elements, but they should sort of fit well with the Lady Science perspective on gender and science. So, you know things like gender and statistics in professional sports or performance-enhancing drugs, exercise culture, sports medicine, technology and sports. So you can find more details about that on our Twitter and Facebook pages, but the basics [00:04:00] are send your pitch, some clips, a little bit about yourself to ladysciencecienceinfo at gmail dot come with the subject "Sports series," and we pay so that's fun, too.

Leila: Cool, so a little bit later in this episode, we're going to be talkin to Safya Noble about her new book Algorithms of Oppression: How Search Engines Eeinforce Racism. Her work attacks head on the notion that algorithms that control search engines are not at all neutral and she does this by exploring how those algorithms work and how Western societies' biases against people of color and especially black women get baked in every step of the way. Her book does an amazing job of dismantling the idea that algorithms somehow remove biases from decision-making processes, and shows that they do exactly the opposite, which is amplify and codify those biases. But before we get to talking about why there's [00:05:00] no such thing as an unbiased algorithm, let's talk a little bit about data.

Because algorithms are built on some kind of data, whether it's the browser history that gets fed into a search engine algorithm or it's the Lady science business expenses that we keep track of and calculate using it an Excel spreadsheet. And here's the most important thing you need to know about data: like algorithms, data isn't neutral.

Recebba: It seems like especially when we're talking about science -- super sciencey science -- we've decide there is something called raw data that scientists collect in a lab. And this pops in my head, it's kind of like the science we learned at school, right?

So you have a scientist, and they're in white lab coat, and they do something like count cells in the petri dish, or observe the change in color in a solution, or track some like small animal's heart rate, and that's supposedly [00:06:00] raw data. It's apolitical because they're cells in the petri dish and cells in a Petri dish don't care who the president is. So just counting them, therefore, we assume must be a neutral act. Right?

Anna: But we can ask all kinds of questions about how that could not be neutral like: Why is the scientist counting those particular cells and not other cells? How did they decide to do this experiment? Who is supervising them? Who's paying the scientists to count them? Why? Who taught them how to count the cells in the first place? Is the scientist getting enough sleep at night? Like there are all kinds of things that we can, um, variables if you will, that we can inject into you know this idea of neutral data.

Leila: So this example about cells [00:07:00] reminds me of a really good example from history -- which I guess in the last 10 years or so has gotten a lot of attention because of Rebecca Skloot's book --which is Henrietta Lacks and the Hela cells. And just for some basic background. What happened with Henrietta Lacks and Hela is that in 1951 a black woman named Henrietta Lacks went to Johns Hopkins for what she was believed to be cervical cancer, and it did turn out to be cervical cancer. And so during her treatment, the doctor who was treating her extracted cells from her cervix, and extracted them without her consent or knowledge, which was you know to some degree acceptable. Well not acceptable. I wouldn't say that. [00:08:00] Legal at the time. Doesn't mean that it's acceptable. But they they took them, and those cells are still being used today. They're the first immortalized cell line, and it's become one of the most important cell lines in medical research.

So some of the issues surrounding HeLa, and the name comes from the first two letters of her first and last name, so some of the issues that come into that with Hela, obviously, are the idea that she didn't give consent, and these cells of her own body were taken without her knowledge. But also that these cells continue to be used long after she died, long after we did establish informed consent laws. And they have also turned a profit. She died less than a year later in 1952 from the cervical [00:09:00] cancer, but her family was not even made aware of the cell lines' existence until 1975.

So there's a whole lot of issues that come into play with that: with commercial commercial purposes, about privacy and patients rights, and stuff like that. So if we're talking about cells, I think that might be one of the most prominent examples of how that wasn't necessarily just raw data that just like sprung up in a Petri dish.

Rebecca: Another example that I thought of maybe this because it was about counting -- maybe made me think of this -- but the census. Not not like lab science in the same way, but on the surface the census seems like a very like basic thing: you go and you count a bunch of people and find out where they live. But then how do you define what it means to live somewhere and how do you count people who like don't want to answer [00:10:00] their door to people they don't know, and maybe even particularly government officials.

And so this thing that seems on the one hand maybe really basic counting people has just so many variables involved in it.

Leila: That's become something that's become a prominent topic during the Trump administration because they're asking questions about immigration status, which you know with what's been going on with deportation and ICE all of that, that naturally puts immigrants, you know, they obviously have concerned about that. And then also I believe maybe a year ago or so they were a floating the idea that gay people weren't going to be counted on the census. And so sure you can't exclude gay people from your data collection and then just use that data and pretend that it's raw data, but it's not because someone decided who [00:11:00] got to count and who didn't get to count.

Anna: Yeah the census is -- oh my god, ugh -- There is also, when you're talking about like people not wanting to open their doors or not, you know not wanting to participate in the census, there's a good book,I believe it's called the Averaged American about how Americans were first sort of exposed to this kind of data collection for social sciences. And like we were basically trained to participate in creating this kind of data, like the history of the Gallup poll. Things like that. It's a really interesting book.

So in terms of what we want to talk about today, you know, even when we're talking about things that seem to us straight forward like the census or counting cells, I we've seen already that that gets really fuzzy really quickly. You know, what happens when we try to [00:12:00] collect data on something that's much more complicated, and I think that most of our listeners would understand IQ, intelligence quotient, as being really complicated. So there are various kinds of tests that are supposed to measure and quantify intelligence, and they've been shown to be biased against people of color, women, non-Westerners. But they're meant in this like "all data is neutral" optimistic kind of way to turn something squishy and difficult to understand, intelligence, into something quantifiable and comparable and neutral.

Leila: Just a little bit of history here about the IQ test. So the first working intelligence test was developed by a French psychologist named Alfred Binet in 1905, and he had a pretty political reason for creating it in the first place. At this point in history, France had rcently instituted universal public education, and Binet [00:13:00] wanted to ensure that so-called "abnormal students" were also served by the new system. He created intelligence tests as a way to advocate for specific education policy and at his urging the French government created a quote special education system where quote feeble-minded students attended classes separately from quote normal students, but in the same school. And I wanted to keep saying "quote" there because I want people to know that these are not things that we say, these are things, these are terms that were common then, and that's why we're using them because it goes to show how ableism was built into even just the language that they were using to talk about the IQ test.

So shortly after Binet published his intelligent test he got a visit from an American psychologist named Henry Goddard. Goddard also studied feeblemindedness and he was fascinated by this idea of being able to [00:14:00] quantify intelligence, but he had a somewhat different interest than Binet. He was trying to figure out what made some people quote feeble-minded and some people quote intelligent. And he was pretty sure it had something to do with heredity.

Rebcca: So back in America, Goddard translated and published Binet's test, and this is when it really takes off. Before that it was really kind of this interesting French social science political thing. But it's kind of amazing how fast it takes off. By 1911, he introduced the test to public schools in America. By 1913, it was being used to test immigrants at Ellis Island in 1900.

Leila: That sounds great.

Rebecca: And then the next -- right it gets even better. But in 1914 Goddard became the first psychologist to introduce evidence for Binet's test into a court of law. So this is like five years after this test has been published in English for the first time. [00:15:00] It's already like having a serious impact on the like socio-political system of the United States. All the while he's also a psychologist who is studying the idea that intelligence is an inherited trait. And that made him super duper popular with a group of other well-respected psychologists who were calling themselves eugenicists.

Anna: You know when we started this magazine, I did not think we were going to spend quite as much time talking about eugenics as we do, it's like a lot. But I think you can't talk about data without talking about eugenicists because they were all about data. Eugenicist love data. They like measuring people's heads and like their height and weight. They like tracking rates of mental illness. They really like tracking criminal behavior. And they were pretty excited by this test because now they had something that was [00:16:00] quote-unquote objective by which they could measure intelligence to, and, shocker, Goddard and his fellow eugenicists came to exactly the conclusion that they were looking for. So according to the data from the intelligence test, women, poor people, black people, were all less intelligent than white people, and they produced inferior children. Which is exactly what the eugenicist wanted to hear, and data and objectivity never lie, right?

Leila: Yeah, even today and so, this wasn't that long ago. This was early 1900s. So I think it's really important when we're starting to bring this into the present day, like you know these stories weren't that long ago, and even today we have people who use intelligent test to conclude that women are inferior to men or poor people are inferior to rich people or black [00:17:00] people are inferior to white people.

Anna: I just want to say, when you say "even today," we have people who use intelligence tests, stuff like this is being published in like mainstream medical journals. So it's not just like a people like Pepe Twitter yelling about this stuff. Like it's embedded in our academic system, and there're supposedly legitimate discussions happening in academia about stuff like this right now.

Rebecca: Yes, and this is the part where we have to talk about Charles Murray.

Leila: It is where we have to talk about Charles Murray. Even though he's gonna be kind of the person we talk about, it's not just Charles Murray. It's James Daramore who got fired from Google, and is now deciding to I guess to Google. It's Sam Harris [00:18:00] who runs a podcast where he talks for two hours because he has enough confidence as a mediocre straight white guy that he thinks people want to listen to him for two hours.

Rebecca: And he considers himself. a progressive, If I remember correctly.

Leila: He does. He does. He's just another racist misogynist ... go on. Sorry.

Rebecca: Yes. No. Yeah so many terrible people. Yeah, but, on the one hand, Charles Murray is, I was gonna say considered a crack pot, but I can't even quite say that anymore. But you gotta remember this guy is like has a sociology degree and worked for the government and is considered a normal intellectual human out in the world.

Anna: Didn't he work for the CIA, that trustworthy and kind branch...?

Leila: Yeah, in Thailand.

Anna: Perfect

Leila: [00:19:00] You know, I'm sure he saw brown people in a totally like neutral way.

But I also wanted to say that people like Charles Murray and Sam Harris, and these two are in cahoots with each other...

Rebecca: Totally.

Leila: ...the New Atheist movement, capital A, see like men like Harris and Murray and Dawkins as being like their leaders, and so we need to interrogate the New Atheist Movement in a really rigorous way because they're they're touting the same demagoguery and racist ideology as any other white male movement.

Rebecca: Yeah, and the reason that they're kind of able to do that just to back up for people who don't know all the details. So Charles Murray, he came really to prominence in the 1980s. He was a big proponent of dismantling the New [00:20:00] Society programs and, therefore, was super popular among the sort of Ronald Reagan political set.

His book The Bell Curve, which is probably what he's most known for, came out 1994, and that argued that IQ was heritable and unchangeable and that it was correlated to both race and to negative social behavior. And the reason why people like Sam Harris, people like the New Atheists, get super excited about this is that the Bell Curve is notoriously stuffed with charts and graphs and numbers to make it super weird and sciencey in this way that appeals to a certain kind of mediocre white man who needs to justify his racism via number things. So even Charles Murray, which I think is hilarious called it social science pornography. So like he knew exactly what he was doing. [00:21:00] Yeah. And I learned that from an article that was written by Nicole hammer in Vox last year that was called "Scientific Racism Is On The Rise On The Right, But It Has Been Lurking There For Years," which basically lays out the fact that all these eugenics ideas, which didn't start that long ago, really just never went away. They took different forms, but they just sort of hung out. So for more about that specific history and that intellectual trajectory. I really recommend looking up Nicole Hemmer's stuff.

Anna: And Nicole Hemmer, correct me if I'm wrong, she is a historian, a political historian of American conservatism. Correct?

Rebecca: Yes. That is correct.

Leila: And I okay. I want to point out also that Sam Harris -- I'm sorry that I keep coming back to Sam Harrist but...

Anna: Oh my gosh

[laughter]

Leila: ...when I was making notes and researching more [00:22:00] for this piece he just kept popping up more and more because he has hitched his wagon to Charles Murray in the last year. He has been very much attacking Vox, and Nicole Hemmer runs a column in Vox, and ezra Klein, who has publicly denounced what Charles Murray and Sam Harris are pushing, and so I don't think it's a coincidence that there's these two like Nicole Hemmer and Ezra Klein who have been very outspoken about this particular issue, and Sam Harris continues to go after Vox. Just saying.

Rebecca: Yeah. Oh, yeah 100%. Yeah, there's that absurd transcript that Ezra Klein published in Vox. I only got about like a quarter of the way through it, and I was like I can't do this. But he published like this full transcript of a conversation.

Leila: Yeah. Yeah, it was terrible. I listened to it.

Rebecca: And it's just... oh god [00:23:00] I couldn't even listen. I was like, "nope, can't listen to this."

Leila: But I think it's important to note that these men who claim objectivity and non-bias and denounce identity politics because they're above that are attacking media outlets and media figures because they're being criticized. So like yeah, that's just some cognitive dissonance there.

Anna: So I guess we could probably rant about Sam Harris for the rest of the episode if we really want to...

*laughter*

Leila: Yeah, so lets!

Anna: Do you remember when him and Dawkins and Dennett? And who was the other one? Oh, wait him. Dawkins, Dennett, there was one other one,

Leila: Krauss?

Rebecca: Andrew Sullivan?

Anna: I don't remember.

Rebecca: There are so many of them!

Leila: I know, but there was like that core group...oh! Hitchens! And they were calling themselves the Four Horsemen. I just...

Rebecca: Oh right!

Anna: [00:24:00] Anywhere horrible. Okay, so but the point of bringing up all of these awful men -- are you surprised, really? So the thing here about Murray and his defenders and all of these yahoos is that they always want to talk about pure science -- quote-unquote pure science -- raw data, and they defend The Bell Curve and other studies like it by saying "well, we're just following the data," and accusing people who question their motives as being unscientific, which ugh!

Leila: Nothing is worse than that!

Anna: But this pure science just doesn't exist, which I think should be our new log line for Lady Science. Like it doesn't, so even when we're talking about something much less controversial than IQ, scientists collect data about people that is, and other things not just people, but that is influenced by their [00:25:00] own assumptions and biases and by this sort of structural biases of our society writ large, that's been true historically, it's still true today.

I think one of the things that I struggled with in talking about this kind of thing to people who didn't go to grad school for a History of Science and don't spend all their free time thinking about this is that there's this really hard-to-squash misconception that there is there is an important and comprehensible difference between the data that is collected and who does the collecting and how. So Ithink it kind of gets into some like probably too obnoxious Philosophy of Science stuff to go into here, [00:26:00] but...I don't know what I'm trying to say...that there's some kind of difference between reality and the way that you observe that reality and that like any kind of bias you like even if you were prepared to admit that there was bias, which I'm not sure that we've even got to that point as a society, that there's some way you could just Bad Apple the whole situation and say "well, like a real scientist would never do that." But we're not always talkin about the Charles Murries of the world who have like a specific evil intent, not all biases in science come from, you know, mustache twiddling, eugenicists. I's baked in all the way through and so it goes from being super evil to being very banal. [00:27:00] I think it's just important to say that like if you try to dismiss the problem as just being "Murray and his ilk are evil" that's not gonna solve the problem.

Leila: And I think this is a good time to take these things that we've talked about a little bit step further and maybe specifically with this idea of IQ is how these things get, passed into social policy.. So it's not just about thinking white people are smarter than black people, which in and of itself, face value, is just a terrible thing to believe. But it goes a lot further than that because this affects social social policy. And so if we think about something like Affirmative Action, and you actually believe that a black person is not as smart as a white person, then you think that affirmative action is inherently fone out of bad faith, [00:28:00] that it is swindling well-deserving genetically superior white people just so that we can please black people, right? The fact that we keep still having to have this argument about Affirmative Action shows that these ideas about intelligence and IQ are baked into that very conversation.

Rebecca: The thing that came into my head today when I was thinking, when I was getting ready for us to record, was the article that came out recently that was... was it women are more likely to survive a heart attack if the doctor is a woman, I think was what the study showed. And that's not because every single male doctor is like "well, I don't care if this woman dies" but because of so much of how we think about how heart attacks work is [00:29:00] based on studies on men, and if you're not listening to a woman and how a woman is describing her symptoms, and how her body works then you aren't necessarily going to be able to save her life. And that's just because of both implicit biases that mean that a lot of doctors are less likely to listen to women in pain, but also structural biases like the studies on women and heart conditions don't exist at the same level that studies of men and heart conditions do, and that all comes from decisions that people made to do neutral things like which studies to do.

Leila: Right and that's how these things become systemic is that it's not just an individual belief that Murray has or Harris has, that these become systemic things that affect real people on the ground. And I think that we talked about it a couple episodes ago, about how it wasn't even until the 90s [00:30:00] that in clinical trials it was required to have women and people of color in there. Because that wasn't happening. It was white men. Their bodies were being used as the data for medical science. And so if we have a medical system that continually dismisses woman's pain, where black women are continually dying in childbirth, like those things were done by design because they were systemic, and we're having to dismantle them even now.

Rebecca: And all of that starts from...what you know...this is okay sounds silly....but that starts from someone deciding which cells in a Petri dish they feel like counting.

Leila: Yeah, absolutely.

Cool. So, we'll just go ahead and transition into our interview with Dr. Sofiya Noble, which we actually recorded at a different time, and it was recorded [00:31:00] while she was at a Starbucks. And so there's a lot of sounds of people like moving chairs around on the concrete. And an ambulance and stuff like that. So I'm just letting you know heads up. Uh, it's not going to sound too great, but you're definitely going to want to listen to it because the conversation was really fantastic.

[musical break]

Rebecca: Today we've been dismantling the idea that data about human intelligence is or ever has been neutral and now we're going to be turning to something else that many people assume is neutral and that's the Google search engine. We're excited to have Dr. Sofiya Noble with us to talk about her book Algorithms of Oppression: How Search Engines Reinforced Racism. Dr. Noble is an assistant professor at the Annenberg School of Communication at the University of Southern [00:32:00] California. In her book, she demonstrates how Google creates biased search algorithms that privilege whiteness and discriminate against people of color, and particularly black women. So welcome to the podcast, Dr. Noble.

Dr. Noble: Thank you. Thanks so much. So great to be here.

Anna: Okay. So let's start at the very beginning. You use the term "technological redlining" to describe the power of algorithms in reinforcing oppressive social relationships and racial profiling. So that we're sort of all working from the same baseline understanding, could you first explain what redlining is in a non digita spacel?

Dr. Noble: Sure. Sure. So redlining has been a historical practice of institutionally discriminating using certain kinds of metrics or indicators within systems that really kind of skew [00:33:00] resources and power away from for the most part communities of color in the United States, but also women. And one of the ways we've seen this most profoundly, for example, is in the real estate, banking and financial services industries where people who live in a particular zip code, for example, might be more likely to pay a higher premium on their insurance: car insurance, automotive, and home insurance. Maybe they pay a higher interest rate when they're trying to get a loan for a house or small business or personal loan. So these are the kinds of ways that we've seen redlining happen over time. And, of course, that's been ruled illegal since the passage of the Civil Rights Act. And so one of the things that I'm trying to do in the work is show how. some of these key markers about group identities, community-based identities, get kind of reiscribed into data [00:34:00] profiles and ideas about who people are and then get implemented through technical systems, like software or artificial intelligence that does a certain kind of automated decision making and that also incorporates all kinds of demographic features are raised, our gender, and so forth, but does it in ways that are very very difficult to see and, of course, even more difficult to intervene upon.

Rebecca: So how is it that the Google search works? How does it find and serve up search results?

Dr. Noble: Google searches actually a complex phenomena, and most of us who study Google or study search engines have to kind of come to, to draw a lot of sophisticated conclusions about how it works based on its output and what's publicly available.

So [00:35:00] at a very technical sense, the only people who know how Google search works are the engineers who work on search at Google. But more broadly, I think we can glean from what we know about how search engines work, that, for example, the primary mechanism that influences what we find is the advertising algorithm, which is Google's AdWords program that allows people to optimize content and pay for and outbid others to link certain keywords with their content. And that's a huge feature of how search engines work which is to say that their content is moving through an advertising platform, and advertisers with the greatest amount of capital as well as breadth of reach with their content either through hyperlinking [00:36:00] or having their content embedded with other people's content; they're typically the winners in a search engine. And so we see this, of course, in some very obvious ways. If you're looking for new stories, for example, you're much more likely to be directed to major international and national news outlets before you are kind of local news or kind of local perspectives. And if you're looking, you know for dresses, of course, this was a big story in the New York Times a few years ago, you might be more likely to find JC Penney's or Macy's right instead of of a boutique in your neighborhood. So ultimately, search engines are really trying to, I think they're at balance between optimized content and paid for advertising.

Leila: So to kind of bring those first two questions together, how combining the idea of [00:37:00] redlining and how Google works, how then does this technological redlining happen in a search.

Dr. Noble: Well, you know when I first started my research many years ago, I was looking for example to see how women and girls were represented conceptually in a search engine, and you know many people have heard me speak now about this and I write about this in the book, it was the impetus for the book and the book cover, which is what happens when you search for black women and girls or you see what kinds of water suggestions come about or what kinds of images come about for these identities. And what you find is that, for example, with women and girls oftentimes is the porn industry that has a tremendous amount of influence over the type of content that comes up in relationship to keywords, searches that use the word girls in particular, girls, or black girls, [00:38:00] Asian girls, Latina girls. So that's one dimension of it, which is to say industries that have a lot of money like the porn industry are able to dominate and control keywords and, of course, those keywords are actually made to real human beings in real communities, real groups of people. So that's one dimension of how I kind of argued that people lose control over their representation and um, and they're not able to kind of purchase their way out of that.

And of course now ambulance is going by so.

*laughter*

Anna: So one of the, I think, one of the really important threads in the book is about understanding the way that Google is first and foremost [00:39:00] concerned with advertising and profit and the disconnect between what people assume Google is concerned with, which I suppose think of some kind of like public service. And so you describe Google searches and advertising algorithm not as an information algorithm, and I was wondering if you could just talk about that difference and why is it important to know and be able to identify that difference as a user of the internet?

Dr. Noble: Yes, so this is one of the most fundamental reasons why I wrote this book, which is we're living in an in a moment where people, particularly in the United States and in Europe, are extremely reliant upon the internet to kind of provide their basic information needs. Governments are increasingly pushing people to the web to facilitate government business. The public.is increasingly reliant upon the internet and [00:40:00] digital technologies and media platforms as a proxy for all kinds of things, so we see, for example, in local communities where libraries are being threatened with closure; the general public sensibility is "what do we need the library for when we have Google." And so we have this kind of increasing tensions around our conceptions of what public information and reliable and credible information is and where it can be found. And one of the reasons I wrote this book is because I felt that there was an increasing conversation about the trustworthiness and the reliability of Google to really serve as something akin to a public library online.

There's many studies that bear this out, that show people believe that what they find in a Google search, for [00:41:00] example, is highly reliable, credible, trustworthy. And yet when you start to look at certain kinds of concepts, especially concepts around traditionally marginalized and oppressed communities, you find that there's a lot of misrepresentative information images and ideas that flourish, and this to me is significant when we think about how important it is to have trustworthy and reliable evidence based research and journalism and fact-based information. In order to see democracy flourish. If there were ever a time, of course, where those kinds of ideas are threatened would certainly be as we look to see how social media and platforms like Facebook and YouTube have played a significant role in the spread of [00:42:00] disinformation and misinformation whether it's in the 2016 presidential election or around other kinds of ideas.

So this is really what the thesis of the book is. It's like what's at stake when we outsource our information needs in a democracy to private corporations that are in essence advertising platforms and whose values are about the bottom line and returning shareholder value rather than maybe some type of non-commercial set of values that similarly what a library right might have. How different are the kinds of things that we find when we're relying upon an advertising platform to inform the public.

Rebecca: Now, on a very sort of practical level, I guess, the way that this happens that it acts as an advertising platform not as an [00:43:00] information platform is that it's so hard to tell the difference between an advertise search result and a non-,advertised result. But are there ways that people can tell the difference and can you give some tips on what people should look out for?

Dr. Noble: Yeah, you know, this is really complex because there have been times where Google has highlighted, for example, yellow box content to signal "this is an ad" versus what it calls its organic search results, which I think we might even trouble the notion that an organic search result is not tied to keyword optimization in its advertising tools. This idea that it's borrowing from an old model of newspapers, which is that the advertising is somehow wholly separate from the editorial, but in a search engine all of the content is really [00:44:00] reliant upon these organic ,so to speak, results are deeply tied to ads that want to be visible when people are looking for those keywords. So there's a real co-mingling, and of course, a lot of the content that we might see might itself exists because it's been paid for in essence to be optimized or to make it to the front page. So this makes it really difficult then to say like "well, what is advertising and what's a quote-unquote truthy fact or organic result?" I think that paradigm or that idea is a bit faulty. I think all the content that we find has through a variety of different factors, Google says over 200 different factors go into deciding what makes it to the first page right or how it finds what it finds and displays it to us. We don't know what those factors are, but we can be guaranteed that some level of [00:45:00] profit for Google is at play in that. That makes it very difficult to say what is an ad. In some ways. if you know that the content you're getting says ad right in front of it, which now is the way that Google is doing it, that's that's one way.

Although I will tell you that there have been studies that have been done by ConsumerWatchdog.org, for example that show that when they ask people to look at search results on a page and tell them which ones are advertisements and which ones are just kind of these quote-unquote organic search results most of the public can't tell the difference. So I think what we are more likely to find is that the public generally believes that if some content makes it to the first page of Google search results, it's credible. It's been vetted and it's reliable. I think they might not think of it as an ad.

Rebecca: It seems like even in a very practical kind of way even if you know that the ones at the very top are [00:46:00] almost always ads, sometimes you're in a hurry and you're just trying to find the thing and so it can be so easy to to fall into. Yeah exactly what you're saying, anything on the first page just is what you want.

Dr. Noble: Yeah and here's the thing. Listen I use Google all the time. People always ask me what do you u,se and I use a lot of different ones, probably three different ones. But the deal is if I'm looking for directions someplace, if I'm looking for a good coffee shop in a place, in a town I've never been to, Google can kind of quickly get me this type of banal information, and that reinforces our trust because if nine out of the ten things we look for are just like where is something, where something on sale, how do I get to this place, these kinds of things really reinforce our trust. Then when we do the one thing like ask Google a complex question [00:47:00] and it gives us back some propaganda, well that might be harder to discern. That's in fact what we're seeing, and of course, I wrote a chapter in the book about Dylan Roof and his searches, and I think that's a really one of the the worst kind of most extreme egregious examples of how that happens,

leila: Right. There was one time when I was researching black women inventors in the 19th century and the first result, one of the very first results, that popped up was a discussion thread from the Daily Stormer debunking the idea that black people had invented anyting. And like that wasn't what I was looking for even a little bit.

Dr. Noble: I know and imagine you were 14 or 12 writing a paper for some sixth grade class. We see this happen in fact. Many reporters [00:48:00] and scholars have talked about what happens when middle school and high school kids are writing papers on Dr. King for example, and they come across MartinLutherKing.org, which is a Stormfront website. That's a propaganda disinformation site about Dr. King, but it's been owned and optimized by Stormfront for so long that it's on that front page, and it's very hard, even the King Estate has not been successful in getting that URL taken down. And so what happens is it's legitimate by virtue of being on the first page, and young people in particular but even older people may or may not be able to tell. If they don't know that Stormfront, for example, as a white power organization, the largest online white supremacist community, if they don't know what that is, there's no reason for them to recognize what's happening there.

Leila: One of the things that you talked about [00:49:00] multiple times throughout the book is the importance of understanding historical context in which these oppressive algorithms pop up. So can you explain why history is such an important part of your argument?

Dr. Noble: Yes, well one of the reasons is because... I give the example of what happens to black girls and black women in this gross hyper-sexualized stereotyping that happened for many many years. Google has since changed the algorithm and suppressed the pornography and that's great, and unfortunately, it still for Latina and Asian girls; there's still work to do. But all of these can these communities of women who quite frankly are women: when you look at this pornographic content, it's allegedly women over the age of 18. So it's not girls in a fundamental women coded as girls. It's like a sexism [00:50:00] 101 elementary level example of that phenomenon of infantilizing women, regarding women as girls.

Thinking about women of color as hyper-sexualized objects in our society has been used historically as a trope in service of disempowering women of color, keeping women of color from meaningful political and social participation, legitimating the second-class status of women of color. There's a kind of long history that I go into about that specifically around black women, and these tropes are really important when we start thinking about again how they affect public consciousness about other people, so it's not just [00:51:00] that I was writing this thinking about black girls, which of course I was, and women but I was thinking about what do other people come to learn about black women and girls when they go to a search engine as some type of authority site to learn more to answer questions about things they don't know. This is where I think that these kinds of narratives are able to circulate without any ability to think more thoroughly about, how these are old tropes, old media stereotypes, and how they just keep circulating. Even in these allegedly new media forms, they're anything but Democratic and fair representations, and those are the kinds of things I think we need to challenge in our society.

Anna: So in terms of your methodology in the book you write that you are asking these questions about algorithms and searches from the specific standpoint of being a black woman. Can you explain [00:52:00] why that matters and how that standpoint sort of specifically shapes the questions you're asking and on a larger scale even the research projects that you decide to pursue?

Dr. Noble: Sure. So one thing is that there's there's a popular idea among researchers, some researchers maybe in some fields than others, that there's some type of objective neutral place from which research is done, which, of course, everyone who does research has a point of view, has a standpoint, is a human being, brings all their social context to the research experiene. I felt that as I was reading over a lot of the scholarship about Google, it was very kind of universalized about either the political economy of Google or what Google's business practices are, how Google, it's [00:53:00] founders, are the "boy geniuses" of the 21st century, and so forth. Yet some of these more complex dynamics about misrepresentation of women were not being asked about that platform, and I, of course, know that my own commitment to asking questions and developing a research agenda that keeps in mind the people who are most vulnerable whether that's women, or people of color, poor people. Those are typically the subjects that are not given any agency or attention by a lot of mainstream technology scholars.

Of course my own subjective experience in the world as a black woman gives me a [00:54:00] lot of insights about what these things mean. And I can tell you that I have presented my work, years ago I presented to conferences where I was the only woman of color present in the entire conference, for example, for sure the only black woman. And people would say, "Don't you dare try to mess with this algorithm," or they would say, "Well maybe black women do more porn than anyone else." Just like ludicrous ideas, like completely unsubstantiated, just reaching kinds all of ideas, and I felt like this is why people sometimes who belong to communities that are the subjects of inquiry often have insights and ideas of places to look and of questions to raise that people who haven't that lived experience ever even think of. And so that's [00:55:00] certainly been my experience as a scholar, and it's been exciting.

Also I can tell you to see more women of color, more women doing feminist research around technology, and I certainly am not even close to being the first mean. There were many women who were doing feminist inquiries of digital technologies that inspired me to think even in a more intersectional way than just like broadly a white feminist perspective on technology. But like how does this get specific in its various executions or realities for black women.

Rebecca: One thing that I found really interesting about your book, and I think we all did, was that you offer, maybe not solutions, but things that people can do or things that we as a society can do that might improve the situation. So can you talk about [00:56:00] how people who are marginalized by oppressive algorithms can in some ways fight back and claim to space in the digital landscape

Dr. Noble: Sure. So one of the larger concerns that I have is that the framework for rights, civil rights in particular, in the US and in other countries has been kind of one in the legislative realm, right? And so it's been public policy that has been the primary provider, so to speak, of pathways to enfranchisement. Let's say whether its voting, housing, full citizenship, and so forth. So, of course, there's no way I couldn't write about the public policy landscape right now around large digital media attack a companies, which quite frankly in the US, we probably have the least regulatory [00:57:00] environment around potential harms. I certainly think about some of my work in terms of consumer protections: what's the barriier or the lowest common denominator of consumer protections, and protection from harm that people should have, and how do we build on that? And we certainly have the Federal Trade Commission, for example, that has played a very important role in regulating advertisers. And so just even getting something a body like the FTC to start thinking about Google as an advertising platform is a mechanism for thinking about protecting consumers, especially vulnerable consumers. We have laws around predatory business practices, and we might be able to fold some of these conversations up around that. So I talked about that, of course.

I [00:58:00] felt like I had to write an epilogue to the book because it was going to press just as we were moving into a new administration and the Trump Administration, and I was like, okay, wait a minute, this might not be the right moment for that kind of fair and civil rights oriented legislation, and I think we've seen in fact a rollback of those kinds of commitments in the last couple of years. So this might not be the moment. But certainly, I think we should remain diligent and in calling for that.

I also talk about what I think will be increasing human rights abuses and concerns that we really don't have a legislative framework to talk about things like how automated decision-making platforms or algorithmic bias or artificial intelligence increasingly is taking power and decision-making power away from human beings and people and [00:59:00] putting it in the domain of machines. Increasingly, I think as we see more machine learning it will be more and more difficult for human beings to intervene upon the kinds of decisions and outputs that come out of AI.

So, of course, regulation and public policy as just a fundamental no brainer. But, of course, there's also some things that I offer up, which is around increased critical media literacy: how do we institutionalize that in schools and libraries? How do we educate teachers? How do we create alternatives that are in the public domain rights? A public interest search is something that I argue for that I'd love to see organizations like the Library of Congress and other kind of major information institutions to think about and play with and imagine because I think that's [01:00:00] important. Certainly technology design, interface design, how we think about providing more context for search results rather than less, and I have a whole bit there in the conclusion of the book about what it would mean to display content in a better context. So again, if I do my search for black girls knowing I'm in the red light district of the internet that I know that I'm gonna get the porn, but maybe if my interface could allow me to access products and services, I might get the good hair products. I don't know. I'm always on the search for that; that's crucial.

There's lots of different ideas that I try to surface in the book, and, of course, just at a fundamental basic level, if the public could be more aware that these technologies are not neutral, they're not apolitical, they have a tremendous amount of cultural power, we could engage with them differently.

[01:01:00] Leila: You were talking a little bit about that there are there are business regulations and that google is a business, so one of the things that you propose is breaking up Google. Can you talk a little bit about how businesses get broken up and how it could work in this particular case?

Dr. Noble: Well, this isn't a new idea. When companies become so powerful that they are monopolies in a market then the tendency has been to see that lack of competition as damaging as not giving consumers choice. In the case of giant tech firms, not just Google, certainly Facebook has no real formidable competition. The amount of capital across [01:02:00] Google, Facebook, Apple, Microsoft. and Amazon for sure, it's so intense that it's very difficult for new entrants to pose some type of competition. And when that has happened in the past, let's say, for example, with AT&T when it kind of dominated the telecommunications market then it was required to break up for the possibility of new entrants into the telecommunication space. So this isn't really a radical idea. This is a very kind of common practice, but who knows whether there's kind of an inclination. Again, we haven't really seen a rigorous Federal Trade Commission that's thinking about these things since the new administration, if you can believe that or not, I know it's hard to believe. *sarcasm*

Professor Molly [01:03:00] Neisen, she's an amazing media historian and has really written the history of media policy and the Federal Trade Commission and how its acted and in the past, and I think she's an important voice in helping us make sense of these histories. But these are things that I think have to be considered if we're going to talk about any viable counterweights, and I think right now my faith has been in public institutions. I still believe in the public library. I still believe in public universities and research universities, community colleges, K through 12 education. I think that we have a lot of resources and knowledge and and capital in those spaces, and we should maybe turn to them to help us in this navigation of the information landscape.

Anna: So my last question. I [01:04:00] think you already answered it. So was there anything that we didn't bring up that you especially wanted to talk about.

Dr. Noble: Well, listen, I love this whole idea of Lady Science. And I need a shirt that says Lady Science. I know that one thing that I would just add is that some of the really interesting scholarship and research that's happening around what the past has been in relationship to our imagination about what the internet could have been, certainly what it is current complicated, and in many ways exploitive, business practices are. A lot of the really good work is coming from women scientists, women social scientists, women researchers, women computer scientists and technologists. Whether it's [01:05:00] people like Joy Borromini at MIT media lab, who is a black woman who's done this amazing work around the facial recognition, for example, how cameras don't recognize black faces. Sarah Roberts at UCLA who's writing about commercial content moderators and the outsourcing of ideas about what's acceptable content on the web to a global workforce that's making all these curatorial choices and making it more visible that the internet, in fact, is not a democratic place. There are all kinds of moderating effects, not just algorithms but human beings. People like Marie Hicks and her great book Programmed Inequality where she's talking about the history of women in British computing and how the UK basically tanked it's computing industry because women dominated it and so they thought it wasn't [01:06:00] really a thing and they're still playing catch-up to this day.

There's a lot of Lady Scientists who are doing brilliant work. I can't even begin to name them all but there are dozens and dozens and dozens who are helping us understand better the internet and the technological landscape and I think we should be listening to them. So I would say, if people are interested, certainly in the book I try to highlight as many of those voices as possible and send a signal out that people should be listening to the lady scientists.

Anna: We absolutely endorse that message.

Rebecca: Yes, stamp of approval. Awesome.

[musical break]

Anna: So at the end of every episode, [01:07:00] we, your exhausted hosts, will unburden ourselves about one annoying thing. And I think we mentioned this last time too, it's really hard to find just something that's annoying and not like actively horrifying. So what we're gonna talk about today actually does kind of border on horrifying, but we are going to talk about the show the TV show Insatiable on Netflix and a horrible 3D body scanner that was partially funded by Peter Thiel. This is the worst timeline. And just this I think resurgent fatphobia that we're seeing in the age of body positivity. So, as far as Insatiable, because I haven't watched it because I've been warned away from it, but from very smart and articulate writers on [01:08:00] Twitter who have written some scorching reviews, the show is basically, it's like a teen comedy about a high schooler who in is in a fat suit and then she gets punched in the face and has to have her jaw wired shut and she loses a lot of weight and then comes back to school and is like hot or whatever and takes revenge on all her bullies. But the twist that's supposed to be a body positive is that she's still a horrible person even though she's not fat anymore, that was their sort of way of saying something empowering about being fat. I don't know. I've read a couple of reviews, and it's apparently just horrible. And I think the most important thing to mention is that the main character is played by a thin actress...

Leila: Who puts on a fat suit...

Anna: ...who puts on a fat [01:09:00] suit.

Leila: Anytime there is a fat suit on a skinny person, we should probably just go ahead and nope out of that situation. And I do want to say that actually right now as we're recording, it's not out yet, just the trailer. And like some reviews of critics who have pre-screened it all apparently agree that it's terrible. But by the time this episode comes out, it'll have started on the 10th.

One of the things that I immediately noticed in a part of the trailer is that in the first 30 seconds it shows her as fat, and it is all of the horrible stereotypes that we have about fat people. She just sits on her couch and eats ice cream and eats horrible food and watches TV, and she wears frumpy clothes. She looks a little dirty, her hair's kind of greasy. All of these stereotypes that we have about [01:10:00] fat people are just played out in the quick 30 second beginning of this trailer.

Rebecca: I should say among other things in Linda Holmes's wonderfully scathing review on NPR, which if you follow her on Twitter, it turns out that all through the TV Critics Association tour thing that she was at she was definitely subtweeting Insatiable the entire time. But anyway to go back to the fat suit thing and the stereotypes about fat people that are baked into that, linda homes makes the point that like because it's a fat person being played by someone in a fat suit that the person has no sense of their own physicality. The idea that a fat woman could know how her body is shaped and how to move in it in a way that like makes her look like a [01:11:00] normal human being just doesn't even occur to the people in the show. So it looks like a skinny woman in a fat suit shambling around, which is a perfect representation of how so much of a culture thinks about fat women.

Leila: Right, and I was telling Anna about this when we were deciding that we were going to talk about this is that the actor or actress who plays the main character in the fat suit -- I can't remember what her name is, I think it's Debby Ryan -- she defends the show and says something about that it's a about how difficult and scary it can be -- and this is a direct quote -- to go "move through the world in a body." There's just such a big disconnect there because she is not a fat person. She is thin, and she [01:12:00] moves through the world, her every day lived reality, with thin privilege, and these types of comments from thin people are equating, yes, a world in which women are always going to be made to feel bad about their bodies with someone who is fat and moves through the world as a fat person with those biases that someone who is thin does not have.

Anna: Yeah. I was thinking that, I don't know if I would call it backlash per se because it's I think is kind of a different conceptual animal, but there is like a reaction I think happening to a lot of this...we have a lot more visible fat people with platform now to talk about what their lives are like, if you know Lindy West or Tess Holiday, people like that. It does seem to [01:13:00] me that there is not only a reaction to that, obviously women like that get unbelievable harassment on social media, that's why Lindy West quit Twitter. It's disgusting. But there's also stuff like Insatiable where this sort of bad faith engagement with body positivity or body acceptance ideals getting baked into our cultural products where to me this doesn't seem like an honest misunderstanding of the body politics of fatness. It seems in bad faith to me in like it seems like it is taking a shot at fat people, and I think that there has been like a resurgent fatphobia as we hear more from fat people [01:14:00] and from body politics advocates and activists and stuff. We're seeing stuff like this kind of cropping up, and I think that's also what's behind this length awful 3D body scanner thing that you sent me, Leila, which is terrifying.

Leila: Yeah, the body scanner like take a 3D scan of your body. And so ladies, if you thought that looking in a full-length mirror was bad enough, just wait until you can see all of your cellulite and all of your little wrinkles and all your stretch marks in 3D.

Rebecca: So wait, so okay, they didn't share this with me.

Anna: Oops, sorry!

Rebecca: No, no, it's okay, now used to tell me all about it though. So it takes a scan of you and you're a little hologram of yourself?

Anna: Also while you're standing on a [01:15:00] scale, so it's the package.

Rebecca: Of course. Oh my god, why?!

Anna: And they're billing it like it's for people who are obsessed with fitness and they want to check out their gains or whatever.

Leila: Again, it's equating fitness with weight.

Anna: Yes.

Leila: Again.

Rebecca: Yeah.

Leila: Which is something that body politics and women writing about fatness have just been like tearing their hair out about trying to tell people that that's not the case, that is a false correlation.

And speaking of bills, the actual bill for the body scanner, if you so wish to purchase one, is $1,395.

Rebecca: Oh god! To torture yourself every day

Leila: For the low low price of [01:16:00] $1,395, you can obsess over every imperfection on your body and never leave the house, I guess.

Anna: I will just add one thing about the 3D body scanner is that it's just another. piece of surveillance technology that you can volunteer to put in your own home.

Rebecca: Oh god, that too!

Leila: It helped funded by a man who founded the company Palantir so.

Anna: Yeah, that's something you should really think about. If you don't know what a Palantir is you should read, well, I guess the best description, well is you can watch the movie to they talk about in The Two Towers.

Leila: Lord of the Rings.

Anna: Yeah, it's for spying on people and hypnotizing Hobbits.

Leila: Yep. He purposely named his surveillance company that. I really need these men on the right to stop [01:17:00] appropriating my nerd culture.

Rebecca: No, for real though.

Anna: It reminds me of Soylent too where you're like "that's the name you're going to pick for your thing? And you want to try and convince us that it's not evil? What are you doing?"

Rebecca: Yeah. Yeah. Stop, just stop being Bond villains. We are not enjoying living in the worst timeline.

*exasperated sighs*

Leila: So we'll go ahead and uh and that there today, but remember that if you liked our episode today to please please please leave us a rating and review on Apple Podcasts. That's how new listeners can find us.

And if you have any questions about the segment today, tweet us at @ladyxscience or hashtag #LadySciPod. For show notes, episode transcripts, to sign up for a monthly newsletter, read monthly issues, pitch us an idea, and more visit ladyscience.com.

And remember that we are an independent magazine, and we [01:18:00] depend on the support from our readers and listeners. You can support us through a monthly donation with Patreon or through one-time donations. Just visit ladyscience.com.

Until next time you can find us on Facebook at @ladysciencemag and on Twitter at @ladyxscience.

[Music]


Episode 12: Gender in the American Space Program

Episode 12: Gender in the American Space Program

Episode 10: Make It Rain on Lady Science!

Episode 10: Make It Rain on Lady Science!