John Blomster: Welcome to DISCOVERY. I'm John Blomster and today we're speaking with Mutale Nkonde who is a fellow at the Berkman Klein Center for Internet and Society at Harvard University. Mutale is an artificial intelligence policy analyst and researcher who works at the intersection of AI, race and policy. She's been a powerful advocate on Capitol Hill for many years. And she is also the founder of the nonprofit AI for the People, which seeks to educate black communities about AI and social justice through popular culture. She's speaking with us here today about how the Russian government is leveraging U.S. race relations to undermine America's global standing and how disinformation was and is being used in both the 2016 and 2020 election cycles. So obviously, a critically important topic as we ramp up to the election. We're thrilled you could take some time to join us here today. So thank you very much.
Mutale Nkonde: Thank you for inviting me. I'm excited.
Blomster: So first off, can you just tell us a little bit about your background and how you came to be doing the work that you are with the Berkman Klein Center?
Nkonde: Yeah, sure. So, I always tell people when they asked me about my origin story that I don't have a career per se, I just have a tapestry of weird and wonderful experiences that magically lead from one to the other.
At the beginning of my career, I was a documentary filmmaker. I worked for the BBC. I did lots of documentaries around science and society and how they, you know, how they impacted people, and really thought that I was going to spend my career in London doing that type of work. And it really was ended up being quite shocked because as my life progressed, I met somebody who lived in Brooklyn, my former husband, but I met this boy from Brooklyn and decided I was going to run away to the United States and thought that I was going to continue this type of work. But unfortunately, when I got here, I worked with CNN and ABC, which were great experiences, but it wasn't really the type of level of analysis that I'd been used to in the U.K. And at that time, Barack Obama's 2008 campaign was ramping up. And I was so energized and interested in this candidate that I just figured, you know, his name is Barack Hussein Obama, this isn't really going anywhere, but it looks like a lot of fun. He seems very cool. His kids are nice. I like his wife's hair. Let me try this out, and ended up in Pennsylvania, and was one of the youngest people on the campaign. And they looked at me and said, you know, do you know anything about Twitter? And of course, I knew nothing. And I was like, Yes, I know, I know all the things. And not only do I know all the things I'm going to help you win. Again, totally was like this is not really going anywhere, but it seems like fun. And in that I was introduced to this idea of digital communications being used for influence and political influence, particularly. So, I was in charge of taking pictures for example, when we went on church visits or placing the candidate in sections where black populations would become comfortable with breaking away from the Clintons and going to support this candidate, and we did it, he won.
It was great, was working in government for about 15 minutes in New York City and then got a call from somebody on the campaign. And they were saying Google's coming to town and they need somebody that can speak to black policymakers specifically, because in New York, we have community agreements, like when a big city comes in, you have to show how you're going to be a benefit to the public. And would you like to do that work, and so was a member of another nonprofit that consulted with them to do that work. And that was really the first time that I was introduced to tech and the culture of tech and having living in a black community in New York City, but then going downtown, where there just were not very many black people, not very many black women, was very galling. And then starting to realize that some of the products that we were building at Google were showing bias. So, this was around the time that reports were out where if you typed in black youth into the Google search, mugshots came up, or monkeys. It was, it was kind of one or the other. That really made me start thinking, go back into my science communication days and thinking I have to develop a hypothesis for this what could possibly be happening? And Kathy O'Neill's book Weapons of Mass Destruction came out around that time. And it was really the first time that a book that speaks about technology and speaks about the blind spots of technology was mass market, and read it ferociously and start to speak to policymakers who are still in relationship with and I was like, Oh, my God, you know, there's something really something really, really bad here. We need to counter it and everybody was like, You're crazy. You have free food. You have a scooter, you can boast to people about this job, what are you doing and I ended up being moved out of that space and into more creative work that was being done at the time around how we message technology, particularly around black girls and black families. The types of words that we use the types of pictures we put on Twitter to make this an enticing place to be and ultimately left. It just wasn't for me.
And then went to Data and Society, which is a research institute in New York City, and was really excited to be part of that community because they were doing research on some of these more critical questions around technology and my original pitch to them, where I was the fellow was I was just going to do five point checklist for folks in Congress who at this point I'd been working with close to 10 years, unofficially. Always unofficially and to help them think about algorithms, that was where I was, and started on that project, and the Democrats won the house in January of 2018. So, went from starting a project where we're talking about algorithms. I did one briefing in a D.C. around algorithmic bias, and then getting a call at the end of that January, saying we love the briefing that you did, and Pelosi’s office would like to make tech accountability a priority. Would you come in meet with us so that we can start thinking about where are these algorithms. What do they mean?
Went in and I wrote three policy briefs. One was around facial recognition. One was around deepfake technology. The other was around algorithmic accountability. And within nine months, all three had been entered into the house. So, I go from being this really obscure black woman, who's working in tech but can't seem to keep a job and, you know, sketchy, to someone who my whole community at that point are looking at because I'm the first person to have been able to do this work. And for the Deepfakes Bill specifically, I'd been really interested because I came from documentary film. So, I was interested in audio visual manipulation. And during our press tour, we went to radio shows, we went to newspapers all around New York City, and there's one called The Breakfast Club, which is an urban radio station, run by, it's a hip hop station, very cool, very quirky. And one of the presenters says to Congresswoman Yvette Clarke, who I'd been working with, on a lot of this, you know, how powerful is celebrity for moving policy. But you have to remember at this point, I have been working with this person for 10 years, because I'm not a registered lobbyist I’d always worked as constituents, it always been voluntary. The stuff I'd been doing, but it was my time. And she turns around and says, “We love it when celebrities come to the Hill because it moves policy.” And so I'm in the back of the studio, like, “What? How could it possibly move? What?” Like I'm writing all these briefs. I'm reading all of this stuff. I'm speaking to, you know, technologists. I'm learning how to code like I'm doing all this stuff. And all it takes is for like Kim Kardashian. No, this cannot be right. So, the interview wraps up. And she explains to me that we've got to a point where ordinary people need to really understand how these technologies are impacting them, because otherwise she doesn't have the mandate to go in and do this policy work. It looks much more like a passion project than it does a real urgent need of the people and she has a largely black district.
So, there was also the complication around race, you know. We have to make sure that this is really vital for black people. And that's where the idea for AI for the People came from. And it was really birthed because I was creating content. I was a journalist by training that through Obama, and just through the years, I had contacts with film companies and TV companies, and podcasters, and all of these people that are creating content, but would we be able as a nonprofit to create some of that content and put it out into the general public? So, that since apparently, celebrity is the only way to get a bill passed in this country, which makes sense. I mean, we have a reality star in the White House. So, you know, who knows? Can we add that voice to the conversation? And that's my very complicated and long way of kind of answering that question.
Blomster: How big a factor did race play in the disinformation strategies that were being enacted as part of the the 2016 election? Because that's a conversation maybe that has not been a top level or it's been subverted by other conversations. So where, how big role did that that play?
Nkonde: I have very much dedicated my career to be at the intersection of technology and racial justice. Starting, like I said, with algorithms, but one of the things that was so interesting to me, as the Mueller Report was coming out was a news report that appeared in The Washington Post in February 2017. The headline was, “IRA uses race-based disinformation to disenfranchise black voters.” So already, I'm super interested because this is landing very much in my wheelhouse. And in reading of the Mueller Report, one of the things that became very clear to me was really taking advantage of this racial tension between black and white in America became a key strategy that was deployed by the internet research agency who worked in concert with Putin to subvert our elections. And what they did was really quite brilliant.
So, they developed about 30 social media profiles across Facebook and Instagram and gave them names like Blacktivist and Woke Blacks. And they use these pages to develop audiences around black people. So, this started actually in 2015. And it was around the time that Philando Castille had just been shot by the police and we were coming out Ferguson and you know, Black Lives Matter as a hashtag was really starting to bubble. And what they did was really genius. They used a three-part strategy to bring in black audiences. So, the first one was really amazing affirming messages, you know, “black girl magic” and “good morning black queen.” And I was like, “Oh, that's great. Let me retweet that. Yeah. Let me like that. Yeah. You know, they speaking directly to me, oh, how did they know? Oh my god, I love the black activist” and I was really like in the community and active on these pages, and we were sharing content. And then as we moved closer to 2016, they start to really question Clinton as a candidate, and she wasn't the best campaigner. And she did have this real tension with Black Lives Matter, which for somebody like me, who's really interested in racial justice as a way of moving towards a more fair and equal society, I was conflicted. I was at that point, a Bernie person and he didn't win, but it was also very clear to me that it was just not going to be Trump. That was going to be my default position. But what these pages start to do was to post old content of her adopting the 1996 crime bill that has led to this overt mass incarceration of black men which is something that I'd certainly feel very deeply in my community. They played tape of her using the super predator comment in reference to the Exonerated Five and these ended up being five black kids that were innocent and did not commit a crime and then spent 13 years incarcerated and really problematizing Clinton so I found on those pages what was happening is that we'd gone from “Good morning black queen” to “she's not that good.” And, you know, everyone's commenting and they're like, yes, you know, she sucks, but not him. Like he sucks more than she sucks and we just gonna have—it’s just gonna suck. But she seemed smart.
And then what happened at election—the last three days of each election are called Gt out the Vote. And typically what happened on Obama and then subsequent campaigns I was involved in, is the last three days of any campaign are when ordinary people start listening. And the only message that you want out there is go to the polls, go to the polls, go to the polls. However, what was happening on these social media pages is that they were saying things like, “Vote for Jill Stein.” And I was like, Jill Stein? That's not a thing. Like, let's focus. This one seems to be very popular and having rallies. No. Or don't go to the polls. And at that point, that's when I was like there’s something weird here, but didn't think about it, right? And we got what we got, ultimately. But the thing that I really liked about the Mueller Report was that they laid that out bare. That really created a sense of urgency in terms of my work, to try and create a body of research that not only explains what happened in 2016, but looks forward to 20 and tries to identify these threats before we go to the polls, and then develop some type o—I don't have a good name for it, but some type of maybe best practices or checklist or something that we can share with general audiences to help spot disinformation. Because in the midst of this whole campaign, there were pockets of black resistance. There were a group in Baltimore City, Maryland, who were interacting with these bots and actually refused to believe them and call them out, prior to the election. There's a group of black women who prior to Gamergate, which is taking us even further back, had identified a disinformation campaign on 8chan and called it out. They developed a hashtag called “your slip is showing” and called it out and then even since the election, I've been working with researchers here out of the Tech Policy Lab and the iSchool, and other places where we were able to see how the not my Ariel hashtag, which blew up in July, and for those who don't know, it was this reaction towards an African American actress being cast as Ariel who is a pretend Disney mermaid. And I'm sitting at home in Brooklyn thinking “wow, white people have nothing to do if they care about this.” I like “what's going on” in the like, I have read like we have ICE and white people care about mermaids. Oh my god.
Blomster: An actual mermaid.
Nkonde: Yeah, like we're dying. And it turns out white people didn't care about mermaids. It was a disinformation campaign. And it had originated on Facebook. And originally was like a yoga page that attracted all of these visitors in 2017, around 40,000 visitors and they would do guided meditation. Then, in about 2018, it became a Mohammed Ali tribute page, which attracted probably up to 1,000 more people. So, now you're getting into like big numbers. And then it became Christians against Ariel, which to me was even I was like, it's not just white people it’s evangelicals, like they've really got nothing to do. They need to be praying to Jesus and minding their own business. And it then got nominal numbers, something like 250 new followers, but the way the press reported it was there’s an out roar that this African American actress is going to be a mermaid and there really wasn't. And so that's when I start to think, “Hmm … this is going to happen again.” And then in July, of course, Mueller did the open testimony where he said this is going to happen again. We were looking at some of the reparations hashtags, or disinformation also, where they're taking an issue that is sufficiently divisive, and then testing to see what people do with that message. And coming out of that work we had then the Ukraine phone call from the president where he didn't say use disinformation in the phone call. But this idea that you have some foreign power spreading unflattering information, in this case about Biden's son makes me think that we're probably onto something in this project and the strategic importance of black women for the Democrats. Ninety percent voted for Hillary, even though we didn't like her. We were just like, “Look, there's no one else around here. Jill Stein is not it. And, so I do anticipate that that black women will be attacked online again.
Blomster: With these kinds of race-based disinformation, would targeted campaigns have been as effective back in 2008 or 2012, or before? Or does the state of race relations in the U.S. right now maybe provide more opportunity? Is it something like the societal issues we're facing? Is the tech evolving? Is it a mix of the two? Why has this been so effective and now is a real threat to our democracy?
Nkonde: I think that they, I think they definitely would have been effective in 2008. Not in 2012. The reason I say 2008 was, that was the campaign in which Obama was depicted as a monkey online. It was also the campaign where Reverend Wright—that video was played, where Reverend Wright, Jeremiah Wright, was the Obama's pastor for many years in Chicago. And he has a brand of theology, which in the black church is called liberation theology, where they speak about racial injustice and social injustice very centrally to the ministry. And there's a video where Reverend Wright is saying “This isn't the United States of America. This is the KKK of America” as part of the sermon and so those attempts were definitely made. What turned it around in 2008 ended up being the crash of the housing market and that came right at the end of the campaign and Sarah Palin was vice. And I think why America looked at Sarah Palin as being one heartbeat away from the presidency. And John McCain was so—late John McCain—was so there in many ways to Obama and that he never stoked racial tension in his own campaign. I think we all knew that Sarah Palin probably would, and we voted for Obama. So, I think we need to contextualize that. And then what happened after Obama got in, in my opinion, has kind of brought us to these race relations, in many ways. There were two things: backlash from the right through the Tea Party and the birther conversation, which was really a coded way of saying America should not have a black president. And it was a way for, I think, white disenchanted groups, who do not believe in social progress, to have a say and feel that they could make their America Great Again, as it were.
And one of the things that Obama did that wasn't helpful was promote this idea that we were in this post-racial America. And the truth of it is if we look into the social science, and particularly the work of Eduardo Bonilla-Silva, out of Duke, he's a sociologist. When you ascribe to colorblind notions of society, you're actually ascribing not to do anything about it. And what that does for populations of color is that it allows racism to get worse. It actually attacks us. And one of the things I think Obama did in trying to appeal to the whole of America was we're colorblind, we're colorblind, we're colorblind. So, even though we were having a right a rising racial tension, nobody could signpost it because the black man who was president was saying that we're colorblind. So, by the time we got to 12, we were so kind of indoctrinated with this colorblind mythology that we weren't able to speak about race. At the same time, the technology is improving. And we're starting to see machine learning and deep learning, being much more part of the product development processes. And with the introduction of these new and innovative ways of building technology, we can now surveil people better. So, you have the denial of race, you have new technological systems and then you have new markets that are based on oppression, or based on surveillance. And it's around that time that you start to see your social media kind of acting up where I always remember one time—I’m not on Facebook anymore after Cambridge Analytica—but I always remember looking for Uggs and then going on to my Facebook page and all I could see were Uggs like on the sidebar. It was like every type of Ugg, every type of color. I don't know what I clicked on. I don't know what cookies, like why did the cookies. That's when I started to notice that we had got to this kind of tipping point where our behavior was being tracked. And I think you are right in the sense that what happened in 2016, really pulled that bare. All that underground, kind of seething racism and resentment was really allowed, you know, that we had complete republican capture of the House and Senate and they were blocking everything that Obama did. We had government shutdown. We had all the big things around the ACA and healthcare and for white people that do not want to see the progression of our society, 2016 was the perfect time to stoke those tensions. And then if you look at the history of disinformation, Russia has always used race as a tool of disinformation. If you look to—they invited W. E. B. Dubois, the famous sociologist, to Lenin's funeral, saying “you are not, you're not welcomed in your own country, but come to Mother Russia, and we will, we will welcome you.” And so even within black studies, critical race studies there, you do see, the Russians have historically reached out to blacks. When you look into the literature, the reason that they use race is that they want to disprove to Russia, that America is the city on the hill. They want to show that this is what Americans do to their black people, and that makes them less than their brand, without speaking about their own racial histories that they have.
Blomster: As we head into 2020. Has the landscape changed? I know that tech companies have not necessarily undergone those comprehensive reforms that many people are asking and in terms of accountability, and in terms of disinformation, we see the issues that Facebook's having right now with its own employees protesting. So, is the landscape that different? And what are, what is your outlook in this space going into 2020? Are we going, just going through the cycle again, because that's a pretty scary prospect?
Nkonde: I think the landscape has changed in so much as we have fallen out of love with tech companies. And I think that we were, as a nation, punch drunk. We were, you know, crazy in love, as Beyoncé would say, with tech companies over the last decade, and I think since Cambridge Analytica broke, and we had to face this real possibility that our election results may not be legitimate. Because of that, there's this reckoning that, ultimately, tech companies are more powerful than government. They are richer than government. They have invested heavily in lobbying. If we look at our own approach to technology, the AI Futures Act, which was entered in 2017, was meant to be a roadmap for how we're looking at Advanced Technologies on the federal level. And the first priority is to create an enabling business environment. So, you not only have these rich companies that yield this huge excessive power and influence on the federal level, but you have, you have a legislature, you have a Congress that is completely committed to supporting them means that fundamentally, power has not shifted to people and power has not shifted to the public interest or in the interest of democracy. And then if you add on that Zuckerberg’s speech at Georgetown, where he said that he's standing by section 230 of the Communications Decency Act, which basically says that tech companies are not liable for the content that they carry. So, anybody can say anything from a First Amendment perspective, and they're not going to do anything about it. And then we saw that in subsequent questioning.
So, my outlook in 2020, is that we have to save our own democracy for ourselves. That power is with the people it's not with government and the tech industry are not going to turn away lucrative political advertising. Because we have an election they didn't in 2016. We have evidence of that. They actually were alerted to the fact that Russian ads were being bought as early as 2015. And did nothing about it because it increases the bottom line. That's very sad because as an immigrant to this country, as someone who truly believes in American, the promise of America, I would like to think that our democracy was would be worth more than quarterly results. But unfortunately, I'm not seeing that.
Blomster: Finally, for young law students who want to get involved in this, in this kind of work and working in the space that you are, what's the biggest advice that you can give them as they're growing and coming into their own and turning into next legal leaders, if they want to get involved in the kind of work you're doing and become leaders in that space?
Nkonde: So, the first thing I would say is come one come all. You know, I'm based at Harvard Law School this year through my fellowship and one of the things I really like about UW is that there seems to be a concerted effort to think about the public good. That doesn't necessarily exist in elite educational spaces. And I would say number one, develop really, really good relationships. Because if you're looking to influence policy, policymakers listen to people that they know. And they also listen to people that they perceive as being friends. That doesn't matter what law school you went to, because 90% of the people I worked with in Congress came from Harvard or Yale, but they weren't necessarily public focused.
The second thing that I would say is, find a niche, find something that you're interested in within technology and publish. Get your name into a student journal or into another journal or hook up with a professor and publish with them because you want to be discoverable. When you go and say that you want to do this work, you want to be in a position where somebody can Google you and a track record if that work will show up.
The third thing I would say is get on to Twitter and retweet and DM and like the people that you like. Get them to pay attention to you and develop relationships with them because I'm what's called a public interest technologist, which is a new field in technology that’s supposed to be an alternative to going to work for industry. It's being ramped up now. AI for the people is one of the organizations that would fall within that remit and what's unique and different about us is, we work with film production companies. I'm about to do a meeting when I go back to the east coast with hip hop artists like people that do ciphers because I was watching a late night show and Nicki Minaj did a freestyle with edible arrangements, as that was one of the words and I was like, “Well, if she can do with edible arrangements, she can do with an algorithm like I mean, you know algorithm what make a decision, hey.” Like, whatever, you know. So how can I get to the people that are writing those songs to talk about my house is watching me. And I write a bunch. So, I write both op-eds, and I'm going to start doing more academic writing. But it gets the name out there. And that's really the vision that we're looking for.
So, I'm just one organization. There are others that are traditional research where you can go you can be a postdoc, you know, you can have a career there. There are others that are looking at impact litigation. So, particularly for law students that want to litigate, you can go to EFF and other places. But start looking for these alternative places and personalities that are doing work. And if they have an organization reach out to them, because if they're anything like me, where I'm interested in actual people, I'm not necessarily focused on the academy. They're going to be reading interested in people that are law students that are also public interest facing
Blomster: Mutale Nkonde is a fellow at the Berkman Klein Law Center for Internet and Society at Harvard University. And she is the founder of nonprofit AI for the people. She is here with us today as part of a special tech talk sponsored by the University of Washington Tech Policy Lab. You can learn more about her work at aiforthepeople.org. We'll put links and more in our show notes. So, check that out.
Mutale, thank you so much. This was a lot of fun.
Nkonde: Oh, no problem. Thanks for inviting me.