This week Audrey and I chat about Waymo v Uber, Algorithmic Violence, and how YouTube’s algorithm distorts truth. Enjoy!
- [00:57] Recompiler: Issue 9 (Hard Problems)
- [05:07] The Waymo v. Uber trial: greed, ambition, and robot cars – The Verge
- [07:41] Trade secret | Wex Legal Dictionary
- [17:39] How the Frightful Five Put Start-Ups in a Lose-Lose Situation – The New York Times
- [25:55] On-Algorithmic-Violence: Attempts at fleshing out the concept of algorithmic violence
- [32:35] ‘Fiction is outperforming reality’: how YouTube’s algorithm distorts truth | Technology | The Guardian
- [36:13] AlgoTransparency
- [53:48] Bringing the blog to a close | Geek Feminism Blog
Issue 9: Hard Problems, now shipping!
Our first issue of 2018 focuses on the hard problems we try to solve in our work and our careers. Get it in the shop now.
Now Broadcasting LIVE most Fridays
We broadcast our episode recordings LIVE on most Fridays at 10am PST. Mark your calendars and visit recompilermag.live to tune-in.
We love hearing from you! Feedback, comments, questions…
We’d love hearing from you, so get in touch!
CHRISTIE: Hello and welcome to the The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.
Episode 49. This week, Audrey and I chat about Waymo versus Uber, Algorithmic Violence, and how YouTube’s algorithm distorts truth. Enjoy!
We’re recording what will be Episode 49, seven squared.
CHRISTIE: And I’m Christie. Audrey is also here with me. Hi Audrey.
CHRISTIE: Any announcements for us?
AUDREY: Well, I know that everyone’s heard the last few weeks. We have Issue 9 in the shop, but it’s still true. And we’re working on some of the design for that and getting it ready to print.
CHRISTIE: Awesome. So shop.recompilermag.com is where you can go to ensure you get a copy of that.
AUDREY: And subscribe if you want to read it all year. This one is on Hard Problems but we’ll be following with issues on science, love and romance and machines and things which is machine learning and internet of things. So there’s some really cool, interesting topics coming up.
CHRISTIE: Really good stuff. And you can buy gift subscription for somebody else. So if you’ve already got one and those topics you think might be of interest to somebody else, it could make a good valentine’s gift with that love and romance issue coming up.
AUDREY: For sure. We’re going to talk about matching and the ways that algorithms and technology affect those areas of our lives.
CHRISTIE: Awesome. Not an announcement per se, but we kind of blasted through an anniversary. We were working so hard, we didn’t even notice.
AUDREY: I think you just pointed it out to me the other day.
CHRISTIE: I think I was just looking at the list episodes, but we’ve been doing the podcast now for two years. I’m a little shocked at how much time has gone by. Hopefully, it shows a little bit in the quality of the show. I didn’t go back and listen to check that myself. And also, this is going to be Episode 49 which means we’re almost at 50, which is kind of a milestone for me too. So, yeah. I didn’t really prepare a retrospective or anything but I did kind of want to just note it.
AUDREY: Yeah. And I mean we’ve covered a lot of different topics and I feel like doing this podcast every two weeks, every week has really helped me see a narrative about what’s going on around us, the kinds of topics that we keep bringing up that there are just a lot of ways that technology pushes us that we don’t like, that understanding the details of it helps us take action to be better advocates. And that it’s just so pervasive, these things. I was realizing that for as much as we talk about security problems, we tend to talk about like low level security problems and sort of high level structural impact. And I thought that was really interesting.
CHRISTIE: Yeah, I was noting that every week when I sort of say what episode it is and what we’re going to talk about, it’s tends to be like this sort of heavy dystopian topics. Then I go, “Enjoy!” And it feels kind of funny, I don’t know. Hopefully, there’s a little bit of fun in our podcasts.
AUDREY: Personally. I get very excited about understanding something even when looking at it is uncomfortable.
CHRISTIE: Right. And I know we make each other laugh. So hopefully, some other people find us a little funny too, not in a derisive way.
AUDREY: At least there’s the titles.
CHRISTIE: Right. Well, here’s to another two years. If we keep going in our weekly pace, we’ll hit 100 faster than we hit two more years. But here’s to another couple of years and another 50 episodes.
CHRISTIE: All right. I don’t know, did you see my message this morning or any of Sarah Jeong’s tweets yet?
AUDREY: I saw your message. I didn’t have a chance to really…I just briefly looked.
CHRISTIE: So, the first thing we have been planning to talk about was the Waymo versus Uber trial. And the first thing I see this morning when I’m having my breakfast is that they’ve settled. On the fifth day of the trial, they settled.
AUDREY: After a whole week of us learning about just how ridiculous Ubur is. I mean, we knew this, but actually seeing the emails and seeing the lack of insight that they have about what they’re doing and how they’re doing it is interesting.
CHRISTIE: Yeah. So I don’t know. What was I getting out of this? I think I was experiencing a lot [inaudible] about it. Like it was different than the Google Oracle where I thought there were interesting technology questions that were kind of being thought through. But the Waymo-Uber thing was just sort of examining the weird culture of Silicon Valley and the way that people shuffle between companies and the way that — I’m putting this in big air quotes — but the sort of “Biz Dev” people approach things and move through their careers such that we get things like can they play a clip from Wall Street? The Gordon Gekko Greed is Good speech, right?
AUDREY: Right. Is that relevant? And I guess watching Sarah Jeong tweet about it all week, one of the things that I kind of got a sense of was that you could start with the embarrassing stuff. Like there was so much to embarrass Uber over, like looking at their conversations and these emails and texts and everything that came into it. They could start with kind of the embarrassing material and make them really uncomfortable and then get into the ‘so here’s what we know’, like the details of whether it could be traced, that materials were brought over.
CHRISTIE: Right. And Sarah Jeong and the other people recovering this basically, I think she had posted yesterday, “I’m not sure that Waymo can win this basically.” So they had started out with…so the main issue was about trade secrets. We haven’t really talked about that too much on the podcast. We’ve talked, I think we tend to talk more about patents that come up as an [inaudible] property, but trade secrets are totally different thing. And it comes out of tort law, civil law, and it involves basically like the secret sauce. The secret has to not be known to the public and you have to take reasonable steps to keep it secret.
AUDREY: You can’t open source your trade secrets and still ask for the kinds of protections.
CHRISTIE: Correct. Or if you can buy something off the shelf and reasonably reverse engineer to get the trade secret, that also is not protected.
AUDREY: So in the case of a self-driving car, we can’t buy one and they haven’t posted or open sourced these trade secrets. So, assuming that they have OK internal security practices, then that would make them continue to be trade secrets, right?
CHRISTIE: Yeah. So the first thing that came up was that Waymo initially alleged like hundreds of trade secrets or like a much huge volume. And when the trial started on Monday, they were down to eight. So the other part that’s kind of interesting about covering this trial is because it’s about trade secrets, whenever they’re talking about that in court, it’s under seal and the public has to leave the courtroom. So there’s this sort of like blackbox around it and they would refer to trade secret number 96 or whatever, if they were referring to an open court.
AUDREY: At the same time, the jury needs to understand the trade secrets well enough to know that if a particular material references them, they would need to understand it well enough to know what was being referenced.
CHRISTIE: The jury has access it.
AUDREY: Yeah. What I mean is it’s relevant to their ability to be jurors to know what the trade secrets are, right?
CHRISTIE: Yes. And there’ll be a component of explaining what the trade secret is and explaining that it is a trade secret. Actually, part of what Waymo was admonished for is basically not going into enough detail about that. Like they only reserved for 15 minutes, I think, of their opening remarks. And then later in the public portion, judge also admonished them for like, including basically promo videos as part of that. And he’s basically like, “You’re trying to make it seem to the jury that you invented LIDAR, which you did not.” I guess where I’m going with all this is that it seems like as the trial started to unfold, like you were kind of saying, “Waymo, trot it out. It’s really embarrassing stuff.” But then maybe as it got closer to, “Oh, we actually have to prove our case,” it started to weaken.
AUDREY: They just didn’t have enough evidence that anything was definitely stolen. Everybody could see that there might be opportunity.
CHRISTIE: And it was pretty clear that Levandowski downloaded information, but it’s not clear that that information got to…he download that information and then deleted it, acknowledged that Uber shouldn’t have it. There seems to be very little evidence that it got to Uber.
The other kind of layer to this is that in the process of Uber purchasing Ottomotto, did you know it was actually Ottomotto?
AUDREY: I didn’t until I read the first day’s account. Yeah.
CHRISTIE: I don’t know why. I’m like, “Ottomotto?” Anyway.
AUDREY: When you first started talking about it, that it’s o-t-t-o, otto, is what they’ll name their car company. So yeah.
CHRISTIE: Yeah, because we talked about Otto because remember they made that announcement about the center in Pittsburgh? We talked about them before they were bought. Yeah. And I was like, “That’s a weird name.”
AUDREY: Yeah. And the CMU partnership did actually come up in the course of the trial, I think, as something…now I’m trying to remember…something that maybe Travis said that Google was jealous of or something goofy like that.
CHRISTIE: I saw something in the context of recruiting engineers and the sort of where the experiences and things like that. It might have been, they wanted, they had trouble staffing that center because ultimately everyone wanted to move to Silicon Valley, which shocks me.
AUDREY: Maybe they just didn’t want to be in Pittsburgh.
CHRISTIE: That’s true. I have never lived in Pittsburgh.
AUDREY: I hear it’s really nice.
CHRISTIE: So, when Uber was going through the process of purchasing Ottomotto, there was this due diligence stuff and they indemnified Levandowski against any litigation or whatever. So I think that’s why Waymo’s not going out for Levandowski directly.
AUDREY: Aside from that affecting their ability to bring him in, didn’t he take the fifth?
CHRISTIE: They thought he was going to. He never…
AUDREY: Oh, it never actually happened.
AUDREY: So that would have complicated things a lot, too.
CHRISTIE: Right. I think another thing that stands out for me is just I think it’s good for anyone to have an awareness of just how much forensics your employer can do on the equipment that they give to you. And that the stupid banter may end up in a court case some time.
CHRISTIE: And that if you check out your whole companies or projects, [inaudible] repository right before you leave, someone might pay attention to that. In fact, for a few of the employees that left, Google actually didn’t put the equipment back into circulation. They flagged it and set it aside in case it needed examination later.
AUDREY: They were concerned about the possibility.
AUDREY: The resolution is really interesting, right? That Google has now taken a significant stake in the company.
CHRISTIE: I guess it depends on how you define significant. I think it was 0.3%.
AUDREY: No, like if in cash dollars?
CHRISTIE: It’s a couple hundred million. But it is a stake and Uber is pre-IPO, so that could end up being much more, I guess. But I think more importantly, it seems like penance that makes sense. You know what I mean? To me, like, “OK, you kind of poached our employees and maybe Levandowski took some stuff and he’s going to build on that, but we can’t really prove it.”
AUDREY: Look where he’s at now? They fired him.
CHRISTIE: They fired him, yeah. There is testimony now of Travis Kalanick saying, “We hired him because we thought he was a great technologist and he had a lot of charisma, but maybe it turned out not so much.”
AUDREY: There did seem to be kind of an implicit thing here about him being a pain in the ass. I mean, maybe just because they got sued over his actions, but still.
CHRISTIE: And also maybe he didn’t. I don’t know. I think a lot of people can play on that, like they have more charisma than actual skills.
AUDREY: Oh, sure. Yeah.
CHRISTIE: I don’t know. I’m not making an assertion. I don’t know. I don’t know the guy, but that was a little bit of what was implied.
AUDREY: Yeah. I guess the main point that I saw people making a week was that we have a collective stake in whether people can move from company to company. It impacts all of us if working on a really high value technology means that it’s difficult for you to go to another company. So, if they had one on really shaky evidence that would’ve had a chilling effect.
CHRISTIE: Right. I think this is a better outcome in terms of precedence. Yeah.
AUDREY: Yeah. I know that it’s not like a controlling interest or whatever, but I think Sarah Jeong tweeted after they posted their wrap up that consolidation is possible and really likely. It’s very reasonable for Google to want other companies to try the same problem as long as it doesn’t affect them business-wise, if they can just buy out the solution later. If they can let somebody else develop a good technology for them.
CHRISTIE: Right. Let other companies take the risk and we’ll just buy the winners.
AUDREY: And that is a lot of what happens in, just Silicon Valley consolidation in general. And I saw something I wish I could remember where. I saw something a while back that was talking about how innovation doesn’t happen within Google a lot of times. It happens at startups that maybe even get funding from them and they get brought into the company. So there’s kind of this pattern of that of, I think it was probably in the context of exits, right? Funding and exits.
CHRISTIE: I know what you’re talking about. I saw the same thing. Basically, the percentage of companies that have non acquisition exits has gone radically down, that there’s fewer startups and fear of them exit through not being bought by the company.
AUDREY: Yeah. And so the impact is like a company like Google. And I mean, any of the big five or six players could conceivably do this. But that they can fund startups, watch them, buy the best ones, let the rest fail because VC actually has a lot of failure built into it and reap the benefits.
CHRISTIE: Yeah. And it’s again, the power of monopoly, especially at the scale.
AUDREY: Yeah. And I mean, how could you compete with that? If Google can decide that the least embarrassing thing is to spend millions and millions on a stake of auto, then yeah.
CHRISTIE: I think there’s a slightly…I mean, there’s two things. One, if we’re talking about mobility for engineers and other technologists, I think there’s some clear things to learn from here, like don’t plot to this extent while you’re still at your current employer and the relationship with your current employer before you start a relationship with a new one. There’s always a little bit of overlap there, right? Like most people interview, but I think this kind of took it a little too far.
AUDREY: Maybe you should be like meeting with the executives to sell them on how awesome you are a before you’ve really made your way out of the previous company.
CHRISTIE: Yeah. And then the other thing about startup exits, part of me wants to say, be OK with having a lifestyle business, right? Like don’t take on the kind of venture capital where you need to have an exit. And then I’m wondering or how possible, how successful can you be if Google and the other big companies are around and they’re just going to come to you and swallow you up, or make competition impossible for you. Like I think there is a ceiling there if you really do threaten the space they want to be in.
AUDREY: Yes, definitely. I think the stuff that we’re remembering reading had gotten into that too. And there are things that you just can’t do without a lot of funding. I’ve definitely learned just directly the limits of what I can do with the amount of funding that I have or the amount of resources that I have. There are kinds of projects that I can’t take on without 10 times increase in funding. And so yeah, I mean, given that, it’s sort of hard to strategize to have a, you know what I mean? Like a good strategy around this for yourself as somebody wants to build a thing because you can either scale your ambitions appropriately to not competing with Google, to not meeting the kinds of resources that require in VC and you can do something really interesting within that. But if what you want to build in the world cuts into one of those two areas, then you have to engage with the system that might only give you one or two good options.
CHRISTIE: Right. And if you want to reach an audience of a certain size, you’re going to want to be compatible with Facebook or with these other big systems and then there’s that. You’re tied to them, right? So, that makes it even harder.
AUDREY: The more that I understand about Facebook’s impact on traffic and on advertising, I’m kind of amazed that I was able to take The Recompiler as far as I have with no Facebook presence.
CHRISTIE: Yeah, part of still wants to believe you don’t have to be on Facebook if you want to run a business. But you make a compelling case, that’s not true.
AUDREY: Yeah. Well, I don’t know. I think I’m still learning whether that’s the case or not.
CHRISTIE: I just ran into another case where a product we use at work, their main line customer interaction form is on Facebook. And I didn’t want to join the group but I wanted some insight and some interaction with the other customers. And so it was like, again, these companies have really leveraged their platform network effects and you just kind of have to go along with it?
AUDREY: Yeah. There’s one last thing that I wondered about this trial, and that was seeing the settlement today, whether Google had kind of gotten into it with this outcome in mind.
CHRISTIE: If I have. Yeah.
AUDREY: I mean, it’s quite a negotiation then, right? Like, we’re going to make you look bad. We’re going to make everyone feel really uncomfortable and not trustworthy. And then, we’re just going to take a chunk of you.
CHRISTIE: And keep in mind, Travis was still CEO when they launched the lawsuit. So Uber has, I would argue, I mean I was about to say Uber has changed materially and I’m going to walk that back because that’s not necessarily the same thing. But having a different CEO is a big deal, especially if the new CEO put out a statement, there was a bit of contrition. It was sort of like this has been distracting, like we need to get past this.
AUDREY: That makes sense.
CHRISTIE: There was a lot of just like pettiness, one interaction particularly stood out to me. Did you read that interaction about the cheat codes?
AUDREY: Whether cheat codes were something permissible or not permissible? Are you taking advantage of something that you intended to or…
CHRISTIE: Right. And I believe it was the Waymo attorney kind of ended it, like they allow you to not do the work basically. It was just very kind of…there seem to be a…I don’t know.
AUDREY: It makes an interesting business school, right? We’re going to find the shortcut, the loophole.
CHRISTIE: Well, and the fact that Google is kind of positing that that’s lazy. Where is that line? Anyway, put a bow on that. It’s done.
AUDREY: Until one of these players does something else egregious.
CHRISTIE: Till the next time. What is the next trial? I’m going to preside over.
AUDREY: Yeah. I don’t know. I don’t think I’ve heard about anything else coming up unless there’s some other aspect of Google and Oracle that will not die.
CHRISTIE: Oh, that one surprised me. All right. So, I shared with you this essay, I guess, it’s on GitHub. And the person says they put it on GitHub because it’s a work in progress. But it’s titled Notes on Algorithmic Violence. I really appreciate this because I think it gave a term for a lot of what we talk about here on the podcast. Let me highlight stuff on it.
So they started by saying that in 1969, Johan Galtong coined the phrase “structural violence” to refer to the ways social structures and institutions harm people by preventing them from meeting their fundamental needs. The forces that work together to inflict structural violence (things like racism, caste, colonialism, apartheid, transphobia, etc) are often systemic and visible and intersectional. But crucially, they become embodied as individual experiences.
So they introduced or they remind us about this term structural violence. And they say we’re overdue for a term that allows us to easily (if imperfectly) articulate some realities of the moment we find ourselves in today. They said, “We need a new phrase that addresses newer, often digital and data-driven forms of inequity.” So, they posit the phrase ‘algorithmic violence’ sort of a first step. And they say, “Algorithmic violence refers to the violence that an algorithm or automated decision making system inflicts by preventing people from meeting their basic needs. It results from and is amplified by exploitive social, political, and economic systems, but can be intimately connected to spatially and physically borne effects.” Then the rest of essay is kind of pointing to different categories of things.
The author is Mimi Onuoha. I probably did not pronounce that correctly. I should have looked it up ahead of time. I just wanted to say the author. We’ll link to it as well in the show notes. So, I’m curious to see how this gets flushed out and I think it might be useful for us to sort of use that term or think about that term.
AUDREY: We’re talking about the impact of algorithmic control over content over policing over accesses to all sorts of essentials and services.
CHRISTIE: Right. Algorithms have become part of the fabric of everyday life and pretty much all aspects and has real material impact. And it’s not just things like what you see on YouTube, which I would argue, we’re going to talk about in a minute, has a very real material impact, but it’s also like it’s used in sentencing, criminal sentencing, and just other things like this.
AUDREY: To determine risk, to determine applicability, yeah. We talk about so much how we have so little insight into where these sorts of things are being used and how they’re impacting us. And that’s a kind of violence too, to not have any view into or authority over the decisions being made around you.
CHRISTIE: Yeah. They say, “Like structural violence, they are procedural in nature and therefore difficult to see and trace. But more chillingly, they are abstracted from the humans and needs that created them (and there are always humans and needs behind algorithms that we encounter every day). They occupy their own sort of authority, one that seems rooted in rationality, facts, and data, even as they obscure all of these things.”
That really stood out to me because we’re starting to talk about the problem of algorithms and lack of transparency around them more in the public discourse. And the general response from the companies that run the algorithms and have the data is, “Is this sort of hands off?” They’ll deny it works that way and then they’ll be like, “It’s the algorithm.”
AUDREY: The thing that I keep coming back to around machine learning is that we can create systems that no one can understand. It’s not just about like a programmed algorithm, like how you make a batch of cookies. We talked about that before. But the idea that you might put in ingredients to get a cookie at the end and you don’t actually know. You told that you wanted a cookie, but you don’t know how that happened, what happened in the middle to make that occur. So, it makes control and oversight so much harder.
CHRISTIE: I just realized the repository has some readings in here. So yeah, we’ll link to that. Violence, I feel like it’s one of those stop words.
CHRISTIE: And then it can really create a reaction in people. I don’t want to use this term to be inflammatory, but I actually think it’s really what we need. I think we need a term that is very clear about the impact.
AUDREY: And acknowledges harm to people that aren’t us. I mean, I think part of why calling it violence is a really good move is that it helps us, not just center ourselves and our experiences, but the impact that the system may have on a lot of other people.
CHRISTIE: And building upon a prior idea of structural violence, right? The essay concludes with, “Not because it is exceptional, but because it is ubiquitous. Not because it creates new inequities, but because it has the power to cloak and amplify existing ones. Not because it is on the horizon, but because it’s already here.” So, it’s like sort of a new iteration of a thing that we already knew existed.
With that idea of algorithmic violence in mind…
AUDREY: We’re going to talk about YouTube.
CHRISTIE: Oh, yeah. We’re going to talk about YouTube. The Guardian…what is the name of the article?
AUDREY: Fiction is outperforming reality: how YouTube’s algorithm distorts truth.
CHRISTIE: Yes, by Paul Lewis. This is a pretty lengthy article. I will encourage everyone to read it.
AUDREY: It’s a really in-depth exploration.
CHRISTIE: It is and it builds on…so we had previously… did we talk about the YouTube and the weird kids’ videos?
AUDREY: Yeah, we did.
CHRISTIE: OK. So, it builds on…not builds from that, but other people have been interested in YouTube’s algorithm engine and how it works and how it’s affecting people’s perceptions of things. So, the subtitle is ‘An ex-YouTube insider reveals how its recommendation algorithm promotes divisive clips and conspiracy videos. Did they harm Hillary Clinton’s bid for the presidency? Paul Lewis starts off talking about how the Logan Paul video from Japan and then how he was watching that and then it kind of went down the rabbit hole of increasingly disturbing videos. So that was kind of the intro.
Did you know there are 1.5 billion YouTube users in the world, which is more than the number of households that own televisions?
AUDREY: I was not aware of that.
CHRISTIE: So, YouTube has this recommendation algorithm. It’s the single most important engine of YouTube’s growth. We don’t really know how it works. But more and more, we’ve been noticing the weird stuff that happens, like conspiracy theories about mass shootings, the weird children’s stuff. The one…I don’t even remember the name and I’m not going to say it even if I could. But the dad with his two daughters and those disturbing videos that…I think he had to take down his channel or they took it down. He defends it to the reporter, “That’s what got us out there and popular.” He says, “We learn to feel it and do whatever it took to please the algorithm.” That is extraordinarily killing to me, coming from parent.
CHRISTIE: And YouTube/Google basically responds by kind of a whack-a-mole, like once things are raised by journalists basically, then they take them down. They say they have more human moderators, but it’s really like just barely touching the surface.
AUDREY: One of the things I think is really interesting here is that, I mean, YouTube has so many videos. And maybe I couldn’t just automatically figure out how to search for a conspiracy theory or something that really has this kind of hate speech. But it’s fairly easy even with that, for somebody to examine this at work just by picking a topic and letting the auto recommendation thing work, just by picking anything and letting it play for a while.
CHRISTIE: Yes. And that’s what this ex-YouTube engineer did. After he left, he has built this tool and it looks like he used it around different [actions] maybe. But he built a tool that basically runs searches with a seed search, like something like climate change or Hillary Clinton or whatever, and then just records what comes up in the next, in the what’s next, I can’t remember what it’s called exactly.
AUDREY: Up Next.
CHRISTIE: The Up Next, yeah. What the recommendation engine plays and it’s auto-play. A lot of times if you’re not super quick on the draw, they just go and then you’re sucked in.
AUDREY: Yeah, I try to remember to uncheck that box as soon as I open YouTube.
CHRISTIE: “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy.” He said when he was there, the algorithm is not static. They’re constantly tinkering with it and that the preference changes that increase ad revenues by extending the amount of time people watch videos. Watch time was the priority. Of course, Google denies now that this is true or that this is still the case.
AUDREY: It’s not like they deny the overall phenomenon because that’s maybe harder for them to refute, but just things like, “Oh, we aren’t prioritizing watch time anymore. We have way more complex ways of deciding what people should see.”
CHRISTIE: Right. And we’ve made changes to suppress fake news and improve the quality and diversity of what people see. So, this engineer wrote a tool that collecting information about the recommendation engine and there’s a database and that’s part of what this reporter looked at. And part of what they found is that basically, the recommendation engine appears to be biased towards fake things and like sensationally fake things. And that no matter where you started out with a seed word, you got more pro-Trump videos than you did Clinton videos.
AUDREY: The extremism. It’s not just that it takes you to an extreme view, but it will take you to extreme views in a particular direction.
CHRISTIE: Yeah. So I’ll go transparency.org is this engineer’s…Chaslot, I don’t know how to say his last name…site. The reporter says, “Each study finds something different, but the research suggests YouTube systematically amplifies videos that are divisive, sensational, and conspiratorial. Wherever you started, whether it was from a Trump search or a Clinton search, the recommendation algorithm was much more likely to push you in a pro-Trump direction.”
AUDREY: That’s really interesting. And I think the more favorable to YouTube argument is that this is all just reflecting our own biases. This is reflecting what people click on, that they would personally make the choice. It trains the algorithm. You look at it and you go for the more divisive things, but that’s not necessarily the case, right?
AUDREY: It’s entirely a human generated phenomenon.
CHRISTIE: Right. And how do you satisfy the question about ‘the people are influenced by the recommendations’? “YouTube presumably never programmed its algorithm to benefit one candidate over the other. But based on this evidence, at least, that is what happened.”
“The spokesperson says: our search and recommendation systems reflect what people search for, the number of videos available, and the videos people choose to watch on YouTube. That’s not a bias towards any particular candidate; that is a reflection of viewer interest.”
AUDREY: And creator interest, right? If they are arguing that they’re just that many more pro-Trump videos than pro-Clinton ones or whatever…
CHRISTIE: Right, the number of videos available.
AUDREY: But we have no way to count that. I mean, they can make any assertion about that that they want.
CHRISTIE: Yeah. “How does YouTube interpret ‘viewer interest’ and aren’t ‘the videos people choose to watch’ influenced by what the company shows them?”
Let’s see, what else? “But why would a bias toward even more weird or divisive videos benefit one candidate over another? That depends on the candidates. Trump’s campaign was nothing if not weird and divisive.”
I think people on different sides will argue, but my point of view is that the pro-Trump people put out way more of this stuff than pro-Clinton people. And so, they benefit from an algorithm that is going to be skewed to a weird and divisive content.
AUDREY: Yeah. And I’m thinking about how chaos benefits some political views over others and conflict and uncertainty benefits some political views over others. Like if we’re talking about climate change, then you don’t necessarily have to convince people that climate change isn’t happening. You just have to convince them some combination of we don’t know, we can’t control, it doesn’t matter anyhow, and nobody’s really sure. And that’s a much easier thing to accomplish than to convince people that it’s real, it’s serious, it’s happening, it affects everybody, and we have to take action right now.
CHRISTIE: Right. Of course, it’s not surprising that they talked to Zeynep Tufekci for this. It says: Tufekci points to studies showing that ‘field of misinformation’ largely tilted anti-Clinton before the election. “Fake news providers,” she says, “found that fake anti-Clinton material played much better with the pro-Trump base than did fake anti-Trump material with the pro-Clinton base.” She adds, “What is your thing about which points of view does chaos support?” She adds. “The question before us is the ethics of leading people down hateful rabbit holes full of misinformation and lies at scale just because it works to increase the time people spend on the site – and it does work.”
So basically, it’s just the algorithm. That’s not a defense. That’s not an ethical defense.
AUDREY: Right. It’s still your algorithm, it’s still your website, it’s still content you promote.
CHRISTIE: You’re still responsible for the impact you have in the world.
AUDREY: Yeah. But algorithms are this distancing and this money laundering kind of thing.
CHRISTIE: And then, the article starts to tie it back to sort of Russian influence on the election and that the discussion has been tending to focus on Twitter and Facebook, but it’s starting to become clear or at least there’s good evidence that YouTube played a huge role in that and that they were compounding factors. Like Russian bot networks on Twitter, promote YouTube videos, and then the recommendation engine promotes them.
AUDREY: Yeah. These aren’t isolated pieces.
CHRISTIE: There is this really disturbing bit in here about this one video: “This Video Will Get Donald Trump Elected”, a viral sensation that was watched more than 10 million times before it vanished. They found a copy of it, and I don’t know what prompted them to do this. But when they played the video in slow motion, I saw that it contained weird flashes of Miley Cyrus licking a mirror. I thought I had already been as disturbed as I could get about YouTube. That added a layer…
AUDREY: That is a fairly strange.
CHRISTIE: The reporter talked to some creators and got them to send screenshots or downloaded reports of the traffic, and it basically confirmed that what makes these videos take…like the referrals are hugely from the recommendation engine, whereas direct traffic will be like 3%.
AUDREY: I’ve been watching more stuff on YouTube lately. It’s really interesting to see, for very non-controversial things like travel videos, it’s [inaudible] whatever. For fairly non-controversial things, I’ll search for something and look at the first six results and see that they’re ones that have way more traffic than another one. And some of it’s about the channel, like the reason that they push you to subscribe is that then you see it’s like getting people to like your Facebook page. Then they’ll actually see the content when they log in. But there’s just also these really odd peaks and dips that don’t seem to be correlated with the quality at all, but just something to do with how the content links together on the site.
CHRISTIE: Peaks and dips, meaning a video not of very high quality will have a lot of hits?
AUDREY: Yeah, or moderate quality. But I’m thinking like I found five videos reviewing the same object and they’re all kind of similar, they have lighting – OK. Sounds – a different person in front of the camera in each one. But yeah, some of them will have like twice or three times as much traffic than another one. And their channels are all kind of similar too. So you can tell that there’s some other element there.
CHRISTIE: A long time ago, we used to…every culture had some sort of religious structure and some sort of theology and people would have ceremonies and rituals and sacrifices to their gods and different gods to ask for different things. And I feel like our algorithms are the new gods and we have to…
AUDREY: We have to use them.
CHRISTIE: Yeah, we have these rituals and things that we do. And we don’t really know how it affects, but we all feel drawn to figure it out so that we can run our businesses or whatnot. I don’t know. It’s odd. It’s a little overwhelming.
AUDREY: It is extremely, extremely odd. And yeah, as someone who sits here and tries to figure out what those elements are, it’s confusing. There’s some very predictable ways that marketing can work and marketing in the sense of like getting people to traffic to your site. But at the same time, there’s a lot of that uncertainty too, like if I need the right topic that’s just sticky enough, getting the right headline in there, getting it posted to the right site at the right time because maybe if I post on Twitter at 3:00 AM and the only people who are interested in that thing are in the United States, I’m just not going to get anywhere.
AUDREY: And it’s just all of those kinds of things. Geez, I just saw something talking about Instagram and the Instagram algorithm, the way that it shuffles the photos in your feed now. And I saw something that said, if you don’t get likes in your first 10 minutes, forget it.
CHRISTIE: Oh, my God.
AUDREY: Because it’ll bury your post. So, if you don’t get likes in those first 10 minutes, there’s a good chance that people following you won’t even see it.
CHRISTIE: We had a precursor or a lead up to this which was Google’s search engine. Ten, 15 years ago, people started getting really interested in how that worked. And Google would roll out a change and suddenly your referrals will drop off.
AUDREY: And there were blogs that were dedicated to watching those changes and warning you. [Crosstalk] are going to see a really sharp drop.
CHRISTIE: Right. And I think there still are. The problem is, is that there’s so many other players now. You have to worry about Twitter, and Facebook, and YouTube, and Instagram, and God knows what else. I can’t keep up.
AUDREY: Yeah. You can’t just look at your search engine optimization anymore.
AUDREY: Social media really does affect that.
CHRISTIE: To me, just more and more it’s becoming so obvious that we need oversight. There has to be. And I don’t think it’s just like…because a lot of people are like, “Oh, it should be open source. It should be open source.” I’m like, “That’s not enough. We need a dedicated oversight body that can look at the algorithms, can look at the data that can provide some kind of regulatory feedback, correction, adjustment, accountability mechanism.” I’m just going to make noise on podcasts because I’m like banging my fist on my desk as I say this. So, I guess I’m a little worked up A.
AUDREY: Well, yeah. It’s so serious, right? It’s easy to trivialize this, but it is really serious when it affects our views, our understanding of the world, our ability to make a living, our ability to share our voice and our perspective.
CHRISTIE: Or making a thing we can’t control too.
CHRISTIE: I mean, I don’t think people at Google understand how this totally works.
AUDREY: There’s this kind of odd trend of former engineers and VPs and whatever, is starting to be really uncomfortable with what they’ve done. This is another thing that I see on The Guardian every couple of weeks. There’s some new organization that some of them are starting to help us understand how our kids brains are affected by so much internet or something along those lines. And their regulations or their recommendations aren’t like, “Oh, let’s regulate it. Let’s have a government body that’s responsible for us.” It’s things like, “Turn your phone onto grayscale so that you’ll find icons a little bit harder to read and it’ll slow you down. They’ll keep you from just click, click, click.”
CHRISTIE: Right. So, on happier news. There’s some things we like on the internet this week.
CHRISTIE: Mine, and I’ll have to find some specific things to link to. But I didn’t have to watch the Super Bowl. Twitter, I was able to just…
AUDREY: You never have to watch the Super Bowl.
CHRISTIE: I know. But I really wanted to know if the Patriots lost. Well, there are many reasons why I didn’t want to watch. One of them was I didn’t want to watch the Patriots win. So, I just checked in with the Twitter Moments or whatever, here and there. And I feel like I got to, I heard about the crappy Ram Dodge commercial that used MLK. I know about the Justin Timberlake Prince thing and I know about that amazing play where Foles, the quarterback for the Eagles caught a touchdown. I think making him the first NFL quarterback to throw a touchdown pass and catch one in a Super Bowl. And I got to shout with joy shortly after the Eagles did win. I was in the other room and Sherri was like, “Are you OK?” I’m like, “Yes! The Patriots lost.” She’s like, “I was hoping that’s what that was.” Yeah. And there’s weird stuff that comes up with the Twitter Moments stuff, but increasingly I’ve been liking it. I know if there’s a really big thing going on, I just go to the search thing. I can usually get a quick rundown of what it is.
AUDREY: I started using it to try to figure what people are sub-tweeting at any given time.
CHRISTIE: Oh, yeah. Does that work for that?
AUDREY: Sometimes, it depends. If it’s an article, yeah.
CHRISTIE: What’s your thing that you liked on the internet this week?
AUDREY: Well, it’s a little bittersweet. The Geek Feminism blog is officially shutting down. It’s been kind of winding down for a few years, but I thought this was just a nice opportunity to acknowledge what an impact that it’s had on, not just the Recompiler, but just a lot of other expressions of feminism in our work. And so yeah, I’m really glad that it’s been here.
CHRISTIE: Yeah. So we’ll link to their kind of farewell blog. It’s not a hard farewell. The site will be around.
AUDREY: And there’s a Wiki that’s collected a lot of information over the years and that’ll stay online.
CHRISTIE: Yeah. So we’ll link to that. I think that’s been an important community for a lot of people. I’ve been grateful for it over the years and I think, sometimes you say bittersweet, but I also think when things come to an end is a really awesome opportunity to, like you said, recognize the impact and also to, there will be new things.
AUDREY: Definitely acknowledges the new directions that we’ve gone in, the ways that we’ve built on that original work.
CHRISTIE: All right, I think that’s our show. Thanks everyone for listening. Thanks Audrey for hosting again with me this week and talk to you all again soon.
CHRISTIE: And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to email@example.com or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.
The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.