Download: Episode 54
This week Audrey and I chat about YouTube’s announcement to link to Wikipedia, the New Yorker’s profile of Reddit, Spotify and copyright law, and more. Enjoy!
Show Notes
- [03:53] The Responsible Communication Style Guide is headed back to the printers!
- [06:36] Hire Christie!
- [07:06] YouTube didn’t tell Wikipedia about its plans for Wikipedia – The Verge
- [12:50] Phoebe Ayers on Twitter: “…It’s not polite to treat Wikipedia like an endlessly renewable resource with infinite free labor…”
- [14:04] YouTube, the Great Radicalizer – The New York Times
- [15:51] kate conger on Twitter: “case in point: in YouTube’s statement, they don’t take a position on whether the moon landing happened.”
- [19:56] The Grim Conclusions of the Largest-Ever Study of Fake News
- [29:49] Reddit and the Struggle to Detoxify the Internet | The New Yorker
- [43:04] Rochelle on Twitter: “Analyzing videos for YouTube this morning. The rules are always changing…”
- [44:44] A $1.6 billion Spotify lawsuit is based on a law made for player pianos – The Verge
- [49:20] Dear Music Fans… by StartUp from Gimlet Media
- [51:16] Spooky action at a distance, how an AWS outage ate our load balancer
- [55:26] Janelle Shane on Twitter: “Does anyone have a picture of sheep in a really unusual place? It’s for pranking a neural net.”
- [57:31] Let’s Encrypt on Twitter: “Let’s Encrypt wildcard certificates and ACMEv2 are available today! More information can be found here: https://t.co/0SdH98Oabn”
Community Announcements
The Responsible Communication Style Guide is headed back to the printers!
When we sold out of print copies of The Responsible Communication Style Guide last fall, we promised to do another print run in early 2018. We’re happy to announce that we’re ready.
If you’ve been waiting to pick up a printed book (or enough for the rest of the office so they stop filching your copy), this is your chance. Order now!
Now Broadcasting LIVE most Fridays
We broadcast our episode recordings LIVE on most Fridays at 10am PST. Mark your calendars and visit recompilermag.live to tune-in.
We love hearing from you! Feedback, comments, questions…
We’d love hearing from you, so get in touch!
You can leave a comment on this post, tweet to @recompilermag or our host @christi3k, or send an email to podcast@recompilermag.com.
Transcript
CHRISTIE: Hello and welcome to The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.
Episode 54. This week Audrey and I chat about YouTube’s announcement to link to Wikipedia, the New Yorker’s profile of Reddit, Spotify and copyright law, and more. Enjoy!
We have started Daylight Saving. Stupid thing that it is.
AUDREY: Also known as the week everyone has jetlag.
CHRISTIE: Right. And you don’t even get the fun of going anywhere. That’s quite silly.
AUDREY: The only year that I actually manage to make both actual jetlag and daylight savings jetlag work for me correctly involved going to Colorado and then Austin within the course of a week. And so, I’ve managed to cross time zones at just on the right days but I didn’t feel the difference.
CHRISTIE: Is Colorado Mountain Time?
AUDREY: Yeah.
CHRISTIE: And Austin is Central. Time zones are weird. I understand why we have them. What’s really weird is, is it China that’s one time zone?
AUDREY: Yeah and there’s a couple of places actually. I just submitted a talk on time zones. So I can tell you that in general, time zones and geography are not the same thing. And there are a couple of countries that have done something extremely weird with their time zones such that a whole part of the country, whatever noon is, it has zero to do with what the sun is doing.
CHRISTIE: Yeah. I know things like Hawaii and Arizona…
AUDREY: Don’t do daylight savings.
CHRISTIE: Don’t do daylight saving. It moves Hawaii to one of the top of my list of states to move to. It would make even less sense there because they’re so close to the equator that the day doesn’t change that much, some of the things I like about it. Spain, they align their time zone to match Germany during the war, I think. So they’re much later than they should be for the rest of Europe.
AUDREY: I just read a thing about that and it gets into this whole labor and gender kind of thing around how late the workday is and who is able to work into the evening like that. I thought it was really intriguing.
CHRISTIE: I did not like it. I was there for like over a week for a work trip. And you know how much I do not like working in the evening just because my energy is so low. And so this whole, “Yeah, we’re going to take a break at 4:00 and then come back to work until 7:00 and then have dinner at 10:00,” is like really…this particular Christie was like, “No.”
AUDREY: I can totally understand that. It’s actually perfect for me. But yeah, I don’t think that that’s a very effective thing for most people. And again the gendered impact that I’ve read about is that it does take women out of the workforce.
CHRISTIE: Because of childcare.
AUDREY: Yeah, because of childcare.
CHRISTIE: Right. So welcome everyone, who’s tuned into the live stream. I’m Christie Koehler, this is Audrey Eschright.
AUDREY: Hello, hello.
CHRISTIE: Yeah. You got any announcements for us, Audrey?
AUDREY: I do. We are reprinting our book, The Responsible Communication Style Guide. We have sold nearly every copy of the first printing which is super-duper exciting. When we started planning out the book, we were like, “If we sell 500 copies, that’ll just be totally amazing.” And I think actually between [inaudible], we’ve reached that mark already. So we’re sending it back to the printer for another 300 paper copies. But in order to do that, we need preorders to help us cover the cost of the batch because perfect-bound printing…this is getting a little technical but it’s basically more expensive to do a perfect-bound book. Perfect-bound being like where the seam is flat. And so in order to do that on our little tiny company budget, we just need preorders to support the batch. So that’s what we’re looking for right now.
CHRISTIE: Because there’s too much content in there to be saddle stitch, like stapled.
AUDREY: Yeah, for sure. It would bulge weirdly.
CHRISTIE: There’s probably a minimum for perfect binding too, is there?
AUDREY: Kind of, yeah. That’s the thing that we hit that they charge a lot per copy if you are below a certain number. And 300 was a pretty good per copy price.
CHRISTIE: So there’s a lot of great resources in there. I was actually thinking about it this week with the passing of Stephen Hawking. I even saw an article today that was talking about how the media outlets were struggling with how to talk about his disability and his achievements and his passing and all of that. So it reminded me for the need for things like the responsible…what’s the name of it again? I just forgot.
AUDREY: Responsible Communication Style Guide or as all of my notes say, the RCSG.
CHRISTIE: That’s what checked me up as I was looking at RCS, my brain was like, “Recompiler.” And I’m like, “I know that’s not what it’s called.”
AUDREY: I mean, the distinction between wheelchair-bound and wheelchair-using, using like glasses or toothbrushes.
CHRISTIE: Yes. I appreciate the glasses analogy I use it a lot for other things, too. So, shop.recompilermag.com is the easy to remember URL and then we’ll link to both the announcement about it with the details and the shop link in the show notes.
AUDREY: And if you want to know more about the book in general, you can go to RCStyleGuide.com.
CHRISTIE: Any other announcements?
AUDREY: That’s our big one.
CHRISTIE: I got a quick one. I am looking for my next full-time gig. I’m primarily looking for either software engineer or developer advocate. And you can find out more about me on my website, ChristieKoehler.com and we’ll have a link in the show notes. So, if you’re a tech company or have a tech department and you hire for those things, check that out.
Our first topic: YouTube and Wikipedia. When did this…? This just happened like on Wednesday or something, right?
AUDREY: Yeah, I think it was at South by Southwest.
CHRISTIE: Or maybe a little earlier.
AUDREY: YouTube gave a talk about what’s up for them. South by Southwest is actually that time zone trick I was talking about. I guess it probably always lines up with daylight savings time.
CHRISTIE: Yeah probably, at least since it was changed. But we established last episode that it has been changed for quite some time now.
AUDREY: It’s 2003 or something. It’s been 15 years.
CHRISTIE: And actually you know this is big news because I have every version of the South by Southwest hashtag I can think of muted. Unfortunately, that doesn’t apply to promoted tweets because it’s just like I don’t really want to think about Austin right now. Everyone makes all these announcements and there’s always like…that’s a big deal.
AUDREY: Yeah, there’s definitely a lot going on. And I think that YouTube’s updates might have slipped right in the middle of that without anybody noticing except that they did something kind of audacious with the statement about how they intended to use Wikipedia.
CHRISTIE: Yes. I don’t know…do I have the quote here from Susan Wojcicki? Anyway, the CEO of YouTube said onstage that they will start adding information from Wikipedia to conspiracy related videos. And this quote that’s published in the Verge says. “We will show a companion unit of information,” it’s weird phrasing, “From Wikipedia showing that here is information about the event.” The company is using a list of well-known internet conspiracies from Wikipedia to pull from.
AUDREY: There’s a lot about this that seems flawed.
CHRISTIE: Yeah. The first thing is that they didn’t tell Wikipedia they were going to do this.
AUDREY: Right.
CHRISTIE: And there’s some issues with that.
AUDREY: Like I said, it’s pretty audacious to go to another organization and be like, “So, we’re just going to borrow everything that you did and stick it in a different context and not pay you for it.” And taking it out of the context where the work is supported and it’s free anyhow, isn’t it?
CHRISTIE: Right. So normally…I don’t know if normally, but quite often the way these things work is that you make some kind of partnership with the other side or the other service or you give them…maybe if there’s not an exchange of resources, there’s always an exchange of knowledge and a heads up.
AUDREY: Yeah.
CHRISTIE: And that evidently didn’t happen in this case. Wikimedia Foundation confirms that it didn’t happen. And there’s another compounding factor which is that Wikipedia is pretty much run by unpaid volunteers. And also, anyone can edit Wikipedia. So I think there’s a very real chance that linking from YouTube to Wikipedia is going to increase traffic and potentially increase, if not outright vandalism, at least edits would have to be monitored or discussed.
AUDREY: Right. It might it might increase the amount of, I don’t know what they call it, the contents of the page flip back and forth that kind of…
CHRISTIE: Reverts.
AUDREY: Yeah. It’s going to be more work for the editors. Actually, that even assumes that there’s a proper link to the content and not just pop up that completely isolates it from the page where it was taken.
CHRISTIE: And we see that a lot too. I think in a lot of search results, you get the little blurb box from Wikipedia. And I think a lot of people never clicked beyond that.
AUDREY: Probably not. I mean, if it answers the question that you had, then it’s right there.
CHRISTIE: I know this came up because I think the same thing was happening with MDN, Mozilla Developer Network references where just a bit of it would show up in search results. And what it did is sort of suck up content but not actually drive traffic.
AUDREY: And because Wikipedia has all of this volunteer writing and editing activity, not driving traffic means that you’re not driving contributions. You’re also circumventing their annual request for funding, the pop up that they do every year.
CHRISTIE: And other messages are starting to do…there was just one up about, I think, the censorship in Turkey. It’s not just a year end. Yeah, I favorited some really good tweet threads from different Wikipedia contributors. I thought this was really…”It’s not polite to treat Wikipedia like an endlessly renewable source with infinite free labor; what’s the impact?”
AUDREY: I was just going to say that beyond that, it doesn’t actually help with the conspiracy theory problem.
CHRISTIE: Right.
AUDREY: It just supplies the metadata to it. Even if your goal is to get people looking at conspiracy theories, I don’t know, no better. This won’t work. We have even more information about that.
CHRISTIE: Right. Basically, people are not as easily swayed by facts as we would like to believe.
AUDREY: And the chain of thinking that pulls people into conspiracies isn’t fact driven.
CHRISTIE: Right. Also this week, there’s a couple of things that I feel like we just keep getting the same versions of the same message, like in flashing red light. So there is the Op-Ed in The New York Times this weekend called YouTube, the Great Radicalizer which seemed interesting that it kind of pretended the rest of the news we heard and watch which basically…I don’t know. I feel like if you’ve been listening to podcasts and thinking about this stuff, it would just be sort of as a short summary basically that YouTube, kind of no matter where you start, it pushes you in the direction of more radical extremist inflammatory content. And that’s obvious results of Google’s business model and wanting to keep your eyes on the page so that it can generate ad revenue.
AUDREY: When we talk about machine learning and sort of the effect of these algorithmic approaches, unintended consequences happen all the time because you get what you design, you get when you control for. And if the only thing that you’re looking at is how long does it keep people on the page, then you’re going to discover unintentional behaviors like this.
CHRISTIE: Definitely. There is a quote that I favorited from a journalist who…I don’t know if they were actually at South by Southwest or watching a feed or whatever…I’m looking to see if I can find it…but it was something like YouTube wouldn’t…Susan Wojcicki wouldn’t even take a position on something like the moon landing.
AUDREY: I saw that, yeah. Like this isn’t helping.
CHRISTIE: It’s from Kate Conger, “Case in point: in YouTube’s statement, they don’t take a position on whether the moon landing happened. This is a quote ‘At SXSW, we announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, like the moon landing.” That just strikes me as weird.
AUDREY: It’s super weird. I can’t imagine that the people that watched the moon landing conspiracy videos are even paying attention to the SXSW announcement. Even if it’s sort of pandering, it just doesn’t seem like a very useful context for it.
CHRISTIE: Right. To me what it signals is that it signals that there’s no monetary incentive for them to remove conspiracy content. There’s really no incentive for them to take a position. Google’s mission is something like organizing the world’s information and making it available.
AUDREY: They’ve more recently stepped in the direction of calling themselves an AI company.
CHRISTIE: Right. And I think part of that is because of this realization that disinformation is just as lucrative as information and how much can you really be fulfilling your mission to make the world’s information available and useful if you also don’t give any signposts or clues about the quality of that information.
AUDREY: It does come off to me a little bit like she’s given up caring about those distinctions because you’re saying there’s no financial reason but there are some enormous social cultural political ones to care about this. I don’t know, I feel like those are incentives in the sense that whatever they’re building is probably not what they meant to, probably not something that’s a very healthy environment or just, I don’t know, the world that Google exists within.
CHRISTIE: But you mentioned pandering to the moon landing conspiracy theories. It’s reminding me that when you do take a stand, there’s consequences. And so I think in addition to the monetary financial incentives to doing things the way they are, there’s also this other incentive to like not draw negative attention. And I’ve seen so many times, I’ve seen companies take weirdly passive stances. I mean, we see Twitter do it all the time.
AUDREY: Yeah, because they don’t want to take responsibility and they’ve convinced themselves that sort of the kind of inaction is a neutral stance and it’s not.
CHRISTIE: I guess we’re just back to this avoidance and shuffling of accountability.
AUDREY: Yeah. I don’t know. The other thing that I think we both read about the impact of disinformation and propaganda and fake news, what it really just gets into is that what makes all of those things stick and get shared so much is that they’re juicy. These are juicy kinds of information that we are attracted to you even if we look at it and go, “Uh, no.” We’re still attracted to it and it just has so much reach, so much power and so much reach because of that, that goes well beyond any reaction or response that we have about news that we know is true.
CHRISTIE: Right. Are you referencing to this thing on The Atlantic, The Grim Conclusions of the Largest-Ever Study of Fake News?
AUDREY: Yeah.
CHRISTIE: So this piece in The Atlantic that came out last week by Robinson Meyer is a summary of a paper published in Science magazine. And I didn’t go through and try to find the paper. Half the time I get hit a paywall which I think is why these types of summary articles exist in part. But basically the these social media researchers at MIT took a huge data set they got from Twitter and analyzed basically what kind of tweets end up having the biggest, fastest reach both in terms of breadth like and depth. And it’s fake news.
AUDREY: Like how many people versus how many layers of sharing.
CHRISTIE: Yes. Thank you. I was trying to figure out how to say it. That’s a good way to say it. And that falsehood almost always beat out truthfulness just penetrating further, faster and deeper into the social network, inaccurate information. And some of it, they postulated the reason is because it’s novel. False information tends to be novel. And I think that’s something that humans seek out.
AUDREY: Yeah. And they looked a little bit about…they did sentiment analysis on the tweets in terms of like what emotions and reactions were they evoking. And they found that news and this other, it’s not just fake news, but this other category of stuff, they provoke very different reactions and very different emotions in us. And the actual news is often not nearly as engaging. It’s not as, I don’t know, emotionally arousing. I don’t mean like in a sex way but like you sort of, like your blood pressure goes up.
CHRISTIE: Yeah, it’s stimulating. There’s a dopamine release or adrenaline or whatever.
AUDREY: Yeah.
CHRISTIE: I think we’ve all experienced that. We’re bored or there’s some downtime, so we look at Twitter and something that annoys us is going to be as much of a diversion as something that pleases us.
AUDREY: Or something that confirms our biases, that’s exciting in that way.
CHRISTIE: It’s the same reason why people rubberneck along the highway or why whenever there’s some kind of big accident or something like those news stories. If it bleeds, it leads. I think some of this is not necessarily new.
AUDREY: We’ve managed to amplify it in a completely novel way, in a completely unseen-before or an experienced-before away. And I don’t think that we’ve even started to see how bad this could get. I keep reading more and more analysis of this and more of these kinds of dissections. And I think right now, it feels like a little bit of a minor cultural phenomenon, like it’s worrisome. But there’s just so many ways that our ability to know what’s true become altered by this and become altered by some of the other things that are going on in media.
CHRISTIE: What are some of the other things?
AUDREY: I read a thing this morning about recreations of celebrities.
CHRISTIE: How do you mean?
AUDREY: Like video and audio recreations of celebrities and how…
CHRISTIE: Like making them appear they’ve done or said something they haven’t like taking existing…
AUDREY: Yeah, creating fake recordings.
CHRISTIE: You know what’s a good way to provide a lot of audio for people to be able to do that is to host a podcast.
AUDREY: Yeah, we talked about this the other week personally that we’ve provided enough for somebody to train a model on it. But no, I was just reading an Op-Ed this morning about how if we can…and this has been true for photography for a while, like you can spot really obvious photoshopping. But there are some fairly sophisticated things you can do to create an image where it’s hard to detect the paucity of it. And there’s so many ways that images that we passed around on the internet are detached from their context anyhow that can be used to say a lot of things that aren’t true. I don’t know, I think that we’ve walked into a situation that how we understand what’s real and what’s true have become just undeniably altered and that the stuff that we saw around the election is maybe just a little piece of it.
CHRISTIE: That’s disconcerting.
AUDREY: It’s really disconcerting. Yeah, that was what I woke up with this morning. Yeah, I’ve just been trying to wrap my head around like what actually happens now if things keep going the way that they’re going, then what? And I keep trying to think of like what are the active and positive steps we can take when it’s not just that Google can do these things and YouTube can do these things by the massive volume. There’s smaller examples. And if that’s true, then what’s our next step?
CHRISTIE: Do you know?
AUDREY: I don’t know. This is like what I think about when I’m trying to do other things. This is like one of the big questions in my head right now and I don’t think anybody knows, is the thing. Like I think to the extent that I’ve read about this, people say, “Well, we should regulate it.” Yeah, I think we probably should. But I also think that because the techniques for creating falsities are out there and because the sort of the social effort is possible, people constructing all these things. And because it doesn’t take a very big online community to create a lot of this stuff, because of all those things, I don’t think that you can…shutting down YouTube would definitely have an impact but I don’t think it erases the problem.
CHRISTIE: No, because it’d probably be another service or network people would go to.
AUDREY: Or just a smaller set of things that eventually led up to the same impact. I think we’re in a situation where what technology enables is, I don’t know, it’s harmful in this really big way that we don’t have any control over.
CHRISTIE: I think when I’m thinking about the stuff, I often come back to just the importance of personal connections and building community at a local level. I mean, not to rule out…not to say that it has to be physically local but just continuing to build up those personal relationships. You know what I mean?
AUDREY: I do, yeah.
CHRISTIE: When I’m confused about something online, I end up talking to somebody that I know and trust in real life as a way of sorting that out because I guess that’s going back down to like the core-est level of trust or whatever.
AUDREY: Yeah. And I mean that is something that I’ve been thinking about too, about like how if we can’t trust anything that we see, like information that’s provided to us, then how do we sort of build our understanding of what’s actually happening? And I think you’re right that it is those personal connections in those community conversations. The way that I think that we haven’t seen how weird this can get is that not every community is going to do the same thing there. The two of us might have a certain conception of objective reality and what happens in the world and want to continue to explore that but that doesn’t mean that everybody is going to take the same direction with that and that every community is going to do the same thing. And that means that we’re going to have sets of people to come into conflict with each other.
CHRISTIE: Right. Are you sort of meaning like tribes?
AUDREY: Yeah.
CHRISTIE: [Crosstalk] tribalism which I hate.
AUDREY: Yeah, it’s just that it can create communities that do not have an ability to communicate with each other about important things.
CHRISTIE: Okay. I don’t know if our notes are totally in order. Do we want to talk about Reddit or the Spotify thing?
AUDREY: I’d like to talk about Reddit because it’s kind of a nice counterexample.
CHRISTIE: Which is an odd thing to say.
AUDREY: Yeah. I mean, it’s a demonstration of how all of these things can play out in a different kind of way, at least within a single service.
CHRISTIE: So there’s a long piece in The New Yorker. I guess it’s in the…I don’t know which issue it’s in. Anyway, it’s at the bottom. I guess it doesn’t matter. We’ll link to it. It has a different title in the print version for whatever reason.
AUDREY: To make sure people click on it. The title’s being different.
CHRISTIE: Well anyway, March 19th issue which just came in the mail for me and it’s very, very bright. It says ‘Reddit and the Struggle to Detoxify the Internet. How do we fix life online without limiting free speech?’ I like this because for one thing, it focused entirely on Reddit and one of the reasons it did that was because the author said, Andrew Marantz, said that Reddit was the only social network that would really talk to him.
AUDREY: Yeah, there’s a nice bit at the beginning about how everybody stopped answering emails at a certain point in the conversation about this.
CHRISTIE: Which honestly, that actually gets a lot of points with me.
AUDREY: The reporter?
CHRISTIE: That Reddit would actually be somewhat transparent about their work. It’s just so unusual that companies do that.
AUDREY: Yeah, that they screwed up in a really public way and they decided to talk about it publicly, even though the first part of it doesn’t make them look very good.
CHRISTIE: Right. And so, if you don’t know a lot about Reddit, I think this is a really good quick sort of overview of Reddit’s history starting with Huffman and Ohanian coming out of college and making this thing. And at that time, they were very young and they were really into like, “Free speech! Free speech!” And so that kind of set the tone for Reddit and then being purchased by Condé Nast and then all of this sort of escalating kinds of content on there including Pizzagate which just baffles the imagination, I guess maybe not so much anymore.
AUDREY: I mean, how does a community of people create a belief that has nothing to do with reality?
CHRISTIE: Yeah, I guess that’s not really new necessarily for humans. But again like you said, the speed and the amplification of it. That’s something that did jump on me. The time between…in only 15 days of existence, the Pizzagate subreddit had attracted 20,000 subscribers. And I don’t know how long it took for that guy to go into the actual pizza joint with a rifle.
AUDREY: It was at least a few weeks later, I think the article says.
CHRISTIE: Okay. I mean, if you think about sort of like cultish type movements, which is the only thing I really know how to compare this to right now, they take much longer. Usually, that’s a year’s long process of like recruiting people and getting them to believe a certain thing and then getting them to do certain things.
AUDREY: Yeah, this can happen really super quickly when it sort of taps into the right kinds of feelings and ideas and news stories.
CHRISTIE: Remember when we were first looking at that really creepy dad on YouTube that eventually got banned? And I remember I was googling stuff and I got the most information about him from, I think from these Pizzagate people.
AUDREY: Yeah.
CHRISTIE: And that really…I don’t know if we talked about it at the time…but that really weirded me out like something [inaudible] legit I thought I was doing and it ended me up at Pizzagate people really quickly.
AUDREY: Yeah, I know we talked about it on the podcast but I do remember we both, I think, independently discovered that. And it was super interesting that a bunch of people that were trying very hard to find child predators on the internet might succeed even though a lot of their groundwork and a lot of their assumptions were wrong.
CHRISTIE: Right. I think we’ve all had that experience where we end up realizing we’re in alignment with someone we consider like way over there. And I still don’t really know what to do with that. Okay, so the New Yorker piece kind of keeps going through the history of Reddit and they end up in a place where Huffman has come back as CEO and how they basically started removing these really bad subreddits, and what their approach to actually starting to moderate some of the content on there. And I sort of like that they were really kind of transparent about the fact that you have to start somewhere and it’s not perfect. And early on in the site, they were sort of like, “Well, we would ban people. We just did it quietly and it was pretty much at our whim, like we didn’t like something that was in a username.” And there was also that incident where I think it was Huffman who edited a bunch of people’s posts that caused a big uproar.
AUDREY: Yeah. Automated thing to do that. So they were trying to insult him, they randomly insulted each other.
CHRISTIE: Yeah. Which is a little funny but also like I do agree kind of…
AUDREY: Extremely inappropriate.
CHRISTIE: Yeah.
AUDREY: The capriciousness of their previous attempts is actually like when I train people on Code of Conduct enforcement, this is one of the big things that we emphasize. You need to have…people need to see your process. People need to see like if somebody gets in trouble, why they get in trouble. And if you’re afraid of doing that, the effect is that nobody trusts you.
CHRISTIE: Right. And that needs to be done, like you can’t be totally transparent about this stuff but you can make things visible to the right people. And yet there has to be a certain kind of consistency about it. AUDREY: Yeah. And in general people should know what rules they’re breaking. CHRISTIE: Right. It’s not that rules cannot change. But if they are to change, it goes through a process, right?
AUDREY: Yes. And their process when they decided to improve their content guidelines, I really like the process that they went through. That there was a lot of preparatory work that they started looking for things that they thought were going to be in violation of the new role, that they worked out what the new role was going to be and worked out the details of it to try to hit that sweet spot between specific enough and open enough that you couldn’t catch other things you didn’t think of at first. And that they sat down in the course of a day and they just worked through it. This wasn’t like, “Oh, we’ll take five of them this week,” so that it didn’t give…to me, it just seems like it didn’t give people a chance to try to work around it because they took down all of the [inaudible] separated at once and they took down all of like any gross category. They took care of all of that once so that things could just bounce around the site. CHRISTIE: Yeah, they took a stand. AUDREY: Yeah. And it just strikes me as one of the most effective ways they could have done this.
CHRISTIE: And there was a report that came out of a study called “You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech.” And they concluded that the ban had worked. Users participating in the banned subreddits either left the site or (for those who remained) dramatically reduced their hate speech usage.
AUDREY: Yeah, and that they focused on banning areas and spaces of interaction rather than people.
CHRISTIE: Not people.
AUDREY: Yeah, that seems like that was a very effective choice.
CHRISTIE: Yeah. And I wonder if that’s one of the challenges that Twitter is hitting is it that they don’t have that same sense of place. I mean, the hashtag is sort of the closest thing and it’s not really that close.
AUDREY: That is part of what makes Twitter a dysfunctional social environment even for people who are trying to use it in a good way, that the lack of sense of sub-conversations and places. There’s no threads or personal thing you do. There’s no idea that we all walk over here and have a chat. And that creates a lot of bad behavior that is really preventable otherwise.
CHRISTIE: One thing I learned in this New Yorker article that I hadn’t known about is there is a social experiment with our r/Place. Did you know about that when it was going on? AUDREY: No, I don’t think so.
CHRISTIE: I thought it was kind of a cool experiment. I didn’t watch the video yet. But I think people should check that out if they don’t know about it already. Basically, they set up a grid. How big was the grid?
AUDREY: I think they said like a thousand by a thousand pixels.
CHRISTIE: Yeah. And users could change a single pixel to one of 16 colors like really basic web palette. But they could only change…how does? Any pixel or that single pixel every five minutes? So there’s a sort of throttling involved. And it’s just sort of interesting to see how people organize about what they wanted to create together. It’s kind of a cool story.
AUDREY: I like that there was a group that was just trying to turn the entire thing black.
CHRISTIE: From the description of it, it seemed like actually a pretty good microcosm for like how society is.
AUDREY: And how people interact on Reddit. Yeah, that people kind of came up with ideas together, implemented them together, had other people try to do something else, and had to negotiate that in a fairly friendly, harmless way. There was obviously some inappropriate stuff that showed up on there but people erased it.
CHRISTIE: To me, it’s just another reminder of the extent to which humans will go to express themselves and collectively express themselves. Like changing a pixel to one of 16 colors is pretty much one of the simplest structures for expression you could set up and just the stuff that came out of it. Although meta-conversations and the organizing…I don’t know.
AUDREY: That people could be really creative with it. And you know, reading this reminded me that I have encountered some amazingly positive subreddits. I think I have a login but I don’t really spend any time on there. But every so often, somebody points one out that’s just like a really caring environment or a really friendly fun kind of environment. And they definitely do have a framework that allows that.
CHRISTIE: Now, you’ve got me thinking about this – the difference between Twitter and Reddit, this idea of channels or subreddits or like places. I’m really intrigued by that notion.
AUDREY: Well, even just to come back to YouTube. There are YouTube channels. YouTube channels do sometimes get banned, like the really disgusting family videos that we talked about. Channels do get banned. I just don’t think they get banned often enough and they’re a little bit more like what it started out.
CHRISTIE: A piecemeal.
AUDREY: Yeah, and instead of having like an overarching content guideline where anything that runs afoul of that gets suspended.
CHRISTIE: Right.
AUDREY: And because there’s so much content on YouTube, they’ve tried to just find these big automated approaches to it. And the downside of the opposite is that humans have to look at some really awful stuff to spot it.
CHRISTIE: Right. That came up in The New Yorker article about Reddit and I’ve seen it come up with basically anyone involved in doing content moderation. AUDREY: It exposes you to fairly traumatic things.
CHRISTIE: There is this tweet and I think this might have been the same person that tweeted about the Expensify Mechanical Turk stuff, Rochelle. But it said, “Analyzing videos for YouTube this morning. The rules are always changing. Here’s what I’m looking for today.” And then it’s coarse/crude language and then sexual profanity. And then what’s okay. Adult dialogue and this really detailed stuff. Anyway, we’ll link to this in a more readable thing. But sexual content, violence or gore, drug use.
AUDREY: Well, if somebody asked about one of it, one of the things said like alcohol was fine, people getting drunk is fine, soft drugs are fine. And somebody responded, “What is a soft drug?” Even there, there’s like a really poor levels of specificity.
CHRISTIE: Oh, I see it. And Rochelle replied, “Your guess is as good as mine.” Like they don’t define that.
AUDREY: And there aren’t enough examples there for somebody to go, “Oh, okay. I get the idea.”
CHRISTIE: Incidental or comic use of soft drugs. It was just a joke.
AUDREY: I mean, are they just thinking like Cheech and Chong is fine but no jokes about heroin?
CHRISTIE: I mean, yeah.
AUDREY: Yeah, there’s so many different ways you could interpret this which makes it a bad rule.
CHRISTIE: So, slightly different topic. Let’s go from one complicated topic to another.
AUDREY: Sure.
CHRISTIE: And that’s royalties. Royalties for people that make music. So, a $1.6 billion Spotify lawsuit is based on a law made for player pianos. AUDREY: Yup.
CHRISTIE: The hidden costs of streaming music.
AUDREY: And Spotify’s trying to go public right now. Isn’t that why this lawsuit’s kind of coming to the forefront? CHRISTIE: Yup, because they had to do their SEC filing and then they got to talk about their finances and I think that’s how this got uncovered. AUDREY: Their financial liabilities. CHRISTIE: Right, because if this judgment goes against them, that’s $1.6 billion. A little bit of a liability there.
AUDREY: Yeah. But this company is claiming that they’re in violation of musical royalties rights with respect to a really specific part of the, like you said the player piano rules, these mechanical licenses that are about the reproduction of the composition.
CHRISTIE: So, originally on a player piano which was not on a roll of paper that had the holes that made the player piano go, that was the first thing and then it gets to like phonograph records and then vinyl records and then tapes and CDs and then all the way to interactive streaming.
AUDREY: And depending on the kind of online streaming and the source of it, the nature of it, different kinds of music licensing comes into play. The article gets into a lot of the details of it that FM radio broadcasts over the internet are different from Pandora and different from iTunes and there’s just a lot of different components to it.
CHRISTIE: And there’s different organizations that mediate this exchange of royalties.
AUDREY: But because the industry is so consolidated and the number of organizations that are involved all tie back to each other, it’s an internal dysfunction. And the lawsuit is really just trying to finally get in on that. I mean, they’re effectively saying that they didn’t file the right paperwork through the right organization which may or may not be the case. But not even that they didn’t pay the royalties, that they didn’t file the notification paperwork that goes along with it.
CHRISTIE: Right. And because of that, no one’s really sure if they paid the right royalties, right?
AUDREY: Yeah. I used to work in a cubicle next to somebody who did rights clearance for musical recordings. When you make a compilation, you have to get the rights to reproduce the thing. And there’s a lot of phone calls involved. That was mostly what I took away from it was that it involves spending all day on the phone knowing enough legal information to ask the right questions to get people to sign something.
CHRISTIE: And you might have to do that for the composer as well as the recording artist, right?
AUDREY: Yeah, there could be multiple levels of it.
CHRISTIE: This article is really detailed and has kind of amazing graphs. I’m going to have to read it five more times to understand it.
AUDREY: But it gets the point across that even taking for granted that we have this really screwy copyright and licensing system, that the way that it’s being done is like the least effective way to implement it possible because there’s nothing about this that couldn’t be automated and sorted out and just work. Just work on this series of databases…
CHRISTIE: Do you think they could use block chain?
AUDREY: I’m sure that there are multiple companies that are proposing to do exactly that.
CHRISTIE: That’s what Kodak’s going to do with the photos, right?
AUDREY: Yeah, right. But I mean, this is something that computers are good at like take a complicated set of rules, a complicated series of pieces of data and make sure everything goes in the right place. They could do that. And then, all they would say is, “Hey, we’re playing this music,” and some set of systems would take their money and send it to the right people.
CHRISTIE: This reminded me of, I think it was Grooveshark or some of these early services where you could…I think they started out like you could upload your music library and then stream it from everywhere. And then they started allowing you to listen to anybody’s music that had been uploaded. And I think Reply All did a…either Reply All or StartUp. One of them podcasted a show about this. I’ll go find it because it’s…anyway, they ended up running into licensing issues and just shutting down. And after reading this article from Sarah Jeong about the licensing around it, I can understand why.
AUDREY: It’s a very difficult thing for a startup to actually get involved.
CHRISTIE: Right, because you need like massive resources to build to figure it out and do the work you’re supposed to do.
AUDREY: There’s a good side tangent about Ready Player One in here, but we don’t have to go there.
CHRISTIE: Yeah, I don’t think that piece of fiction requires any…
AUDREY: It’s just that in order to make a movie that is so derivative of 80’s science fiction and culture and all of these kinds of things, it means that you have to clear the rights for every single thing that you’re using.
CHRISTIE: Right.
AUDREY: And that is an enormous undertaking.
CHRISTIE: Yeah. I saw someone comment that only Spielberg could have directed that because only a director of his prominence would have…
AUDREY: I think that was my tweet.
CHRISTIE: Oh, that was yours? Okay.
AUDREY: I saw a trailer for it. I started thinking about this likewise. “Why is Spielberg doing this?” But I think only Spielberg could.
CHRISTIE: Right. So, our last quick topic here: Spooky action at a distance, how an AWS outage ate our load balancer. I like these sorts of post-mortem articles or retrospective articles where there is some kind of outage or tech implosion and then they kind of lead you through the path of how they figured it out. And so, this is from Hosted Graphite and basically they don’t host their services on AWS with the exception of Route 3 health checks to check their own load balancers. And some Friday night recently, they started noticing that traffic out of their services like being ingested by others went way, way down. And that they were experiencing latency too, unavailability, I think, of their API. And so they were trying to figure out what was going on. They knew about the AWS outage and that got resolved after a couple hours and then everything was back to normal. So they were like, “Okay, it’s definitely the AWS thing, but why?” And also, there’s kind of a little side tangents in here and that a lot of the tools they use to run the infrastructure were having issues also, like the Slack bot that they use was being impacted because Slack’s IRC Gateway, which is going away, was impacted by the AWS outage. But what finally led them to figure out what was going on was they put together this particular visualization which was sort of a heatmap type and each row was a different ISP and the latency to connect I think to their load balancers. And what they determined is that basically the servers from AWS which were having issues that were trying to connect to their load balancers were taking way too long to complete their request. And it was basically hogging all their resources out of their load balancers. So what they’re going to do to rectify it is reduce the time or shorten the time out and increase the available connections and stuff. But it is interesting to me. It’s another example of how complex internet infrastructure has gone and how disturbances in one area can cause. There’s a huge amount of interdependency, even if you don’t think there is.
AUDREY: Yeah and they had an interesting opportunity to look at that, to see exactly what things went down when the AWS went down and how it limited what they could do. I thought the chat bot/chat ops thing was really interesting because it would be non-obvious to me that that would be affected. CHRISTIE: Yeah, and it’s hard to identify those interdependencies until there’s a truly broken link.
AUDREY: When you sign up for, I don’t know, Heroku or something, nobody gives you this list of dependencies, like it’s the opposite of that SEC filing. You don’t get this list of like, “Well you know, we depend on these major pieces of infrastructure.” Aside from just like, I don’t know, even asking around or waiting for an outage, you don’t necessarily know.
CHRISTIE: Yeah.
AUDREY: But I liked this breakdown a lot. I liked the questions that they were asking.
CHRISTIE: Yeah I thought this was a good story, so I appreciate it. Okay, what’s your favorite thing on the internet this week?
AUDREY: It’s more machine learning hi-jinks, but this one about sheep.
CHRISTIE: All right, let me look at it.
AUDREY: There’s kind of two Twitter threads that I put in there.
CHRISTIE: Okay. You’re going to have to explain it.
AUDREY: I did dig up the appropriate first tweet for this but a couple of people on Twitter were playing around with image recognition services, the sort of thing where like Apple’s photos app does this now where you ask for a picture of a sheep and if you happen to have one, then it tries to give you something that matches that.
CHRISTIE: Is this related? I saw someone looking recently for like a picture of a sheep in a weird place.
AUDREY: Yeah.
CHRISTIE: Okay.
AUDREY: Yeah. So there were a couple of people on Twitter who were trying to find edge cases. Okay, so like here’s a picture of a sheep and every image service agrees that this is a sheep. Well okay, so let’s find things that are sheep that it doesn’t think are sheep. And so, sheep in wheelbarrows, sheep on trees, lots of sheep, cotton fields all those sheep. And an orange sheep came in there. I learned about orange sheep.
CHRISTIE: There are really orange sheep?
AUDREY: There’s a farmer who paints the sheep orange to keep them from being stolen.
CHRISTIE: That’s clever.
AUDREY: Yeah, it’s really clever and they’re kind of fun to look at. There’s just this interesting series of things where they’re trying to find what does it say this sheep actually is. But then, there’s a twist. One of the services that they’re looking at actually, if it has low confidence about the image, it has people do it. So, they started trying to find the boundary between the automated results and the people results.
CHRISTIE: Cool. All right. We will link to some good Twitter threads to get you started on understanding about deep learning and sheep. Cool.
AUDREY: Yeah.
CHRISTIE: My thing is mostly functional and that’s Let’s Encrypt which provides free SSL or TLS certificates so you can secure your website. And also they provide cert bots so the process of sending those up is super easy. Now it has wildcard certs, so you can get a cert for like all subdomains on a domain. That’s what wildcard is, right Audrey?
AUDREY: I think so, yeah.
CHRISTIE: And I think that’s pretty cool. And I wanted to just highlight the work that Let’s Encrypt is doing and encourage folks that if they are still just serving plain http to change it. So, I’ll link to that announcement.
AUDREY: Yeah, that’s really great.
CHRISTIE: It is really great because even before, like all my websites…anything I set up now is https by default. And I don’t have to think about budget when I’m doing that. And because even a $10 a month or annual cert adds up for domains.
AUDREY: Right, yeah.
CHRISTIE: All right. I think that’s a wrap. You got anything else, Audrey?
AUDREY: That’s it for me.
CHRISTIE: All right. Thanks everyone for listening. Thanks Audrey for recording another episode with me and we’ll talk to everyone again soon.
AUDREY: Thanks.
And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to podcast@recompilermag.com or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.
The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.