Download: Episode 55
This week Audrey and I chat about Cambridge Analytica, anti-sex trafficking bill SESTA/FOSTA, Google News Initiative and more. Enjoy!
Show Notes
- [02:28] The Responsible Communication Style Guide is headed back to the printers!
- [04:18] ‘I made Steve Bannon’s psychological warfare tool’: meet the data war whistleblower
- [13:56] Robert Mercer: the big data billionaire waging war on mainstream media | Politics | The Guardian
- [15:13] The Reclusive Hedge-Fund Tycoon Behind the Trump Presidency | The New Yorker
- [15:37] Paul Ford: Facebook Is Why We Need a Digital Protection Agency – Bloomberg
- [24:23] How Controversial Anti-Sex Trafficking Bill Will Screw Over Sex Workers – Rolling Stone
- [25:38] Communications Decency Act – Wikipedia
- [26:24] Section 230 of the Communications Decency Act – Wikipedia
- [36:46] Post-SESTA/FOSTA Self-Censoring for Twitter, Reddit, and other Social Media
- [38:11] Melissa Gira Grant (@melissagira)
- [38:48] Floor Remarks: CDA 230 and SESTA – Ron Wyden – Medium
- [41:19] Slack picked a weird time to make it easier for bosses to download you
- [48:35] yan on Twitter: “fun way to monitor someone’s IP address:…”
- [50:24] The Google News Initiative: Building a stronger future for news
- [59:21] 12 Things Everyone Should Understand About Tech – Humane Tech – Medium
- [59:34] Why is your email in my car? | daniel.haxx.se
- [1:01:10] Dialects of English: Take The Dialects of American English Survey
Community Announcements
The Responsible Communication Style Guide is headed back to the printers!
When we sold out of print copies of The Responsible Communication Style Guide last fall, we promised to do another print run in early 2018. We’re happy to announce that we’re ready.
If you’ve been waiting to pick up a printed book (or enough for the rest of the office so they stop filching your copy), this is your chance. Order now!
Now Broadcasting LIVE most Fridays
We broadcast our episode recordings LIVE on most Fridays at 10am PST. Mark your calendars and visit recompilermag.live to tune-in.
We love hearing from you! Feedback, comments, questions…
We’d love hearing from you, so get in touch!
You can leave a comment on this post, tweet to @recompilermag or our host @christi3k, or send an email to podcast@recompilermag.com.
Transcript
CHRISTIE: Hello and welcome to The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.
Episode 55. This week Audrey and I chat about Cambridge Analytica, anti-sex trafficking bill SESTA/FOSTA, Google News initiative and more. Enjoy!
I guess we better get to it as long as our voices hold out, which may not be the whole hour.
AUDREY: Right. Should I tweet something?
CHRISTIE: I tweeted and then there was a scheduled tweet.
AUDREY: Yes.
CHRISTIE: You can tweet again if you want. We might have a whole different audience being Saturday.
AUDREY: I was kind of hoping so.
CHRISTIE: Now it is the March For Our Lives.
AUDREY: Yeah. So we’re retweeting new.
CHRISTIE: But I feel like once Spring comes around, there’s conflicts every weekend.
AUDREY: Obviously, this is a big deal. I obviously would not be able to march in my current condition, but I know a lot of people down there.
CHRISTIE: Yes. And I’ve been seeing stuff all over Twitter about it. So good luck doesn’t sound right, but like we’re thinking about all the people marching, I guess. Trying to be with them in spirit. Stay bundled up. The weather kind of got miserable here in Oregon.
AUDREY: I keep hearing we had snow, but it melts very quickly.
CHRISTIE: Yeah. I did not see snow where we are at all. But also it’s possible, I just wasn’t paying attention.
AUDREY: Yeah.
CHRISTIE: But it’s definitely been cold. I bought some flowers to put into the ground and I haven’t done so yet because I got a cold and then it was miserable weather. It’s like, ahhhh. So, enough about the weather. Any announcements, Audrey?
AUDREY: We are still taking orders for the reprint for The Responsible Communication Style Guide.
CHRISTIE: Awesome.
AUDREY: And I just talked to somebody who was learning how to make use of it. It’s a really, really valuable tool to just help us be more considerate and respectful of each other. So, if you haven’t ordered your copy, I encourage you to do it now. The faster we get the pre-orders, the sooner we can send it to the printer, which means the sooner we can ship them to you.
CHRISTIE: So, if you had to say there was a deadline…
AUDREY: I believe the website says April 15th, I don’t have it open.
CHRISTIE: But sooner the better.
AUDREY: Yes.
CHRISTIE: And what are some of the topics that The Responsible Communication Style Guide covers?
AUDREY: We have five main sections. They are race, gender, sexuality, religion, and health and wellness. Just the super relevant themes right now. I’m trying not to cough. Hold on, I’m going to mute myself for a second.
CHRISTIE: I’ll talk for Audrey since she’s muted and I don’t think she said this, but this printing will also include some minor corrections and sort of spiff up some rough edges there were perhaps in the first edition. So definitely worth your while to go check that out. And I forget the URL for the style guide itself.
AUDREY: RCStyleGuide.com.
CHRISTIE: RCStyleGuide.com. Perfect timing, Audrey. And you can also get a link to it from shop.recompilermag.com too. Any other announcements?
AUDREY: That’s the big one.
CHRISTIE: All right. Our first topic, Cambridge Analytica. I’ve been babbling about them for a little while now off and on. I think we talked about it, we may have talked about it in passing on previous episodes, but the first one I remember really talking about it was when that source code on GitHub. And I felt like that was in the Fall sometime.
AUDREY: Yeah, that sounds about right to me. I think that you had brought them up a couple of times last year and that and the lawsuit, but somebody filed using the UK privacy laws. I think that was the other big thing that we had talked about.
CHRISTIE: Right. The podcast Note to Self has been periodically doing stories about this and we’ll include links to some of those in the show notes because I think they give a really good overview of things [inaudible] podcast format, which you might, if you’re listening to this, will find those enjoyable. So about a year ago, The Guardian, I feel like had their first really big piece about this and they had stuff earlier too, but all of a sudden this exploded like in the last week. And why is that, Audrey? What changed?
AUDREY: It’s kind of a planned release. The same reporters, and I keep trying to figure out the relationship of The Observer to The Guardian I saw on the same website. But there’s a reporter that has been following up on this since I think about this time last year at least. And she and her team, they have a whistleblower who was able to bring a lot of documentation of what they’d been talking about and a lot of details. And interestingly, somebody pointed out to me that Christopher Wylie, this is the whistleblower, had been called out in a previous article. One of his former co-workers basically said outright, you need to talk to him because he’s got all of the smoking gun stuff here.
CHRISTIE: Oh.
AUDREY: Yeah, a friend had pointed that out to me. So I think that it’s all in front of us right now. I think it’s the planned release from the journalists to show the information that they have, and just show the strength of the information that they have now.
CHRISTIE: And to get people up to speed who are just coming on board at this, Cambridge Analytica is a company that is basically formed as a subsidiary as a UK organization, SCL, something like that, group or limited, I forget. And their prior work had been basically information ops for basically the military and covert stuff.
AUDREY: For SCL, you mean?
CHRISTIE: For SCL, yeah. And this Christopher Wylie guy had been studying. He’s a Canadian. He’d been studying data analytics and started following the research of people doing, what’s it called? Psychometrics?
AUDREY: I think so, yeah. It’s like a kind of behavioral forecasting,
CHRISTIE: Right. So those quizzes that you encounter online, what Game of Thrones character, stuff like that, personality tests and also just correlating things that you like and other type behavior with like how you might vote or the things you might purchase.
AUDREY: There was a fairly major quiz that some researchers that they started to work with that they had created beyond what Game of Thrones character are you. It was a fairly detailed quiz that people were taking for money. So they were collecting not just the incidental data, but a lot of very detailed, deliberate stuff.
CHRISTIE: Right. And just to close that one thread, where I was going with that is that the Guardian article about Christopher Wylie, this was a new revelation for me or a new piece that had been connected that the way…So Cambridge Analytica came about because the Mercers which are very, very rich family here in the States were interested in influencing American politics. And they got connected to Christopher Wylie who is at, I think at SCL group at the time through someone with government ties, like military intelligence ties. Anyway, I just thought that was kind of a little bit of a light bulb for me.
AUDREY: Oh, I see. Yeah. It was Wylie who had access to the research and the Mercers getting introduced to him. And then I think the next step is that they took it to Bannon and wanted to impress Bannon with it, which I got the impression that that’s how Cambridge Analytica is like a separate organization, how that got kicked off. [Inaudible] Bannon on the idea.
CHRISTIE: Right. It was kind of this intersection of Steve Bannon’s ideology and wanting to basically completely disrupt government and change culture. The Mercer family basically wanting to never pay taxes again and all the money they made that they can put forth towards this effort combined with the work that people around the SCL group are already doing, including work that started in Academia.
AUDREY: Right. And that’s another point of controversy here.
CHRISTIE: So basically what Cambridge Analytica did was create this huge sort of database of. They basically pro have this huge profile database of millions of Americans and the sort of seed for that was Facebook profile data and they initially obtain that through a researcher at Cambridge University who had developed, who was working on the psychometric stuff and develop that personality quiz. They ended up getting data not just from the people who consented to taking the quiz, but because of the way Facebook worked at the time, they were able to get profile information on all those people’s friends. So it went from 50 million people that, what is it, 230 million people?
AUDREY: Something like that, yeah. And it’s important to say this is a very deliberate over reach. I mean, I was once a Facebook app developer and I didn’t go around routinely scraping everyone’s info at that depth. I actually assumed that we could get audited, which I realize now was weirdly naive, but I was always very cautious about what we accessed and how we handled the permissions because I assumed that we could get audited for it.
CHRISTIE: And maybe you could. Basically, Facebook’s whole mode of operation is to allow liberal access and have policies in place that tell people how they should use the data.
AUDREY: And then punish them if they don’t do that.
CHRISTIE: Right. And it turns out that the punishment came very late and seemed to be pretty much only cursory like, “Hey, you collected this data inappropriately. Delete it. Sign this thing and send it back to us saying you deleted it.”
AUDREY: And that’s not really what I mean about an audit. Audits are when a third party goes through everything that you have and enforces restrictions.
CHRISTIE: And evaluates.
AUDREY: Compliance.
CHRISTIE: Yeah. And evaluates like how many copies of that data was made, was it encrypted and things like that. And I’ve still seen a lot of argument about if this is a big deal or not, which I find interesting. And I think something to keep in mind is that Cambridge Analytica and its parents, it’s not just the 2016 US election that they’ve been involved in. It was the Brexit vote and other political activity elections in different countries. So like this is what they do.
AUDREY: They were looking to sell it and have been looking to sell it anywhere that they can.
CHRISTIE: Right. It’s their product. So that’s a concern. And now like the UK officials are involved in like, I think there was a search warrant executed on the Cambridge Analytica offices.
AUDREY: On Friday.
CHRISTIE: Yeah, and I don’t quite know what’s going on here in the States with it. Facebook is getting a lot of crap, so people are exploring once again like how much data does Facebook have of us? And I think there’s talks of like how do we leave Facebook? What settings can we change to make it safer?
AUDREY: Right.
CHRISTIE: The Mercers, that [inaudible] just how much families with money and can influence things. To me, that’s another issue. I’m kind of out of scope I guess for our podcast.
AUDREY: Well, I thought it was super interesting. This other link that you gave me about, like the background on the Mercers. I’m just looking at the photo of them and what they’re doing. If you’d told me that they were big oil, I would have bought that immediately, but it’s much more interesting that their money actually comes from hedge funds and that quantitative, what’s the other word? The quant stuff in hedge fund management.
CHRISTIE: Mercer basically, he’s a geek. He’s a totally introverted quant and he figured out how to computerize and automate hedge fund transactions and made oodles of dollars and now doesn’t want to pay taxes on them.
AUDREY: Yeah. I mean, that he’s starting from that and then funding this massive thing that takes advantage of data collection. It probably just seemed like an obvious great direction for him to go with. It fits very well with his previous work.
CHRISTIE: And I don’t mean to harp on the taxes, but actually I do. He’s actually in arrears like millions of dollars. And so, he has an active incentive to make it impossible for the IRS to collect that money. Anyway, I’ll find that link for the show notes, Audrey. And you probably haven’t read it, but The New Yorker did a big piece on the Mercer family. It’s interesting and it’s also pretty infuriating, but I’ll dig that up.
AUDREY: Yeah, I haven’t read anything about them beyond what you’ve shared and what’s in the other articles.
CHRISTIE: We’ve got this piece from Paul Ford about why we need a digital protection agency which reminded me a lot of other stuff we brought in. Was it Zeynep Tufekci? I think it was one of their op-eds about like basically needing regulation, like not putting this at the consumer level, having regulation. That, I think there’s also gaining momentum around that.
AUDREY: Yeah. And I think it’s fair to think that regulation is the only level that we can solve this. And Paul Ford has a really good point about comparing it to environmental regulation and the way that, there’s plenty of things that chemical companies do that we’re perfectly fine with. But when they leak it all over the place and contaminate the groundwater, that’s when we have a problem. So, regulating data in that sense of like they’re perfectly good uses for the data, but there are also things that you should be penalized for dealing with it. It makes a lot of sense to me.
CHRISTIE: Like not securing it?
AUDREY: Yeah.
CHRISTIE: Yeah. I think that metaphor of a toxic spill in a data leak, like there’s thing about equifax and linking all of that consumer information and the chain of consequences that can come from that and the damage, it is very similar to ecology and environmental damage, I guess.
AUDREY: Yeah. And the leaking, this is such an interesting thing because one of the first things Facebook said about this is that this isn’t a leak.
CHRISTIE: Oh God! It’s not a breach.
AUDREY: Yeah. It’s not a breach because everything was authorized. Every step of the way, this was authorized. And then when it wasn’t, we told them to stop it. And that was just fascinating. And I think that that’s something that often gets overlooked when we talk about this consumer data. I kind of hate that word. But this personal data that gets collated, a lot of what’s happening is perfectly legal. It’s disgusting, but it’s perfectly legal. So it’s not like, “Oh well, they should stop it.” We’ve already decided that we actually do have to pass laws to do something about this if we want to see it to change.
CHRISTIE: Right. Because there’s zero consequences right now. There might be some bad press, but it never seems to manifest as anything more consequential.
AUDREY: And it’s very hard for individuals to even sue over this stuff, like finding the point of responsibility is difficult.
CHRISTIE: Most people sign away their rights to sue with the arbitration clause and the EULA.
AUDREY: If you were an actual direct user of the thing that collected the data, yeah, because Facebook profiles people that aren’t actually even Facebook users. I think that makes it a little bit more complicated.
CHRISTIE: It does, yeah. And I mean, I appreciate it, Paul Ford’s analogy. It also was really depressing when you think about the current administration is just dismantling any effective structures that we have. Any of these federal oversight, they’re like broken.
AUDREY: I mean this kind of brings us back to that UK oversight. That some of the angles where people are able to do anything involve the [inaudible] aspects of it or even the ways that…probably some of the stuff Cambridge Analytica properly did break elections laws because there were non-US employees working on campaign advertising.
CHRISTIE: Is that one of the provisions? I know you can’t take foreign money for elections, but I don’t know what the other rules are.
AUDREY: Only in advertising. Nobody’s actually tracking it.
CHRISTIE: Right.
AUDREY: Yeah, there’s something else to do with like where the workers were from and based that I think also violates elections laws. So there are a couple of pieces like that. And I also kind of wonder a little bit about more local legislation for example, because Facebook has headquarters in California. What if California was to try to take this on?
CHRISTIE: So, do we have any practical things that people can do?
AUDREY: Delete your account or don’t delete your account. I mean, do what’s comfortable.
CHRISTIE: You can check your settings and you can turn off platform access. I think that hobbles a lot of things you might be using.
AUDREY: Somebody told me that Facebook login doesn’t work if you do that. Like the login with Facebook on another service.
CHRISTIE: So if you’re thinking about that, you could start by making a list of all the websites you log into with Facebook and seeing how to change that.
AUDREY: I think that I do have a couple of links that I can put in here about all the steps if you want to disentangle yourself as much as possible from Facebook. I think I’ve seen a couple of things that outlined the steps that you’ll have to take. It’s doable. It’s not like an infinite thing. That won’t get your data back out of all of the places that’s been collected, but at least over time, your profile will become less accurate. They’ll stop being able to add to it.
CHRISTIE: Anything more about the Cambridge Analytica stuff?
AUDREY: Just one last thing which is that our whistleblowers are kind of shady.
CHRISTIE: You mean in this instance?
AUDREY: Yeah, like the more that I read about Christopher Wylie, the more I’m like, “I don’t really know why you’re coming forward right now,” because I think maybe you fear you’re going to get in trouble. And there’s another woman from Cambridge Analytica who, I forgot a couple of days ago, there was a big thing from her and absolutely everything was like, “Yeah, she never actually objected to what we’re doing.” It’s just that she got pushed out of the company over a disagreement. And so of course, she’s coming clean now. She’s just trying to burn them. I don’t know, there’s something about like…I’m trying to figure out how to describe this, like the way that everyone can be awful and awful to each other, and sometimes that just blows out.
CHRISTIE: I think it’s safe to assume that people always have ulterior motives. It just may differ on the proportion of that versus altruism or whatever.
AUDREY: And I think it’s fine that the whistleblowers have mixed intentions here. It’s still really good for us to be finding these things out.
CHRISTIE: I do think it’s a valid thing to explore though because it can inform what they’re saying and how you take that information and just the overall picture, too.
AUDREY: Yeah. And The Guardian reporters are obviously doing a lot of fact checking, a lot of verification. They’re following up with a lot of people off the record who are able to corroborate a lot of this. So they’re not taking it at one person’s word, and that’s really important.
CHRISTIE: And there’s something in a story that made me believe that they had been working with Christopher for a long time and that there might have been a certain, I don’t know if cultivation is the right word, but process of, where he had to get to the point where he could actually be more public and give them more direct information.
AUDREY: Yeah, I would believe that, [inaudible] to those moody photographs.
CHRISTIE: Right.
AUDREY: I saw something refer to it as like a fashion shoot, like it’s a whistleblower fashion.
CHRISTIE: So, SESTA/FOSTA. The first one is the Senate version and the FOSTA is the house version of this legislation, I think.
AUDREY: Yeah.
CHRISTIE: And I’ve already forgotten what the acronyms are.
AUDREY: It’s sex trafficking something, yeah.
CHRISTIE: It’s an anti sex trafficking legislation that…I guess it’s passed both houses of Congress, and is going to be signed imminently into law?
AUDREY: Yeah.
CHRISTIE: The one thing I hadn’t been able to track down is when it takes effect.
AUDREY: My impression said it was immediate, but I have to say that having the flu and trying to read legislation at the same time, it’s not really a good idea.
CHRISTIE: Oh, come on Audrey. All that nap [inaudible].
AUDREY: I slept for like 30 hours. So I looked at it and I had absolutely no idea what it was saying. So I wish I could tell you, but I tried.
CHRISTIE: Well, that’s okay. We can follow up with that. I think we understand the main part of it which is in ’96, I think ’96, we had the Communications Decency Act. It did a number of things regarding online communications like the internet. And keep in mind this is ’96, so this is just as the Internet was becoming a thing in the mainstream.
AUDREY: And this was such a big deal because there were all of these people in politics and otherwise who really wanted internet communications to be restricted to things that we could all agree on, which is a very conservative take on the internet content. And so what happened in this act was really important for having what we think of as the free speech on the internet.
CHRISTIE: Right. So there is a provision or a Section 230 in this act that basically said…Wikipedia has a really good…It says, “Provides immunity from liability for providers and users of an interactive computer service who publish information provided by others.” And here’s the quote. “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” So basically this exempted platforms and service providers from prosecution or liability for things that their users publish. Is that a good summary?
AUDREY: Yeah. And anything other than like deliberate criminal acts would be separate.
CHRISTIE: Like if it was clear that the provider participated in the communication, that’s not exempted.
AUDREY: Yeah. And this has actually been at the root of a lot of discussions about moderation of content on the internet because a lot of platforms have been concerned that they would be, if they moderate content beyond a certain point, they would be seen as overstepping this. They would stop being an open platform and start being somebody who had responsibility for the content. I don’t know how reasonable that liability is, but that’s something that I’ve heard over and over again.
CHRISTIE: Yeah. I think the thing is there’s three parts to the test to get the immunity benefit. And the third is that the information must be provided by another information content provider that is the defendant. The platform must not be the information content provider. And so the question is, at what point in the continuum of moderation do you become a co-provider of the information? Like the simply deleting a post count or does editing it count? If you edit it for language or whatever, are you then a co-creator of it? And I think there’s been a number of case law around this.
AUDREY: There must be by now. We’re talking about 20 years.
CHRISTIE: This was a big deal that they got the Section 230 in the Communications Decency Act way back in ’96. I’m pretty sure Wyden was a big part of getting that in there.
AUDREY: He’s a co-author.
CHRISTIE: He’s the co-author. Okay.
AUDREY: Yeah. There was one other person, and he’s a co-author.
CHRISTIE: And I think it’s fair to say that Section 230 really helped user generated or platforms where people can contribute content. I don’t know what I’m trying to say.
AUDREY: Everything you think of as social media exists because of this.
CHRISTIE: Yes. Thank you, Audrey.
AUDREY: Everything you think of as social media, everything you think of as open hosting. We could go back to LiveJournal, MySpace, YouTube obviously. Every single one of these sites that we take for granted is dependent on this act.
CHRISTIE: Including your own blog if you have comments open. I think that can be argued.
AUDREY: Yeah. And I started to wonder about WordPress.com because they provide hosting. I mean, they don’t evaluate my blog before I spin up another one. They provide that hosted service and I think a lot of kinds of hosting providers would have to start thinking about this in a different way.
CHRISTIE: Oh, so not just comments but…
AUDREY: Well, it’s a platform.
CHRISTIE: Right.
AUDREY: I think it has the potential to go fairly deep.
CHRISTIE: Because what SESTA/FOSTA does is basically rewrite that Section 230 to exclude anything that looks like sex trafficking.
AUDREY: Right, and with a fairly broad idea of what that involves. Really, the most immediate effect is anything involving sex work.
CHRISTIE: Including consensual sex work.
AUDREY: Correct. And this is a deliberate overreach, the completion of consensual sex work with coerced sex work. That’s a very deliberate thing that’s been put together. This thing that they want to do, it’s like a cut out right in the middle of the law that we have. And by saying that we’re going to exempt from the previous protections anything that might contribute to participate in or allow for sex trafficking, it automatically squashes every conversation around that.
CHRISTIE: It includes consensual sex workers of legal age just talking with each other about their trade and how to stay safe.
AUDREY: Including things people do for safety, protection, monitoring. This is a thing like advocates that actually work with people who are escaping these coerced situations that people that actually work with it say that having it out on public sites is vital to their ability to help. And law enforcement says this is vital because the further and the deeper off track this stuff goes, the harder it is to find, the harder it is to monitor, the harder it is to help anybody, to be able to track victims. So if you take down every site that people can use to advertise it, what you’re doing is taking down our ability to help.
CHRISTIE: Yeah. And the ability for people to engage in that work in a safer way both in terms of procuring clients and also in terms of connecting with their community about it. Groups that work with sex workers have been against this legislation. Many internet privacy and freedom groups like EFF and Wikimedia Foundation and sort of the players you might expect have spoken against the legislation. Oddly enough, the bigger tech firms reversed their position at some point or they supported it all along.
AUDREY: There was a fairly minor change in the legislation that I think basically had the effect of saying if you employ automated moderation against these things, then you’ll be okay. So if you automatically block everything that says prostitution or some bucket of terms that you would also be okay, like considered to be moderating effectively. And that was enough that a bunch of companies flipped their position.
CHRISTIE: Probably the bigger ones.
AUDREY: Yeah. And I mean the really obvious criticism is that they could withstand the lawsuits. They can actually spend money proving that they’re in the right, whereas lots and lots and lots of other people can’t.
CHRISTIE: I think something that’s important to keep in mind about this is that this effort to pass this kind of legislation. One, it was widely supported. Only two senators voted against it, which is concerning.
AUDREY: And again, Wyden was the co-author of the original provision. He probably knows the most out of anybody in Congress about the effect of this because he is also on the Intelligence Committee. He’s seen a lot of this other stuff about how data gets used. He probably understands this the most out of anybody.
CHRISTIE: And it’s also part of a larger effort to further criminalize and marginalize those who engage in consensual sex work. As I was researching this for the podcast, things like there are anti-sex work groups that fund law enforcement activities and to get the funding, they have to take a certain position on sex work.
AUDREY: Yeah. And completing sex work with sex trafficking and only focusing on sexual labor in trafficking are a couple of tactics that anti-prostitution/anti-sex work groups use over and over and over again. I don’t have it handy, but there’s a lot of work that’s gone into documenting this, and there’s stuff that you can find if you’re looking.
CHRISTIE: I’ll try to find some overview articles that we’ll include in the show notes if you’re kind of new to this topic. Already, we’re seeing the effects of this. Craigslist shut down its personals, which I think is a pretty big deal because that wasn’t just about sex work.
AUDREY: They had already previously closed the sections that were most likely to be used by sex workers. This is going a lot further.
CHRISTIE: Reddit shut down a bunch of sub-reddits like things where sex workers talk to each other. And it’s not even signed yet. We’ve got a post from the Tits and Sass site.
AUDREY: They’re great. I mean, they’re a self-advocacy group and they do a lot. I mean, they’re speaking by and for other sex workers.
CHRISTIE: And so they have…Post-SESTA/FOSTA Self-Censoring for Twitter, Reddit, and other Social Media that gives some practical tips on…it’s geared towards like if you’ve…like how you might want to alter your Twitter presence so that you don’t get automatically banned under platform changes that are intended to comply with this legislation.
AUDREY: Yeah, making the assumption that there’s going to be a lot of stuff that’s automated and the things that you’re going to encounter first are these big sweeps. Trying to make sure that you don’t have to lose your entire internet presence.
CHRISTIE: And if you want to hear from sex workers themselves about how this is going to affect them. there’s a hashtag, which I just forgot what the hashtag was. I’ll dig it up. Do you know what it is, Audrey?
AUDREY: Not offhand.
CHRISTIE: Okay, we’ll find it and you can check that out. That was really a [inaudible] to me too.
AUDREY: The person that I follow who reports on this the most is Melissa Gira Grant. It’s funny, I’ve been following her work since way back when she was on Valleywag, but she does just a ton of reporting and a lot of putting the pieces together and always focuses on sex workers’ perspectives in the stuff. So she’s a really good starting point too.
CHRISTIE: Yes, and I definitely know that. I think that’s actually where I saw the hashtag.
AUDREY: One other thing about Wyden’s commentary. He sort of tweeted out the same things that he said in his statement before the vote on the bill. And he had a point about how what they offered in Section 230 was an opportunity for a lot of companies, a lot of organizations on the internet to say what they needed to say. And the lack of responsibility shown by some of the biggest players is what’s killing this, their refusal, like the latest statement from Zuckerberg about Facebook and Cambridge Analytica was still coming back to this thing. “Well, I don’t want to be the guy that have to decide what’s appropriate.” And Wyden called that kind of thing out really specifically to say the reason that you’re seeing them come after this part of the law and to try to just…I mean, sex work is the first thing they go after. There will be other things. There will be other topics that will come up with to just create another hole here. And the lack of the refusal of responsibility is really core to this.
CHRISTIE: Yeah. Well, I think that Twitter thread from Wyden is busted. So we should find this whole statement if there is one.
AUDREY: Maybe he’s got like a press release transcript or something.
CHRISTIE: Yeah. Big tech companies have become bloated and uninterested in the larger good. They’ve made it easy for the creators of vile content to move from one platform to the other.
AUDREY: He has some pretty harsh words.
CHRISTIE: I think they’re very much deserved.
AUDREY: Yeah.
CHRISTIE: Especially when you put it side by side. I was hearing the Zuckerberg comments on NPR or something. Sometimes when I hear stuff like that in people’s own voices, it like goes deeper into my brain for some reason. And I was sitting there just getting so mad. I was like, “Have a backbone! Make a stand.”
AUDREY: I mean, it’s just some kind of intense denial to say, he controls the company like legally and financially and everything. They can do whatever they want.
CHRISTIE: It’s your job.
AUDREY: Yeah, it is absolutely his job. And if he doesn’t feel qualified, he should let somebody who’s qualified do it. That is utterly possible.
CHRISTIE: Okay. Slack had some privacy changes, privacy policy changes or whatever, product changes. I don’t know what they’re calling it.
AUDREY: Yeah, the privacy policy. And it’s an interesting [inaudible].
CHRISTIE: Because of…?
AUDREY: We’re talking about all your data.
CHRISTIE: All right. I’m getting to the point where I’m like is this really going to be different than like the other, like it’s just so much all the time. I feel like we’ve been saying that since November 7th 2016 or whatever the day after the election was.
AUDREY: Yeah. It sort of funny this week though because when I woke up Friday morning and I was like, “Wow, that was a lot of sleep.” Somebody said, “Well hey, you slept through…good on you for skipping all of the hour by hour stuff going on.”
CHRISTIE: Right. So Slack previously had this thing called the compliance export which is only available at a certain, like top-tier paid plans. You have to enable it ahead of time and workspace members would be notified. But now they’re sunsetting that and instead they’re just going to have a general export tool that workspace owners can request access to a self=service export tool to download all data from their workplace. This includes content from public and private channels and direct messages.
AUDREY: And this is important to something that we talked about a couple of weeks ago about the use of Slack and workplace organizing.
CHRISTIE: Yeah. So basically you have to consider that you’re on private messages on Slack or not private.
AUDREY: And retroactively. So if two weeks ago we could have said, yeah, unless your workspace has this particular thing set, then they won’t be able to get at it. Now, Slack has changed that.
CHRISTIE: And there is a process for people who don’t pay too to request it. And so they have to either provide the company with a valid legal process, consent of members, or a requirement or right under applicable laws. It seems like there’s very little you can do it. Something I hadn’t realized and I don’t…it’s not in this Fast Company article, that you can set an expiration on your direct messages. Did you know that?
AUDREY: Yeah. I vaguely remember that. I haven’t been using Slack very heavily for a while.
CHRISTIE: Assuming that’s true, you should do that if you care about this. And also if you’re going to be discussing anything that you wouldn’t want upper management at your company or anyone else for that matter to see, take it outside of Slack.
AUDREY: Yeah. I saw a little bit of eye rolling around this like, “Oh, why are you using Slack for this?” But I’ve encountered this problem and I know somebody else who’s told me about it too that when you’re just trying to work, starting to organize around something, the barrier to use is really important. Like there are a lot of people who, for example, salary conversations. There are a lot of people who won’t jump in if they have to go to another website even, another conversation point. But you might be able to get them into a side channel in an existing one.
CHRISTIE: Where are you going to go? Everybody’s using Slack.
AUDREY: No, I mean I’ve also seen that happen. Set up another Slack.
CHRISTIE: It’s an oral burrows of shitty privacy.
AUDREY: The ideal is to take the conversation to signal, [inaudible] signal group, getting people to jump in. It’s another step in, it’s another barrier and maybe that’s not where you can start your workplace conversations.
CHRISTIE: I don’t know that the open source alternatives to Slack could solve this problem either. I mean, they’re open source so it’s possible that someone could work on a solution where things…I guess that you could encrypt your messages that you put into Slack.
AUDREY: Sorry, I had to take a coughing break.
CHRISTIE: No, it’s okay. I guess you can encrypt the text that you put into Slack.
AUDREY: Yeah, but without a plugin that makes it easy to decrypt them.
CHRISTIE: There’d be a lot of copying and pasting, yeah.
AUDREY: Which you might as well just use Signal. There’s a desktop app and it’s nice.
CHRISTIE: It is, yes. Although the desktop app doesn’t rotate pictures that are taken on phones, which I don’t understand.
AUDREY: Oh!
CHRISTIE: So if someone sends me a photo from a phone, I just tilt my head.
AUDREY: From my web and social media development days, I know one possible technical reason for that. Rotation is like a separate piece of data. It’s like part of the meta-data.
CHRISTIE: And so, is that meta-data not getting to Signal or are they just not reading it?
AUDREY: It used to be that ImageMagic just wouldn’t read it, but I’m thinking like years ago. So I would assume that most libraries would look at it now.
CHRISTIE: Right. Signal is the only thing I’ve come across recently that doesn’t rotate them.
AUDREY: I think you probably would’ve seen it a lot more like five or eight years ago. It was a recurring bug on the website that I worked on. I was like, “But it’s not giving me that data.”
CHRISTIE: Yeah. There’s not really any way to infer it from the image itself really.
AUDREY: No. I mean, we really just needed like an editor and, “All right, change it to [inaudible]. Save.”
CHRISTIE: Okay. Anything else on Slack?
AUDREY: I’m just thinking that there is sort of a group use is a tip that I can offer even if you’re going to delete stuff. When you add people to groups, it’s better to start a new group unless you were absolutely certain the entire history is safe to share. And similarly, when you leave groups, people leave groups, it’s a good idea to delete them and start over anytime you’re having sensitive conversations. And that’s true for both Slack and Signal.
CHRISTIE: Right. And you might argue like…I mean, there’s other things that applies to like a Google group or any type of group communication actually, I think.
AUDREY: Yeah, those are the two that I think are really easy to overlook.
CHRISTIE: One thing, this tweet popped out at me from @bcrypt. It says: a fun way to monitor someone’s IP address, one, create a paid slack workspace; two, get them to join your slack; three, now you can see their IP address and device type in Slack’s access logs as long as they’re logged in and have the Slack webpage/app open. So I don’t think that means that anyone who is an admin, which I’m not sure Slack always makes obvious can see your IP address at any given time. I think that’s a little unusual for a hosted service, is it not.?
AUDREY: Yeah. I can’t think of a good example either way.
CHRISTIE: I just wanted to flag it because I mean, one of the things that I hope that we can do to this podcast is just raise awareness, situational awareness about what data you’re sharing, maybe whether you realize it or not and what data can be collected about you. And that’s something I hadn’t thought about. I probably am a member of a dozen slacks at this point. I’m not logged into them all the time, but they’re generally community ones. And I don’t really know who runs them. And there’s a lot of information you can get about someone from their IP address and any admin of the slacks would have access to that. So just something to keep in mind. I don’t know.
AUDREY: Yeah. For example, if you need to hide your location, hiding your IP address is important to that.
CHRISTIE: So our last topic. We only got 10 minutes. Ten minutes is not enough time to digest this massive announcement from Google.
AUDREY: They got almost no attention because, oh my God, Facebook.
CHRISTIE: So the Google News Initiative: Building a stronger future for news. So this is like, I almost want to say omnibus. I’m not sure that’s the right word. But there’s announcements I feel like for a dozen different things in here. And Audrey, I’m not sure which one you wanted to talk. If it was specifically the subscribe with Google or if it was just all of it. What stuck out to you about this in particular?
AUDREY: Okay, hold on. Let me open it again. Sorry.
CHRISTIE: This is the last article I read when I was preparing and I did that like, “Oh crap, I needed an hour to go through this, not 20 minutes or whatever.”
AUDREY: Fortunately for you, I stared at it several times over so that I can find the most amazing sections. Right off the bat in this Google News initiative, there’s kind of this great, I don’t know what to call it. They’re describing like why news and journalism are so important to Google’s interests. And so, Philipp Schindler, the Chief Business Officer whose name is on this thing says, “That’s why it’s so important to us that we help you drive sustainable revenue and businesses. Last year, we paid $12.6 billion to partners and we drove 10 billion clicks a month to publishers’ websites for free. And that just kind of killed me because I didn’t know that I was supposed to pay for access to Google. Am I grateful? I don’t know, I have to think about this.
CHRISTIE: Everything is worded like that. You can tell this is what you hire PR and communications people for because they make everything that Google or AlphaBeta or who cares is providing this like God’s gift to humanity. I guess it’s obvious how we feel about it already.
AUDREY: There’s a lot of individual pieces here that we’ve critiqued before because Google gets to drive the narrative about it, because they are big and powerful and we’re just two podcasters sitting here with the flu. They do these things and they announce them and they’re like, “This is grand,” and then their partners are like, “Yep, it’s grand,” because they need the support. So they’re able to come back to these things like we worked with the industry to launch the open source accelerated mobile pages project to improve the mobile web where we know that they defined that, they drove that. There is very little public evidence of a real partnership there. So the subscription thing, it sounds not that weird. Like if you have, I don’t even know what they actually call this, a Google account for buying things.
CHRISTIE: That’s Google Play account, yeah.
AUDREY: I guess if you’re on Android, then that’s probably something that you’re using. But if you have that then you will be able to manage your subscriptions to various news publications through that. That’s the thing that I think they’re highlighting the most. And fine, I can do that on the iPhone too, something very similar to that.
CHRISTIE: Through the App Store.
AUDREY: Yeah. And through, I don’t know, iTunes, iBook, whatever. So other platforms have integration like that. But the thing that actually got my attention, got me to come back to this was this next section where it says evolve business models to drive sustainable growth. And they talked about how they are…let me find the right piece here. They talked about how they’re using data to try to get more people to subscribe. And it says, “In October, at our Partner Leadership Summit…”
CHRISTIE: I knew this was going to be this, yes.
AUDREY: “We told publishers about how we’re experimenting with ways to grow their subscriptions using Google data, machine learning, and DoubleClick infrastructure. We’re now in the early stages of testing a ‘Propensity to Subscribe’ signal based on machine learning models in DoubleClick to make it easier for fellowships to recognize potential subscribers and to present them the right offer at the right time.”
CHRISTIE: Which probably saved my barf sounds from when you’re done with the quote.
AUDREY: Here’s another paragraph too that gets into the detail a little bit. This was why I wanted to highlight it because we’re talking about data and how it’s used for and against us. And there’s a story about news publications. There’s a couple of stories we could tell about, like what’s wrong with news publications right now and declining subscriber numbers is one of those. And so it’s a point of intervention for Google. But the bigger thing that I’m hearing from journalists a lot now is about the finances of their owners. The thing that’s actually killing news publications right now is about the companies that own them and their desire to just scrap them for profit. And this is nothing about that. This just seems like another way to, what’s the metaphor about the rock and squeezing it?
CHRISTIE: You mean blood from a stone? That one?
AUDREY: Yeah, I think that’s where I’m going.
CHRISTIE: Well, it’s more surveillance capitalism. And then we talked about that with AMP email. There was another part of this about once you subscribe whatever publications you’re subscribed to with your Google account, it’ll start putting those in your search results. So there’s just a lot more vendor lock-in too, to this.
AUDREY: Yeah, and I mean that can both be really good. Like of course, I would love it if publications that I rely on show up in my results. But at the same time, that Google is the organization that collates [inaudible] that does get really problematic and that does give us less and less control over what we’re seeing and why we’re seeing it.
CHRISTIE: And just knowing that Google knows everything that you’re looking at continues to be problematic.
AUDREY: Yeah. And then the last part of their announcement is that they’re going to put in a bunch of money to help people understand fake news.
CHRISTIE: The $300 million?
AUDREY: I think that’s the entire commitment.
CHRISTIE: That is nothing. That is so poultry. Anyway…
AUDREY: You know, I was trying to figure that out this week too, like I need a scale of measurement for Google initiatives. Not just Google but like Facebook and whatever else. I need to understand like in a conference fundraising, I understand the difference between like a %500, $1000 and $10,000 kind of pledge. For these big initiatives, I’d love to have like a little index to understand how it affects their budget and their activities.
CHRISTIE: Right. Well, it’s $100 million a year and considering these are like a dozen different initiatives. I don’t know, it sounds like very little to me.
AUDREY: It probably isn’t.
CHRISTIE: But I’m guessing that each of these things also have their own funding streams too.
AUDREY: I mean it sort of emphasizes that the partner or publications are sort of paying for this. They’re getting these things for free, which as we know means that the data is worth something. I’m trying to think. There was one other little bit here. Well, I guess just the way that they’re swinging the fake news stuff that spending it as a media literacy problem when we know that it’s a lot more than that.
CHRISTIE: Okay. Are we doing things we love on the internet this week?
AUDREY: I believe so.
CHRISTIE: All right. What have you got, Audrey?
AUDREY: I actually kind of cheated and put two things in here. But one of them is super relevant to everything we’ve been talking about today. It’s a piece that Anil Dash wrote about 12 things everyone should understand about tech and they are all very valuable ways to think about what’s going on right now.
CHRISTIE: Cool. What’s the second?
AUDREY: It’s a bit from the developer of curl, the utility, about an email that he got from someone who wanted to know why his email address was showing up on their GPS display.
CHRISTIE: Interesting.
AUDREY: There isn’t like a great explanation for it because the person provided no info other than their car’s model. But it reminded me of something that a former co-worker once told me about how you should never put your email address in the release notes or the comments or anything else in a library that you release because as the co-worker commented, you can be getting emails from people 10 and 15 and 20 years later that are like, “It doesn’t work. I found your email address in this thing. So please tell me why it doesn’t work.” And it will give you a lot of contact with people who you can’t debug their problem, much like with the GPS system because who knows how curl is actually being used in that context.
CHRISTIE: Right. So, okay.
AUDREY: So it’s kind of funny but awkward.
CHRISTIE: So, this person’s email could also be like in your toaster too, right?
AUDREY: Yeah. I mean, curl’s really useful.
CHRISTIE: It is.
AUDREY: Who knows what appliances are making use of it.
CHRISTIE: And git doesn’t make this easy because every git commit has your email address in it. All right. My thing is that there’s this dialects of English.com which was a survey and it’s free and you have to sign up, but they do ask you to provide the zip code where you grew up, your gender, gender options are great, and age and then they ask you different questions about basically how you pronounce things. And then each one has a heat map at the bottom that shows you how similar or different your response is compared to those regions. And I just think it’s kind of fun because there’s such a huge variation in American English, both in terms of how we pronounce things and also the names for things, like what do you call it when it’s raining and the sun is shining. I don’t have a word for that growing up in California. Do you, Audrey?
AUDREY: Sunshowers.
CHRISTIE: Okay. So anyway, it’s just kind of fun thing to do.
AUDREY: Yeah, I kind of love the regional terms for just really mundane things.
CHRISTIE: There’s things like what do you call water that is carbonated but not flavored. Yeah, so it’s kind of fun. Check it out. And I said they’re in the process of updating it. Some of the things they don’t have heat maps for, so also doing that will give them data for their project.
AUDREY: And good useful data, not the other kind.
CHRISTIE: All right. I think that’s our show. It’s kind of fun recording on a Saturday and I think we both get a little congratulations for being sick and making it through a whole hour of recording.
AUDREY: I just really couldn’t let go of all the news that we could cover this week. I felt very determined that we’re going to talk about it.
CHRISTIE: Great. And of course I’m behind editing last week’s episode because I was sick, but we’ll get caught up and we’ll get this one edited. So you might kind of get two episodes in a row. That’s not bad. And yeah, thanks everyone for listening. We’ll talk with you all again soon.
And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to podcast@recompilermag.com or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.
The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.