Episode 58: A whole lot more EULAs

Download: Episode 58.

This week Audrey and I chat about GDPR, Europe’s new privacy law about to go into effect and Facebook, the PenAir hack, Telegram, and a new tool law enforcement has to crack iPhone passcodes. Enjoy!

Show Notes

Community Announcements

The Responsible Communication Style Guide is headed back to the printers!

When we sold out of print copies of The Responsible Communication Style Guide last fall, we promised to do another print run in early 2018. We’re happy to announce that we’re ready.

If you’ve been waiting to pick up a printed book (or enough for the rest of the office so they stop filching your copy), this is your chance. Order now!

Now Broadcasting LIVE most Fridays

We broadcast our episode recordings LIVE on most Fridays at 10am PST. Mark your calendars and visit recompilermag.live to tune-in.

We love hearing from you! Feedback, comments, questions…

We’d love hearing from you, so get in touch!

You can leave a comment on this post, tweet to @recompilermag or our host @christi3k, or send an email to podcast@recompilermag.com.

Transcript

CHRISTIE: Hello and welcome to The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.

Episode 58. This week Audrey and I chat about GDPR, Europe’s new privacy law about to go into effect and Facebook, the PenAir hack, Telegram, and a new tool law enforcement has to crack iPhone passcodes. Enjoy!

And hey, we’re getting closer and closer to 100, Audrey.

AUDREY: This is pretty cool.

CHRISTIE: Got any announcements for us?

AUDREY: I do. We are extending the deadline to order copies of The Responsible Communication Style Guide, the print edition. We’re going to stretch it out another month just to give people really a chance to put their orders in and for us to make sure that we’ve hit our pre-order minimum. And I just want to encourage people if they don’t have a copy yet to think about it. This is just a really great workplace reference. I’ve talked to people outside of tech who are making use of it because there just aren’t that many comprehensive resources like this to help you understand better ways to write about the people who use your software, the people you work with, the people in your community.

CHRISTIE: And it’s not just for specific writing professions, right? I mean, if you’re writing marketing material or newsletters or blog post or user documentation, sort of any of that, this would be an appropriate resource, right?

AUDREY: Yeah, and the book includes some guides to how to think about using material in this context.

CHRISTIE: All right. So we’ll have a link in the show notes and you can go to RCStyleGuide.com. Any other announcements, Audrey?

AUDREY: That’s it for this week.

CHRISTIE: First thing we want to talk about is the GDPR, the General Data Protection Regulation.

AUDREY: And I’m really glad we’re covering this because I have had a hard time understanding what’s actually changing. And so I hope that maybe we can clarify that for other people too.

CHRISTIE: I keep getting it confused with the GDR, the German Democratic Republic.

AUDREY: Also relevant to data collection, but yeah not the same thing.

CHRISTIE: No. The acronym is just so close. And I think because…what’s the full acronym for North Korea, like the Democratic People’s Republic…

AUDREY: DPRK.

CHRISTIE: Yeah. I think somehow those two things get mooshed in my brain and then I get it confused. Anyway, so this is a European Union Regulation that’s about to go into effect next month.

AUDREY: Yeah. I think possibly in just a month, like May 20th.

CHRISTIE: What can you tell us about this, Audrey?

AUDREY: Well for one thing, it replaces a law from 1995. So it’s quite an update that’s looking at like the breadth of how data is used now in online services. It also has a lot of previous EU, like technology privacy laws have only affected companies in the EU providing services to EU citizens or EU residents and this is different in that any company that interacts with people in the EU is also held responsible.

CHRISTIE: Yes. Anyone processing data of EU residents.

AUDREY: Right. And its interpretation of processing is pretty broad. I had a hard time getting a sense of how that will be clarified whether this is going to look like those cookie pop-ups that some sites have that are kind of a weird interpretation of privacy anti-tracking or tracking responsibility law. They’re not a very useful implementation of it. And so it was hard for me to understand whether that’s what this is going to look like or if it will just be a whole lot more EULAs that everybody has to look at before they can do certain things.

CHRISTIE: It introduces this idea of consent which I thought was interesting.

AUDREY: Yeah. And not just incidental consent like, “Well, you opened the box. There you go.” But that users need to actively consent to their data being used in the way that it is.

CHRISTIE: Wikipedia says data may not be processed unless there’s at least one lawful basis to do so. And it has a bulleted list. The first of which is that data has given consent to the processing of personal data for one or more specific purposes.

AUDREY: And so, because it seems just so broad the way that I’ve heard it described, I’ve been trying to understand especially because there are penalties associated with this. I’ve been trying to understand like could you potentially be held liable if you have logging. It says that recording IP addresses could be a factor. This could be a type of processing data. It’s just you’re logging in analytics likely to be a problem, is it if you reuse information for marketing, and yeah, I don’t know. Like the email addresses or customer email addresses and not direct mailing list sign ups, will it be a problem for…yeah, just like other kinds of customer records, other kinds of just basic web logging and tracking or is it going to be a little bit more explicitly about the kind of thing that Facebook does.

CHRISTIE: What have you figured out so far? Now you’ve got me wondering the same thing.

AUDREY: Oh yeah. No, I don’t know. I felt like everything that I’ve read on it doesn’t address that question directly. And so I’m still kind of looking for a legal explainer. It’s possible the answer is we don’t know. We’re going to wait and see how the courts want to enforce it. But I’d like to find a legal explainer that digs into that a little bit more because right now it seems like a big question mark for small businesses, what you’re really going to be expected to do. And obviously, I want to make a reasonable effort to comply with this sort of thing. I care about people’s privacy. I’m trying to not reuse things in shady ways unless it’s for a very specific and notified artistic purpose. And so yeah, I want to understand how I can effectively follow this.

CHRISTIE: There’s also a component that gives people a right to access their personal data.

AUDREY: And I think that’s been a focus of the EU in general that people should be able to see everything that’s being recorded about you and to ask for that not to be recorded, to ask for things to be deleted. One of the things I looked at said that this could affect call recording like customer service call recording. And that just the spiel at the beginning of the call might not be enough. People need to be able to halt the call or halt the recording and not just hang up.

CHRISTIE: Yeah and I do wonder at what point? Is this enforceable on an individual level or do you have to be of a certain size organization?

AUDREY: Well, the penalties are with respect to your revenue. So in that respect, if you have no revenue, I’m not sure what else they would do. Or if your revenue is pretty minimal, it was something like 4% of…the phrasing wasn’t something I was used to, like 4% of worldwide turnover, something like that. Does that sound familiar? Yeah, I took it as sort of like net revenue but I’m not exactly sure. But still for some organizations, the 4% is not really a big number.

CHRISTIE: So it sounds like a personal blog wouldn’t be subject to this if that blogs weren’t monetized in a business capacity.

AUDREY: Yeah, that was how I was treating it. Again, I would kind of want to look back and just try to find somebody that’s offering guidance on this for individuals and for small businesses.

CHRISTIE: Yeah, because for example like I run my own websites and I have Apache logs that record IP address.

AUDREY: Most people have something to that effect, not logging. It comes up a lot in terms of security and privacy for especially people dissenting against their government and not logging is the active choice that you have to make. It’s not the default choice.

CHRISTIE: Well, it can be useful sometimes to have logs if you’re being harassed or whatever.

AUDREY: Yeah. There are legitimate reasons that you want those. It isn’t just incidental shadiness.

CHRISTIE: The MailChimp article that you linked to had some good stuff in it. I mean, the Wikipedia article seems to be fairly comprehensive which sometimes works against understandability.

AUDREY: Right.

CHRISTIE: But the MailChimp one, I thought had a more concise sort of details and also included stuff they were going to do.

AUDREY: MailChimp was the only service I’m using that I felt like was making a good effort to ensure that users of their service could comply by explaining kind of what we could do on our side.

CHRISTIE: That also makes me wonder who’s, like is MailChimp liable for compliance or are you liable for compliance as the MailChimp customer?

AUDREY: Right. Or both.

CHRISTIE: Right?

AUDREY: Yeah, that also isn’t clear to me right now.

CHRISTIE: Tech Crunch had a review of the…I guess it sounds like Facebook did a press event and did a preview of the changes they’re making. Is this just changes you’re going to see if you’re in the EU?

AUDREY: Yeah, that’s a good question because that was one of the things that came up with the congressional hearings and the stuff people are digging into. Was that a week ago? Two weeks ago?

CHRISTIE: Yeah, because the episode we recorded a week ago that I’m going to post today talks about it and I found that picture of Zuckerberg’s notes and it’s got some bits in it.

AUDREY: Okay, yeah. So originally, there was some back and forth about whether Facebook was going to do this for everyone or just for people in the EU. And I’m not sure offhand which one they decided on.

CHRISTIE: And I read this whole article and I don’t remember what it said either way. The main thing that I got from this article was that Facebook is basically…this article runs step by step the new features. They’re basically applying to the letter of the law but not necessarily the spirit, like they’ve still made choices with their product design to basically…It says, “But with a design that encourages rapidly hitting the ‘Agree’ button, a lack of granular controls, a laughably cheatable parental consent request for teens and an aesthetic overhaul of Download Your Information that doesn’t make it any easier to switch social networks. Facebook shows it’s still hungry for your data.”

AUDREY: Yeah, they really seem to be gaming the privacy controls to try to encourage a particular outcome. And it’s not informed consent then. They’re really trying to step around that.

CHRISTIE: And there’s a handful of instances where it pointed out that your options with regard to personal data was to…it said granular control like you either removed it which would remove it from advertising but then you…you couldn’t control like use this data for advertising and use it to personalize my news feed, things like that. It was sort of all or nothing.

AUDREY: Right. Like either hide my sexual orientation entirely or use it for advertising.

CHRISTIE: Use it for everything including advertising.

AUDREY: Which doesn’t seem like a very useful set of options. I was really fascinated by the parental consent thing because I worked a little bit on a site some number of years ago that was for figure skaters, I think. But it had users under the age of 13. And so there was a whole thing that we had done around how the users could get parental consent to use the service. It was something like you could post your figure skating videos from your competitions or whatever, so like a little mini social network. And I remember that there was this whole big thing that we went through about how somebody needed to assert that that was an appropriate adult to provide like sign up consent for the kid. And that we talked a lot about how to make sure that the kids didn’t just circumvent it.

CHRISTIE: Yeah. So in Facebook’s case, they just have to enter an email of somebody and there’s no verification beyond that.

AUDREY: Right.

CHRISTIE: So they can enter another email they control or an email of their friend.

AUDREY: Yeah or just pick another friend on Facebook. I’m sure that there will be a round robin of kids all signing up for each other.

CHRISTIE: So, GDPR is coming. We want to know more about how individuals should be doing stuff or owners of small businesses.

AUDREY: And definitely if you’re at a bigger company or one that has a lot of interaction with people in the EU, you should probably be getting legal advice on what you need to be doing here.

CHRISTIE: Because you can have some exposure, some liability.

AUDREY: And if it’s not clear to the two of us to what extent do you have that liability then that sounds like time for a lawyer.

CHRISTIE: And that’s also another good reminder to frequently ask what are we collecting, why are we collecting it, do we need to collect that information?

AUDREY: For sure, yeah. And it’s just a basic respectful privacy. What am I going for?

CHRISTIE: Practice?

AUDREY: Yeah, practice. It’s something that you can do to show consideration for your users.

CHRISTIE: I don’t have anything more to say about GDPR at this point.

AUDREY: No. I’m going to be looking for reports as we get closer about what other companies are doing and keep an eye out for what the first enforcement actions look like.

CHRISTIE: So let’s say you get fired from your job. But for some reason, you still work at your job for a little bit knowing you’ve been fired. So you still have access to things.

AUDREY: Which is not a recommended practice for exactly this reason.

CHRISTIE: Yeah. And then let’s just say, I don’t know how much of this was premeditated, but you create an administrator account, a backup one or whatever and then you leave. And then you log in later and cause problems for your old employer. I have to admit I think there are serious aspects of this and I probably shouldn’t have enjoyed reading about it as much as I did. I think we’ve all been there with an employer.

AUDREY: Where there’s an employer that you would happily just put a wrench in their business, yeah.

CHRISTIE: Yeah, for whatever reason. And I could see being a woman in your more mature years getting fired and really wanting to give them the finger or stick a finger and think…

AUDREY: Yeah, just tell them to knock off.

CHRISTIE: So what we’re talking about, because I’m not just making up this story, and it’s not me. I should clarify. Not at all talking about me here. Was it this woman? And it sounds like she lived in California but worked for a regional airline in Alaska called PenAir. You found this, Audrey. Do you want to give any more of the details?

AUDREY: Yeah. So she uses this administrator account to delete things that affected the run of business, so things like seating charts for upcoming flights. And so nobody could actually be checked in without that piece of data in the system. And I don’t remember if it said like how long this went on but because there was a company VPN that she needed to use to access this, they were eventually able to kind of connect when she was logged in with a VPN with when these things happened and narrow it down to this particular person.

CHRISTIE: Yeah and then they went and searched her home and grabbed laptops. And on the laptops, they found activity logs from the VPN. So this article from Hack Read points out that in this case…so it’s not uncommon for VPN providers to give logs law enforcement. But in this case, it wasn’t the server side logs. It was the internal VPN service and then the actual client logs. I did find it kind of amusing that this article ended with, “Also, don’t forget to delete the activity logs.”

AUDREY: Yeah. Cover your tracks. And I thought that was sort of interesting that needing to use a corporate VPN to log in is kind of a difficulty of this sort of thing. It’s supposed to protect the security of the communication between you and a system that maybe normally only has on-site access. And it makes it a little bit easier to screw around or a little bit harder to screw around.

CHRISTIE: Yeah. So I guess if she wanted…what’s her name? Suzette Kugler. She has a Boxer, a pretty looking Boxer companion. What is she holding in this picture?

AUDREY: I don’t know.

CHRISTIE: Oh, it’s ribbons. An award winning Boxer companion.

AUDREY: Nice.

CHRISTIE: I get totally distracted by the dog picture. I don’t know what I was saying. Oh, covering your tracks. Yeah, I guess deleting client logs and then…I don’t know what additional…I guess launching the VPN from a different mount point or something.

AUDREY: Yeah, and maybe only doing it during active business times. So it would be a little bit harder to show that you were the only one logged in right then. But she pled guilty.

CHRISTIE: Right and she’s doing 50 hours of community service and probation for five years.

AUDREY: It sounds like they decided that because she had gone this far in her life without committing major crime that maybe they’d be forgiving.

CHRISTIE: Yeah. And it sounds like it was sort of mischievous. I mean, the fact it was an airline though I would think would send some people’s hackles up.

AUDREY: But yeah, she affected the run of business, not the safety of the airline.

CHRISTIE: Okay. So if you’re going to fire people, just remove their access prior to firing them.

AUDREY: Yeah. Don’t tell them until…

CHRISTIE: I mean, it’s good for the employee too that they’re not…I think that the complement to that is if there’s records that you need from your employer, make sure you have them like ongoing.

In less entertaining topics, what’s going on with Russia and Telegram?

AUDREY: So a court ruled…let me open up the article.

CHRISTIE: What’s Telegram, first off? I’ll answer my own question. Telegram is like WhatsApp or Signal. And I know there are security differences between the three that I don’t remember all the details of. But I know that Telegram is very popular in certain locales or communities or whatever.

AUDREY: And it’s based in Russia.

CHRISTIE: Oh, okay. I don’t think I realized that. It was getting popular in Mozilla when I left among the Mozilla and the volunteers.

AUDREY: It’s based in Russia and one of the things that I was reading about here with this issue that came up with Telegram is that people get a lot of news this way. It’s not just being used like for a one-on-one communications but for shared news channels, they go outside of the sort of big government controlled space. So it’s how people find out what’s actually going on around them.

CHRISTIE: Although there was bits about…because it looks like they also interviewed a handful of Russians. I don’t know how they picked these folks.

AUDREY: It sounded like it was just friends of the writer.

CHRISTIE: Okay. But one of the things that caught my eye was that some of what they’re sharing is just gossip and that a lot of younger Russians feel really disenfranchised or that there isn’t a meaningful way to participate in the political process. I thought that was kind of interesting.

AUDREY: Yeah. Just how people respond to a really restricted environment. So yes, Telegram is used in some of the same ways as Signal but also a little bit like Twitter, it sounds like, for people to communicate. And because it’s based in Russia and because it is not controlled by the government, a court ruled…and because anytime you have secure communications, there is the possibility of people using it for terrorism or for further kinds of criminal activities. There was a court in Russia that ruled that Telegram should be banned because the people who make it wouldn’t handover the encryption keys to the government. So they wouldn’t let them decrypt communications.

CHRISTIE: Well, it’s developed by a London-based company.

AUDREY: Why did I think Russia?

CHRISTIE: It was founded by a Russian entrepreneur, Pavel Durov.

AUDREY: Oh, well there we go.

CHRISTIE: So, I give you full credit.

AUDREY: A Russian in London, okay.

CHRISTIE: So Telegram, this is what happens when I get woken up at 4:00 in the morning. So a court said, “Hey, you’re not giving us encryption keys. You’re helping to facilitate communication we don’t like. Go away. And terrorists use you.”

AUDREY: “So, let us crack this thing open and see everything everyone’s doing.” And Telegram said no. So, the court ruled that they should be banned. And banning such a thing is a little bit complicated. And in the attempt, they seem to have broken a lot of other internet activities, a lot of other internet access.

CHRISTIE: And part of this is because…so they blocked a handful of subnets that ended up being a lot of IP addresses and their IP addresses of Amazon’s, of AWS and Google Cloud platforms which I hadn’t heard of this but that a lot of Russian businesses use. And so what ended up happening is it not just Telegram that was blocked but like a whole bunch of ‘legitimate services’ like a bank.

AUDREY: I mean, that doesn’t surprise me. Think about if AWS got blocked in the State of Oregon or something, you would find that most of the things you use don’t work in some way.

CHRISTIE: I think a lot of things wouldn’t work. No one would be able to watch Netflix.

AUDREY: Yeah, every time there’s an outage, we see that even if it’s just that all of the photos on the website that you’re looking at don’t load because that was a CDN.

CHRISTIE: So, is this still ongoing?

AUDREY: I think so, yeah. This article is from two days ago. I hadn’t looked for an update on this but the people interviewed seem to think that it would be an ongoing issue.

CHRISTIE: And they’ve done more than just that. They’ve asked the Google and Apple Stores to stop allowing the Telegram app and a few other APK mirror, which I don’t think anyone’s complied with.

AUDREY: It didn’t sound like it. And that’s something that we see come up a lot with government suppression and multinational tech companies that they kind of weigh their ability to do business and the need for compliance, how much is this going to affect their ability to sell services in Russia. And if the answer is not much at all then it’s not really much pressure.

CHRISTIE: The other thing that popped out for me from these interviews is that a lot of people are finding work-arounds anyway. Though a lot of this is, “I’m using a proxy server that a friend set up.”

AUDREY: Or popular ones that are just in the EU. There were a couple of quotes that I had highlighted that I thought were just kind of interesting and revealing. One of the interviewees said, “Young people are yearning for non-algorithmic shit.” And this comes up with social media so much. I thought it was really interesting that they called that out as a thing that Telegram was offering is just a stream of information that wasn’t being manipulated by the platform or the government.

CHRISTIE: It closes with this quote and this has ableist word which I’m going to say because it’s part of the quote. It says, “My mom’s take: Thank God repressions are handled by such incompetent idiots. My family remembers full well how it was when they were carried out by professionals.” I don’t know. That’s one person’s take. I don’t know how accurate that is. I don’t want to minimize the damage, or repression is necessarily happening.

AUDREY: But it does sound like at least everybody that the author had talked to or was aware of, they were seeing this more as a nuisance and just kind of a sign that the government still doesn’t understand the internet which has good and bad effects.

CHRISTIE: And it’s really hard to…in the same way that it’s hard to to ensure privacy on the internet, it’s also hard to block people from information.

AUDREY: You have to block quite a bit to get that effect. And when governments want some businesses to work and they want some activities to work, it ends up being like, “Well, you can shut down the internet entirely,” or you can sort of haphazardly go at it. It’s kind of hard to have to find like fine grained control that maybe they’re thinking about. And in part because of Cloud services and because of CDNs and because of all the ways that things are distributed.

CHRISTIE: I think when I last used Telegram, I didn’t have this. They’re talking about these channels where there’s like thousands of people in the channels which I didn’t know was a thing Telegram could do.

AUDREY: Yeah.

CHRISTIE: It’s interesting.

AUDREY: Yeah, I haven’t used it.

CHRISTIE: When I was still at Mozilla and some people were using it, I feel like there were groups and they were limited to a fairly small number, like less than 100 or something. So it’s kind of interesting to see where it’s gone.

Last episode, I think we talked about possible tools the FBI had to break into iPhones. And this is more information on this. There’s a company called GrayShift. It’s not clear to me how like where this fits in with the timeline of what the FBI has been doing. I just sort of felt I had to read between the lines a little bit. But there’s this company called GrayShift and had a thing called GrayKey not GrayBox. I don’t know why I started calling it GrayBox.

AUDREY: That they’re currently offering to law enforcement, selling to law enforcement.

CHRISTIE: Yeah. Maybe I started calling it GrayBox because it’s a literal box. It’s a box with two lightning dongles coming out of the plug phones into and it basically brute forces the passcode and somehow circumvents the setting to delete the phone if you get too many incorrect guesses.

AUDREY: Right, yeah.

CHRISTIE: Does that work on a time out. So it’s not ten ever, it’s 10 and then you’re locked out or it’ll delete.

AUDREY: Right, depending on what you have set up. But by default, every time you get it wrong, it increases that time out before you’re allowed to try again specifically to prevent this kind of thing. Somebody just sitting there and trying random passwords until they get in.

CHRISTIE: This didn’t go into detail about how exactly that’s bypassed. I think they’re keeping it a secret. But merely that it’s available, they’re selling it to law enforcement. It’s really spendy. I mean for me, it’s really spendy. I don’t know how this compares to…

AUDREY: A law enforcement budget for cracking things.

CHRISTIE: I’m looking for exactly how much it is.

AUDREY: Hundreds? Thousands? Tens of thousands?

CHRISTIE: Well, there were two models. There was a model that was cheaper and you got so many tries and then you had to pay like some kind of subscription. And then there was a price model that you got to buy it outright and that was much more.

AUDREY: That’s funny that that’s the pricing that they’re doing.

CHRISTIE: I don’t know why I can’t find it now. Maybe it’s in a different thing. Anyway, the reason we are talking about this is because…there’s a timing on here, okay. Clearly, I’m not prepared enough. I think Matthew Green had some timings on this.

AUDREY: How long you have to wait before you can put in the password again?

CHRISTIE: Yes. And then therefore the more complex the password, the more chances that are required. So the longer it would potentially take. So for example, a passcode made of 10 random digits would take as much as 25 years to crack, 12 years on average. And so this article starts off by saying, “Stop using 6-digit iPhone passcodes.”

AUDREY: Which I’ve seen quoted a couple of times takes about three days to crack, a 6-digit passcode.

CHRISTIE: Yeah. I’ll find the original tweet to put in the show notes that has the calculations because it’s sort of like a 4-digit is much less time as 6-digit and then a 10-digit and then like a long alphanumeric even longer. And the thing that…I saw this and I thought, “Okay well, I’m using a 6-digit code, should I change it?” And I thought, “Well, alphanumeric sounds nice but then you get the keyboard pop up.” I keep my phone locked all the time. I can’t imagine having to type a strong password on that keyboard every single time I want to do something with my phone. And then someone reminded me, “Oh well, if you just use all digits, you get the digit entry pad.” So I tried that for about an hour. And it wasn’t so much the extra digits, it was that…so if you have a 4 or 6 digit, it knows how many digits you’re going to need to enter and it just unlocks when you reach them. But when you have longer than that, you have to hit a little OK button and it’s at the top of the keypad. And I don’t even have the biggest iPhone and it was the combination of that extra step and having it be so much further up at the top of the phone.

AUDREY: So you can’t do it just with your thumb with like one hand very easily?

CHRISTIE: You can but it actually hurts a little bit. And I know that it’s pretty easy to get RSI. I know I could tell it was overextending what I should be doing with the thumb. And it sounds really trivial but it adds up.

AUDREY: Yeah. No, it didn’t sound trivial to me at all. I also keep my phone locked all of the time. I definitely notice like how much thumb use I’m getting especially since I use that thumb trackball too. So yeah, the idea of putting in more than a 6-number code just sounded like the kind of thing I would never do it. I would try it and then I would realize it was terrible. And then I would find an easier way to unlock my phone that was probably more vulnerable or an easier way to use things without unlocking it all the time.

CHRISTIE: Right. Like I also considered, “Well, maybe I’ll go alphanumeric and then I’ll use Touch ID,” and that just seemed like a bad…because apparently you can disable Touch ID if you…they added the thing where if you hit the power button five times in a row or something. And I just thought, “Well, the chance of me being under duress and remembering and having the opportunity to do that versus…” I don’t know. I just decided that wasn’t a good tradeoff.

AUDREY: Yeah, I thought this was a really terrible security advice. I mean, it’s good that they’re pointing out the details of it, like how long it takes to crack something, that the government does have a way right now to get into your phone faster with the default password. But I don’t really see this as useful advice to anybody other than a very small and specific group of people who are likely to have their phones searched by law enforcement in a way where…I mean, not just like on the street but in a way where their phone’s taken away from them. There’s a very small group of people that actually have to worry about that.

CHRISTIE: I found the thing about GrayKey pricing. It’s in a different article. So the unit costs $500, GrayKey annual license online 300 uses for $14,500.00.

AUDREY: That’s spendy. And I also think that whatever they’re doing, Apple has to be looking to block. It’s not in their interest to allow this to continue.

CHRISTIE: Oh, the fully offline version costs $30,000. I wonder what they’re doing with the extra data from that, if they’re doing anything with the extra data with that online version. One of these articles mentioned that this company has ex-Apple’s security engineers working for them. So I think one of the things that comes up a lot in these conversations is not just people not thinking about the unintended consequences but I think sometimes there’s too much assumption of good action. And I think we always have to know that there’s going to be people that are going to be willing to work for these companies and use their knowledge to circumvent these things. So it’s always going to be a game of cat and mouse.

AUDREY: Yeah. I’m still thinking about the [inaudible] model aspect of this. I mean, it would be a little bit different if we start to hear that the TSA is just routinely taking people’s phones and doing this then that would be argument in favor of longer passcodes.

CHRISTIE: Yeah or maybe changing to the longer passcode.

AUDREY: When you have to get on a flight.

CHRISTIE: Yeah.

AUDREY: There’s a couple of reasons that we don’t just by default recommend Touch ID from a security perspective. One of them is that in the US, legally they’re treated as different kinds of search consent. So, the passcode is something that you don’t have to give up whereas putting your thumb on the device might be something that you can be forced to do. And I’d rather not have a body part associated with like getting access to my phone. Not that I think anybody’s going around like chopping thumbs off, but I just don’t want that ever to be a possibility.

CHRISTIE: Yeah, it crosses a line for you.

AUDREY: Yes.

CHRISTIE: To be fair, the article does say as usual it’s a matter of threat modeling but it doesn’t really…

AUDREY: They didn’t start there.

CHRISTIE: No.

AUDREY: And yeah, there’s so much about security that increases people anxiety. I don’t think that we need to load that on.

CHRISTIE: I’d be happy if just most people had a code at all.

AUDREY: Sure.

CHRISTIE: I think that’s still not the default. Although Apple is making it much more so, I think.

AUDREY: It’s harder to skip it, yeah. And the nice thing about that is that if you go ahead and you follow their default recommendation, your phone is encrypted at that point. The two things go together.

CHRISTIE: When my phone did that weird thing where it wouldn’t authenticate anywhere with Apple’s services, that did to me point out a pretty…like that also meant that I couldn’t do any of the remote things that you might want to do if your phone is not in your hands anymore. So I’ve been thinking about that too. I think we need to understand at what point those sort of remote features break down and know that we may not be able to rely on them 100%. And I don’t know how common that bug is, probably not very common if the Apple people mostly hadn’t seen it.

AUDREY: Yeah, you’re the only person I know who has mentioned it.

CHRISTIE: But it does require the phone like gets on the network again. I think we’re to things we love on the internet this week.

AUDREY: Cool. Well, mine’s security related.

CHRISTIE: Perfect.

AUDREY: Yeah. So there was this really neat conference that I get the impression was thrown together kind of last minute. But it happens just the other day, Tuesday or Wednesday, I forget and it’s called…I don’t know if they said like the individual acronym or not but OURSA. Well anyhow, it’s Our Security Advocates and they threw a one day conference with no white men, just pretty great. They basically did…I loved the format. They livestreamed the whole thing. So they did this really great format where they had like four topics that they covered and each one was a series of short talks and then a panel discussion. Let me see if I can quick find the topics they covered. Somebody I know was live tweeting the whole thing so I didn’t manage to tune in to the livestream but I saw just a lot of highlights from it. And the conference recording is online. You can watch the whole thing. I don’t know if they’re going to split it out into the individual sections or not, but we’ll have a link for that. Sorry, I’m just letting the agenda [inaudible].

CHRISTIE: I got it up if you want me to read the topics.

AUDREY: Sure.

CHRISTIE: Advocating for High-Risk Groups, Applied Security Engineering, Practical Privacy Protection, Security Policy and Ethics for Emerging Tech.

AUDREY: Interestingly, their closing keynote was somebody from the Department of Homeland Security which I thought was interesting but a woman. And just that if they threw this together in five days and CloudFlare hosted and this is the speaker lineup they got, it’s just one of those things again where with conferences, there’s no excuse for not having a diverse panel.

CHRISTIE: Right.

AUDREY: There’s just no reason because the expertise is there.

CHRISTIE: Yeah.

AUDREY: And people said just great things about the content, the conversations that they were having. So I’m looking forward to putting the stream on later and watching this.

CHRISTIE: Cool. They got a lot of sponsors for putting it together so quickly.

AUDREY: Yeah. There must have been some kind of impetus but I didn’t see anybody mention like what the motivation was to throw this together so fast.

CHRISTIE: I was just looking through the longer agenda. Cool. I have two things this week. The first is about tumbleweeds. And I showed this to you yesterday, Audrey. But there’s a high desert town in Southern California, Victorville, and they’re being consumed by tumbleweeds. Remember Trouble with Tribbles? It’s sort like that.

AUDREY: But the plant version?

CHRISTIE: But the plant version and of course tumbleweeds are bigger. And it’s been really windy and the tumbleweeds are just blowing into people’s yards and like covering up their front doors all the way up to the second story.

AUDREY: Yeah, that’s serious tumbleweed problem.

CHRISTIE: So I don’t know too much about the cause. We’ll link to a tweet that has a video. Tumbleweeds are pretty interesting plants too. And the Wikipedia article has some photos of them blooming which I’ve never seen. So it kind of inspired me to go take like a springtime high desert trip to see a different kind of desert, blooming plants.

AUDREY: Yeah, I think this is the time of year to do it. I took a tumbleweed. I don’t think I told you this yesterday I took one to show and tell in kindergarten.

CHRISTIE: Yeah.

AUDREY: Because my grandparents were living in Eastern Washington. We moved out there not long after but they brought one in for me into Portland for me to take to show and tell, and I was thrilled.

CHRISTIE: How far do you have to go to get one, do you think? Is that a day trip?

AUDREY: Would we want to? Oh yeah, we could definitely do it in a day.

CHRISTIE: Hmmm.

AUDREY: Should we go collect some tumbleweeds?

CHRISTIE: It sounds…gosh, got to get my oil changed first.

AUDREY: Yeah.

CHRISTIE: But that’s sounds really fun.

AUDREY: I think that they’re really super, super cool. I think they’re also kind of invasive, like this is showing.

CHRISTIE: And then the second thing…I want to do two things this week, is this Slate podcast called Lexicon Valley that is hosted by John McWhorter. I think he’s a linguist. And it’s just all about language. Some of the recent topics, the one I just listened to is all about the letter P and the sound of P and the really interesting things about that. And then also why spelling in English is such a mess. Also things like, did the founding fathers have a British accent? It’s opening my eyes to a lot of things I had taken for granted or just assumed or just had no idea about language. And so I’ve been really enjoying that. And of course when you’re talking about language, you talk about history and culture and stuff. Anyway, I’m finding it interesting. I thought some of Recompiler folks might as well.

AUDREY: Yeah, it sounds cool.

CHRISTIE: All right. I think that’s our show this week. Thanks everyone for tuning in. We will talk to you again soon.

AUDREY: Thanks.

CHRISTIE: And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to podcast@recompilermag.com or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.

The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.