Episode 52: The yak’s given up. We’re both taking a break.

Download: Episode 52.

This week Audrey and I chat about Palantir’s predictive policing activity in New Orleans, Trustico’s mishandling of private keys, the recent DDOS attack on GitHub, ProPublica’s guide to authenticating email, and more. Enjoy!

Show Notes

Community Announcements

Call for Contributors for Issue 11: Love and romance

For our third issue of 2018, we’ll be talking about love and romance! We’re looking at the technology that brings us together with our fellow humans. Our guest editor for this issue will be Thursday Bram.

Here’s a few ideas to get you started:

  • How to understand the information a dating site collects about you
  • Almost anything about matching algorithms
  • Technical concerns of ending a relationship (when do you kick folks off your Netflix account?)
  • Date ideas randomizers
  • Wedding planning tools with APIs or SDKs
  • Analysis of fan fiction

We look for ideas that will be effective at an advanced beginner to intermediate level of technical knowledge, and that are grounded in the author’s personal experiences. We’re especially interested in work from people who are part of under-represented groups in technology. Contributors are paid.

Find the details and submit your ideas at https://recompilermag.com/participate/. Submissions are open through March 8.

Note from our guest editor: Love is for everybody. I want pitches about technology for different people with different experiences. There is one constraint I want to talk about, though: Try to keep your pitches on the PG side of things. The Recompiler’s readership includes high school students, so I want to be mindful about how we cover any particularly sexy topics.

Now Broadcasting LIVE most Fridays

We broadcast our episode recordings LIVE on most Fridays at 10am PST. Mark your calendars and visit recompilermag.live to tune-in.

We love hearing from you! Feedback, comments, questions…

We’d love hearing from you, so get in touch!

You can leave a comment on this post, tweet to @recompilermag or our host @christi3k, or send an email to podcast@recompilermag.com.


CHRISTIE: Hello and welcome to the The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.

Episode 52. This week, Audrey and I chat about Palantir’s predictive policing activity in New Orleans, Trustico’s mishandling of private keys, the recent DDOS attack on GitHub, ProPublica’s guide to authenticating email. Enjoy!

So this is March 2nd. It’s March already, Audrey.

AUDREY: Ahhhhh….

CHRISTIE: March 2nd and what episode is this going to be? Fifty two.

AUDREY: Fifty two.

CHRISTIE: The number of cards in a standard playing card deck.


CHRISTIE: And you got any announcements for us?

AUDREY: Well, I believe there is one week left on our Call for Contributors for the Love and Romance issue, which is if you’re listening live, you’ve got a week. And if you’re listening on the recording, I don’t even know what we call the distributed feeder, whatever, then you’ll probably have a day or less. But we could always extend it a couple days if we need to. And I am sure that we have a great pool of submissions and we’d love some more.

CHRISTIE: Awesome. The short or the easy to remember URL for that is recompilermag.com/participate.

AUDREY: Yeah. And there’s a link to the Call for Contributors from there.

CHRISTIE: Yeah. And of course, we’ll put the link in the show notes. Any other announcements?

AUDREY: I would love it if people who have been thinking about reading issues 9 and 10 would consider a subscription. Subscriptions really help us plan for the year a lot more than individual issue sales. And I just think that we have a great year planned here and I would hate for anybody to miss out.

CHRISTIE: And how do people get a subscription?

AUDREY: They go to shop.recompilermag.com and click on the link.

CHRISTIE: We have print subscription, as well as electronic subscriptions, right?

AUDREY: Correct. Yeah. And everybody gets the digital copies, but the print subscription also gets you a shiny print copy for yourself.

CHRISTIE: Nice. It’s something you can hold in your hand. All right. So our first topic, I knew this was going to upset me. It’s about Palantir. Does that mean anything? Is that a word that means something?

AUDREY: I don’t know. I keep meaning to look it up.

CHRISTIE: There’s a word that’s like…oh, Palantir with an accent, I don’t know if over the I is. It’s a fictional magical artifact from J. R. R. Tolkien’s legendarium. I’ts described as a crystal ball, used both for communication and as a means of seeing events in other parts of the world or in the distant past.

AUDREY: I’m going to guess that’s the reference.

CHRISTIE: Yeah. Now we have that cleared up. No, you do not need this video. I already told you not to play.

AUDREY: Are you on a news site?

CHRISTIE: Yeah, I had The Verge article open. Okay, so what’s the deal? What has Palantir done now, Audrey?

AUDREY: So they have deployed a system in New Orleans for predictive policing. And not only did they offer New Orleans this thing, but they did it through a route that prevented oversight. They took advantage of a non-profit attached to the mayor’s office to offer this as a donation so that they could deploy it without the city council having a say in it or any of the other oversight bodies that might exist. And so instead, it went directly through the non-profit no budget process because it’s a donation, to the mayor’s office, and then to the police.

CHRISTIE: Good times. I think this is another case study in my ‘non-profits are not magic’ book.

AUDREY: Sure. I did think it was clever that they took advantage of…if there’s no purchase, then again, there’s no budgetary oversight. And that’s something that we had talked about a little bit with New York in their attempts at algorithm transparency in public services and policing, that the budget is one place where companies providing this stuff to get examined a little bit more and the thing that’s being purchased gets examined more. By donating it, they circumvented that.

CHRISTIE: Because in New Orleans, there’s a city council, but their main oversight mechanism is through budget approvals. The article says, “New Orleans is predicated on a strong mayor model where the council does not have approval authority over contracts or policies for the city police department.”

AUDREY: It’s the opposite of how Portland operates. We have a weak mayor model.

CHRISTIE: So a contract like this would have to go through some sort of city council process or the council member for that area, right?

AUDREY: For that bureau.

CHRISTIE: Do we still have a separate or do to the mayor take over responsibility for the police department here?

AUDREY: I believe that our mayor in Portland is currently the head of the police bureau. Yeah. The one way that…and I don’t know enough about how this works out in different cities, but I know that one of the things that serve interesting about what we do is that the mayor gets to control the bureau assignments and so, that is used in a lot of different political ways.


AUDREY: But in New Orleans, they’re able to sidestep that, right? The mayor has a lot of control. And again, by circumventing all of the things that let the city council even see what’s going on, they were able to put this into practice in the police department and basically run an experiment on people in New Orleans.

CHRISTIE: Right. And the city council didn’t…it’s not that they weren’t granted oversight. They didn’t even know about it. They didn’t know about it until The Verge went to ask them about it.

AUDREY: Yeah. There’s a lot of people that seem to have gotten looped in only because The Verge was reporting on it.

CHRISTIE: How pissed would you be? I’d be so mad.

AUDREY: Yeah. I would be furious.

CHRISTIE: So it says, “Palantir’s prediction model in New Orleans used an intelligence technique called social network analysis (or SNA) to draw connections between people, places, cars, weapons, addresses, social media posts, and other indicia in previously siloed databases. So not unlike…I can’t remember the name of the company, but we talked about it in a recent episode that the company ICE is contracting with for the license plate data. So sort of think about that sort of stuff.

AUDREY: Right, to pull together all of these different sources of data. So yeah, it’s that they’re taking all this different data collection, right? Like the license plates, like I don’t know, people’s locations when they are stopped by the police because one of the inputs that we know that the New Orleans system had was basically these stop and frisk type records. And then they’re looking at social media and other pieces of information that tell them who associates with who else and using that to build a model of who they…they always [inaudible] who could be a victim or a perpetrator of a crime, but really they’re looking at people that are already under a lot of scrutiny and what the system can perceive as a risk of committing further crimes.

CHRISTIE: So, Palantir gave NOPD access to the system. But Palantir also got access to New Orleans police criminal databases. It says, “For ballistics, gangs, probation and parole information, jailhouse phone calls, calls for service, the central case management system (i.e., every case NOPD had on record), and the department’s repository of field interview cards. The latter database represents every documented encounter NOPD has with citizens, even those that don’t result in arrests. It’s a huge amount of data.

AUDREY: Yeah. And if you were trying to train a system, that’s everything.

CHRISTIE: And they linked this system to the city Cease Fire program, which Cease Fire is a form of the decades-old carrot-and-stick strategy developed by David Kennedy, a professor at John Jay College. In the program. law enforcement informs potential offenders with criminal records that they know of their past actions and will prosecute them to the fullest extent if they re-offend. If the subjects choose to cooperate, they are called in to a required meeting as part of their conditions of probation and parole and are offered job training, education, potential job placement and health services. In New Orleans, this program is run under this NOLA For Life, which is that philanthropic pet project of the mayor’s.

And the article goes on to say that there was this temporary drop in violent crime. But looking at it, it’s not really coordinated or correlated with the…I’m using the wrong words. But it’s not clear that it’s because of this.

AUDREY: There’s no real geographic correspondence.

CHRISTIE: And some of the people involved in the Cease Fire program basically said that over time, the emphasis was much more on the stick aspect of the carrot-and-stick and it really became about the police in the city dictating how the program should be run versus the be more of a grassroots effort.

AUDREY: The way that it was described I thought this just sounds like kind of a mobster thing. Like, “Oh, we’ve got our eye on you. You better…” I don’t know. I shouldn’t laugh because it is really a serious thing of police over scrutiny of people, especially people of color. And just the impact of that, like getting called in constantly, getting stopped constantly.

CHRISTIE: There’s this really good quote in here that says…who is this from? This is from, I think it’s from William Isaac, this Michigan State researcher who analyzed predictive policing systems. He says, “If you’re trying to predict anything, you need to have some representation across the universe that you’re trying to predict. If you’re trying to predict crime, you need to have a positive and negative examples for every possible offense. Police departments tend to have good data about communities where they are present, but little data about communities where they do not patrol as vigorously – which tend to be affluent in white.”

And the reason this is concerning is not just for the people of New Orleans, but Palantir is using this effort in New Orleans as like a case study to sell its system to other cities, including other governments outside of the United States. Did you catch this bit about how they sold it to the Danish government and the Danish government had to pass an exemption to the European data protection laws?

AUDREY: Yeah, I saw that. I also thought that would make me pretty angry if I lived in Denmark. It’s a cut a hole in it like that for something that is just so shoddy. A thing that is mentioned throughout the article is that everybody who’s done research that contributed to this kind of modeling has pretty much distanced themselves from the idea of using any predictive policing. They’ve said that they’re trying to understand history and to model the past of what’s happened, but they don’t think in any way it’s appropriate to use it in a forward going manner.


AUDREY: And I think a lot of the abuse of data we see is like that, that something that’s very limited in scope is taken in a much broader capacity because it benefits somebody, like it’s very profitable for Palantir in New Orleans as it perpetuates existing injustices. It back feeds for all these systems.


AUDREY: Yeah, it is. We talked about, including this today, I think that was…my only point of hesitancy is like, this is really gross, but it’s something for us to be aware of and really just continually talking about because the thing that does kill it is scrutiny, making it visible, making everybody explain what they’re doing, why they’re doing it.

CHRISTIE: Right. And at that local government level.

AUDREY: Yeah. They really did a good job here to look at where else Palantir is trying to sell this to get more information about that.

CHRISTIE: Right. And Palantir is 50% owned by one person. Anything else about that?

AUDREY: The other thing that came to mind was that just the faultiness of this kind of modeling. There are sociological tools that tell us more about who commits violence that aren’t about this kind of just intense racial bias and this marginalizing of communities geographically and socially. It comes back to what we’ve talked about a lot with gun control, that domestic violence is just a huge red flag and that’s something that you never see them talk about modeling. I don’t know, that just really gets me.

CHRISTIE: And also that even though they were sort of being opaque about it, it’s very likely that part of what Palantir is doing was targeting individuals and that these problems are not problems with individuals. They’re problems of communities and they’re problems of communities being under resourced and under developed. This Robert Goodman, the person who is a responder for the Cease Fire Program says, “As long as they’re not putting resources into the hoods, nothing will change. You’re just putting on Band-Aids.”

That makes me think of…Minority Report was the movie. I don’t remember if that was the same title, the Philip K. Dick book. He wrote about that. I don’t know how similar the book and the movie are but there are some of the things there that he sort of predicted.

AUDREY: The surveillance and the use of surveillance.

CHRISTIE: And the predictive, the pre-crime.

AUDREY: Yeah, it is kind of a path you go down that if they collect all that information on us, then of course somebody would look at it and go, “Well, why can’t we predict what will happen next?” And this kind of approach doesn’t separate cause and effect. It’s the reason that people are exposed to more violence because of the policing or is it a separate factor? Like that kind of thing never gets separated.

CHRISTIE: And there’s this idea because it’s an algorithm, it can’t be tampered with or it’s somehow on that level of like proof and it’s not.

AUDREY: Yeah, we can see that every stage they have faulty data, they have faulty models, they have faulty usage. We can see that it’s failing at every step.

CHRISTIE: You got to love how they want to use these kinds of models for this stuff and yet we have this whole group of people who are constantly denying our climate models.


CHRISTIE: So enough about that trash fire. This other trash fire…

AUDREY: The second one is genuinely funny because it’s just so inept.


AUDREY: Fortunately, I don’t think anybody’s security was actually harmed.

CHRISTIE: Yeah. So Trustico. Trustico? Wait a minute, why am I suddenly doubting myself? Is that what the company is called?

AUDREY: Trustico.

CHRISTIE: So they’re a reseller of SSL certs, TLS certificates, to use the right term. And what did they do, Audrey?

AUDREY: Well, they’re in the process of switching what company they buy from and
they decided that the best way to do that was to try to negotiate with their current provider to drop all the certificates and let them move over to this other company that they wanted to use. And their provider said, “Well, no. We don’t do that.”

CHRISTIE: Revoke them.

AUDREY: Yeah, revoke them. So, the other company said, “We don’t do that. We would only revoke a certificate in the case that was compromised.” And so Trustico said, “Well, how do we demonstrate that?” “Well, okay. Provide the private key.” So then they sent over via email the private keys for all of the certificates.

CHRISTIE: Twenty three thousand of them.

AUDREY: That’s compromising them.


AUDREY: At which point, they were provoked or there was kind of a 24-hour window, I think. But yeah.

CHRISTIE: Right. So the reason this came to light is because DigiCert, which was the company that Trustico was asking to revoke these certificates got emailed the keys and then they had to revoke them. And so, they started alerting Mozilla and Chrome, I think. They started alerting the browser people that you notify when you have to do such a mass revocation like that.

AUDREY: Right. And the individual customers. There are many things that are remarkable about this story. One of them is that they had the private keys available apparently when they’re generated. Trustico had been storing a copy in a way that they felt was internally secure. But still, they’re holding on to a copy.

CHRISTIE: Well the first thing is that they should never have really been generating the private keys in the first place. That’s something you should only do on the machine where the private key is going to be used. So, they were doing that as a customer service feature, a customer service thing. And then clearly, they were hanging on to them.

AUDREY: Right. And the rationale for doing this, I don’t know if the blog post is still there and still says this, but their rationale for doing this was that if it was compromised, then they could help revoke it. But again, that’s not how you’re supposed to do this.

CHRISTIE: No. I think the takeaway here is if you’re getting a TLS certificate or anything that requires public/private key encryption, only you should know about the private key.


CHRISTIE: Because once it’s not, then you have to trust whomever has access to that private key. And they can do things like this with it.

AUDREY: And you may not have control of how it propagates. And like you said, it should sit on the machine where it’s going to be used.

CHRISTIE: I used to do this before Let’s Encrypt is you would, it was annoying, you had to do stuff on the command line. You had to like get the thing from the issuer and then run a thing and then copy the thing over. And so, I get why they enabled that. But it just makes me really glad that we have things like Let’s Encrypt now because for most people, that’s a perfectly fine level of TLS certificate and you don’t have to do any of that back and forth manually.

AUDREY: Yeah. I have, for my personal website, I have shared hosting and the hosting provider automatically regenerates my sort of get through Let’s Encrypt every, I don’t know, three or six months. I forgot.

CHRISTIE: I feel like it’s three months, but I haven’t looked at it recently. And the way that works is that the private key is only ever on that server.

There’s kind of a funny Twitter thread. If you want to catch up on this story and you’re short on time, just read this Twitter thread from Geoffrey Thomas that we’ll link to because it’s really a short overview of it.

AUDREY: And it’s sort of the ‘but wait, there’s more’.

CHRISTIE: Right. So, they’re explaining the point which DigiCert reached out to the Mozilla security policy list for help managing a massive revocation and also emailed all Trustico customers as a heads up. And then Trustico responds angrily to the list objecting to this being called a compromise and calls DigiCert’s email “absolutely defamatory”. And then their website gets taken down shortly after this is all revealed. Turns out there’s a really obvious security vulnerability, a shell injection in their order form. Anyone entering crafted input had full access to their live server. Speculation is that someone took pity on their customers and injected a ‘shutdown dash -h now’.

AUDREY: That’s a very kind use of that vulnerability.

CHRISTIE: But seriously, if you don’t have enough of your stuff together, that you have a shell injection vulnerability on your order form, you got no business storing private keys. And it’s not a coincidence, in my opinion. These two things go together because that’s inexperienced people running their technology.

AUDREY: Right. And nobody doing proper security audit to look at all of the aspects of it. So in a way, I feel like this was really lucky day for Trustico’s customers.


AUDREY: They found out quite suddenly and dramatically that they did not have a very secure backing and had an opportunity to fix it.

CHRISTIE: Yeah. That’s what I wanted to ask, what’s in a name? Trustico…maybe not so trustee of a company.

AUDREY: No. Like I said, I think it gets to be funny because the impact is mostly annoying.

CHRISTIE: Key takeaway here is if you’re generating a private key on something you don’t control, walk away. Stop, pack away, get help.

AUDREY: Yeah, and really, really make sure that your web forms are not susceptible to that kind of injection.


AUDREY: Please. This is such a known problem.

CHRISTIE: It is. And most of the mature web frameworks have that kind of security built in or at least easily accessible. That also tells me they had something kludged together probably. And the whole reason this was an issue in the first place was that DigiCert is a company that took over Semantic’s cert business because Semantic got in trouble for not following the rules. The so-called baseline requirements that the browser makers say that certificate authorities need to follow to be trusted. So, it’s like this line of like shit shows.

AUDREY: Yeah. The last couple of years have not been great for certificate authorities in terms of finding out what they’re doing and how secure they actually are.

CHRISTIE: I actually really appreciate that that’s coincided with the availability and the ubiquity of Let’s Encrypt because Let’s Encrypt is pretty solid. And so, I think it’s not only increased how many websites are running HTTPS/TLS, but it’s also shown that you don’t need to spend $99 a year to get a cert to do that necessarily. And then if you’re at it, you may not be getting your money’s worth.

AUDREY: It’s just that the downside for users of websites is that you don’t really know. I mean, are you getting something through Let’s Encrypt? Are you getting something through Trustico?

CHRISTIE: Yeah. And that’s why the browser makers are so like…

AUDREY: Really have to be involved.

CHRISTIE: The individual user shouldn’t be expected and we talk about this a lot. The individual user shouldn’t be expected to like validate something as trustworthy on a technical level.

AUDREY: Right.

CHRISTIE: Okay, good times. GitHub had an interesting day yesterday.


CHRISTIE: One point three terabytes per second. That sounds like a lot, Audrey. Is that a lot?

AUDREY: That is a giant DDoS attack, yeah.

CHRISTIE: Wired says, “At 12:15 Eastern Standard Time,” ours is Eastern Time, “1.3 terabytes per second of traffic hit the developer platform GitHub all at once. It was the most powerful distributed denial of service attack recorded to date,” and it didn’t use a botnet.

AUDREY: Nope. It used an amplification technique.

CHRISTIE: I think this made sense to me, but it is also new to me. So, explain this one.

AUDREY: You want me to explain it?

CHRISTIE: Unless you don’t want to.

AUDREY: I think you should take this one.

CHRISTIE: Okay. We’ve talked before about DDoS attacks that use botnets like webcams and other devices that have been compromised and then they’re sending traffic to websites. This one used memcached servers that were publicly available on the web, on the internet. So memcache is a caching service. I think I’m using it for my websites where you can just…it’s a pretty simple data storage. Like just take some data, hang on to it until I asked for it back.

AUDREY: And it has a very simple key value system.

CHRISTIE: Right. I think JSON or something like that. And so, and I don’t know why people are putting these on the public internet. So if you send a request to memcache server with a little bit information because you’re asking it to return values in cache, you’re going to get a lot more back. So what the amplification attack does is that the attackers fake the IP address that they’re coming from. And so they fake it as the target that they really want to hit. And then they ping the memcache servers and then the memcache servers return a much greater volume of data to the target, to the victim.

AUDREY: So they pull that thing back out of the cache.

CHRISTIE: And I’m guessing because it’s a pretty simple value store that it’s really easy to get information out of it rather than just get a ‘hey no, I didn’t find that’.

AUDREY: I thought it was interesting, the part of the mitigation. One of the things that it relied on was being able to tell that it was the same attack based on the size of the data coming in.

CHRISTIE: Yeah. So, there are a couple of unique things about this. One thing was it was huge and also they stopped it very quickly. It says within 10 minutes, GitHub had called automatically for help from its DDoS mitigation service. They’re using a service from Akamai called Prolexic. And they immediately took over, routing all the traffic coming in and out of GitHub. And then after eight minutes, attackers relented and the assault dropped off. And there’s this amazing graphic here of the huge spike in traffic. It reminds me of Contact when they first get the signal in the movie. I know that’s what this reminds me of.

AUDREY: Can we go from like ping, ping, ping to like [inaudible].

CHRISTIE: I don’t know who was doing the sound engineering for that movie, but there was a very specific sound. I wish I could play it. I can like hear it when I’m looking at this graph. Anyway, that’s just my weird brain. And there’s another…is that a quote from some company that like monitors web traffic? And they basically said to network intelligence firm ThousandEyes, that’s not creepy, Alex Henthorne-Iwane said, “If you look at the stats, you’ll probably find that globally speaking, DDoS attack detection alone generally takes about an hour plus, which means there’s a human involved in looking at the traffic. And he said, “When it happens all within 20 minutes, you know that this is driven primarily by software.” So I think that’s kind of interesting that they were able to identify that an attack was happening so quickly, engage Akamai, and then detour enough the traffic that the attackers are like, “Oh, this isn’t worth it.”

AUDREY: Right. It’s good news all around that they spotted it, that they were able to mitigate it, that they can kind of share some of the information about what the attack was and how they mitigated it so that other people might take advantage of that.

CHRISTIE: Yeah. And I’m guessing in addition to like just what the packets look like, that they probably come across on specific ports for memcache. I hadn’t heard of the memcache thing before, had you?

AUDREY: It sounds sort of familiar, but yeah, I would have to go search around a little.

CHRISTIE: I’m curious what other services can be used as an amplification attack like that.

AUDREY: I mean, this is relying on a specific piece of software and specific kind of access.

CHRISTIE: I think the lesson here is that if you’re running memcache, it better be behind a firewall. And if you’ve done that because there’s a different service you need to hit it, there’s ways to address that. Put them both behind a firewall.

AUDREY: Right. I mean, the situation where I’ve used the application, instead of making a database call, the application calls out to the memcache server and then it passes it back through to the user. And so, the user or the browser never has any interaction with it directly.

CHRISTIE: And I think the other lesson is that if you’re a part of running a high profile service, you have to assume at some point you’re going to be the target of DDoS attack. Because some of this is just like prestige, right?

AUDREY: Yeah. They said that they didn’t think it was necessarily coming after GitHub for any political or strategic reason, just to show whether they could. And they think that in part because it did light up pretty fast.

CHRISTIE: Oh, I missed this the first time. We’ve seen about 300 individual scanners that are searching for memcache boxes. So there at least 300 bad guys looking for exposed servers. It’s a quote from someone at CenturyLink.

AUDREY: Interesting. Even just from the start of the article I thought was really interesting that it says, “We modeled our capacity based on five times the biggest attack that the internet has ever seen.” That’s sort of an arbitrary thing, but also that they’re saying, we made a guess about how big an attack could get. Fortunately this was under that. But you know, even though the volume of the attack was so high, it wasn’t like a single hit. It was the collective aspect of it.

CHRISTIE: Well now, I think they probably have to readjust their window. Is that an [Overton] window?

AUDREY: No, I don’t think so. But yeah, they have to readjust that because now they have an idea about how many servers, if it’s memcache servers, they can look at, “Okay, so how many did it take to get there?” And you can make some estimates about what’s possible. And like you said, there might be other kinds of services that can be attacked in a similar way.

CHRISTIE: All right.

AUDREY: But given what an impact this has on the internet, it’s nice to see some good news here.

CHRISTIE: Yeah, it’s nice. And I didn’t even notice this was early enough that I hadn’t quite done anything with GitHub yet, and at lunchtime for the Eastcoasters.


CHRISTIE: Okay, we are blasting through these today. Do you want to talk about this authenticating email?

AUDREY: Sure. ProPublica put up a guide to authenticating email aimed at journalists who sometimes receive forwarded information. With email, we’ve talked about it the other week with AMP and what Google’s doing and how the static verifiable nature of email is actually really important. And this was a good example of both that.

CHRISTIE: Oh my God, I didn’t even think about that.

AUDREY: Yeah, in terms of evidence and documenting things. It’s really important that these aspects of email are preserved or else, how do you know that a whistle blower is actually a whistle blower? How do you know that the incriminating information that you receive or in the case of somebody documenting harassment or other things like that, that they’ve received, the verifiability really matters. And so there are some specific pieces of data in the email that can be used to do this and ProPublica walks through how that works.

CHRISTIE: Yeah. And the thing that stood out to me is I’m reasonably technical and I have like set up domain keys and SPF for my domains, email. I had not heard of authenticated receive chain. But mostly what struck out to me is there are four separate…Sender Policy Framework, DKIM, ARC…why did I think there was four? The fact that there’s even four different things to keep in mind. It shows you just the…oh, the fourth thing is PGP. They basically kept saying, “But this doesn’t confirm this. You have to use this other thing to confirm this.” And even then, you don’t necessarily know that that was the person sending the email. It could’ve been a person who had access to the email and to the PGP.

AUDREY: Right. But at least you can get a reasonable amount of information about who sent it, who received it. And if things were forwarded, then you can look at that too.

CHRISTIE: So DKIM, you can basically use to confirm that the message sent…that you have the same email that was sent and ARC confirms that you have the same email that was received by the receiving server and then Sender Policy Framework validates whether the sending server is really allowed to send email on behalf of that domain.

AUDREY: That helps you detect spoofing.

CHRISTIE: Right. So not the user, but the domain and the user comes in for PGP.

AUDREY: PGP can be used to sign things to demonstrate that they came from that sender.

CHRISTIE: Right. And you can use PGP to sign, which doesn’t encrypt an email, but just says, “Hey, it’s from me.” But doesn’t encrypt the contents, they’re still plain text and encryption.

AUDREY: It provides sort of a key that can be sent back.

CHRISTIE: I was thinking it might be fun. I guess I have a weird notion of fun but to try this out between us.


CHRISTIE: I don’t know. That may not make for like very compelling a podcast. Maybe it’s like a video or something we can do.

AUDREY: Right, a screencast.


AUDREY: Like I said, I think one of these things that we keep coming back to about email is that how many just really important relevant things come out of email as a piece of documentation. Like we’re just talking about Palantir and learning what happened in New Orleans. And a big part of what The Verge could say is that we got access to emails about this and this is what it told us. And if you can’t verify those then your news becomes a lot less trustworthy.

CHRISTIE: Right. And that was the whole point that’s why ProPublica put this article together basically saying they were actually using a specific case of some emails they had been forwarded and they wanted to know if they were legit. So like, these are the tools they used.

AUDREY: Yeah. And as we have seen, there are a lot of people that benefit from manipulating this information.

CHRISTIE: I really dug this. I want to see more of this from ProPublica.

AUDREY: Yeah. I really enjoy this kind of behind the scene stuff too. It’s informative and you know, it does improve our trust in the information that we’re getting.

CHRISTIE: So thank you, Jeremy Merrill who’s a news apps developer at ProPublica for putting this together. It might be time for things we like on the internet this week.

AUDREY: Cool. What’s yours?

CHRISTIE: I don’t know. Oh, I think I remember. This is just an animated GIF, I guess. And it’s a…

AUDREY: Should I click?

CHRISTIE: Yeah, you can click on it. I don’t know if it’s a GIF or a video.

AUDREY: I guess we’ll find out.

CHRISTIE: I’m just not realizing this is a Lego train set, which I somehow didn’t pick up on before. And so, there’s parallel train tracks. My God, this is video. Oh, sorry, sorry. I forgot this was video. Oh, they’re just really loud in my ears.

AUDREY: Oh, I’m sorry.

CHRISTIE: No, no, I played it. It’s not you.

AUDREY: I hit play too and it came over my computer speakers. Anyhow, I see that there is…oh, I could almost like scrub through it without listening to audio. There is some very clever train maneuvering on this thing.

CHRISTIE: Yeah, I like how you put it. We won’t give it away and I just appreciate it because I’ve been having kind of a rough week and it felt like a reminder from like the internet to me that things can go off the rails but then recover, like things can get derailed but recover. So I’ll just leave it at that.

AUDREY: That’s nice.

CHRISTIE: So, that’s what I liked. What have you got?

AUDREY: Well, I think computational art is really interesting and I love, for example, the machine learning pies and similar kinds of things that we’ve talked about before. And there’s an artist and a software engineer who does these scarves. The company is KnitYak and you can get your very own unique computational scarf.

CHRISTIE: Are these handmade?

AUDREY: It’s machine made but like programmed.


AUDREY: And so some of these scarves, they’re computationally generated and you get the computation and there’s a whole series of them that are unique, like everybody gets a different one.

CHRISTIE: That’s cool. This is one of my favorite logos ever. It’s like a Magenta yak kind of draped over a ball of yarn.

AUDREY: Yeah, very cozy.

CHRISTIE: Yeah. It’s sort of like the yak’s given up or taking a break.

AUDREY: The set that just got posted are Elementary Cellular Automata and they’re just really pretty designs. Very cool to look at.

CHRISTIE: Yeah. It’s so cool. Ships with source code in processing for the exact pattern knit into the scarf. That’s so cool. What a nice gift idea.


CHRISTIE: All right. We shall link to that and all of our show notes. And I think that’s our show for the week, Audrey.

AUDREY: Awesome.

CHRISTIE: Thanks everyone for listening. We’re going to sign off.

And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to podcast@recompilermag.com or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.

The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.