Episode 69: We’ll just make a pickle grid

Download: Episode 69.

This week we’re talking about Reddit’s security breach, retail spearfishing indictments, ghost characters, and surveillance capitalism.

Show Notes

Now Broadcasting LIVE most Fridays

We broadcast our episode recordings LIVE on most Fridays at 12pm PT. Mark your calendars and visit recompilermag.live to tune-in.

We love hearing from you! Feedback, comments, questions…

We’d love hearing from you, so get in touch!

You can leave a comment on this post, tweet to @recompilermagor our host @christi3k, or send an email to podcast@recompilermag.com.

Transcript

CHRISTIE: Hello and welcome to The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.

Hello…hello.

AUDREY: Hello.

CHRISTIE: We are clickety clacking on our keyboards and coming to you live on Friday, August 3rd. What the F! It’s August already. It’s a little after noon, Pacific Time. It is gloriously clouded and not a million degrees.

AUDREY: Yes.

CHRISTIE: And that is awesome. It’s so overcast. The dogs are like, “Nah, we don’t need to be on the deck.”

AUDREY: The cats have started snuggling again. That’s how we know.

CHRISTIE: We’re recording Episode 69 of The Recompiler podcast. This week, we’re going to talk about Reddit’s security breach, some retail’s spearfishing indictments, ghost characters, and a recent discussion about things we can do about surveillance capitalism. But first, I bet you Audrey has some announcements for us.

AUDREY: I do. I usually have announcements and this week is no exception.

CHRISTIE: Alright. What have we got?

AUDREY: We are a community sponsor for DevOpsDays Portland. It’s coming up in just over a month, September 11th through 13th. We have a discount code, RECOMPILERFRIENDS will get you 20% off. I just got an email from them saying that they expect to sell out. So if you’ve been thinking about registering, you should. It’s a great event that’s a little bit interdisciplinary talking about software development and IT infrastructure and just a lot of different ways to get it done.

CHRISTIE: And we’ll both be there, right?

AUDREY: Yes, we’ll both be there.

CHRISTIE: Alright. I’ll be in part repping day job and I will have stickers for day job and also stickers for not day job. So get your tickets, come to the event and say hi to us. What’s next?

AUDREY: We’re also taking pre-orders for our Community Event Planning book. I’m going to leave the pre-orders open just a little bit longer so that if folks want to preview what we’re working on, get a look in at the book before it comes out next year, they can do that and make sure that they get a copy. We also have a few more sticker packs that we can send out. So if you didn’t do the Kickstarter, heard about it later, have folks say you still want to tell, then go ahead and we’ll have a link in the show notes for that pre-order shop. We are also doing a survey for event organizers as part of the research for our book. We’re looking to reach out to a lot of different communities. A lot of different kinds of organizers to understand the nature of the events that you do, the size and scope, how you organize them and some of the things that you would tell yourself if you start it over or you would tell another organizer. So we want folks to go ahead and have a look at that and fill it out.

CHRISTIE: Yes, please. Thank you.

AUDREY: And then our third announcement is that we have a call for contributors for Issue 12.

CHRISTIE: Yey!

AUDREY: Machine and Things – that will be our last issue this year for 2018. We have a guest editor, Stephanie Morillo who is super awesome and she’s joining us on this. We’re going to talk about the internet of things and machines, hardware, cyborg-y stuff even, and just a lot of different aspects of that combination of stuff. So our call for contributors is open for a couple more weeks. I think actually all this month. And there is a chance to put in a pitch and tell us what you’d like to talk about.

CHRISTIE: Awesome.

AUDREY: And we pay. We pay contributors.

CHRISTIE: So check out the show notes for all those links. First topic. So Reddit announced a security incident.

AUDREY: I got an email from them.

CHRISTIE: Oh, did you?

AUDREY: And it’s funny because I couldn’t remember if I had a Reddit account. I don’t really use it very much. But yeah, I got an email saying, “You might have been affected by our data breach.”

CHRISTIE: That crow might have been affected, too.

AUDREY: Should I go close the window?

CHRISTIE: Oh no, it’s fine. It’s fine.

AUDREY: Okay.

CHRISTIE: Keep us company. We can pretend like they’re the sidekick or something.

AUDREY: Right. The crows always have commentary.

CHRISTIE: Yeah. So a hacker/hackers broke into an employee account and the employee was using two factor authentication but it was SMS-based and they don’t specify what happened but they do say it was an SMS intercept. And there’s a handful of different ways that that can occur. And so, let’s see. What were some of the things that were compromised? All Reddit data from 2007 and before, including account credentials and email addresses. So there was an old database snapshot laying around, salted hash passwords, all content, mostly public but also private messages. Interesting that they are keeping data from well over a decade ago. I’m not sure what that was all about.

AUDREY: I sort of wondered if this was a backup that had been forgotten about in some way. The way that we’re doing development sometimes you have old backups that you’re using for testing. I wondered if they’d held onto it in that kind of a context.

CHRISTIE: You know, it was interesting and then there was also some very recent data. It says: e-mail digest sent by Reddit in June of 2008. So, logs containing the e-mail digests that we sent from June 3rd to June 17th.

AUDREY: And it’s not especially private information, it sounds like. But the e-mails themselves are important.

CHRISTIE: And it says the attacker had read access to our storage systems and other data was accessed such as Reddit source code, internal logs, config files, and other employee workspace files. So they say: what are we doing about it? We reported the issue to law enforcement and are cooperating with their investigation. They’re letting people know if their credentials were potentially compromised and says they’re taking measures to guarantee that additional points of privileged access are more secure (e.g. enhanced logging, more encryption and requiring token-based two factor authentication to gain entry since we suspect weaknesses inherent to SMS-based two factor auth to be the root cause of this incident).

AUDREY: That’s something that we’ve heard a lot with accounts that have been compromised that had two factor authentication that SMS is the weakest option that you could pick.

CHRISTIE: And then people say well it’s better than nothing. I’m not sure how many options there are to you if you don’t have a smartphone. I guess you’d have to go hardware token.

AUDREY: And definitely not everything accepts the hardware token.

CHRISTIE: Well, by hardware token, I don’t just mean the thing you stick in your computer. I mean the thing that generates a one-time password too, a little display.

AUDREY: Like the old secure ID tokens. That kind of thing?

CHRISTIE: Yeah. I don’t know if that’s still a thing you can get.

AUDREY: I haven’t seen them very much in quite a while.

CHRISTIE: I think apps on smartphones probably replaced them largely. Not in this initial post because, of course, Reddit, so they did the announcement as a regular looking post on Reddit. “In other news, we hired our very first Head of Security and he started 2.5 months ago.” So, they waited 12 years/13 years into their existence to get a Head of Security.

AUDREY: To have a Head of Security.

CHRISTIE: I just thought that was something maybe to note.

AUDREY: That’s interesting.

CHRISTIE: And I didn’t have time for this episode but I think some day, we could definitely dive in more at some point is the different ways that SMS cards or SMS-based two factor authentication is vulnerable because there’s a couple of different ways to intercept. I think there’s ways at the network level. You can talk to a phone company into porting the number, different things like that.

AUDREY: Right. I saw an article this morning that I didn’t have a chance to read from Motherboard about people trying to recruit telecom employees to help them with SIM card hijacking or they’d interviewed some people that had been targeted in this way. So, I thought that was pretty interesting.

CHRISTIE: We’ll link to that and then maybe dive into it more in another episode.

AUDREY: Right. When I saw the Reddit news, aside from it’s sort of odd to get an e-mail about a Reddit account, I don’t really remember saying that I must have created it before 2007. Aside from that and the kinds of things that they have in here about what was revealed and the weakness of two factor authentication when there are SIM card attacks and other SMS attacks, I started thinking about auditing and that’s something that the Head of Security would definitely be designing a plan for. It’s one thing when it’s like you’re externally facing users and you have to give them certain options for how they log in and you can control to some extent what kind of passwords people set. But there’s a lot that’s external. So you need good security in the system so that nobody can get more access than they were supposed to at that level. But for your developers, I don’t know. I’m sure organizations do perform kind of audits on this, not just set requirements but audit it. But I think a lot of development systems don’t really have a cohesive way of handling this. You have lots of different kinds of logins and different ways that things are accessed. And so, it just seems like a lot of these come from gaps in that, if that makes sense, that there are just these fairly complex systems. And organizations aren’t always very good at looking at what individual developers have access to, what data they hold on to, whether they’re taking shortcuts that are security problems.

CHRISTIE: And I think it’s something that there’s a lot of value in doing it from the beginning because it just becomes that much harder to impose after the fact. And there are definitely tools out there that help with this. For my day job, there’s certain remote policy things for our laptops that set certain security parameters that you cannot change. Things like password required to turn off the screen saver and screensaver comes on after so many minutes, things like that. And then of course, making two factor auth mandatory. But then again, you’re limited by what your vendors will support in your security. And then, the other thing we’ve seen time and time again is the complexity of deploying to the cloud means that we have a new version of Chmod 777 often in the form of wide open S3 buckets or whatever.

AUDREY: By what you just said, you mean the data permissions? The access permissions?

CHRISTIE: Right. Once upon a time when we’re deploying to dedicated servers, if you’re having trouble getting app working, it might be due to permissions. And so you would just make the whole directory read/writable by everybody. Or maybe another version of this is like turning off the firewall entirely or opening all the ports or whatever because you’re having trouble getting just the exact right set of what you need to make things work.

AUDREY: So you think, “Oh, okay. Then I’ll go back and lock it down.” But maybe you don’t.

CHRISTIE: Right.

AUDREY: Maybe you don’t actually thoroughly manage that.

CHRISTIE: Yes. The newer version of that is, I think, there’s even more layers and so more potential things that could be improperly secured. I was just thinking about development environment. There’s just so many ways. And also taking a database snapshot and then where do you put it, and things like that.

AUDREY: Another thing that occurred to me as we’re talking about this is that to some extent, I think this is related to the way that startups and small companies just sort of punt on IT questions. I’ve worked on a lot of things where the person who set up the servers is just whoever’s most motivated. And so, it’s not especially coordinated, and so they’re individuals making decisions about this stuff that affect the security of the system. But because it’s not happening in a coordinated way, you would really have to audit to understand that and you’d have to get everybody to tell you what they’ve been doing in order to assess the security threat profile.

CHRISTIE: And everyone has to be on board with the norms too because I think depending upon what kind of developer culture or professional culture you’re coming from, being locked out of things can be seen as a lowering of status or something. So, I think there’s a social element too.

AUDREY: Or just even obnoxious, like a barrier that’s being put in the way of you getting work done.

CHRISTIE: Right.

AUDREY: You’re absolutely right that it’s a lot easier to do this from the start than to go back. The larger the organization and the more time, the more details and the more cultural stuff, too.

CHRISTIE: And I think companies have to hold each other accountable too. If you are a customer of someone or you have customers, you’re all part of a chain of security and everybody has to do their part.

AUDREY: Well, that’s a really good tie in to the other security incident report that I saw this week that I thought was worth talking about. The NPM folks had a follow up about the ESLint security incident where they talked about what they’ve done to figure out what happened whether there was anything that had been addressed yet, what they’re doing to try to at least identify this in the future. And then they have a section where they said, “Do you have any advice on how to prevent this aside from two factor authentication? Package maintainers have a lot of responsibility to the community they serve.” And then they go to talk about steps that they might take. But NPM itself can’t force package maintainers to do these things. They’re relying on a sense of community obligation here and education and awareness.

CHRISTIE: Right. And we get back to that, that there’s always that tradeoff or balance between convenience and security. So, part of why the package maintainers share a burden of the responsibility is because NPM has designed it so they have a certain amount of freedom to access a system to publish and maintain their packages.

AUDREY: I mean, it would take a much larger organization to not need to share the responsibility in that way.

CHRISTIE: Right. It’s a way of distributing the work. And I think package maintainers want that ability to be able to post an update to their package relatively quickly and not have to wait. What’s the Apple, the App Store, right? That’s sort of like the alternative model. I think any update that gets pushed has to go through some kind of review process by Apple.

AUDREY: Yeah, and there’s been a lot of questions about the thoroughness of that, whether they’re able to catch everything or not.

CHRISTIE: It may not even provide the efficacy that we’re wanting but I’ve certainly seen people complain about it. Also, Apple is in the news this week for having passed a trillion dollar valuation. So, cannot really compare a way that Apple is doing something with the way that NPM is doing something.

AUDREY: Yeah, right. A huge scale…

CHRISTIE: Orders of magnitude.

AUDREY: For sure.

CHRISTIE: There wasn’t a whole lot in this follow up basically. The biggest thing for me was that they scanned code looking for this similar code to the malicious stuff that was uploaded. And they found two other packages but they didn’t nearly have the same scope because they were not tied to a popular package. And they took those packages out.

AUDREY: That makes sense.

CHRISTIE: The DOJ had an announcement, or was anything else on the NPM stuff?

AUDREY: No. I’m always glad to see these retrospectives. So I was happy to have a follow up.

CHRISTIE: DOJ announced a press conference for, I guess, Wednesday or so. And the thing that they were announcing was indictments for three Ukrainian nationals connected with a lengthy hacking campaign of more than 100 businesses. And they really targeted a lot of retail businesses and restaurants. So, Chipotle, Arby’s, other things. And basically what this group did over a handful of years was send e-mails with payloads to different employees of these businesses and they would also supplement it with other e-mail communications and phone calls to really make it look like they were legitimate customers. So, customers trying to place orders, customers asking for help with their placed orders, and they were pretty successful, it sounds like, of getting employees to open these e-mail attachments. And then there was malware and it basically infected the point of sale systems and started uploading customer payment data’s credit card information. Then they sold that in the markets that you sell that in.

AUDREY: Right.

CHRISTIE: DOJ worked with different law enforcement agencies and the different countries where these work for. Some of them have been extradited to the United States, some of them are pending extradition. The thing that spoke to me about this was just like, “Okay, this is stuff that’s happening all the time.” It’s happening at the mundane places where we go to get our midweek dinners or whatever. And that I feel like we just have to start assuming that at some point, our credit card information is going to be out there and that we need to keep an eye out for that and for fraudulent activity.

AUDREY: But there’s so many ways that this stuff can be attacked or be accessed. I thought the article that you shared with me about this was really interesting because one of the first things it says is that hackers preyed on our urge to help. The way that they got people to open files with malware was by making it look like a reasonable customer service request.

CHRISTIE: The other thing they did was some of the attacks, they posed as agencies that help with notifying of food borne illness outbreaks.

AUDREY: Oh, I see.

CHRISTIE: Which is like, “Of course, you need to pay attention to that.”

AUDREY: You’re not going to be like, “Oh, I don’t know,” and blow them off.

CHRISTIE: Right. And having worked with an agency about this, there is a certain amount of back and forth you do to coordinate information about such things. And so it wouldn’t be out of the ordinary to necessarily pass files back and forth. I didn’t put the link to the actual DOJ announcement in the show notes, but I’ll do that. And then at the bottom of it, it has a bunch of attachments including like a fact sheet about how they did it, FIN7 or…what was the other name this group has? Carbanak Group. I guess they’re kind of well-known, that talks about how they did it. And it includes some of the example messages and whatnot. Does it actually include information of the malware too? Oh, wow! There’s like an infographic. Step one: identifying a target, complete with like…not that clip art. Step two is grooming. Spear phishing e-mails target victim company’s employees: typically public-facing contacts, like employees handling catering requests and their reservations, and/or in a managerial position. They accompany it with e-mails with telephone calls to persuade the employee to open and activate the e-mail’s attachment.

AUDREY: It seems really normal.

CHRISTIE: The step 4 is selling stolen credit cards. The graphic is a hand, an arm and they’ve dressed the arm in black and white stripes. I don’t know if that’s to indicate criminal. I don’t even know if inmates ever still wear stripes, coming out of the computer monitor holding a credit card.

AUDREY: Nice.

CHRISTIE: This is amazing. It makes me wonder what mitigation can be done on the point of sales systems. Is there any way to harden those systems?

AUDREY: You mean, are they networked in a way that they need to be in order to actually do what they’re doing?

CHRISTIE: Yeah. Is there any way to make them less vulnerable to this kind of malware? As I’m saying that, [inaudible] that I still see point of sale systems running Windows XP.

AUDREY: I think that’s a big part of it that they don’t get updated. I mean, if they’re not getting basic security updates, then I don’t know, take them off the network.

CHRISTIE: Spear fishing is still an interesting term for that.

AUDREY: That’s targeted.

CHRISTIE: So, we got some ghost characters.

AUDREY: This was another interesting thing that I saw this week.

CHRISTIE: Although the article is titled A Spectre is Haunting Unicode. I thought this was another…I was like, “How is a spectre vulnerability manifested in unicode?” That’s not what it’s about.

AUDREY: Although that would be intriguing.

CHRISTIE: So what’s this one about, Audrey?

AUDREY: It’s about a series of Unicode characters that were a mystery. They existed. They didn’t mean anything. They didn’t say anything but they came through kind of an interesting set of pathways and that Unicode character set has adopted a lot of things from other text and coding sets. And so in this particular case, they had been mistranscriptions of Japanese characters that weren’t fixed. Nobody really paid attention to them and then they became part of Unicode. And the characters still don’t mean anything but they’re there and probably there for good.

CHRISTIE: In one case, it was sort of a pickup from…so back in the late 70s when the character set was sort of standardized. I’m not even sure for a digital purpose or not.

AUDREY: It was a specific Japanese character set, I think.

CHRISTIE: But was it…[crosstalk]

AUDREY: …typesetting.

CHRISTIE: Yeah. And in one case, one of the ghost characters actually came up from a sort of manual artifact of, it was, I guess, a combined character and so they had printed up each combined character and pasted them together. And it was just the way they had done the paste up and then subsequent copying of it created like a shadow or a glitch, made it look like there was more to the character than there was and that’s what got recorded in the standard.

AUDREY: Just sort of glitchy interpretation. And part of what I thought was interesting about this article was that the Japanese characters, there’s a few things that are place names that aren’t characters used in any other context so they are fairly unique. And mistranscribing them doesn’t make it any easier to write a place name but it does sort of explained why it wouldn’t have been captured differently.

CHRISTIE: Because it’s not used in a lot of contexts.

AUDREY: Right, unless you were making another Atlas, you might not even want to reference it.

CHRISTIE: I’ve done manual paste ups like that. So I was like, “Oh yeah, you sometimes get like the little shadows.” I just thought that was kind of interesting. I didn’t have the chance to watch this video you linked in here.

AUDREY: As it was talking about the process of what characters end up in Unicode, how an error from 1978 in Japanese text encoding could eventually result in ghost characters in Unicode, it reminded me of this talk called Decolonizing Unicode that I saw at AlterConf Portland a while back. The theme of the talk is that Unicode is sort of portrayed as this idea of representing all of the written language of the world. But there are actually very commonly spoken and written languages that aren’t included in that. And it reflects the political process that Unicode is built on, the ways that different languages are represented in the…by political, I just mean like the institutional process. And I thought that was really interesting and this was sort of…the ghost characters are kind of a mirror of that in a way, it’s sort of a reflection of who helps create Unicode.

CHRISTIE: And that there’s a certain human fallibility that gets translated into it whether it’s a manufacturer of a character due to a physical glitch or are not including a very prominent language.

AUDREY: In some cases, just major oversights in terms of written languages that people use.

CHRISTIE: And then there’s also an article, it looks like the same person that gave the AlterConf talk had an article in Model View Culture titled: I Can Text You a Pile of Poo, But I Can’t Write My Name. And they are talking about Bengali. Bengali not having good representation in Unicode.

AUDREY: Right. That there are things that you could write by hand that you can’t write on the computer. And it is sort of interesting that pictorial Unicode characters through emoji have become this big, big, big thing that everyone uses. And they’re good, they’re useful. But at the same time, the novelty hasn’t extended to just…I don’t know, [inaudible] languages with a little bit more history.

CHRISTIE: I have my own troubles with the emoji because quite often, I mostly don’t know what they mean. And a lot of the ways, when I can even figure out how to get the emoji viewer, the emoji keyboard to display, I can never figure out how to get it to tell me…because they all have descriptions. But I can never figure out how to get those descriptions to display. So I sort of have to interpret.

AUDREY: What the image you’re looking at is?

CHRISTIE: Yeah. And when they’re like 16 pixels by 16 pickles…pixels…now I’m going to think of all pixels like pickles.

AUDREY: We’ll just make a pickle grid.

CHRISTIE: Yeah. I often try to participate in the emoji thing but I always feel like I’m doing it very wrong. And then half the time when I see people using emojis, I’m just like…I have the same problem with animated GIFs. So, I always feel like I’m not in on the joke.

AUDREY: Two kinds of complexity to this, aside from whether you can actually see the character. One of them is that emoji are not always used literally. There are lots of emoji that are used to represent other things.

CHRISTIE: That’s probably a big part of the issue.

AUDREY: Well, and the other one is that it depends on what device you’re using, what website you’re looking at, like the actual graphics themselves are different. And some of them acquire different connotations because of differences in the illustration.

CHRISTIE: Right. In the same way that an alphabet has different fonts, emojis kind of do too because they have different representations. I was trying to think of different examples. Something going around about like the burger emoji and the ordering of the things on the burger.

AUDREY: Yeah.

CHRISTIE: Some of these are trivial but some of them are not like…one of the other differences like a gun. One looked like a more realistic revolver and the other one was like a water gun. That’s kind of different.

AUDREY: Yeah, those are going to be very different things when you use them.

CHRISTIE: One’s maybe playful and one’s maybe a threat. Anything else about Unicode?

AUDREY: No. Just I encourage you to watch that talk.

CHRISTIE: So next topic, we have this…this thing annoyed me a little bit. We have this article in the MIT Technology Review. I was happy that I found this this retort or this counterpart piece to it. There’s this piece by Mariana Mazzucato in the June 27 issue of MIT Tech Review. And it says: Let’s make private data into a public good.

AUDREY: I think we get to talk about the tragedy of the commons again a little bit.

CHRISTIE: Oh, God. Do you want to summarize this?

AUDREY: Sure. The article has a very good analysis of the impact of surveillance capitalism. That companies are allowed to use our personal and private data in a way that an aggregate is very financially beneficial for them. But because it is an aggregation that adds financial value for the company but does not transmit any value to us other than our ability to use and access services that are funded by it, that there is a fundamental mismatch there. And it’s exploitative and all sorts of things. So what the author proposes is that really what we need is a data commons for this kind of private data that would allow us all to benefit in that financial sense from it. And whatever information gathering that is also useful through these services, the way that they aggregate data. It has the idea that we could all benefit from that, if there was sort of a public, I don’t know public…I can’t think of it now…that aggregation.

CHRISTIE: Well she basically says the government should aggregate the data and sell it to private companies.

AUDREY: Yeah.

CHRISTIE: That’s the solution proposed here.

AUDREY: It’s sort of arguing that the problem with surveillance capitalism is not the surveillance but the capitalism, I think.

CHRISTIE: Or that the government isn’t participating in the capitalism?

AUDREY: Yeah.

CHRISTIE: Aral Balkan had a response to this titled: Out of the frying pan and into the fire. I thought this is really good. He summarized it well too. So if you don’t want to read the original MIT article, which actually isn’t that long.

AUDREY: It’s not that long.

CHRISTIE: You also just read this. But basically, he pointed out most of the issues I’ve found with it, basically that the issue isn’t data collection itself but it’s a violation of privacy, that there’s no way that we shouldn’t continue to normalize that, that there’s no way we can give consent for the level of data collection that we are subject to, basically that we’re farmed for information.

AUDREY: And I think that’s something that we’ve talked about a lot before that you can’t give meaningful consent to the use of your data when you can’t examine the aggregate, when you can’t really assess the aggregate impact of that.

CHRISTIE: There basically are not viable alternatives to…there are not viable tech products to alternative tech products and services that aren’t surveillance capitalism.

AUDREY: [Inaudible] [Crosstalk]

CHRISTIE: So it’s not a real choice. Then it says, “That’s not to say that we cannot have a data commons. In fact, we must. But we must learn to make a core distinction between data about people and data about the world around us.” And then it says, “This is the rule of thumb: data about individuals must belong to the individuals themselves. Data about the commons must belong to the commons,” meaning that there’s a difference in scientific data that is collected about a particular field of study or whatever versus our own personal data. Then they talk about the alternative is building free and open decentralized interoperable systems where data originates in a place that you as an individual own and control.

AUDREY: And there are ways to re-examine that aggregation. How we might examine when it makes sense to share things.

CHRISTIE: Yeah. There’s a little bit of a sideways pot shot at the cloud and maybe that’s like…given how accessible cloud computing has made computing. I don’t know how much. I think you can still store…I think there are probably ways to store data and run services on servers that you’re renting versus owning. I’m not sure that we should continue to engage in that fight that everyone should have a server under their desk.

AUDREY: There are lots of ways that, that doesn’t scale and isn’t accessible for everyone.

CHRISTIE: Especially now that we actually have competition in the cloud computing market. They say there are two things we must do to create ethical alternative: regulate the shit out of surveillance capitalists, and then fund and build ethical alternatives. All that made sense to me. The one thing that this response piece didn’t mention that really bugged me was actually the last line in the original Tech Review piece. It says, “The digital economy must be subject to the needs of all sides; it’s a partnership of equals where regulators should have the confidence to be market shapers and value creators.” No, no, no, no, no.

AUDREY: It’s a pretty different view of what regulation means, what regulators are.

CHRISTIE: Right. It’s like saying…

AUDREY: It makes something other than a neutral party.

CHRISTIE: Right. It’s like saying, “Oh, the judicial branch should have the confidence to shape and create legislation.” No. That’s what the legislative branch of the government does. So that bugged me a lot and that wasn’t specifically called out in the other piece.

AUDREY: As I was looking at this, I thought the reason that I said, “Oh, we get to talk about the tragedy of the commons again,” is that there’s this very fuzzy idea about what the commons here actually is. Like is there a common value to personal private data in the first place. In a lot of cases, there isn’t necessarily, aside from how companies make money off of it.

CHRISTIE: And it becomes valuable to them in aggregate.

AUDREY: Like, is there any value to us aggregating some of these things? Phonebooks are, I guess, the example of that but there are lots of other things that maybe not so much.

CHRISTIE: I’m thinking in terms of like research and understanding greater patterns in society. But we have protocols for that.

AUDREY: Yeah, and I don’t think I would ever want to see a situation where somebody could do research that relied on personal private data that did not require talking to the participants first. And I think that first piece especially sort of indicates that there’s a possible outcome that you could go to the regulatory agency and say, “Well you know, I’d really like to get everybody’s…” I don’t even know, just some core thing about everybody’s shopping purchases, whatever. And that you could do research on that without having to get the individuals to be willing to share that information to explicitly offer to share that information.

CHRISTIE: It’s like the Dropbox thing that we talked about last week where you might sign it and Terms of Service will say Dropbox will use certain anonymized information about my Dropbox usage to make the service better. You don’t think that that also means giving it to researchers to do this other kind of research.

AUDREY: To judge how your collaboration practices affect your standing and university rankings.

CHRISTIE: Right. It’s this endless reselling and reusing of data for all these other purposes.

AUDREY: Ones that definitely don’t have a clear benefit to the user.

CHRISTIE: It’s also the linking of data because that’s the other thing. People are taking data from Facebook from your credit card purchases now from surveillance roadway information.

AUDREY: The ability to link lots of different kinds of pieces of data together.

CHRISTIE: Because that compounds a privacy violation.

AUDREY: And can be much more revealing than any of those individual pieces were. I think we both agree – more regulation. Very important.

CHRISTIE: This person has a new book: The Value of Everything: Making and Taking in the Global Economy. I’m really curious as to what else is in that book. I feel it will make me just as grumpy.

AUDREY: I don’t know that you need to do like, it’s not hate reading but like grumpy reading.

CHRISTIE: No, I just wake up grumpy.

AUDREY: Well, it’s a good thing that we’re going to talk about things we like on the internet.

CHRISTIE: Right. And I’m realizing I don’t know what I got. I know there was something. Oh, okay. Why don’t you go first?

AUDREY: Okay. I woke up this morning and I saw a great news story about goats.

CHRISTIE: Oh! I know what you’re talking about.

AUDREY: Is that yours?

CHRISTIE: No, that’s not mine. But I know what you’re talking about.

AUDREY: There were a whole lot of goats rampaging around Boise, I think.

CHRISTIE: Yup, a neighborhood in Boise. It was a lot of them, like 30 or 40, it looked like.

AUDREY: And there’s video, and it’s great. The first thing that I saw said something like that goats are out here and they’re eating everything in sight. And the video confirms that they are going after some [inaudible] and people’s lawns and stuff.

CHRISTIE: First I laughed and then I thought, “Oh my, God! They would probably eat all my flowerbeds in like two minutes.

AUDREY: [Crosstalk] [Inaudible]

CHRISTIE: But then on the other hand, it would obviate the need for mowing this really overgrown dead lawn I have. So, we made it pretty far in the summer. I might be okay with losing the flowerbed.

AUDREY: I would definitely let it go at my front yard right now.

CHRISTIE: The front yard, I did manage to mow once. It’s the backyard which has not been mowed all summer. And you know how we have [inaudible]? I don’t know where they are anymore.

AUDREY: And you have a lot of backyard to work with.

CHRISTIE: I know, I know. So, my thing. I follow Botanygeek on Twitter who is, I think Botany is for the…I don’t know why I think there were botanists for the Royal Horticultural Society. I think I might just be making that up. But he had this small tweet thread about where Maize comes from. He says Maize is an artificial species created entirely. Thanks to human ingenuity. I had seen bits of this before. I think I saw an exhibit about it at the Natural History Museum in Mexico City. It says, “Its closest wild relative is a grass with tiny, barely edible, rock hard seeds called Teosinthe. I think that’s the English version of that. It took thousands of years of tireless breeding work by Native Americans to get us maize. And then, “What about sweet corn?” Do you know where sweet corn comes from, Audrey?

AUDREY: I do but only because I saw this thing earlier.

CHRISTIE: Oh, okay. Did it blow you away? It blew me away. It says, “Well, the sweet corn we know today is essentially a post war discovery. It’s a random mutation discovered in a batch of mutant maize strains sourced largely by exposing bags of the seeds to nuclear bomb tests in the South Pacific. And then Botanygeek tweets a screen cap of a paper and this just cracked me up. The paper is titled “Hereditary effects produced in Maize by radiations from the Bikini Atomic Bomb I. Studies on seedlings and pollen of the exposed generation.”

AUDREY: I’d known that it was a random mutation. I hadn’t realized that it had that particular history.

CHRISTIE: I had no idea sweet corn was so new.

AUDREY: Yeah, it’s a very modern thing.

CHRISTIE: Sherri has a really good way of cooking it, so it’s become kind of a summer special thing we do. And I had no idea that it was so new.

AUDREY: And the history of cultivation is really interesting that it went over this very long period of time to accomplish what we think of as just corn meal.

CHRISTIE: And corn is not the only thing. I think a lot of our staple crops are like that. And that just kind of boggles my mind how…I mean, people were doing this long before Mendel and the peas and stuff. So like, how did we do that?

AUDREY: I guess you can have good scientific process even if you don’t write it down.

CHRISTIE: Right. There’s value in like experiential observational knowledge without understanding it on an atomic level or whatever.

AUDREY: You don’t [inaudible] you don’t even have to understand DNA to look at the result of your breeding experiment.

CHRISTIE: And also because I’m learning more and more about plants and I’ll be like, “Oh, that’s in the blah, blah, blah family.” The same as some food crop. I’m like, “It doesn’t look anything like that.” And then it just helps explain that there’s been specific dedicated work to get our food crops to look like this particular thing. That’s why there’s so much divergence.

AUDREY: And I think Western colonial history embraces a lot of this for us, that awareness of the history of how these things came to be.

CHRISTIE: And then further on in the thread, some people are talking about maybe it was just spontaneous mutation not radiation, but anyway. So, we’ll link to that. It’s kind of interesting.

AUDREY: Now I want corn.

CHRISTIE: Yeah.

AUDREY: I don’t do a lot of sweet corn in the summer.

CHRISTIE: Well, Sherri like soak boils it and then makes it on the barbecue in the husk.

AUDREY: I’ve had it, it’s good.

CHRISTIE: It’s a lot of work. I probably wouldn’t do it on my own, I admit. All right. I think that’s a show. Thanks, Audrey.

AUDREY: Thank you.

CHRISTIE: Thanks everyone. Signing off. Talk to you next week.

AUDREY: Bye.

CHRISTIE: And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to podcast@recompilermag.com or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.

The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.