Download: Episode 48.
This week Audrey and I chat about the Strava’s heatmap, a newish iOS app called Verena, a new Amazon patent to track hand movements of warehouse workers, and how ICE now has a contract to access nationwide license plate recognition date. Enjoy!
- [01:44] Recompiler Issue 9: Hard Problems
- [05:32] Recompiler Newsletter
- [06:04] Strava’s heatmap data lets anyone see the names of people exercising on military bases | WIRED UK
- [13:37] @timmathews: “On my deployments…”
- [14:59] Sensitive Compartmented Information Facility (SCIF)
- [16:45] The Latest Data Privacy Debacle (NYT)
- [24:30] Verena, an iOS app to help protect people in abusive situations
- [28:36] Cell Phones: Friend or Foe? (Recompiler, Issue 7)
- [29:28] Amazon Patents Wristband to Track Hand Movements of Warehouse Employees
- [37:32] ICE is about to start tracking license plates across the US
- [48:03] Presenting the winners of the U.S. Wiki Science Competition – Wiki Education
- [50:55] The Racist Sandwich Podcast: Food, Race, Class, and Gender by Racist Sandwich — Kickstarter
Issue 9: Hard Problems, now shipping!
Our first issue of 2018 focuses on the hard problems we try to solve in our work and our careers. Get it in the shop now.
Now Broadcasting LIVE most Fridays
We broadcast our episode recordings LIVE on most Fridays at 10am PST. Mark your calendars and visit recompilermag.live to tune-in.
We love hearing from you! Feedback, comments, questions…
We’d love hearing from you, so get in touch!
CHRISTIE: Hello and welcome to The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.
Episode 48: Wouldn’t it be cool if we knew. This week Audrey and I chat about the Strava’s heatmap, a newish iOS app called Verena, a new Amazon patent to track hand movements of warehouse workers, and how ICE now has a contract to access nationwide license plate recognition data. Enjoy!
This is Audrey and Christie. I’m Christie. That’s Audrey.
CHRISTIE: And we’re live. It’s Friday, February 2nd. Can you believe February is finally here?
AUDREY: I’m a little concerned, honestly, as much as I want winter to be over.
CHRISTIE: Yeah. I have mixed feelings too because I feel like I was like, “Ugh, January’s dragging.” And then I was like, “Oh my god, it’s February already.”
AUDREY: I’m pretty sure there were some things I was supposed to get done in January that did not happen as intended.
CHRISTIE: And then it’s a double whammy because February is such a short month. So we’re going to do it all over again. We’re like, “Oh my god, it’s March already.” But then we really will be three more weeks closer, or four more weeks closer to spring. And we’re live recording what will be Episode 48. Any announcements, Audrey?
AUDREY: Just that we have Issue 9 in the shop, getting it much closer.
CHRISTIE: Issue 9.
AUDREY: It’s getting ready to print. I’m looking forward to seeing some more of the design work this weekend. So, we’ll have it sent out. It will probably still say January on it. It might have been one of the things I was supposed to have done in January.
CHRISTIE: Remind us what Issue 9 is going to be about.
AUDREY: About Hard Problems.
CHRISTIE: Hard Problems.
AUDREY: Both technical problems and career problems and just understanding our work.
CHRISTIE: I really like the cover of this one. It looks very much like a literary journal, just probably why I like it.
AUDREY: Yeah. Our new designer, she had a really, really great pitch about how we could, let me think, there was like a sort of feminist journal authenticity that we could display with these design changes and that she could really find ways to highlight the meaning of what we’re talking about through the design.
CHRISTIE: Nice, I like that.
AUDREY: Yeah. And so the cover is pretty cool too because circles kept coming up in people’s discussion. And we have an article in there about floating point numbers, too. So she’s been working with this circle motif. These pieces that go together to try to illustrate that. And I think it’s just very cool how it sort of shifts through the different articles.
CHRISTIE: Nice. Yeah, and there’s a combination of the half circles and different configurations. So, shop.recompilermag.com. You’re working on Issue 10, too?
AUDREY: Yeah. We’re starting to get her first drafts in from our contributors. And we have a guest editor on that issue. Her name’s Rachel Rigg. She just finished up a Ph.D. in…I have to remember the subfield. It’s like bio engineering, I think. And so, I’m really excited for her perspectives on what people are working on too.
CHRISTIE: Nice. That’s the science one, right? Did you already say that?
AUDREY: Science, yeah. And we are also getting ready to open the call for contributors for Issue 11, that will be very soon too.
CHRISTIE: Oh, my goodness. Do you know what the defining topic on that will be?
AUDREY: Yeah. We’re going to talk about love and romance.
CHRISTIE: Oh, my goodness.
AUDREY: So, matching and how the technology we use fits into those parts of our life.
CHRISTIE: Wow! So, you’re having to juggle three issues at once?
AUDREY: Yes. Well, parts of it. But again, guest editors. It’s really cool to get to be working with a lot more people this year.
CHRISTIE: Awesome. All right, good stuff. And so, if you go to shop.recompiler.com now, you can subscribe and then get all of those issues when they’re ready in your inbox or your mailbox.
AUDREY: Yes, for sure. And if you’re not sure if you want to subscribe yet, get on our newsletter and then you’ll get e-mails when each one comes out.
CHRISTIE: And how do they get to the newsletter subscription?
AUDREY: There is a link on our home page. There’s two places. But one of them is that you click ‘News’ up at the top on the newsletter, and the other one is the big button right in the middle of that main block of text.
CHRISTIE: Awesome. And we can probably put a link to the show notes, too, I’m guessing.
CHRISTIE: Cool. All right, good times. And if you’re not interested in subscription for whatever reason but you want to support our work, you can also just make a one time contribution, or a monthly pledge, to support podcasts and all the other stuff we’re doing here.
That’s some interesting stuff for this week. So, Strava.
CHRISTIE: What is Strava? It’s a fitness thing, right?
AUDREY: It’s a fitness tracker, yeah. It’s really popular with cyclists and runners and people who do certain kinds of outdoor sports.
CHRISTIE: They don’t have their own hardware, I think, but they basically like get the telemetry from whatever, whether you’re using a Fitbit or a Garmin tracker.
AUDREY: Yeah, I think Garmin trackers are pretty common.
CHRISTIE: And it’s not just tracking and analyzing your own performance but there’s a social networking aspect to it where they have leaderboards and stuff like that.
AUDREY: Yeah. I mean, like if you and your friends are all training for a half marathon together, then it kind of lets you check in.
CHRISTIE: Ooh, and share your real time location during an activity.
AUDREY: Yeah, that’s one that we already know to ask questions about.
CHRISTIE: Right. And so, Strava came up in the news over the last weekend. This feature had been out for a little while, but someone had just recently noticed some interesting side effects, unintended consequences of this feature.
AUDREY: Well, there’s this heatmap visualization specifically that gives people access to, I think, what they thought would be reasonably anonymized data.
CHRISTIE: So, heatmap is a kind of visualization that I think we’ve all seen them where there’s like some kind of map or…yeah, we’ll just call it a map, and areas of use get highlighted and they get highlighted in brighter or warmer colors the more often that area is touched. Am I describing the heatmap correctly?
AUDREY: Yeah. And we say heat because yeah brighter, warmer making it glow. And so, in the case of Strava, what that could show you is like in Portland, there’s a popular biking trail called the Springwater Corridor. So, that would probably glow quite well, as would a cycling route that goes around Sauvie Island.
So you could get kind of an idea like, “Oh, these are really popular places to try.”
CHRISTIE: Or maybe like the Leif Erikson Trail in Forest Park is probably popular with runners.
CHRISTIE: And so, they release this heatmap feature where you could go and it’s overhead views, like satellite views, and then the heatmaps are overlaid on. But they did this worldwide, and this one person found that this data was inadvertently revealing the location of military bases and other intelligence facilities. I don’t know. Do we consider like the CIA an ex-military…do we consider that?
AUDREY: They’re not officially, right? Lots of people use fitness trackers and it’s not surprising that people who have physical jobs would use fitness trackers. But it is, of course, surprising that when Strava released this, they didn’t look at the major clusters that were being revealed. It’s sort of obvious in retrospect to go look for non urban heatmap areas and see what they might actually be showing.
CHRISTIE: Right. Like why is this one area in the middle of the Nevada desert have so much activity? Oh, because it’s around Area 51. And I think part of the problem was that, so there’s the heatmap thing but there’s also these leaderboards. And even though Strava thought that the data was anonymized, if you, through their API, could create a route and then upload it and Strava would then tell you the leaderboard with other users’ information for that route.
AUDREY: And they could use the heatmap to figure out where your route should be and then use the leaderboards to connect that to people who are actually there.
CHRISTIE: Exactly, yeah. So once Strava has gone through its records, they’ll be able to see the overall top 10 per gender/age group when they ran it, where they ran with. If their profile isn’t locked down enough: which other military bases they’ve been for runs on.
AUDREY: Oh, yeah.
CHRISTIE: So there’s kind of like multiple intersecting things happening here. There was the thinking the data is anonymized when really there’s another vector to get at it because they have this other feature. There’s the confusing privacy settings that Strava presents to users. And there was another Quartz article about that, kind of outlining all the different challenges there. And then, yeah, I guess those are the main two things.
AUDREY: Yeah. That your data is being collected, that your data is being aggregated, that your data is being displayed, and that there are access to all those pieces.
CHRISTIE: And that anonymization isn’t a binary.
AUDREY: Right. There’s lots of implicit ways that data can be de-anonymized. One of the examples I heard was like if you know somebody or if you live like in a mountain cabin in a fairly isolated area and you have like a favorite morning hike or something, that’s going to show up too. That’s going to make it so that people can find out who lives all the way out there, which maybe isn’t something that you want to get into.
CHRISTIE: Or if they know you live out there, they can derive things about you. Like you go on this route and if they can figure out what time you leave and how big that route is, they’re going to know how long you’re going to be gone or something.
AUDREY: But it’s a general loss of privacy, too.
This comes up too, like Glassdoor and those other employee review sites that are “anonymous”. Well, they’re only anonymous if you work for a big enough employer that yours isn’t going to stick out like a sore thumb.
AUDREY: They only release salary info, I think, I forget, it’s like after a certain number of people put in for that position. But generally, your comments, if you were the squeakiest wheel in that department, it’s going to come through.
CHRISTIE: Yeah. It’s really interesting too because I saw some comments from former military people that were like, “When I was active duty and it wasn’t that long ago.” It says, “On Twitter, Tim Matthews, who has served in the US military, said during his time he was not permitted to use any non-military issued device with GPS tracking.” So I don’t know if it’s just gone to the point where these devices are so ubiquitous, they can’t totally control access.
AUDREY: He was actually in the military not a contractor? Is that what it says?
CHRISTIE: Well that’s what the quote says.
AUDREY: Because there are tons of contractors on bases.
CHRISTIE: He’s talking about, I don’t know. He doesn’t really say. We can link to this Twitter thread. “On my deployments, if you used any non-military issue device with GPS tracking, or a non-secure communication device, it was an Article 15. We weren’t covert CIA spooks. We were just infantrymen using common sense.” That does not sound like a contractor to me.
AUDREY: I just wonder if contractors are being held to the same standards.
CHRISTIE: Yeah, that’s also a good…He also says, “Also, please stop taking selfies in the SCIF. Or at least turn off geotagging and iCloud sharing before doing so.” And SCIF is that secure communications, like when the Joint Chiefs of Staff I think go into a meeting, they’re supposed to be in a SCIF. I forget what it stands for. But I was like, “Who’s doing that?”
AUDREY: Somebody. Somebody is doing that, yeah. It’s just one of those things where we take a useful thing which is tracking your own fitness. Again, if you’re training, this is actually really, really important, really helpful.
CHRISTIE: And so many athletes and other people wanting to improve their performance have been doing it for a while. It’s not necessarily a new thing.
AUDREY: Yeah. I mean to me, this is something that you would have done with a stopwatch and a map. So it makes it a lot more accessible for a lot of people to use these training tools. I mean, that part’s totally fine and maybe it doesn’t make sense that Strava would have to do what areas of the map to black out. But there’s just like an aggregation problem where you figure whatever process they use to review this and release it, it didn’t consider this. It didn’t consider that there were people in places that wouldn’t want to be revealed, that it might affect somebody’s safety to be revealed.
CHRISTIE: Yeah and that’s even if there is a review process.
CHRISTIE: And I would be surprised if most companies or even a small number of companies have such processes. You alerted me to this article and I guess it’s an opinion piece in The New York Times about consent and data collection from…do we figure out the correct pronunciation…Zeynup.
CHRISTIE: The Latest Data Privacy Debacle. And basically, Zeynup makes the case that…and I saw this. I saw these interactions to the Strava thing. I think they even…part of their official response as well, the users opted into this. And her argument is that she says the privacy of data cannot be managed person-by-person through a system of individualized informed consent.
AUDREY: Because you can never be properly informed without seeing the aggregate, without seeing the ways that the things can be put together, and the ways that people can try to use them without a clear insight into that, then you can’t have informed consent.
CHRISTIE: So the reason we say informed consent is because it’s sort of like one has to come with the other. There’s no such thing as consent if you’re not informed about…
AUDREY: …the impacts, the implication. Yeah.
CHRISTIE: And her point is that that’s not possible because she gives a couple of reasons. She says one of the problems is that it assumes companies have the ability to inform us about the risks and that they don’t. Like there’s so many unintended consequences that a company can’t identify them all. And we’ve seen this time and time again.
AUDREY: Yeah, and it’s not even just in this moment. Like we might look at and go, “OK, well you can do this much with this data put together in this way with these tools.” But the way that machine learning has progressed, it’s hard to know what further kinds of aggregation can be possible and what further kinds of analysis can be possible. Somebody may come up with a question for their system that uses this data that reveals something entirely else.
CHRISTIE: Right. It says machine learning, which can take seemingly inconsequential data about you and combining them with other data can discover facts about you that you never intended to reveal. And we talked about this before, like looking at Facebook likes might reveal things or might infer things about your sexual orientation.
AUDREY: And those are really simple examples. I mean, I think the creepiest ones are when somebody gets really focused on how to tell if someone’s about to kill themselves or to commit acts of violence or to become terrorists, the kinds of things that you pull together to analyze that are really invasive.
CHRISTIE: And also the algorithms and the data sets for that matter that drive the machine learning is opaque, for the most part.
AUDREY: Yeah, and you could have a system tell you a result for that and not know how it got there.
AUDREY: And that’s the part that worries me the most, that we look at it and go, “Oh well, the system says…” But if you can’t follow that route to understand how it came to that conclusion then you can’t trust it.
CHRISTIE: And you have no due process which makes me wonder. I’m curious if this ever does make its way more through the courts. Is there a due process angle or something that that will rest on?
AUDREY: I don’t know. I mean, it’s already getting used the other direction, right? In sentencing and parole hearings.
CHRISTIE: Right. Yeah, that’s what I mean. And so, she goes on to say that that because it can’t be managed at the individual consent level, it needs to be managed more in a regulatory framework. So stricter controls and regulations about how data is used, data storage must move from being the default procedure to a step that has taken only when it is of demonstrable benefit to the user with explicit consent and clear warnings about what the company does and does not know. And there should be also significant penalties for data breaches especially ones that result from underinvestment in secure data practices.
CHRISTIE: So she’s basically saying collect less data collect and only collect it when there’s a clear benefit to the user, because I think the default now is to suck up all the data.
AUDREY: Sure. I mean, even if you’re not advertising focused. And as far as I’m aware, Strava isn’t primarily. But even if you’re not advertising focused, you can get into this mode of thinking, “Wouldn’t it be cool if we knew some particular piece of information?” And the problem is that if you can’t anticipate what will happen when you collect that information, when you collect it from multiple people over time then maybe you won’t be cool but you can’t tell.
CHRISTIE: And a huge part of what’s driving that is the business model. Most of these things don’t charge and so in one regard, they need to suck up all these data because their plans to monetize revolve around it.
AUDREY: Yeah. I think I mentioned before that now that I’ve spent a little bit more time poking at Facebook’s business manager and Ad Manager interface, I can really understand why what they’re doing creates a powerful advertising platform. It gives you a lot to work with. And if what you’re doing is advertising, it’s a very logical outcome.
CHRISTIE: And also you said something very specific to me which was now that you are using those tools, you really understand who the platform’s actually built for.
AUDREY: Yeah, I finally felt like the customer.
AUDREY: And suddenly, all of the things that I hate about Facebook disappear.
CHRISTIE: Yeah, I think that’s really, really telling.
AUDREY: I read through some forums and some comments and this comes fully formed out this way but Facebook has iterated on this in response to what paying customers have told them and this is how they solve those problems just to create this particular platform.
CHRISTIE: Right. Anything more on the Stava stuff?
AUDREY: I think the impact of this is going to be hard for us to assess a little because of the military intelligence aspect of it. But I hope that there will be some follow up reporting to kind of see and be on the [inaudible] if terrorists knew where that base was. I assume that the effects are going to be a lot more subtle.
CHRISTIE: Right. The military’s not going to come out and say there’s negative consequences. So this next thing is kind of neat. There’s this iOS app called Verena. For one thing, it’s written by a young person, a teenager. Amanda Southworth and this is not her first app. She did an app called anxietyhelper. She had to attend the most recent Apple’s Worldwide Developers Conference and even wrote about it for Teen Vogue. This kid’s very precocious.
AUDREY: It’s pretty cool.
CHRISTIE: It is cool.
AUDREY: Both her past work and the thing that we’re going to talk about I think really show like why we want more people to code, somebody solving problems that they see in front of them in their community and having the tools to do that.
CHRISTIE: So have you heard of this app before? It’s been out for a little bit, it sounds like.
AUDREY: No I hadn’t, until you mentioned it to me.
CHRISTIE: So it’s designed to protect and help guide you through situations like domestic violence, hate crimes, abuse, and bullying. It was developed with the LGBTQ+ community in mind, but can be used by anyone facing abuse or harassment.
So, you create an account and develop a network of emergency contacts who can be alerted without leaving a trace on your phone. There’s an emergency feature to guide you through a problem giving you the resources to get out of the emergency safely. You can create incident logs to keep track of abuse, hate crimes, or bullying which is really important if you’re going to get any kind of help from law enforcement or government officials, you need lots of documentation.
AUDREY: Yes. I like this timer feature. She’s written in a safe call feature where if you set a timer and say OK if I don’t check in safely by this point, contact my emergency people. It’s something that’s a pretty common problem.
CHRISTIE: And then there’s also an incognito mode that changes the interface. So it’s not as obvious if the abuser is looking at the phone and it has a shut down mode where it’ll disable the app until you redownload it. So, cool stuff there.
AUDREY: Yeah. It just seems like a very well thought out tool.
CHRISTIE: Verena, which takes its name from a German name that means protector. That’s where the name comes from which is kind of cool. I’d like to be able to share happiness.
AUDREY: Yeah, productive stuff and the opposite of super invasive data collection.
CHRISTIE: This article, she was going to school but it wasn’t allowing enough time for her to code. So now, she’s home schooling the last bit of high school. So she says she codes five hours a day and does school work for about two.
CHRISTIE: And wants to get a job at SpaceX.
AUDREY: That is a very cool goal.
CHRISTIE: It is.
AUDREY: I think tracking is our theme this week.
CHRISTIE: Yeah. I started thinking about could this app be used by an abuser to like, could they set the timer? Could it be turned around on you? But I started thinking, well there’s a lot of other apps that are more direct about, that’s what they’re designed to do.
AUDREY: Yeah, for sure. Things that are intended to track or monitor in some way. And this sounds like it’s meant to be something that the user can install really discreetly. And I know we had a previous article about cell phones and safety, and the author had talked about some of the techniques that people use like maybe setting up something through messaging app they already have, so that it doesn’t look like they’ve installed anything new.
AUDREY: But I like the idea that this thing pretends to be a calculator. I think having words with friends type deal would also work really well, like, “Oh, it’s just a new game that I’m playing.” Make it look very innocuous.
CHRISTIE: Because that’s come up with like, “Oh, you signal for encrypted messaging.”
AUDREY: If somebody is monitoring your phone really directly then they’re going to have an idea of what’s going on.
CHRISTIE: So speaking of tracking. You’re right, we do have a theme. Amazon’s got a new patent for a wristband to track the hand movements of warehouse employees. And it can provide haptic feedback.
AUDREY: I think that’s where we’re like, “Oh, that’s creepy.”
CHRISTIE: Yeah. And so, you could you could think of the haptic feedback is like, “Hey, you’re near the wrong bin. You’re putting something in the wrong bin or is taking from the wrong bin,” and other things like break time or I don’t know. Think of all of the things that make your phone buzz.
AUDREY: Sure. I’m imagining it a little bit like those restaurant coasters that they give you when you put your name on the list for a table. Even if it just does sort of this light and alarm and ‘late get back to work’ after your break kind of thing. It just sounds so invasive, like even if there’s something that they can get from it in terms of warehouse optimization. The idea that it provides direct feedback to the user, you’re changing their sensory experience, you’re changing their ability to interact with the warehouse.
CHRISTIE: Right. So I have no doubt that they’d be also tracking the movements and looking at different optimizations. But yeah, that ability to…yeah. And Amazon has demonstrated before that these things are not really consensual. Like the searching of workers when they leave the warehouse to look for theft. People will do that after they clock out. And so sometimes, they’re waiting these really long lines at shift change and there was a whole lawsuit which Amazon won. So, the powers within Amazon to just impose this on employees and they don’t have much recourse.
AUDREY: Right. And there are definitely places in the country where this is the best work you can find, Amazon warehouse work. And definitely people for whom this is the best work that they’re going to be able to find and that locks you in. That doesn’t give you the choice to just walk away from something like this.
CHRISTIE: Yeah. And the legal frameworks. When they are there, protect the employer’s ability to do this, to track their employees.
CHRISTIE: For technology workers, it’s software on their computers to track everything they do. You’re right about changing the sensory experience. And then also just like, where your hand is, is kind of personal.
AUDREY: The article mentions tracking badges. Make sure that you’re in the right area of the warehouse. I mean, we’ve had lots of workplaces that have some kind of access control in terms of needing your badge to get into the right area or just anything like that. But yeah, the wristband does seem just a lot more personal especially the kinds of things that it seems to be able to track.
CHRISTIE: Right. And I mean presumably, they can figure out if you’re close to a bin, they might be able to figure out like how many times you scratched your junk on your shift or something, which is like not really any of any employer’s business, right?
AUDREY: Sure. Or nose blowing, if you’re sick a lot.
CHRISTIE: Right. Or if you have allergies.
AUDREY: Oh, geez. Yeah.
CHRISTIE: I mean, we already know the environmental conditions in the warehouses aren’t great.
CHRISTIE: And I will say this is a patent. We don’t know that they’re implementing this.
AUDREY: But generally, you can’t file the pattern until you can show how it works. So, in the case of hardware, that generally means that you’ve prototyped it.
CHRISTIE: Right. And why wouldn’t they? I mean, this sounds exactly like something that they would do. It does make me wonder if in human history there’s a thing happens and there’s a reaction. And I wonder like, sabotage is very much in human nature.
AUDREY: This is something that we talked about years ago at Wherecamp about geographic tracking. That in order to…we had kind of a ongoing session, someone will bring up every year about how to lie with your data collection. I mean, the idea that if you have GPS tracking on things and you have location tracking on things, that lying is not just a matter of turning it on or off. Lying is creating a reasonable insertion of data that masks what you’re actually doing.
CHRISTIE: Misdirection, right.
AUDREY: Because yeah, if you go off the grid and you come back on the grid, well somebody can infer that you did something in between. So if you really want to lie in the course of that collection, then you need something else that you can put in there.
CHRISTIE: Yeah. In a spy movie, when you discover the tracker, you put it on another moving thing, not throw it on the river.
AUDREY: Yeah, I don’t know. Like in one of these sessions, we talked through the logistics about for a while. If you did leave your phone on the bus and the bus just kept going, if you’re being monitored in that way and it’s ongoing, you need to get that phone back which means that your actual route that you’re taking and your phone’s route have to intersect at some point. And so, it’s a little bit more complicated than the spy movie, I guess, is where I’m going with this.
AUDREY: It’s something that takes a significant amount of effort.
CHRISTIE: Now I’m going to be thinking about different ways to circumvent that kind of tracking. And also, the thing with computing is it’s become much more realtime because the speed of these things matters too.
AUDREY: Yeah, sure. If they all went back to the office at the end of the week and then they did some analysis on the workflow in the warehouse that they may change for next week, that’s very different from being able to get hour by hour or minute by minute evaluations.
CHRISTIE: And also, if the feedback is driven by machine learning too.
CHRISTIE: Like maybe the manager viewing this stuff isn’t a human but a computer and has determined you’re not resorting fast enough or whatever, and you get even more of…maybe it’s not just a little buzz, maybe it hurts.
CHRISTIE: And also people have wildly different sensory thresholds for things, too.
AUDREY: Yeah, that’s part of what I was thinking about that this makes being a warehouse worker that much less accessible for people with sensory processing issues, and people who, for whatever reason, find that more than distracting.
CHRISTIE: And probably, I don’t know. I’ve found this the most disturbing of our tracking stories today.
AUDREY: Well, at least it doesn’t have the immediate short term impact of our next one.
CHRISTIE: I mean, our next one was the most disturbing.
AUDREY: Oh, OK.
CHRISTIE: That was my poor transition, Audrey.
AUDREY: Sorry. I don’t know if I’m awake enough for that.
CHRISTIE: No, I think I threw you a crap ball and you didn’t catch it. But that was on me for throwing a crap ball. So, Immigration and Customs Enforcement.
AUDREY: They’re awful. They are the scariest thing in the United States and that’s even with our health care situation.
CHRISTIE: I would agree. And they are now contracting with Vigilant Solutions which is a company that sucks up license plate recognition data. And I think this actually dovetails nicely well. ‘Nicely’ is a bad word but it builds on what we are talking about, the Strava thing, and sort of unintended consequences because companies collect our data and then they sell them to other companies. It’s not just one layer or one entity. And so, there’s all these different things across the United States that scan for license plate information and collect it. And then either they sell it or give it to this one company, Vigilant, and then Vigilant sells access to that database and not just access but tools too to search it. So they have this thing called the hot list where you can upload a list of license plate numbers and get alerted when they show up places.
AUDREY: So, if you’re trying to track everybody that you have a warrant on or something like that, then you could find out if they’d show up in your jurisdiction.
CHRISTIE: Or if you are trying to track the known associates of someone.
AUDREY: Oh, sure.
CHRISTIE: I listened to…I was in a podcast and I need to do better bookmarking these, that worked with someone who is a bail bounty hunter. So it wasn’t exactly immigration. But a lot of that work is done by looking at the known associates, recognizing that humans are social creatures and we have to stay connected to somebody. So I think it’s clear to understand that or it’s important to understand that they’re not just looking for “bad guys”. The circle of the net is much wider than that.
AUDREY: Yeah. I mean, this is what we talked about Strava too. It’s not just an individual data trail but it’s when you put those all together, you can learn a lot. It expands not just linearly, how much you can actually get from that.
CHRISTIE: And there’s time and location data correlated with this. So if you wanted to construct someone’s schedule, you could do this, do that with access to this. And it’s a private contract, of course. So, it’s not really subject to any kind of government oversight.
AUDREY: Yeah. That was the part of the article that really jumped out for me. That by contracting this out rather than developing it in-house, that affected how effective oversight could be.
CHRISTIE: Right. And that’s another pattern we see over and over again.
CHRISTIE: Earlier, we talked about the NYPD asset forfeiture database and how that was contracted out. Presumably, it was poor work that was done or something and they didn’t have access to it.
AUDREY: Here, I don’t think access will be a problem for the license plates. I mean beyond just the problematic aspect of surveillance itself, we know that ICE is trying to do is to deport whole blocks of people. Not just people that have broken the terms of their immigration status, but anybody that they can potentially sweep up all at once.
CHRISTIE: And they have specifically stated that their goals, like their quotas are much, much higher.
AUDREY: Yeah, I find that especially disturbing.
CHRISTIE: And then I think when you combine that with the burgeoning attempts to redefine citizenship, it’s even scarier. I mean, I don’t think we should be deporting undocumented immigrants on this scale anyway, but also I didn’t mean to say that it only matters to me if they start targeting citizens.
CHRISTIE: I think this is only the beginning for what they’re going to try to do.
AUDREY: There’s just all these layers of vulnerability, like we’re the least vulnerable, in that we’re white we’re…I forgot what you call the kind of citizenship where you’re born here. We’re not naturalized citizens. We started here. And so, the ways that this can affect us are mostly in terms of search and seizure, proximity to border enforcement. But the ways that a lot of other people can be affected is naturalized citizens can have their citizenship reviewed. That’s something that immigrations looked at doing, that data collection effects. And then we get into people being criminalized for helping or having contact with undocumented immigrants, people being criminalized for being in a sort of like this [inaudible], like DACA. And so, there’s just like layers and layers and layers of this stuff and ways that surveillance can feed into it.
CHRISTIE: And there’s no sense here of due process or of unreasonable search and seizure. It’s like those concepts have not caught up to mass data collection. The precedent is that observations about you made in public are fair game.
AUDREY: But it’s very different to have an individual set of staff that go and write down license plate numbers outside of the 7-Eleven or whatever. It’s very different to do that than to have a networked computer system that photographs, analyzes, and shares that data.
CHRISTIE: And that’s where I think if this is going to have a successful challenge in the courts, it’s going to have to be too process based or something like that.
AUDREY: Aside from commentary from the ACLU here, I didn’t see a lot more about what kind of limitations might be enforced or what kind of legal challenges could happen.
CHRISTIE: It’s another thing where like the chain of accountability is so stretched out and obfuscated that it’s also they’re collecting license plate data from all these kinds of different sources. So, each individual, one of those is just a little tiny link in the chain. So, it doesn’t necessarily make sense to attack one of them. But then, it’s going through this private company and then through a government. It really shows you how insidious the private public spheres the relationship between them can be.
AUDREY: There’s this sort of data laundering that happens. If it’s completely reasonable that your tollbooth system photographs license plates so that you know if people have paid and you know who hasn’t. If that’s totally reasonable but you’re holding onto that data and you’re selling that data, then it becomes something else.
CHRISTIE: [Inaudible] is also an accountability laundering too, I think.
CHRISTIE: You mentioned that the thing you were talking about at Wherecamp, this is even more like on target. So, do we just start to make it a habit to like put in random side stops? Do we like start making random errands in different parts of town to just like dirty the data, add noise to it?
AUDREY: For somebody who’s been tracked, like in the case of people who are helping undocumented immigrants, tracking them is how ICE gets to people to deport them. So, it’s not like you can just give up. If you’re driving that car, you are going to just give up your car or change cars every day, change the tracking. That sounds really difficult. How do you actually disrupt that when you’re home and your work are static things?
CHRISTIE: Hard problems.
CHRISTIE: Fortunately, we have things we like on the internet this week.
AUDREY: And not just enormous legal and regulatory difficulties and things that we can’t even figure out how to protest.
AUDREY: I’m glad that there are things that we can like.
CHRISTIE: So my thing that I like involves really awesome photos. So Wiki Education ran a…or Wiki Science? I don’t know. I don’t know exactly who ran this…ran a photo contest basically to encourage the creation of and free sharing of science related images. And then they just announced the results. And there’s some really cool photos including one of eclipse.
AUDREY: Oh, I’m looking at it now. Yeah, this is really something.
CHRISTIE: And I know it’s creating comments but I really want to see if I can get a high res version of this or buy a print from him because it’s so cool. It’s one of those multiple exposure photos, so you can see the texture of the moon and the solar prominences and the rays from the sun during a total eclipse.
AUDREY: Yeah, it’s very cool. I also like this one of the Ebola virus that’s after it.
CHRISTIE: Yeah. They had a category that was basically like microscopic photos. So, there’s a general category, it’s the one with the eclipse and there’s also really cool…if you click through from the blog posts or some other really cool photos, including one of People in Science. That was another category which I thought was cool and then Microscopy. There’s a colorized scanning electron micrograph of Ebola virus from a chronically infected African Green Monkey kidney cell, that’s the blue. That’s amazing.
AUDREY: Yeah. There’s also like a really up close photo of a fly. Bad guys.
CHRISTIE: It’s a little spooky. It’s like a Jeff Goldblum voiceover. There’s also a really cool colorized one of…or maybe it’s not colorized about pollen. Pollen’s a really cool looking close up.
AUDREY: Yeah, it is.
CHRISTIE: And then one of lice or louse. Louse is just the singular of lice or are they different things?
AUDREY: Yes, singular.
CHRISTIE: I think it’s funny because it says “Obtained from a post-doctoral fellow’s jacket.” I think that person shall remain anonymous. Like. “What is that on your jacket?” “Oh, let’s put under the microscope and take a photo of it.”
AUDREY: That sounds pretty normal actually.
CHRISTIE: And then they have some category called non-photographic images which seemed to be like videos. And then image sets. So there’s this really cool, one of the tornadoes.
AUDREY: Yeah, I saw that.
CHRISTIE: So, check those out. We’ll link to it. Some cool stuff.
CHRISTIE: What have you got?
AUDREY: My favorite thing is that another podcast that I think is super awesome got funded on Kickstarter.
CHRISTIE: Awesome. We talked about this before, The Racist Sandwich Podcast.
AUDREY: Yeah. And I think I first heard about them because they are or were based in Portland, at least one of the participants. And yeah, they just do these really cool interviews and cool discussions about food and racism, food and race. I find it super insightful. I loved that they got funded. They went over their goal and all of us who pledged at a certain level are going to get a little pin that looks like a burger, like their logo.
AUDREY: They’re pretty neat, yeah.
CHRISTIE: So this is to fund basically their second season? It says they’re going on the road?
AUDREY: Yeah, get outside of Oregon and get to see more people.
CHRISTIE: It’s exciting for them. Cool.
CHRISTIE: All right. I think we got our show.
AUDREY: Well, all right.
CHRISTIE: So, thanks everyone for listening. And thank you, Audrey, for joining me again this week. And we’ll talk to you all again soon.
AUDREY: Thank you.
CHRISTIE: And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to firstname.lastname@example.org or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.
The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.