Download: Episode 60
This week Audrey and I chat about AI and predictive policing, domain fronting, how Facebook does a lot of emotional labor for us, and more!
- [00:47] The Responsible Communication Style Guide is headed back to the printers! – The Responsible Communication Style Guide
- [01:43] Issue 9: Hard problems – The Recompiler
- [03:00] A pioneer in predictive policing is starting a troubling new project – The Verge
- [17:49] Blocking-resistant communication through domain fronting
- [19:37] Signal >> Blog >> Amazon threatens to suspend Signal’s AWS account over censorship circumvention
- [30:39] I tried leaving Facebook. I couldn’t – The Verge
- [40:14] Jive Software: An inspiration and cautionary tale for Portland tech | OregonLive.com
- [48:57] Help Us Solve This Debate About What “IMHO” Stands For
- [45:57] Emergency Response Guidebook app
The Responsible Communication Style Guide is headed back to the printers!
When we sold out of print copies of The Responsible Communication Style Guide last fall, we promised to do another print run in early 2018. We’re happy to announce that we’re ready.
If you’ve been waiting to pick up a printed book (or enough for the rest of the office so they stop filching your copy), this is your chance. Order now!
Issue 9: Hard Problems is now shipping!
You can still get your copy in the Recompiler Shop.
Now Broadcasting LIVE most Fridays
We broadcast our episode recordings LIVE on most Fridays at 10am PST. Mark your calendars and visit recompilermag.live to tune-in.
We love hearing from you! Feedback, comments, questions…
We’d love hearing from you, so get in touch!
You can leave a comment on this post, tweet to @recompilermag or our host @christi3k, or send an email to email@example.com.
CHRISTIE: Hello and welcome to The Recompiler, a feminist hacker podcast where we talk about technology in a fun and playful way. I’m your host, Christie Koehler.
All right, we should be on air.
CHRISTIE: Hi, Audrey.
AUDREY: Hi, Christie.
CHRISTIE: Happy Friday.
AUDREY: Yey! Friday!
CHRISTIE: It’s May 4th. And about almost 10:15, running a little late today. This is live broadcast for Episode 60 of The Recompiler podcast. How about some announcements, Audrey?
AUDREY: All right. We are still taking orders, preorders, for the second printing of The Responsible Communication Style Guide. And there’s still a couple of weeks left to get yours in. Once we hit our minimum order, we’ll send that off to the printers and have shiny new copies of the book for people to work with and pass on to your co-workers. So I hope that if you’ve been planning to order, you get your order in.
CHRISTIE: Awesome. And that’s RCStyleGuide.com. We’ll have a link in the show notes. You can also get to that from the Recompiler Shop, shop.recompilermag.com and all kinds of good stuff in there including some updates or corrections from the first printing.
CHRISTIE: And yeah, so get those orders in. And then you’re shipping Issue 9, aren’t you? Or has shipped?
AUDREY: I have shipped most of the copies of Issue 9 but there are a few left to buy and I am reserving a few for new subscribers. We’ve been hearing a lot of really positive feedback about it. This is our Hard Problems issue. So again, if you’ve been thinking about getting a copy, you’ve got just a little bit longer to do that before we run out.
CHRISTIE: All right. Any other announcements, Audrey?
AUDREY: Well, we’re going to have special Kickstarter announcement next week but we have to wait on that one.
CHRISTIE: All right. So stay tuned for exciting Kickstarter announcement for a new-ish project where a second…yeah.
AUDREY: Something that people may have heard about but we’re taking it forward.
CHRISTIE: Yes. I’m not entirely awake yet.
AUDREY: Well hopefully, by the time we get to the end of this hour, you will be alert. And I don’t know, maybe our next topic will wake you up a little.
CHRISTIE: Right. I sort of filed this under, I guess, algorithmic violence which is something we talked about on a previous episode. And so this particular version is about predictive policing. It says, “A pioneer in predictive police is starting a troubling new project. Pentagon-funded research aims to predict when crimes are gang-related. Oh, boy!
AUDREY: Yeah, there’s a lot of things to think about there. So there’s this researcher at UCLA who’s an anthropology professor has adapted his Pentagon-funded research in forecasting battlefield casualties in Iraq to predicting crime for American police departments. Of course, he patented that research and founded a for-profit company. Isn’t that great how you can get government grants to university and then create for-profit companies?
AUDREY: I think that there are entire departments or some departments that teach you how to do that.
CHRISTIE: I think there’s also legislation that mandates it. I always forget the name of that. I’ve got to go look that up.
AUDREY: But yeah, there are a lot of programs from what I can understand that teach academics how to do this.
CHRISTIE: And so, this Jeff Brantingham guy is spitting up a new project. He uses machine learning, LAPD criminal data and an outdated gang territory map to automate classification of “gang-related” crimes.
AUDREY: And I guess the idea is that that would change how they investigate those crimes and possibly the kinds of charges that they try to bring.
CHRISTIE: Yeah, the article doesn’t…let’s see. There’s definitely different statutes that apply for gang-related crimes and sentencing and things like that. And also if you’re identified, it does talk about how if you’re identified as a gang member, you are subject to different potential restrictions and scrutiny. So they presented this paper called Partially Generative Neural Networks for gang crime classification at the Artificial Intelligence, Ethics, and Society Conference.
AUDREY: It does definitely involve ethics and society.
CHRISTIE: So part of this project came from funding from the Minerva Project, the Minerva Initiative which is a Pentagon research program that says intended to improve the military’s understanding of social, political, and behavioral drivers of conflict. And the thing that got me about this, Audrey, was that it says they’re testing the accuracy of this neural network’s ability to predict and classify crime data without one key feature. It says the narrative text description of the crime which is the most time consuming data for the police to collect. So that’s where that whole partially genitive thing comes in. They’re basically using other data about the crime. I had to read this a couple of times because I was like, “Did I read this correctly?” And then it algorithmically writes a crime report based on the other three features in the training model. And then that generative text is fed back in to the final prediction.
AUDREY: When I read that, the first thing that I thought of was the AI weirdness experiments and like the generative D&D characters and some of that kind of stuff that Janelle Shane has worked on just for fun. It had a little bit of that idea that we would taking complete information and fill in the blanks and something will come out. But there’s also a certain amount of really boring news writing that already does this and I think the stakes are potentially a lot lower. But it does happen like in financial reporting.
CHRISTIE: You mean taxes generated by computer?
AUDREY: Yeah, based on some already collected factual information. I don’t think using it for policing is the same thing in the slightest.
CHRISTIE: No. I feel like I’ve heard them doing that with sports reporting too. So there’s some issues with this. The first is that…I don’t know how to rank them but one of them is that they’re using an outdated gang territory map. So they’re using data from like 2014 to 2016 and a 2009 LAPD map of gang territory.
AUDREY: Right. So that just seems like it’s noise at best. It’s not going to be very useful.
CHRISTIE: And then of course how is that map even created. Because I mean, I definitely know very, very little about this but I would imagine that within gang territory, there is regular people that live there too.
AUDREY: Yes actually, that’s one of the biggest, most widespread problems that occurs with this kind of gang targeting. Just the fact that you live inside something that’s been designated gang territory can make you somebody who “associates” with gang members because those are your neighbors. And in policing, there often isn’t a clear distinction between active gang membership and the sort of incidental interactions.
CHRISTIE: So that’s part of it. Another part of it is once again, we have a potential where the model is basically just reinforcing the existing biased judgments. The people involved in the research don’t really seem to have a lot of introspection or questions about like should they be doing this work.
AUDREY: Well in any time we’re talking about using a neural net type process for generating information to use, like the traceability or the inspection of it is always an issue. If it generates a report and that report turns out to not be very useful or very accurate, you don’t necessarily know how it got to that. It’s a lot of work actually to construct that information. And so I mean, even if you see that it’s encoding bias in it and you want to do something about that bias, it may not be obvious how to change the outcome.
CHRISTIE: Yes. There’s some bits in here from this Christo Wilson who co-organizes the Fairness, Accountability, and Transparency in Machine Learning conference. It says, “If I train a model to predict people’s height, we know how to interpret the output and gauge its accuracy.” But Wilson noted gang-related is a complex subjective determination. “So the algorithm is accurate at predicting what? Whether LAPD officers will label a crime as gang-related. Now, maybe the LAPD is 100% objective in their determinations of what is and is not gang-related. But if they’re not, then the algorithm is going to reproduce their errors and biases.”
AUDREY: Yeah and I would say again that it’ll probably make it…the use of this would make it harder to do something about that bias.
CHRISTIE: Wilson…let’s see…later on, he talked about how the paper fails to incorporate documented approaches to evaluating biased outcomes in machine learning. And they talk about algorithm that achieves statistical parity across races and ethnicities.
AUDREY: I’m especially interested in the map aspect of this. It seems like they’re saying, “Well, if we know what kind of crime it was and…” I forget. There were a couple of elements, like what kind of weapon or something else.
CHRISTIE: Yeah, it was type of crime, type of weapon and number of people involved, I think.
AUDREY: Which is interesting too. But maybe let’s say that they think that the more people involved, the more likely it’s a gang thing. But once you introduce a faulty map, you really are just scrambling the rest of that and especially to look at Los Angeles. Gentrification is this active factor, the way that people are moved through neighborhoods is really impacted so it just seems like one of the weirder things that they could do with this. I wonder what a verification process would even look like, like evaluation or review process would look like on this.
CHRISTIE: Right. I can’t find it now but the article says how it’s just a recent thing that Californians were granted the right to appeal their status, their documented status as a gang member. It’s like there aren’t even oversight or appeal mechanisms for these things. This is an issue here in Portland recently. They talked about…I can’t remember the details of it, but they have these list and they’re talking about getting rid of them. But did they really get rid of them.
AUDREY: I mean, they did officially discontinue the gang listing. What they did was created a second kind of listing that is not officially the gang listing but still holds on to some similar pieces of information.
CHRISTIE: Well, that’s nice. Okay.
AUDREY: Yeah. I mean, gang-related crimes are considered escalations in terms of charges and sentencing. So this has a really immediate impact on people who, again, just live in the wrong area. And one of the things I know historically in Portland is that the gang territories that we’ve had overlapped significantly with the areas where redlining occurred because the economic and social suppression are incubators for all sorts of things including crime. And so to police people for living in a historically disadvantaged neighborhood that also has other problems, it’s just like a way of just aggravating everything else that’s happened. It pushes people down, collectively pushes people down instead of giving the neighborhood a chance to repair itself and heal.
But the big pressure I see why this research continues it’s not just kind of a “Oh, neat” factor, it’s that police departments always think that they’re understaffed.
AUDREY: They always think that they don’t have enough people and enough ability to investigate and charge people with crimes especially the sort of community crimes, I guess. And so lots of police departments are going to look at this and say, “Oh well, let’s get more done. It makes it easier for us.” But they’re erasing any kind of responsibility in the process.
CHRISTIE: And it’s another form of the militarization of our police too. This comes from Pentagon-funded research that was originally used on the battlefield.
AUDREY: Yeah, and whatever they’re trying to understand about terrorism. Gangs are not terrorists. It’s not the same kind of thing.
CHRISTIE: They’re not enemy combatants either.
AUDREY: No! These are your fellow, your neighbors, people who live in your city with you. You may not be living alongside each other very easily or peacefully but that doesn’t mean that that kind of othering isn’t useful either. And there are lots of ways that cities can look at gang problems and deal with gang problems that don’t involve this.
CHRISTIE: And a lot of it is through economic development and opportunity development. That came up when we’re talking about Palantir in NOLA, right?
AUDREY: Yeah. The kinds of things that community members and activists want to see happen do not look like pulling people aside continuously to inspect whether they are doing gang-related activities.
AUDREY: But yeah, I don’t know. I just think about the impact of this a lot with Portland and with our obviously very problematic history both in terms of racially biased policing and the stuff that happens around mental health and the way that the mayor and the police department keeps saying, “Well, we need a bigger budget. We need to hire more people. We’re understaffed. Nobody is really that excited about being a police officer in Portland,” which is great. But yeah, the solutions they look for are things like this instead of looking for ways that communities can care for each other, the neighborhoods care for each other that aren’t about putting more police officers on the street.
CHRISTIE: And for me I think another take-away from this is that if you’re helping to build this stuff, I think there’s an obligation to look at the ethical issues.
AUDREY: Yeah, it’s ridiculous to say, “Oh, we’re just experimenting.”
CHRISTIE: Or “we’re just an engineer.”
AUDREY: Yeah. Or it’s not ready for primetime. Somebody will take this live. Once you start showing how this can work and showing any kind of result, there’s somebody that’s going to go implement it.
CHRISTIE: Especially if you’re a guy who’s already started a company to sell some of your previous research like he was.
CHRISTIE: I learned about domain fronting this week. So I copy this bit from the paper. It says…okay, this is a little bit long but I think it’s a good explanation. It says, “We describe domain fronting, a versatile censorship circumvention technique that hides the remote endpoint of a communication. It works at the application layer, using HTTPS to communicate with a forbidden host while appearing to communicate with some other host, permitted by the censor. And so we use different domain names at different layers of communication. So basically, you use one domain name at the layer that the sensor can see and another domain name at the layer that they can’t see that is the actual host. And we’ll link to the paper if you want like a super technical explanation for how this works.
AUDREY: The paper gives a lot of technical background on circumvention of censorship in general.
AUDREY: That I really appreciated. Yeah.
CHRISTIE: Yeah, I only read the abstract of it.
AUDREY: Yeah. I did read through some of the other sections. I thought it was really interesting just the different kind of approaches that they talk about and specific places and the kind of censorship that occurs and strategies they’re doing at work, what can or can’t be evaluated. I didn’t know about domain fronting before we started talking about this this week. So I thought it was really interesting. It seems very clever that within an HTTPS request, there is this ability to do kind of a handoff.
CHRISTIE: And this came up this week because Signal has been using this technique to get around censorship in, is it Egypt, Oman, Qatar, UAE?
AUDREY: That sounds right.
CHRISTIE: I don’t know if that’s an exclusive list. So they’ve been using domain fronting through Google App Engine and the ability to use domain fronting relies on specific implementations by service providers. So you can’t just do this with every provider.
AUDREY: One of the things that the paper makes clear is that you have to be hosted on their services. So if you don’t have an AWS account, then you can’t use Amazon for domain fronting. You can’t use Google for domain fronting unless you have a Google Cloud services account. It has to happen like coming from within their IP addresses.
CHRISTIE: And the reason you would use something like Google or Amazon, aside from the fact that their implementation allows us, is that what you do is you basically make it look like it’s a request that’s coming from their main service which the censor presumably would not want to block because then it will block all kinds of stuff.
AUDREY: Yeah, blocking all of Google or all of Amazon probably is going to get a little bit more widespread backlash than just blocking Signal.
CHRISTIE: Right. It’s sort of…we’ve talked about that with Russia and Telegram.
AUDREY: Right. What a mess that’s been.
CHRISTIE: And so Google App Engine is changing the way that they do this.
AUDREY: Google or Amazon?
CHRISTIE: Google. So the reason they started looking at Amazon was because they got a notice from Google that internal changes would stop domain fronting from working. So then they started looking around at different options, and we’re looking at AWS. And then AWS got [inaudible] that they were looking at doing this. And that’s when Amazon reached out to them.
AUDREY: And if I remember correctly, it seems like the fact that signal is an open source project was sort of a factor in this, that somebody saw the discussion that they were having about the implementation and posted on, I think, Hacker News.
CHRISTIE: Yeah, it says, “We’re an open source project, so the commit switching from Google App Engine to CloudFront was public. Someone saw the commit and submitted it to Hacker News. That post became popular and apparently people inside Amazon saw it too.” And then they lay out how they don’t think they’re violating the terms of service because they’re not using SSL certificate of any domain but their own and they’re not falsifying the origin of traffic.
AUDREY: Of course, Amazon can kick them off of the service for whatever reason. But yeah, I don’t know. It just seems a little…I sort of wondered what the company’s motivation for stopping this is. Is it just that they think that they will really get blocked or that it affects their ability to do business in the countries that Signal is circumventing? Or is this like are they just sort of offended?
CHRISTIE: Well, they are making their own changes to CloudFront that wouldn’t allow domain fronting. I mean, it’s a hack. So, it does obscure the source of traffic.
AUDREY: But I don’t think it’s a security problem in terms of allowing somebody to spoof a domain.
CHRISTIE: What if it weren’t Signal? What if it were child porn?
AUDREY: Well, we could say that about Tor.
CHRISTIE: I do say it about Tor. I think it’s a really good argument. But that’s what I’m saying. That’s the tradeoff we have to discuss.
AUDREY: About circumventing and not censorship like in a specific sense but sort of evaluation. Provider’s ability to evaluate the content that’s coming through and the traffic that’s coming through.
CHRISTIE: Right, meaning the hack that Signal was using could be used for something we don’t think is good. Meaning that censorship is valid sometimes.
CHRISTIE: People use domain services so their kids can surf the internet without having access to certain things that are age-inappropriate. And I think as a service provider, I could see as a service provider not wanting to allow this kind of stuff.
AUDREY: Yeah, I don’t know. I’m just thinking about whether this is a security thing, whether it’s just about how much we want to want service providers to be aware of what’s passing through. I mean, it’s not like it prevents AWS from finding out what Signal’s…from knowing that Signal. They would be able to inspect where in their Cloud service this was coming from.
CHRISTIE: Right, because the internal layer has the target host name.
AUDREY: So there’s a limit on what people could relay through it in that respect.
CHRISTIE: Right. I think it’s just as a service provider, how much do you want to be culpable in this kind of circumvention. It’s weird to me. I mean, I guess that’s what Tor is for. But I’m just thinking with TLS there’s always that someone has knowledge of which server you’re talking to. So it’ll never be totally private.
AUDREY: Yeah. And one of the things that this paper talks about is with Tor specifically, because all the exit nodes if they are all known, they can all be blocked or all be monitored.
AUDREY: So they keep some in reserve. They keep some exit nodes that are not documented and that you can’t get a full list of from any single place so that they have some more options there.
CHRISTIE: There’s another thing that stuck out to me that they say at the end, they basically say domain fronting seems to be a technique that we can’t use anymore so they’re considering ideas for a more robust system. They only have a few people if you want to feel like to help, they’re hiring. And so I went to look at who they’re hiring and there’s some interesting…but it’s like highlights: We’re not VC funded, we’re not a business. We’d rather think creatively than think about the dough.
AUDREY: But they have funding.
CHRISTIE: Yeah. I would rather critical infrastructure think of themselves like a business.
AUDREY: Or some kind of model that encompasses the concerns that businesses have like when they think about their whole scope of activities. Not thinking about the money does not actually make the money irrelevant.
CHRISTIE: No. Anyway, that’s all…
AUDREY: I mean, I think that Signal circumventing censorship is in general a good thing. And I think that…I don’t know, it seems like anytime we talk about encryption and secure communication, they’re going to be people who use it for bad things. And I don’t know that trying to monitor communication is ever going to be, like to me, a really satisfying solution because I don’t think that there’s like a single law enforcement agency that we can expect to, or a service provider even, we can expect to differentiate between activists speech and what they consider terroristic speech to always make a good and safe assumption about that. And so, I think we kind of have to pick which direction we want to lean. Is it toward more secure communication and more avoidance of censorship or is it toward more monitoring and more open communication?
CHRISTIE: Well, I am with you on that but also I’m uncomfortable lumping the two things so closely together.
AUDREY: Which two?
CHRISTIE: I think censorship is a distinct concern from encrypted communication.
CHRISTIE: Because this domain fronting thing doesn’t compromise the encrypting of Signal at all. What it does is allow countries or rather close networks that don’t want people to have access to, to not have access to it. Do you know what I mean?
AUDREY: Yeah. I don’t know. The reason that they’re shutting off the entire services is because they can’t see the communication.
CHRISTIE: Right. They’re related but just in terms of like looking for solutions. I think it’s helpful to just think about them as slightly different categories. I don’t want people who can access Signal now to think that somehow it’s less secure because of this issue.
AUDREY: Oh yeah, that’s definitely not the case. And there maybe…I mean, we’re talking about this from like what Signal can do. There may be ways that people inside say Egypt are finding, they’re finding ways to access that aren’t just about what Signal can offer.
CHRISTIE: Right. And I’m wondering is there a way to do the Whispernet thing as a more decentralized way. What’s the [inaudible] version of this, I don’t know.
AUDREY: I’ve seen a little bit of discussion of that but I haven’t looked into the details.
CHRISTIE: So Sarah Jeong tried to leave Facebook and couldn’t. The subtitle here I think is huge. It says Facebook is an emotional labor machine, and if you want to leave it, you’re going to have to start doing a lot of work.
AUDREY: I read this morning and it definitely connected with a lot of things that I’ve seen and had got me to reflect on how that emotional labor has been handled in my own life and in my family growing up, some of the things that she talks about, “Well, what did your parents do?”
CHRISTIE: Yeah, there’s a bit where she says we’ve probably forgotten or never noticed just how much work our parents did to maintain their social networks. It says basically, Facebook has picked up that labor. I really appreciate this, “It’s hard to pin down what Facebook is because the platform replaces labor that was previously invisible. We have a hard time figuring out what Facebook actually is because we have a hard time admitting that at least part of what it supplanted is emotional labor – hard invaluable work that no one wants to admit was work to begin with.”
AUDREY: Right. I mean, that is the most direct thing that you’ll see revealed if you suddenly stop using Facebook and a lot of people you know are still on there. That it leaves you doing all of the work that the platform used to do for you. And it changes your relationships because you have to go back to thinking about these as one-on-one interactions to some extent unless you all have a common gathering place where you catch up, whether that’s like a church or a bar or a soccer game. Without that, you have to think about these more as like one-on-one interactions and maintain them really actively in a way that platforms can just hype.
CHRISTIE: And we don’t have a lot of those [inaudible] places where people don’t…the amount of time people are spending at church and other places like that continues to decrease. And I think there’s something…I think there’s also the, you set up an event on Facebook or whatever and you broadcast it. That’s a high leverage there and you don’t have to do all those individual communications or whatever.
CHRISTIE: And I think that it’s not just the time cost in that but I think there’s an emotional or a fear of rejection or I think there’s a high emotional cost more than the time cost to just like calling everyone.
AUDREY: You mean like…so when I was in grade school if you had a birthday party, you had to invite everybody in your class? I mean, it could be gender segregated, like invite all the girls but it was just the expectation. And it’s much easier when everyone gets invited, nobody feels left out. And that’s why that was kind of a rule. So if you are having kind of a big open party and you put it on Facebook to all your friends, then you didn’t slate anybody. You didn’t make decisions about who’s in and who’s out.
CHRISTIE: And in reverse, too.
AUDREY: Yeah. And the alternative is that you do. You have to not just make those decisions and do that work but you all have to, I don’t know, like have a better social understanding of each other. Like if I don’t get invited to somebody’s birthday party, I have to think, “Is that because they don’t like me? Or is that just some other constraint that I’m not aware of?” But I have to think about it a little bit differently.
CHRISTIE: Yeah I miss out on a lot of things not being on Facebook.
AUDREY: When I deleted my account, there was a significant number of people who just dropped out of my life. And I’ve come to realize that there’s, I don’t know, that I’m not that upset about it anymore because I think the people that I am close to and that I care about, we work to stay in each other’s lives. And I don’t know that we need thousands of weak ties in that way. Like I don’t know that that’s actually beneficial. I think part of things they did that people talk about that’s been studied aren’t using Facebook is because we sort of over amplify those weak ties. And instead of people that you just sort of see once in a while and casually, we become kind of over-involved in things that we’re not actually that invested in.
CHRISTIE: And maybe shouldn’t be. I don’t mean that in a judgmental sense but just like…I think that all makes sense. This article also got me thinking about other successful products that have monetized or taken the place and therefore monetized emotional labor for us. And I actually think that my…I’m still thinking this out. But I actually think that might be a common thing in some of the platforms that have really monetized open source communities. But I think Facebook isn’t the only thing that has done this.
AUDREY: Oh yeah, definitely.
CHRISTIE: I think it parallels to like what GitHub is doing.
AUDREY: Okay. Just to go back to the personal and family history. For various reasons, there weren’t a lot of people in my family that did that kind of emotional labor very effectively. But my grandmother, my mom’s mom, was really the person that kept it all together. And what she ended up doing was creating a newsletter like an actual physical newsletter. And she took submissions for it every month and they’re printed and mailed it out and then it became kind of more of an e-mail thing. But she found a way to do that without it having to be…she also spent a lot of time on the phone and wrote a lot of letters. But that wasn’t the only way that we got information. It wasn’t just sort of a web of telephone calls. I think that that’s really funny especially because there’s a certain set of us that have gone back to doing newsletters, have gone back to the personal newsletter. And I would be delighted to see groups of friends congregate around that, like our family newsletter that had a list of birthdays in the back every month and some reports on things that family members had done together and some running fiction.
I remember she had a rule that you had to write the first three parts of a serial [inaudible], you couldn’t just write part one. Yeah, just some very silly fun writing. I don’t know, the personal newsletters I see now, I think, are a little bit in that same vein.
CHRISTIE: We should make an app for that.
AUDREY: Well that’s why everyone is upset about the idea that TinyLetter might go away or get integrated into MailChimp because I do think that we need those kinds of tools and that at least in a world where most people have to work full time jobs, we aren’t going to find the ability to do all of the interpersonal social labor that keeps our communities together.
CHRISTIE: Right, because we’re stretched pretty thin. Most households have both parents working. Most of the adults in households these days I think are generally trying to work full time or some version of it.
AUDREY: And for me, to keep it manageable, to keep track of things, I do send out a weekly newsletter to friends and I do let Apple remind me when people’s birthdays are and try to email them something. I don’t send cards usually because that’s more work and I never get around to it. And I also have a thing of trying to think of somebody I haven’t seen every week and make a plan with them.
AUDREY: And that’s been really good for me the last year to think about it that way too, like who’s in town that I haven’t caught up with lately.
CHRISTIE: All right. So you wanted to talk about Jive Software. What is Jive?
AUDREY: So Jive was a company that created kind of an internal social networking tool. Hold on, let me open the link.
CHRISTIE: So things like Yammer or that kind of stuff?
AUDREY: Yeah, I’m opening the article. I don’t remember if it says specifically what companies use this. But they came to Portland, they weren’t actually founded here. They came to Portland in 2004. And 2004 in Portland, there weren’t a lot of big tech companies. And honestly, Jive wasn’t a big company at that point but it was able, especially as they got funding, to be a little bit of a bigger player to have a bigger impact. And so this article about how their office is finally closing. It’s been sold to a company. They’re closing everything down. People walked out with the microwaves. Anyhow, the article reflects on kind of the industry effects that it had in terms of the growth, the speed of its growth, and the various managers and product managers and things like that that came out of it and came to influence other companies. And the thing that I realized that I knew that wasn’t in here is that Jive, even though they weren’t really an open source company, they had an impact on the sort of open source/open collaboration aspects of what we were doing here because when they came to Portland, they said, “We don’t actually know Portland. We don’t know anybody, so don’t we host things. Why don’t we sponsor things?” And they really like just put themselves out there and hosted the meet ups that led to having our first BarCamp and we’re really supportive of some of that early growth in the mid-2000s.
CHRISTIE: Nice. Yeah, I think I was not super cognizant of that at the time.
AUDREY: When did you move here? What year?
CHRISTIE: Later in the year.
AUDREY: So when they were getting that first investment, yeah.
CHRISTIE: Right. And then we had the recession.
AUDREY: But I think that’s about when I remember them kind of at their peak of involvement and hosting. They were hiring a lot. Obviously, tech companies sponsor because they’re recruiting. That was a big factor for them too. Like I said, realizing that they didn’t have the ties to the community, so they did something that not every company does in terms of looking for ways to support and sponsor what was going on around them.
CHRISTIE: I think they were hosting stuff up until their acquisition. I don’t know if they continue to do that until they decided to close. I feel like they had a big exodus right after they got sold.
AUDREY: Yeah. And there was something about…
CHRISTIE: Monitoring software?
AUDREY: Yeah, the monitoring software that the company wanted to use. But yeah, I knew several people who worked there. They hosted user group meetings, the BarCamp organizing started off with just a monthly meet-up for people who thought that running an unconference for Portland to have kind of a big collaborative unconference would be cool. And they give us beer and snacks and that fueled a lot of it.
CHRISTIE: Yeah, it just goes to show the importance of that kind of community involvement by companies. I just wish it weren’t so tenuous but I guess that’s pretty standard unless you specifically build an institution to do this kind of community work. And even then, that’s really hard.
AUDREY: Yeah, maybe it’s okay to see this as transitory.
AUDREY: And if other companies can learn from it, learn how they feed their community from it then that’s good.
CHRISTIE: Cool. Anything else with this?
AUDREY: I just wanted to say that I appreciate the role that they had in Portland. And it’s just interesting to see the shape of that.
CHRISTIE: The article on the same page is this, because apparently everybody does that now, is Oregon’s tech industry appears to have plateaued.
AUDREY: Yes, I see a bicycle down. Oregon’s software industry hasn’t provided the same comparative advantage, the state that the hardware sector did an earlier era. I know they have new numbers on this but this is not actually a new story.
CHRISTIE: Right. I was going to say I feel like people are perpetually disappointed in Oregon…
AUDREY: …the growth of our software industry relative to the previous growth of our hardware industry?
AUDREY: Yeah, that’s something I’ve been hearing for 10 years or so.
CHRISTIE: Because we like our lifestyles here.
AUDREY: Not because the economy has changed drastically in the last 20 years. And you may not be able to expect what happened in the 1980’s and 1990’s to be reflective of what’s happened since 2000.
CHRISTIE: Okay. Things we like on the internet this week, what have you got, Audrey?
AUDREY: I have a HazMat app.
CHRISTIE: Hazardous materials?
AUDREY: Hazardous materials. So I went to training this week. I have probably much on the podcast before that I volunteer with Portland’s neighborhood emergency team. And we’re a group of trained disaster responders. We mostly focus on earthquake response because Portland sits at this great confluence of fault lines and there is someday expected to be a giant subduction zone earthquake that will rattle Portland and overwhelm the ability of professional responders like the fire department to handle what happens. So there’s a group of us and we go through training. They give us all an orange vest and a hardhat and we pick up our pry bars and get ready to go. One of the things that we don’t cover in our basic training is hazmat – hazardous materials in very much detail because understanding what to do with different chemicals is a fairly complex thing. It’s applied chemistry. So I got to go to this all-day training this week that was sort of that first level or next level of awareness of what different kinds of materials we might encounter and how to find out what to do about them. So there’s a guidebook that every emergency response team, every fire department has. And you basically look up either by the name of the substance or the number code that goes on the truck, what it is, and it gives you first data information that tells you how far to evacuate. Not everybody can get a copy of the book necessarily but you can download a free app.
AUDREY: So if you ever look at a rail car or a tanker truck and you wonder what is in that, there is a standardized system for this.
CHRISTIE: And what to do if it tips over and spills.
AUDREY: Yeah. The book, I really appreciated that. One of the things I love about emergency management and emergency response is that it is assumed that you can’t remember everything but what you can remember is your process. So maybe I don’t know exactly what to do if diesel spills. But I can remember the process of I pick up the book, I look up diesel, I follow the instructions. And so there’s a lot of materials created with that structure.
AUDREY: Yeah. And I figure there’s somebody out here. Somebody listening who will enjoy being able to look this stuff up.
CHRISTIE: We’ll link to that. I guess this is the thing I like. It’s something that amuses me because I feel like the internet provides ample opportunities for me to question what I know.
CHRISTIE: So this week, Audrey, the acronym IMHO, what does the H stand for?
CHRISTIE: Oh my God! You’re kidding me! You’re an honest person? I know you’re an honest person. Okay.
AUDREY: It’s an old Usenet thing.
CHRISTIE: I’m on team humble. So does that mean you never use…what is that? So that probably means…did you ever use the IMNS…the “In My Not So H Opinion”?
AUDREY: No. I don’t think I did.
CHRISTIE: Right, because I know it wouldn’t make sense.
AUDREY: And I am [inaudible] anyhow in my opinion. But the humble, I don’t know, it always seemed like false humility to me or maybe a little bit ironic.
CHRISTIE: I think you could say some other thing about saying honest.
AUDREY: Sure. They were just going for blunt, like sort of [inaudible] your blunt opinion.
CHRISTIE: But then you say TBH.
AUDREY: To be honest?
CHRISTIE: Yeah, I think that might be a later acronym. Anyway. Because at first I was like, “Oh, maybe it’s the younger people who think it’s honest.” And then people are writing back to me like the wider range, “No.” “Maybe it’s people’s relationship with the internet.” They’re like, “No.” Okay, all right. Anyway, I haven’t looked at the BuzzFeed polls since they closed it. But when I last looked, honest was winning by like a lot which is very confusing to me.
CHRISTIE: I’m looking to see if they closed it. Where is the poll? Oh my God, there’s too much stuff on this web page.
AUDREY: I think that’s Buzzfeed in general.
CHRISTIE: Yeah, I don’t know where the poll is. I’ll have to dig it up. They did a Twitter poll too.
CHRISTIE: So anyway, that’s my…
CHRISTIE: I think that’s our show this week. Thanks everyone for listening. Thanks Audrey for joining.
AUDREY: For commenting.
CHRISTIE: I’ll talk to you all again soon.
CHRISTIE: And that’s a wrap. You’ve been listening to The Recompiler Podcast. You can find this and all previous episodes at recompilermag.com/podcast. There you’ll find links to individual episodes as well as the show notes. You’ll also find links to subscribe to The Recompiler Podcast using iTunes or your favorite podcatcher. If you’re already subscribed via iTunes, please take a moment to leave us a review. It really helps us out. Speaking of which, we love your feedback. What do you like? What do you not like? What do you want to hear more of? Let us know. You can send email feedback to firstname.lastname@example.org or send feedback via Twitter to @RecompilerMag or directly to me, @Christi3k. You can also leave us an audio comment by calling 503 489 9083 and leave in a message.
The Recompiler podcast is a project of Recompiler Media, founded and led by Audrey Eschright and is hosted and produced by yours truly, Christie Koehler. Thanks for listening.