Preventing your product becoming a tool for abuse, with Eva PenzeyMoog

Audio (mp3: 85.1MB, 35:23)

Published: 12 August 2021

First, enable no harm

Gerry Gaffney

This is Gerry Gaffney with the user experience podcast. My guest today is a principal designer at 8th Light, where she designs and builds custom software and consults on safe and inclusive design strategy. She is founder of the Inclusive Safety Project. She’s also worked as a domestic violence educator and rape crisis counselor.

Her book Design for Safety is recently out, and that’s the reason that I wanted to talk to her today.

Eva PenzeyMoog, welcome to the User Experience podcast.

Eva PenzeyMoog

Yeah, thanks so much for having me, excited to be here.

Gerry

I’ll remind listeners that as always, a transcript of this episode is at uxpod.com.

And I’d also like to alert listeners that this episode includes references to violence and abuse.

Eva, you write that ‘the harsh reality is that abuse and violence is a commonplace occurrence within families, friendships, workplaces, and intimate relationships.’

Why do designers need to be mindful of this?

Eva

Yeah. Because, you know, domestic violence is such a commonplace thing, but it’s something that in most societies is something that we just don’t really talk about. And that includes at work, but it’s pretty much a guarantee that our users, some of our users are going through some really, really intense situations in terms of their interpersonal relationships. But it’s something that we’re not really taught to consider or think about when we’re designing the products that they’re going to use. So it’s important to consider these, these really sad, harsh realities and make sure that the tech that we’re creating isn’t going to become yet another sort of tool in the arsenal of their abuser.

Gerry

Yeah. And you describe really a landscape of how technologies currently are used for abusive and coercive purposes. Can you tell us a little bit about those uses or rather abuses of technology currently?

Eva

Yeah. So I’ve kind of throughout writing the book was able to break this down into three sort of main categories. Two of them are pretty straightforward.

There’s you know, issues of location data that lead to stalking. Most people are pretty familiar with that. Although there are lots of really sneaky ways that this actually plays out in terms of the tech that we use. And it, you know, it’s a lot more than just like slipping a Tile or some other tracking device into someone’s purse or ca,r there’s, there’s all these really sophisticated ways that people can use Google Maps or different things that have any type of location data in them and share them, you know, sort of with themselves, either sneakily or covertly or just out in the open to stalk their partners.

The other one is surveillance, which has always been a thing you can use cameras and different things to remotely surveil people.

But the rise in sort of everyday home surveillance technology has made this problem so, so much of a, of a big issue, especially in the domestic violence context where, you know, someone’s able to surveil their victim while they’re at work, while they’re away. And that person is not able to have privacy, especially when it comes to reaching out to their support network or things like seeing a therapist much less trying to contact a domestic violence shelter or get that like very real and essential support.

And then the third thing, which is a little trickier to get at is issues of control when there are multiple users. And this one comes across in so many different ways. So for example, you know, shared bank accounts are a big one. Financial abuse is like 99% of domestic violence relationships there’s an element of financial abuse. It’s really, really common. And in a lot of ways, shared bank accounts are enabling one person to take control over another because they actually give one person more control often. But then this also plays out in IOT devices a lot, so internet of things where ostensibly there’s multiple users, you know, every adult in the home has access to the Nest or whatever it is to change the temperature. But actually one person is taking control over the device and using it to surveil, harass, gaslight, all these different things, the other person.

Gerry

Yeah. And it’s, it really struck me. I mean, the term gaslighting, I think derives originally from that movie Gaslight from 19, whatever it was. [1944.] But it really applies so much to things like the Nest device. And when you describe a situation where somebody has been turning up the heating on an, on a former partner, because they’ve still got access or turning it down, or basically, you know, abusing the technology to exert that sort of control and I guess, you know, be deliberately cruel and abusive, it’s just, there’s a huge opportunity there isn’t there within the internet of things?

Eva

Yeah, definitely. There’s so many, there’s so many different opportunities and the gaslighting is one that is still like, you know, a sort of relatively new term in the grand scheme of, you know, the history of domestic violence, the ability to make someone really trust, like mistrust their own experience in their own mind and like the sort of psychological torment that goes on with that. Not being able to, to say like, no, I had this experience and I know that’s real. You know, the ability for someone to take that sense away from you is very much being enabled by these products.

Gerry

And one of the things that struck me very forcibly when I was reading the book was when you talk about consent you know, you say that most technologies and most apps I guess assume that once consent has been given the problem is solved, but you know, you point out that that’s not necessarily the case and you say ‘just as consent, isn’t a given it is also not a constant.’ Can you tell us a little bit about, about those ideas?

Eva

Yeah. There’s so much about consent. And you know, how anti-sexual assault practitioners think about consent that I think designers and technologists should borrow. The thing about you know, consent in the past, not equalling consent in the future is something that, you know, is pretty basic in terms of, you know, healthy sex practices. But I think we should definitely be borrowing it. So an example of that is like sharing your location with someone on Google Maps. Like it was maybe safe in the beginning, you know, abusers don’t, they don’t start off abusing on the first date because they wouldn’t to get very far, the person will just kind of end the relationship. They ingratiate themselves. They make themselves, like a big part of someone’s life and, you know, build up a lot of dependence and emotional attachment and good feelings. And then they kind of start to start to amp up the abusive behaviour.

So it might’ve been safe in the past to share your location with a significant other. And then, you know, fast forward a few years, things are a lot more abusive. It’s no longer safe to do that, but they don’t remember that they’ve done this. And that’s just one example. And Google has attempted to fix this a little bit. They send an email after 30 days with a summary of people who you’ve shared your location with. Although a lot of people report not getting that, I’ve tried to figure out exactly how this works and it’s kind of unclear, like I think there’s some other stuff going on in who actually gets this because like, I was just talking to a friend the other day who told me we were talking about this exact thing. And she told me that she had left, she had told her friends who she had plans with like, oh, I just left the house, be there soon. And her friend was like, no, you didn’t. Because she had shared her location with her, like years ago. She didn’t even remember like why she had done it, but she, she got like totally caught. So there’s, you know, and she didn’t have to, she never has gotten this email from Google. So I’m not quite sure how it works, but they are, they’re making an attempt, but I think this is the type of thing where like, why not like once a week or even every few days, like have the person reconsent like, Hey, do you still want to be sure in your location? You can say yes or no, it might be a little tedious. But I think that, that small amount of like annoying a user is, is definitely worth the trade-off when it comes to their safety.

Gerry

And you do in a few instances in the book, you point out specific tactics from a UI design perspective that that designers could use. And I can’t remember whether it’s Google Maps or Apple Maps, where you show the little blue bar at the top of the screen that says, Hey, you’re sharing, you’re currently sharing your location with whomever.

Eva

That’s yeah. Google maps does that. If you’re here actively using the map to like to get directions someplace, and then you go somewhere else, you close the actual app. But it’s still running, it’ll show you that little bar across the top. And it’s like, yeah, that’s great. Just have like some sort of omnipresent design when you’re doing this stuff so that people always know. I think that would be, that would be the really ultimate in terms of safety and probably the most, also the most annoying for people who are just trying to share their, their location with their partner. And it’s not a domestic violence context, but again, I feel like for me at least that the safety pros definitely outweigh the cons of annoying the user.

Gerry

Yeah. I can hear, you know, product managers around the world, sort of saying, we’re not going to, valuable real estate has been consumed by this, what they would call an edge case. I mean, are these security and safety concerns, edge cases, do you think?

Eva

Yeah, absolutely not. They’re definitely not edge cases. You know, in the U S it’s one in three women, one in four men who have been victims of serious physical domestic violence. And when we broaden that out to be including things, physical, emotional, psychological violence it becomes even more common. And I was just looking this up. It’s the same in Australia, essentially. You know, there’s not many places in the world where there significantly less domestic violence. There are definitely places where there’s better resources and a culture that supports people more, but it’s, it’s a problem all over the world. And it’s, it’s so common. Like I was saying, those statistics are really startling. Like we can assume that like around a third of our user base has been through this or will go through this and we need to be planning for those people.

Gerry

In the movie Blade Runner Harrison Ford’s character Deckard says ‘Replicants are like any other machine, they’re either a benefit or a hazard. If they’re a benefit, it’s not my problem.’ But you’d reject that approach from a design perspective, I guess.

Eva

Yeah. I love that reference. Yeah, first of all, I feel like that’s, it’s such a reductive sort of statement, like, oh, it’s good or bad. You know, and in the movie, that’s not the case. It’s not so simple and definitely in design, it’s not so simple. I think, you know, with, with one exception, which is stalkerware, which is you know, secret monitoring software, you can put in someone’s device that serves, it doesn’t really serve any legitimate purpose other than spy on them. Other than that, like all of the tech products that I talk about in the book and sort of focus my work on are, are a mix and they’re mostly good. Like they, they serve these really useful purposes. And when used in the right context, they’re incredibly useful and very helpful in people’s lives. But, you know, it’s, it’s never that simple as being purely good or purely bad because abuse cases aren’t taken into consideration. And then they’re also sort of just passively enabling this really, really awful stuff.

Gerry

Going back to surveillance, which you’ve mentioned already. And at one point in the book you write, ‘There are very few cases where a secretly monitoring the digital activities of others is safe, or ethical,’ and that ‘we can start from a place of assuming that surveillance is unsafe and unethical rather than from a mindset that it’s a benign industry standard.’

I mean, that’s really interesting, I guess, in the case of you know, parents monitoring their children, I mean, you would off with parents monitoring a baby, right. And there’s a baby monitor in the room to check that the baby is just doing okay or crying and so on. But then like when, once you’re monitoring them, as they progress through life and monitoring their arrival at various locations that you’ve predefined and then, you know, monitoring their usage, isn’t that a norm, isn’t that what parents do with their teenage kids?

Eva

Yeah. It is. It is becoming much more of a norm. And I get into this a little bit in a book, but I’m certainly not an expert, but I’ve read a lot of sort of expert, there’s a lot of expert psychology in this space discussing like the impact that that has on, especially teenagers who, you know, psychologically, it’s so important for them to have some level of privacy and be able to explore their identities. And having a parent sort of watching very closely, every step of that process can have a really big impact in terms of the development of a teenager. And then I’ve also come across instances where it doesn’t, it doesn’t necessarily end when the, when the teenager becomes an adult, like reading about college students whose parents are still monitoring their location and noticing when they leave campus and saying like, why are you leaving campus on a Thursday night?

Like, shouldn’t you be in your dorm? And you know, this person is 18 or 19, but their parent ishelping with tuition. So they’re kind of like, well, I still get to, I still get to do this because I’m still this like important part in your life. So, you know, it doesn’t even necessarily end once the person is an adult, these, these sort of problematic surveillance things tend to continue on. And yeah, and it can be really psychologically damaging and then there’s also, you know, it can be legitimately unsafe especially in terms of LGBTQ kids and youth. And I talk about this in the book about youth who are being outed, you know, before they’re ready. And it’s especially dangerous when they’re being outed to a family that is not supportive, has biases and it’s gonna, you know, do something about it. And there’s, there’s the risk of being kicked out of the home and homelessness is among LGBTQ+ youth is a really big problem. There’s disproportionate amount of homelessness, and it’s largely to do with parents being unsupportive and forcing the teenager to leave the house. So in these situations, it’s really important for the teen to be able to sort of like explore this stuff in private, without their parents discovering what they’re doing.

Gerry

I guess, even in a situation where the parents might be supportive of, you know, the gender diversity or whatever it is to do with the kid. I mean, it should be up to that person to choose when to disclose that and when to, when to discuss that with their parents and their family members.

Eva

Yeah. That’s a really good point. Yeah. Very true. Like these things are so personal and get, even with supportive family, you want to be, to choose the right time, that feels right to you and do it and do it in the way that feels right to you and have some agency over it. And if you’re a parent just kind of sees your Google history or sees that you’re chatting about this stuff that takes away so much of their agency and the ability for them to come out how they want to come out.

Gerry

I’m glad you used the word agency there because it was kind of my next question or topic perhaps is, I guess, most people feel that their device, particularly we talk about a mobile phone or a cellphone. Most people see that as being very much associated with their own person. And then there is a really strong association between that and an individual’s sense of agency. And once we start to surveil we take that away, don’t we? Or we certainly damage that.

Eva

Yeah, definitely. Yeah, you’re really right. People really see like your own personal cell phone as like an extension of you. And then a lot of ways it is because that’s the place where you’re, you know, putting your social media together, you’re researching different things. Like it’s no, you know, no two people have the same like app configuration, like it’s such a personal thing. And when people start to like get into that it really it’s just, yeah incredibly disruptive and takes away a lot of agency.

Gerry

To change topics slightly. There’s a bit of a, there’s a bit of a conflict between user experience design and design for safety. Isn’t there a times.

Eva

Yeah, there definitely is. Like I was talking about earlier, there’s I think a lot of times, if we’re going to design for safety, then we’re going to be sort of bothering some of our users in, in a way. Like I was talking about with making it clear who you’re sharing your location with, or, you know, having the person reconsent at regular intervals. And for a lot of people that would be like really frustrating. But I think like I’ve said before, it’s kind of, I feel like we have to center the most vulnerable users first to keep them safe. Even if it bothers some of the other users.

Gerry

And of course any user can become a vulnerable user, right, due to change of circumstances, I guess.

Eva

Yeah, totally. Yeah. And it, yeah, right. It can flip, like you could be the person who’s annoyed and then suddenly you’re like, thank God that this alerted me. And something I want to point out that I really liked as an example of this is lots of modern cars that have the sort of like fancy interface in the middle. There’ll be something when you first turn on the car that says something about essentially wearing seatbelts and not driving distracted. And it usually kind of fades away after a minute. And if you want to like right away, be able to, you know, use the GPS or do something, you have to kind of click a dismiss button. And I know like I see this a lot when I borrow my mom’s car, it has this, I’m always like, oh my God, it’s so annoying. Like just let me just let me turn on Taylor Swift already. And I hit the dismiss button, but it’s kind of like, yeah, that’s, that’s a place where I think the designers of that interface have made the decision to try to prioritize safety and reminding people that it’s really important to not drive distracted and to wear their seatbelts with, you know, the sort of very minor cost of annoyance. And they’ve kind of made that trade off. So there’s, there’s a lot of precedent for this kind of thing.

Gerry

Talking about cars, you have an amusing or perhaps salutary story about a friend of yours who practiced a Tesla and the ownership situation. You want to tell us about that because I think it’s quite interesting.

Eva

Yeah. It’s so interesting. And it’s so like sad in many ways. And it’s not an abusive relationship. It’s two friends of mine who, the guy is like such a, he’s very into Teslas and he’s wanted one for ages and then their car, you know, is kind of on its last legs. And so his partner is finally like, all right, fine. Like, we’ll get the Tesla, you’ve wanted one for so long. And then they use her iPad to reserve, you know, they put the $100 deposit. They do that with her iPad. And then, you know, months later when they get their cool new Tesla they find out that, they add the apps on their phone because it’s all controlled through an app. And she has like way more sort of power, and she has some extra features that he doesn’t. And he’s like, what the heck? Like we got this for me. Like, I’m the one who was like so excited about this. And she’s like pretty indifferent to the whole thing. But because they used her iPad and her Apple Pay, in the background without ever telling them that this was happening much less, giving them a chance to change it. It assigned her as a sort of primary user. So she has the power to add her husband or kick him out which, obviously she adds him and, and it’s fine, but he’s very frustrated because he wants to be the sort of power user. He’s the one who is so excited about the Tesla and there’s no, they’ve, they’ve searched like far and wide for a way to fix this and they can’t find anything, like apparently Tesla doesn’t really have a customer service department or like comes and goes according to things I’ve read online.

Gerry

Or it’s on Mars.

Eva

Yeah. That’s probably it. But yeah. And then, so this is a situation where, you know, I think the, the safety issue is like pretty clear, even though it’s not playing out in this relationship is a healthy one where it’s not a factor, but, you know, in an abusive relationship, being able to cut off someone’s access to the car that you share is obviously gonna be a really, really big deal. But this is, I think a really good example of when I say that centring the most vulnerable people, the most vulnerable users is going to make a better experience for everyone. Like if the safety was considered in this aspect and they made it so that actually, if a couple is sharing the car, they both get access to all these features, it would be a lot safer, but it would also help a lot in this situation with my friends where there’s not a safety issue, but there’s this sort of other thing going on.

Gerry

The book includes several examples of tech that has enabled abuse or harm from Tesla, which you just mentioned to the Strava fitness app. And of course, Facebook, and you’ve got some particularly harsh words for Amazon, whom you say is ‘leading the way with abuse-ready surveillance products.’

What does this all say about where we are going with 21st century capitalism?

Eva

Yeah. I’m so glad you’re giving me the chance to talk about capitalism and Amazon in particular, because I think, yeah, I am, you know, I’m pretty harsh on Amazon. I think like most of the companies in the book, I sort of feel like like they do what they, you know, like Strava, for example, like they did, I have an example about how they were facilitating stalking. And they got called out online and then they, they changed it, they fixed it and it’s sort of like, okay, that’s good. Thank you for being responsive. Wouldn’t it have been great, if you could have figured this out, like upfront and just avoid it, the bad PR and the safety issues. But Amazon, I feel like is a company that has, does not really do that. They don’t, they don’t really prioritize fixing things until it becomes like a much, much bigger deal and there’s some like safety stuff going on, which we saw with the Ring cameras, not requiring people to create a unique password, which just enabled hackers to get in. And there’s these really creepy videos of people hacking into Rings in children’s bedrooms and talking to them. But yeah, it’s, it’s, Amazon is rough in particular. And I think it is, I guess, really illustrative of sort of where we’re at in the current, I guess like health-scape of late stage capitalism is just so rough, especially when it comes to tech. And the fact that we’re sort of, you know, the approach right now is like, let them regulate themselves, which we’ve seen over and over again from other industries just does not work. And I have something in the book and I’m really obsessed with the history of the seatbelt and sort of car safety as, as a guide for this about like the auto industry in the fifties and previous to that was just allowed to sort of self-regulate.

And it was like really, really, really bad. It was so many people dying unnecessarily, cars were so unsafe until the 1960s, Ralph Nader, you know, had some activism around it and wrote a book and it started to change. And then it took a bunch of laws for things to change. So I feel like Amazon is sort of very similar to the auto industry, you know, pre 1960s where it’s allowed to self-regulate and that result is just really dangerous for users. And there’s, there’s a prioritization of profits over user safety. And the result is that the users are just being harmed so much. And it’s also preventable.

Gerry

Yeah. I, I guess you know, well it’s well and truly discredited now, but Facebook had that motto of move fast and break things or whatever the expression was.

Eva

Yeah. Right. Which, yeah. What a way to guarantee that things are going to be dangerous. Yeah.

Gerry

You do have a section in the book specifically about designing for at-risk communities and what we’ve spoken to so far is about design in general, I guess.

And, and you have said, and I don’t have the quote to hand, but you say if your app can be abused it will be abused, which, you know, it was quite confronting in some ways. But you do talk then about designing for at-risk communities. And writing about designing for vulnerable groups you reference traditional ways of engaging with users, the sort of participatory design and engagement that we would undertake. And you say, ‘When designing a product specifically for a marginalized group, the extractive nature of this process is exploitative, unethical and unjust.’ Would you like to talk a little bit about that for us please?

Eva

Absolutely. This, so I want to shout out this. A lot of what I learned from this comes from a book called Design Justice by Sasha Castanza-Chock. And it’s a great book. But they talk a lot in the, in the book about how these sort of mainstay common design practices that especially, you know, UX designers use can be very exploitative. So for example, like engaging the group that is going to use the end product and sort of having them help you come up with the solutions and, you know, really define the problem and understand the problem to find the solutions. And then you sort of, you know, say, okay, thanks, bye. And then you, you make the product and you sort of package it up and then you sell it back to them for a profit. It’s one thing when, you know, that’s like a social media app or something that, you know, people are using as part of their business at their work or whatever.

But especially when it’s, when it’s a group that’s marginalized in the first place it can be very exploitative because you know, you’re not giving that group any agency and the product, you’re not giving them control over the outcomes and you’re not giving them any share in the profit that gets generated. And often, you know, all those things are only possible because you got the solutions from this group in the first place and, you know, not, not participating with the impacted group at all, it’s usually not going to get you very good solution. So it’s, it’s, it is really, really important to work with the group at hand. But I think a lot of times, if you don’t do it right, it can be very exploitative and, and yeah, Design Justice kind of like gets into this in a much deeper way, but I wanted to include it in the chapter about tech that specifically for at-risk groups, because you know, if you’re a vulnerable group, especially, you know, domestic violence survivors have already had their agency taken away over and over and over again in so many ways. And then it’s really important that, you know, a designer who means well isn’t just sort of re-producing that, that experience. And just being another person who sort of taking away their agency.

Gerry

Tell us about your Process for Inclusive Safety.

Eva

Yeah. So this is the Process For Inclusive safety is the way that I think sort of all the different practices that I’ve used over the past few years to try to get at this stuff sort of packaged up in a easy to digest process that designers and teams can use sort of as, as needed. They might not need all the steps or it might not make sense to do all of it all the time. But it’s a way to sort of conduct research into the issue, to brainstorm novel abuse cases that haven’t been captured elsewhere, but are possible with your product, to define, you know, who the abuser is and who the survivor is. And, and then to test out your product to make sure that you are preventing the abuser from reaching their goals and then helping the survivor reach their goals.

And it has very specific hours estimates as well as very specific instructions on like, when you might do this in your process, and this was really important to me for like a couple of reasons. First, because stakeholders are often going to be resistant. So there’s also some tips on like how to, how to help convince reluctant stakeholders, how to show them that this isn’t going to be just like a huge time suck. Like I think the most you might spend is like a week and it, obviously, again, it depends on the company and the context and how big the product or feature is, but it’s not going to be, it doesn’t have to be this like huge, huge time suck and hugely expensive add-on to the process that you’re already doing.

And then it also gives very specific instructions because I was always really frustrated with like, okay, I’ve learned about how tech enables gaslighting and then the sort of advice is like to just ‘consider gaslighting.’ And I was always like, when, like when do you do that? What does it look like? I can’t consider gaslighting like eight hours a day. You know, that’s just not a realistic solution. So it was really important to me to provide these like very specific things that designers can take away.

Gerry

You mentioned a week there and you talked about the amount of time. And I was surprised when I looked at the model that I can’t remember what the baseline number of hours is, but it’s only something like 12 or 18 hours or something like that, for the baseline version, right?

Eva

Yeah. Right. Like if you want to do the sort of minimum amount, like still, still do all the steps, but you know, maybe you limit your discussion time to two hours instead of six for like brainstorming novel abuse cases or you limit your research time. It doesn’t, it can be, yeah, it’s a little, I think maybe 18 hours is what it is, but it’s not, it’s not a lot. And being able to, I think being able to like, show your stakeholder exactly like where this is happening and exactly what you’re doing is gonna make them more able to say yes to it, then something more open-ended.

Gerry

Now given that UXpod has got a bit of a bent for sci-fi I’d be remiss if I didn’t ask you this question - what’s a Black Mirror brainstorm?

Eva

Yeah. So this is one of my favourite activities. And a lot of other designers have done similar things, but it basically, you’re trying to sort of write an episode of the show Black Mirror. So you’re thinking about, you know, the tech at hand, like what is the most like out of control, just really ridiculous, tt doesn’t have to be realistic form of harm that might come from this piece of tech?

And these are actually like usually really fun. And I always tell people like, it’s okay to have fun. Like we’re engaging with really intense stuff. But the Black Mirror brainstorm is it’s usually where people, there’s a lot of laughing and really interesting conversation going on. And then, you know, that’s, that’s a really good jumping-off point for the rest of the brainstorm, which is, you know, okay, let’s like reel it in a little bit. Let’s, let’s use this as a starting point to think of some more like realistic forms of harm that might come from this product. But starting with a Black Mirror brainstorm is always like a good way to get the juices flowing and to have some fun,

Gerry

Besides reading this book, which I think is a very important book, what concrete steps can designers and technologists take to ensure that they are giving you weight to avoiding harmful use of their products and services? Sorry. I know that’s kind of a big question.

Eva

Yeah. So I think being aware that, you know, you don’t know everything and I think, and this is really important. Someone was asking me the other day, like, how do you, how can you know that you’re not making assumptions because I have a whole chapter about assumptions. And I was like, I don’t think, like there’s no silver bullet to be like, okay, I am now assumption free. Like that’s just never going to happen. Or like I have now mastered designing for safety. Like I certainly don’t feel that way. So I think like embracing the fact that it’s going to be a lifelong thing and that you’re never, you’re always going to have to kind of be on your toes and looking out for like, what is the next thing that I’m not even thinking about? Like I’m sure there’s, there’s a lot of stuff out there that’s gonna come out over the years that we aren’t thinking about now that we’re going to be like, oh my God, I can’t believe we were designing without thinking about this stuff. So just having that sort of self-awareness of like, I know that there’s a lot that I don’t know, and I need to be open when it comes to learning about it,

Gerry

When you’re talking about assumptions there’s a quote from the book you say, ‘When it comes to designing products for sensitive groups, every foundational fact that seems to be a given must be researched and validated,’ which I thought was interesting.

Eva

Yeah. I think, yeah. I mean, there’s certainly a tendency. And so I guess I’ll just speak to my experience as a White person. You know, you want to be inclusive of other races, so you’re like, okay, we’re going to make sure that we’re designing this to be inclusive of Black people, for example. But then it’s, it gets really tricky because then like, especially if you’re designing for vulnerable users it’s really important to be like, okay, but you know, Black people are not a monolith. Like they’re women and men have very different experiences and then LGBTQ Black people, that’s going to be different from the experience of an LGBTQ White person, because there’s a lot of cultural things, you know, the way that the government might respond is going to be very different. So there’s, it’s important to not make assumptions like, oh, well, I understand, you know, LGBTQ people as a White person, but actually like, no, it’s going to be really different with different race groups.

So, you know, understanding that you do have those assumptions and then saying like, okay, well, I need to validate that it’s actually going to be the same. And then through that research, you would expose that like, actually, no, there are all these sort of different cultural things going on and different ways that, you know, the state responds to, to people of different races who are LGBTQ. So going through that and, and sort of understanding that you have a bias and, or an assumption, it doesn’t have to be so bad as a bias, but then being able to sort of have that self-awareness like I was saying earlier that you need to just do some research to figure out if that’s true or not, so that you can of identify the ways that you’re making an assumption.

Gerry

You’re saying in the book that you’d like to see tech oriented educational spaces incorporate, you know, some sort of coverage of designing for safety in their curricula. Any sign of that happening?

Eva

Not, not yet so much. I am hopeful. I’ve been, I’ve been talking to some like university instructors and professors about this stuff about how to get this content in front of people because it would be great if, you know, before you were even in the industry, you’re talking about this stuff. And I’ve been talking to someone at a bootcamp as well. I think there’s, there’s a lot of interest in incorporating this stuff. So I am hopeful that that’s going to happen. And I just, I hope that it, that it just becomes a norm that people are learning about this stuff because it is, yeah, I mean, obviously that’s a bias I have, I think it’s very important and should be taught along with like, you know, design thinking and user research interviews and all that kind of stuff

Gerry

I’ll remind listeners that Eva’s book is called Design for Safety. And I think it should be an essential in any designer’s library.

Eva PenzeyMoog. Thanks for joining me today on the User Experience podcast.

Eva

Thank you so much for having me.