Graffiti - the word "Rules"

Product Research Rules: An interview with Aras Bilgen and C Todd Lombardo

Gerry Gaffney Uncategorized Leave a Comment

Download (mp3: 85.4MB, 37:20) Working as a team to improve the effectiveness of your product research

Share this episode



Photo: Delete

Transcript

Gerry Gaffney

This is Gerry Gaffney with user experience podcast.

I have two guests today. One has led the experience design and front end development teams at guarantee. BBVA managed digital product teams at Lala flora and monetize and worked as a UX planner at Intel. He teaches experience design courses in Kadir has multiple universities in Istanbul.

My other guests has worked as a scientist engineer designer, professor and product manager. He’s the founder of product camp Boston, and currently leads the product and design teams at openly in Boston.

Aras Bilgen and C Todd Lombardo. Welcome to the User Experience podcast.

C Todd Lombardo

Thank you, Gerry. It’s great to be here.

Gerry

I should confess to listeners that at this point we only have C Todd and we’re going to hopefully splice Aras in later on.

And I’ll remind people that as usual, there’s a transcript available at uxpod.com.

Now the recent book that I invited you guys to talk about is called Product Research Rules. How did you come to collaborate on this book?

C Todd

Yeah, great question. Aras and I met when I was brought in as a consultant for BBVA and I think I did some facilitation work and some of the, I guess, design strategy work with them and Aras and I met. And we hit it off. We very clearly had a similar mindset and a similar approach to things.

And then we met a few years later at Mirror Conference in Portugal where we were both speakers. And I think we were talking at the speaker dinner about some things and the topic came up of like, what happens before you do a design sprint? Is there some… what’s the inputs? And can we, you know… I wrote one of the design sprint books, the Google guys had their own. And was there something before that, that we could think about? And, and I think the, the host of the Mirror Conference, they ran a bunch of design sprints with their clients. And so I think that was the topic of conversation. And so that kind of is what we were talking about and Aras and I were like, yeah, there’s something here. Like, I don’t know if we’ve, you know, well, you do this way. I do this and this is what I’ve done, what have you done? And so we knew that there was, we were doing things, but we didn’t necessarily have it codified in a way that was elegant enough for folks to maybe use on a more repeatable basis. So we started to think about it a little bit more and as we started to do that, we’re like, well, this is starting to become like this like list of methods.

And we didn’t really want to write that book. I said, well, you know, I got a publisher, if you do want to write a book, but I don’t know if this is the kind of book that they would want us to write or would we want to write. So after enough conversations around it and stuff that we were both seeing was that 1, there needed to be this, there was a missing piece in the conversation around, yes, you have user research. Yes you have a market research and yes, you have your product analytics and there’s plenty of books that are going deep on all of those. And we didn’t want to write, you know, a book that was in-depth about that. We wanted to write something and say, how do you bring all these together? Especially for somebody who may not be a UX expert or a marketing expert or a product expert. And from the product lens, you have to actually understand enough of all three of those to make good product decisions. So that’s kind of how we came up with it and, and started to work on it.

Gerry

Now, usually I highlight parts of a book that I’m reading and often that will guide the discussion that we have on, on conversations like this. But I had so many highlights in this instance, there were so many little gems in the book. So I thought what we’d maybe do is sort of list the nine rules since that’s the way the book is structured in any case. And talk a little bit about each of them, some of them will have to give less attention just given the time constraints. But before we get into that, we might say, we might try and define what we mean by product research.

Now this might’ve been one that Aras wanted to cover. So if so, we can just sort of skip it for now.

C Todd

I’m happy to talk about it and maybe you should ask Aras the same thing and see where the overlap is, but I kind of referenced it a little bit earlier in that there’s, you think about, like, there’s a lot of depth around user research and that’s understanding the behaviours of your target users, your targeted area, but then there’s like the market research and sort of framing what those individual behaviours do on a, on a somewhat of a quantitative level of like, all right, let’s look at our, what are, are there a bunch of these people doing a similar thing or have the same problem on a grander scale? And what scale is that? Are there competitors there? How are those competitors solving the solution or how are those competitors products doing? Do they solve a problem? Well, not well? And then there’s a third element of, if you already have a product, usually you have some kind of product in market, what are the analytics telling you? How do you digest that quantitative analytics and feed that back into not only influencing what your user research is but also what your market research and what your ultimate product is. So it’s trying to blend those three together is really what we’re trying to coin as product research, because it’s more than just user and market and analytics. It’s really the three together.

Gerry

I think another people who have UX backgrounds tend to be weak on the qualitative stuff,

C Todd

Qualitative or quantitative?

Gerry

Sorry, on the quantitative stuff. My, my bad, yeah. And Aras has just joined us. So we’ll just let him in now and tell him that we’re nearly finished. He should be, he should be joining us momentarily. I just admitted him. Hey Aras, have you joined us?

Aras Bilgen

Hi, Gerry. How are you?

Gerry

I’m very well. How are you?

Aras

Good to see you. Thank you. I’m doing good. Okay.

Gerry

We, we kind of did we kind of did the intros already so yeah, so that’s fine. And we were just talking about you know product research, and I just said to C Todd that in my experience, quite a few UX researchers are weak on the quantitative stuff. And at that stage you joined us. I don’t know if you want to throw in a, you know, your comment on, on that particular suggestion or not.

Aras

Okay. so the let me, let me step back. You’re wondering the, about the approach that the designers take when they’re talking about quantitative versus qualitative research.

Gerry

It just seems to me that that UX practitioners frequently are not as strong on the quantitative end of things. I don’t know if you guys agree with that.

Aras

I might have a mixed reaction to that. Some of them are, I would say some of them are extremely confident about quantitative methods. They know how to look at data. They know how to make sense of it. They know when there are certain patterns that they could not make sense. So they know when to work with someone who knows about statistics, who knows about data, but I would also say that there are some UX practitioners who feel that the numbers are only, only a part of the picture, and they may be putting a lot of, a lot more emphasis on the qualitative side of things that they insist on talking like, particularly are talking and finding someone and talking to people in situations where looking at the larger set of data would make more sense. So I’d say there’s there are people who are on either side of things.

Gerry

All right, well, let’s get into the rules themselves, which is how the book is structured. Now, some of them we’re going to sort of skim over, I think, fairly fast. And some of them hopefully dig into in a little bit more detail.

And Rule 1 is “Prepare to be Wrong,” which I like. I guess that’s largely about having the right mindset. For example, you write that “ego is the biggest enemy of product research.”

Who’d like to talk to that?

C Todd

So you thought, go ahead. You I’ve been talking to her a bit already. You can pick that.

Gerry

Yeah. Todd talks a lot, man. It’s amazing. [Laughter.]

Aras

All right. Thank you. I mean, it is also one of our favourite starting points. That is why it is featured at the beginning of the book. Failed research usually stems from someone saying, Oh, like I know better. I’ve been there. I have experience. Or even more or even worse some successful product teams think that because they have been right in the past, they will always be right in the future. And that is unfortunately one of the biggest things that create that ego in the in the team.

With that ego you fail to learn, you don’t have that open mind. You, can’t just you can just suspend your own thoughts and even acknowledge that there may be other possibilities out there.

When you are, when you know that you may be making assumptions, when you don’t have that ego, then you can step in, ask the question and then just listen. And then just like, look at what your users are saying, what your participants are saying, what your data is saying, and then make more informed and humble decisions out of that.

C Todd

Being a little bit more blunt about the whole beginner’s mind, right? Beginner’s mind is to have that open mind, we’re saying, look, you’re going to be wrong. So we’re a little more blunt about it.

Gerry

Yup. And I guess that that discussion of the ego trap leads into the second rule of fairly logically, “Everyone is Biased, Including You,” which I guess is a salutary reminder to all of us.

C Todd

Yeah, absolutely. And that’s, I think one of the hardest things is to call out bias and find ways to call out bias because I think inherently, we all think we’re not, but we are. And one of the, one of the best things to actually help reduce that is, is by actually having somebody else who’s outside of your team, come in and take a look. And we told the, told a story of Hope Gurion in that chapter around how she came in, stepped into a product team who was very biased and focused on just a very subsegment, a very small subsegment of their entire user base. And they were making these product decisions that were only focused on like really was 1% of their customer base. And wasn’t a very diverse 1%. It wasn’t very representative of the entire base.

So and she came in and just, you know, called out and we actually referenced a couple of studies from a couple of university professors groups basically validate that like when you, when you have can come into a situation with fresh eyes, you can say, Oh, I think you’re missing something over here. And that’s one of the ways we can help to reduce that, that element of bias, but also just even acknowledging that there are biases and you have bias, just that alone can be sometimes powerful enough to reduce some level of bias. I don’t think you’ll ever eliminate it completely.

Gerry

And I think admitting to it in, in public, if you like, or within the larger organization that you’re working for, it can be very empowering as well.

Rule 3 was, I thought a very you know, very substantive. You say “Good Insights Start with a Question.” So can you tell us what’s an insight and what’s a good question?

C Todd

You want to take that Aras?

Aras

Sure. to us and insight is a nugget that you can actually act upon, that you can carry out research or like very specific, you know, academic reasons. Or you can carry out research for something almost like a, you know, like local optima and, you know, optimization hack. We don’t want to lend in either side, we want to have something that can be useful for the entire product team, as well as something that would be useful for informing other decisions. And to be able to do that, you need to start your research endeavour with a question. Many teams starts their research cycles with, Hey, like let’s just check out what the users think, or they may be in love with a particular method. They might have read an article about the fact that everyone has personas. So they set out and say, Oh, you know, we need personas or we need like customer journeys, customer journey maps. Instead, you need to focus on what you want to learn, state that as an unbiased question, and then carry on the rest of the research to reach actionable insights

Gerry

To put you on the spot, can you give us a specific example of a good question?

Aras

Well, I will pick the teleconference, you know the remote work realm as an example.

“How do people manage their home duties while they are taking a video conference?”

That is an open-ended question. It does not ask about what are the challenges, because then we will only see the things that are, that are wrong. It doesn’t say what are the great features that we have given our users to manage their own lives? Because that’s coming from we want to be right. We want people to say, Oh, I love your product. So instead it just says, what is their experience? What are they doing? And then hopefully you will match that with an equal open mind and carry out that research.

Gerry

And this is a question that’s close to your heart at the moment. Cause you’ve got a two year old that you’re juggling in the house at the moment, I believe. Is that right?

Aras

Yes, exactly. I am hoping that he’s asleep and we’ll see throughout our chat, if he’s not.

Gerry

Yeah. Now you said during that answer, you know, we need we need a customer journey or we need personas or something, and this is something we see so often with clients, I think. So tell us about the output trap.

C Todd

Yeah, there’s definitely, we see this a lot and I think part of it is we are often trained in judged and even historically as an industry, it was judged on output, right? Manufacturing was, even today, their profits are based on how much output they create and can sell. And given the global market if you an produce more, you can sell more. And so more output is a good thing.

That creeps into even a lot of digital products type of developments where, Oh, if we can produce more things, that must be a good thing, right? So more features is better. More artifacts is going to help us understand and we’ll get more, better features by doing that. So I think there’s a macro trend of, or historical trend of we’ve output is a thing that we do, right? And I can understand it, in many senses it’s very, very valuable, but you wouldn’t want to miss the forest through the trees and only be focused on the output itself. You want to see, well, what does that output help customers do? What does that output help our teams do? So if you’re just focusing on personas and journey maps as outputs, what are they going to help your teams to do? Are they’re going to help your teams make better decisions and better decisions about what, what aspects are really important?

There may be some very thin slice of that journey map that actually is good to focus on because yeah, this is the problem area that we need to understand. Cool. Now those things are put in more context and can help deliver an outcome, which is, Oh, we can help the team make better decisions to make better products. Okay, cool. What does that mean for this particular product in general?

Aras’ example around managing young children while on a video conference. How can we think about you know, you can start to think about, all right, if you understand that context better, then we can think about better solutions for it to drive better outcomes. So that output trap, and I think there’s a, there’s a number of other I think angles about that. Like Melissa Perri wrote a book called [Escaping] the Build Trap, which talks a lot about that. Like you just go off and build stuff. There’s another book called Before You Code, which is like, Hey, before you even start coding anything, do these things to understand before you start coding. So there’s I’m glad to see there’s more and more of that out there, but yeah, we definitely wanted to touch on that here.

Gerry

Now I have to confess to a personal bias before I mention the next topic. I was amused that you have rather harsh words to say about NPS. You write that “NPS scores are at best worthless. At worst, they can be dangerously misleading.” Now I know that later in the book you’re a little gentler about NPS, but what’s wrong with NPS? I think Aras this is one that you wanted to touch on.

Aras

NPS has been around for a while and I believe that marketing around NPS is actually stronger than the academic evidence that shows that it works as the only metric. Again, there’s nothing wrong in asking people about what they think about a product or whether they are going to recommend it to someone, but that the NPS question is… it has so many assumptions built in and it can have so much variance that it may only give you reliable actionable insights in under certain conditions.

It’s far from being the universal single number to optimize for. And there are so many things that product teams unfortunately mis-associate with the MPS that they try to put many optimizations towards making NPS better. Well, that’s managing a metric, that’s managing an assumption and not necessarily maybe providing a better product experience.

Instead of NPS, we recommend teams to use maybe other alternative methods to answer the particular questions that they have.

Gerry

Now Rule 4 and consequently chapter four, I think, in the book is “Plans Make Research Work,” and this has great coverage of different research methods and when they’re appropriate. It also describes the screening process and recruitment process. But there’s one particular aspect that sort of caught my attention I guess. I was interested in your thoughts on note taking you know, when there’s a facilitator and a note-taker working as a team. I know personally I tend towards the extreme end. I take these incredibly detailed notes of everything and your advice is not to do that, but to be more organized and templated I guess in the approach, is that right?

Aras

If you are able to maintain your presence and your attention towards your towards your fellow participants and still take, like, you know, verbatim notes, you know, all the better. The focus there is not focusing on the note taking, but actually in the conversation itself. And in terms of progression working on a structured note taking process with certain templates, certain shorthand, shorthand notes even like skipping certain areas and knowing that you can always go back to the transcript will help you keep better notes. Will help you make that conversation a lot more close to the person that you’re talking to. And also it will make analysis better because then you’ll only have the pieces that are worth analyzing as opposed to things that you may not have a lot of value for.

Gerry

Now Rule 5 is one that’s pretty close to my heart. “Interviews are a Foundational Skill.”

So what makes a good interviewer? I think C Todd, you wanted to talk to that one.

C Todd

Yeah. I think one of the things is interviewing and listening are really, really critical. So one, it’s a little bit harping on the, you know, you got to have a good research question, but you also have to ask know how ask a good question, preferably an open-ended question and what I would call like a targeted or calibrated open-ended question, meaning it’s not so broad. Like for example, let’s take the the example that Aras mentioned earlier around conference calls and, and childcare, right. Notice that it wasn’t just about like, how do you deal with video calls or how do you deal with video or it wasn’t so broad as how do you deal with childcare. Those are incredibly broad and there can be a lot of elements around that. But the fact that it came a little bit more narrow and saying like, how do you manage video conference calls with young children at home? So there’s a level of focus there and any good interviewer can start to do a similar kind of thing and asking a question that’s a little more focused, but still open-ended enough that allows the person, I guess the participant to expound them on things.

And it’s also the once you’ve asked that question, it’s literally just, you know, shut up and listen, and then ask a question, a follow-up question based on what you just heard. And that’s the thing that I see a lot of, people might actually be able to create that good question, but then they’re not digging in and saying, Oh, they said something really interesting in that response, I want to dive in deeper. Maybe it’s not something that’s on your script, right? The ability to go off script and dive in deeper based on what something just, somebody just said is incredibly valuable. So you know, the opposite of talking is really listening, right? It’s not just silence, it’s you really have to be actively listening and then acting on, on what they say and reacting to what they say.

Gerry

It’s very hard for people in my experience. It’s very hard for people to develop the ability to be silent for, you know, because our conversational norms are that we fill in the gaps, right? And a new interviewer would typically start to speak before their respondent has actually finished talking about their, what they want to say.

C Todd

Yeah. Well, it’s exactly, it’s hard to do that and we do want to fill in the gaps and that’s where being okay with the slightly uncomfortable pause, slightly uncomfortable silence. Somebody will keep, will talk… And it’s amazing when you, when you let that pause happen, what somebody will say,

Gerry

Do you think anybody can become a good interviewer?

C Todd

Sure. I think it just, it takes some time in practice. Like anything we’re not, you know, we’re not born anything other than human and there’s a lot of potential we can do. So like anything, it can be trained and taught and, and evolve may not happen overnight. Certain people may have a propensity for it than others, but I think anyone could.

Gerry

Now you described different conversational styles in the book when you suggest that empathetic is the most appropriate for product research. I think Aras wanted to talk to this. Can you describe that type of conversation for us?

Aras

There are so many as C Todd mentioned, once you open your mouth as a human, you start different types of conversations. Some of them are super leisurely. You just say, Hey, what’s going on? How’s it going? And there’s really, you don’t have a goal other than just entertaining yourself and having good time. Sometimes, especially when you are in a product role, you open your mouth and you start selling your product. You say, Hey, I know that this is going to solve that problem. In the book we cover some of these, and these are the easy to fall into traps for people who are starting to talk to their users in the in the beginning. It is very hard to control our approach. And unfortunately there’s no one way to say, Hey, like this, this has to all stop, but eventually we want to switch from those conversational styles into an empathetic style where the only thing that we care about is what the users are feeling, what they’re thinking, what they have been doing and being open to their interpretations. Instead of trying to say, crack jokes to break that silence, convince them that our product is good, or even on the other side, apologize for their bad experiences. It doesn’t, being empathetic doesn’t mean that we have to have pity towards them. It doesn’t mean that we have to be their psychological coach. It’s just being there acknowledging their feelings and their thoughts and being open to their interpretations of the world.

Gerry

Rule 6 is conversation, it is “Sometimes the Conversation is Not Enough.” And in this chapter, you describe other techniques you can adapt, you can adopt rather, during interviews. Can you describe some of them for us? I think C Todd, you wanted to talk to this?

C Todd

Yeah, it’s I think Steve Blank talks about getting out of the building and I think this is a great thing and a little hard sometimes now in this pandemic era where you have to be much more remote in, in many senses, but we’ve actually learned ourselves, even in writing this book, we actually delayed a little bit of publishing because of the pandemic and said well, Hang on. Actually, we probably need to think more about this given this new way of the world that not, you’re not going to necessarily be able to travel and go to places and get out of the building as easily as before. So one of the things, you know, get out of the building, what does that mean? Right? I go out of the building then what do I do? Right? And it isn’t necessarily go straight down to your coffee shop and start interrogating the next person you see and interviewing them about something that may have no relevance to them. You know, there could be like ride-alongs, there a company that a colleague of ours worked with Michael Connors, he helped us with some of the writing and a lot of the design of the book. The design agency he worked for, they did a project with a trucking company and their designers basically spent a day or a couple of days riding alongside the truck drivers to understand their world. Like they basically just spent a day with the truck drivers, you know, driving around in the truck, seeing what that was like taking notes and understanding what was going on. Similarly in my past role at Machine Metrics, managing the teams there, I had one of my designers go spend a day with a machine operator in a factory. And he only did that, I think maybe twice. But the amounts and the qualitative insights were amazing because he spent an entire day gets to really understand what was their daily journey, how do they interact with our product, but how do they interact with everything else in the factory floor? And you start to realize that, you know, your product is only a very small slice of somebody’s entire day usually. And it’s those kinds of things where it just goes well beyond what an interview would tell you that can help you understand, better understand what your audience, your target audience, your target personas lives are like, and then can draw insights off of that. And it really starts to, it starts to maybe even add to those empathetic conversations that Aras mentioned earlier, really just trying to not be judgemental, not have pity on them, just really and better understand where they’re coming from.

Gerry

Yeah. I think you can get really profound insights from doing that sort of work that are just inaccessible to you in any other manner.

Rule 7. “The Team that Analyzes Together Thrives Together.” Tell us about this.

Aras

I’m sure there are many, many teams who have done good cycles of, you know field work or data analysis and came up with their final presentation. Only to be thrown back at the drawing board because someone who was listening to that presentation shared a very particular insight that no one actually shared before. But if that person was a part of the analysis, if that person even better was a part of the research itself things would have been a lot more different. You would have had a lot more rich conversations. You would have saved a lot of time going back and forth and probably, you know, their perspective would have given you a lot more insights to enrich other, you know, pieces of information that you gathered from that research. So doing this analysis phase together, you know, if doing research, if the field work is not possible doing at least an analysis part together is a secret weapon, it brings different perspectives together. It brings different realities of your products into the same room or into the same virtual space so that you can actually see the same data from multiple angles in a very, very short amount of time,

Gerry

And I guess, very closely to this is Rule 8, which is “Insights are Best Shared.”

And my experience is it’s very common for researchers to perform poorly at communicating their findings, which is kind of unfortunate. Any suggestions for how to approach this problem?

C Todd

I think Aras has what I consider one of the most brilliant ways of doing this. So I’ll let him talk about how he came up with this and it’s amazing. Like I was like, wow, what a brilliant idea.

Aras

So the, the problem that I faced when I was working more with the, at the design level was we would come up with a question. We would do the research. We would then come up with a few solutions. We would put them in a prototype and then, you know distribute them. And everyone would, instead of looking at the research report research presentation, they would click on the prototype and say, Hey, like, I didn’t like this without really understanding why we did it that way. So instead we chose to narrate the prototype. We chose to distil the research results into like super, super specific pieces maybe like so short that they would fit on like one or two screens. And we would put that at the beginning of the prototype. So whoever clicks on the prototype, they would first have to go through a version of our final research presentation and then start using the prototype. That becomes a very easy way to consume relevant pieces of research. And it doesn’t create overhead. It doesn’t create any additional maintenance over your research artefacts. And I’m glad that C Todd, you were able to use that on your work as well.

Gerry

Now, Aras, you also mentioned in a conversation previously that Facebook and Spotify teams have been using small museums. Can you tell us a bit about that?

Aras

Yeah. So in the book we mention that the word one of the worst ways of sharing research is to actually write a report. Writing a report is, is hard. Not many can do it well. And especially in this time where everything is quote unquote agile, where everything is super fast, people, especially people in the decision-making roles rarely have time to go through seven page reports. So instead we recommend doing presentations. And we don’t mean take your research reports, paste it into a PowerPoint document, and then send it as a PPT.

It’s actually a presentation, tell a story, share those moments, make sure that you’re actually reaching out to the audience that you’re presenting. Say if you’re presenting the same research to operations teams, to software development teams, to finance teams, that your presentations should be different.

The examples that I heard at the UXinsight conference the teams at Spotify, you know, coming up with museum-like experiences for sharing their research is an example to this. They wanted to share the experience of people who are listening to Spotify on their way to work, for example. So what they did was they created this like New York scene. They created this like Stockholm scene, which are very different than each other, and then have people actually experience how people would walk through that city in the rush hour. What are the other things that they are hearing outside their Spotify playlist? What is the, what is interrupting them? When do they have to pause? When do they have to stand between people? Those are the things that you would not get from a research report, those need to be more experiential. And it is very important to bring those aspects to your sharing so that your research becomes relevant. And actioned.

Gerry

Rule 9 is “Good Research Habits Make Great Products.” C Todd, how can a person or a team develop good research habits?

C Todd

Habit forming and habit formation is interesting and tricky and, and not tricky all at the same time in that if we want to form a new habit, 1 we have to have that motivation to be able to do it. But also there’s, we shouldn’t try to… you know, it’s like, how do you eat an elephant, one bite at a time. Don’t try to do the whole thing. Don’t boil the ocean. Right? Don’t try to do everything at once. Change one little thing, change one little cue. And then there’s I think James Clear talks about this in his book. I think it’s called Atomic Habits about finding these, you know, little things that can cue you and trigger you into doing something slightly differently. And you can start to build on those over time. I think we tell an example in the book of my colleague, Jeff Vincent, who at the time was working at Appcues, I think he’s moved on now, but him and another colleague, Tristan Howard, those two guys have basically set up this, this little, like, habitual research in that they would basically connect… I think it was like once a month, basically. I can’t remember exactly when during the month, but once a month they were, they were planning to do research. And I was like, I don’t know, like whatever, second Thursday or something like that that a particular day of the month that they would do this. And they would basically cue a way to, announce it to the whole team by Slack and any time a customer would sign up that Slack channel lit up. And of course with Slack, you have all sorts of fun little mechanics of like, Oh, I got a message. I can respond to it. I can give a reaction to it. I can comment on it, et cetera. So they, they kind of worked on using those mechanics of habits and for their own, just even starting with them too, to say Hey, we, we started this you know, if you want to learn what we’re doing you can check it out here. You can see this stuff. And there was in the Slack channel. Oh, and if you want to attend some of these sessions, you’re welcome. Here’s the Zoom link, feel free to hop on board, or actually it’s going to be down in the cafe. We’re going to have a couple of customers come in, you know, check us out. And so they were able to establish this. I think what James Clear talks about these four steps, there’s a cue, there’s a craving, there’s a response and a reward. And those are the things that basically form a habit. So in, in this case, I think with Jeff and Tristan in their cue was like the calendar month and the craving was like the Slack channel, creating the anticipation and the craving people like, Oh yeah, cool, this is great. It would generate some buzz.

And then the response was when, you know, all the different team members would comment and share their thoughts on it, or even they would share like, Hey, we’re going to do a just some basic generative interviews, or we’re going to be doing some card sorting sessions with customers. And the reward would be learning what happened after and, and what was the outcome of that research? And they would share it, you know, in a, in a similar fashion. So that was, that’s sort of how they created this habitual, you know, using the mechanics of an individual habit, but as a team. And I think it’s just one small example of how you can apply a similar thing in your organization.

Gerry

I’ll remind listeners that the book is called Product Research Rules: Nine Foundational Rules for Product Teams to Run Accurate Research That Delivers Actionable Insight. And it really is an excellent guide for any individual or team that’s looking to either up-skill or begin conducting product research. I thoroughly enjoyed it. And I would certainly recommend it, it’s very accessible and easy to read as well.

Aras Bilgen and C Todd Lombardo, thanks for joining me today on the User Experience podcast.

Aras & C Todd

Thank you. Thank you, Gerry, so much.

Gerry GaffneyProduct Research Rules: An interview with Aras Bilgen and C Todd Lombardo

Leave a Reply

Your email address will not be published. Required fields are marked *