
Design Better live in Austin at the UserTesting THiS conference
Loading summary
Jason Giles
Traditionally, if you request a research project or have to reach out to another organization to get an answer, it's a lot of friction. It could take weeks, maybe even longer. And that idea that it's right at your fingertips, I think that really could change the behavior of how companies are making decisions.
Aaron Walter
Jason Giles is tuned into the habits of successful product design teams, not only because he's been leading them for 15 plus years, but also because his team at User Testing makes essential tools used by top design teams around the world. And Jason thinks of collaboration and the process around it. And his team is similar to a jazz band where improvisation and exploration go hand in hand. And that piqued our curiosity to learn more.
Eli Woolery
Jason joins us today for a special live episode recorded on stage in Austin, Texas at the User Testing Human Insights Summit.
Aaron Walter
This is Design Better where we explore creativity at the intersection of design and technology. I'm Aaron Walter.
Eli Woolery
I'm Eli Woolery. And you can learn more about the show and listen to our conversations with guests like David Sedaris, Eileen Fisher, the band OK Go. And Pixar co founder or Pixar founder ed catmull@designbetterpodcast.com.
Aaron Walter
Jason Giles, welcome to Design Better.
Jason Giles
Thank you guys. So nice to be here.
Eli Woolery
Yeah.
Aaron Walter
Welcome to our studio here.
Jason Giles
Studio here. I know.
Eli Woolery
Just so y'all know, this is Aaron's actual office. He happens to have a really great record collection featuring a few things here.
Jason Giles
So I saw the Mingus up there. That's pretty slick.
Eli Woolery
Well, speaking of jazz, we've been talking with you for a little while about the work that you and your team are doing. And this is top of mind for us too because we just interviewed this amazing jazz musician, Kamasi Washington. So what we talk with you about is that you take a jazz band approach to your work with your team at User Testing and that favors this flexibility and learning through doing. Why do you think this model works so well for you and your team?
Jason Giles
I've done this a few times with a bunch of different other companies and traditionally I think about this around how do I scale insights across the organization? I'm a big fan of people that are making the day to day decisions actually getting that direct customer feedback. So it's part of what I believe in. But I've tried a lot of different approaches and effectively what I see is there's two major approaches. There's what I call the orchestra model. So these are really good for maybe larger organizations, a little higher risk aversion and that requires like kind of training individual designers or PMs to actually be like really proficient at user research. And that's great. The other end of the spectrum is the jazz band. And that's good for environments that might have a culture of learn by doing, maybe have a higher appetite for a little bit of risk, maybe sometimes make some mistakes. We've played with multiple different approaches. I'm very thrilled with kind of where we've landed, but we're continuing to riff. I mean, this is jazz we're constantly refining.
Aaron Walter
So as a student of jazz and having talked to a lot of jazz musicians, one thing that we've noticed is that jazz bands, they're very generous and they create a lot of space for others to collaborate, kind of explore on their own. Presumably, like if you're taking this less structured approach where more people can be involved in the research process and the design conversation, that's going to have cultural implications. How do you see that changing the culture of a team?
Jason Giles
For sure. And sometimes there's outright resistance to it to have that kind of flexibility. One of the things that we've done is we build like even to who we hire. It's part of the hiring process. When we're bringing in a designer, we set that expectation that, hey, we're going to expect you to do this. For the designers and the PMs in their career model, part of what is expected as a competency is to be able to get their own customer feedback. So that kind of helps from a structural perspective. You're bringing in new folks with that attitude. But then there's also the aspect of, you know, some of our research folks that weren't familiar with this, there's a little hesitancy as well. And it's like, hey, let's give it a go. I think the key thing that for those guys is to really focus on why. And we have some amazing researchers, big throbbing brains. I don't want to apply them to evaluating prototypes or doing usability. We've got big problems. And so often the researchers are spending a lot of their time on stuff that just doesn't have the highest impact. So when I frame it in that way, that tends to get them on board, at least to try it. And then as you start seeing results, then that kind of clears the way.
Eli Woolery
Maybe you could talk us through a little bit of the risks and the benefits of this improv approach. Imagine you get broader team involvement, but maybe it comes at a cost, perhaps less rigor or certainty. You mentioned larger companies that might be more risk averse might have trouble with this. Approach. So tell us how you balance all that out.
Jason Giles
So I've been at user testing for five years, and when I joined, this idea of enabling non researchers to get feedback wasn't new. The founders did that as they built the original product. And so I came in and was like, okay, this is great. But what I quickly learned was as they looked into design decisions that were being made, like, that doesn't look right. And I had access to the tests and as I cracked them open, I was seeing biased questions, I was seeing faulty analysis in some cases. It was almost like they were weaponizing the research. So clearly just handing a bunch of instruments to a room of kids and expecting that you're going to make something good wasn't probably the right approach. And in fact, we went to the other extreme. Then we went really into kind of this more documentation and structured where we gated activities behind certifications. And that also didn't really give us the results that we wanted. So we kind of pulled that back and now we've gotten back into this place where it's like, there's enough guardrails and we're able to have oversight and see the results. But, you know, it took some calibration to get us to that right place.
Aaron Walter
So I founded the UX and research teams at mailchimp. And I wish we would have had user testing back then because a lot of what we were trying to do, it would have made things easier. We used hacks like Evernote to get all of our data into one place and we could search. And what I noticed was when we brought more people into the research process, when they could ask crazy questions and get an answer, that caused more people to ask questions about the product. Talk to us about how that works at user testing and how that influences innovative thinking.
Jason Giles
What we find is that the more folks that are involved in the process, the more committed they are either to solving the customer problem or around just having more engagement and including the customer as part of the thinking. One of my. And this happened over the years, but there's these certain moments where I'll be in a meeting and I'll hear like a tester or a QA person or a developer, and there'll be a discussion. He'll be like, well, I remember seeing in that customer interview that the customer did this, this and this. And like, to me, that's this huge milestone. I'm like, yes, we're driving that kind of customer centricity through the organization. And they're excited and they're proud and they're more motivated to actually deliver the right experience. It's kind critical in developing that culture because if you're just keeping it within the context of your design or research or product team, there's just so much opportunity. We talk about customer focused companies. You need to think about the entire company, start from the middle and work out.
Eli Woolery
So a part of the vision that's clear from the demos that were shown earlier is this idea that you can centralize your research in one place, whereas before it was very fragmented and you might not even be aware that a study was done over here. Over here. With this new system, you can essentially tap into research from across the company, across many years. So how do you see customers mining that research and making it actionable even if they're not on the research team per se, or they're maybe in adjacent roles?
Jason Giles
Yeah, I mean, knowledge is power. Just being able to open up, access and be able to ask questions and knowing what's available. I know for my research team we get asked the same questions a lot. And so to be able to just have a self serve interface where folks can kind of come in, but then also just starting to build that muscle of an organization of knowing that answers about your customers are just a click away. Traditionally, if you request a research project or have to reach out to another organization to get an answer, it's a lot of friction. It could take weeks, maybe even longer. And that idea that it's right at your fingertips, I think that really could change the behavior of how companies are making decisions.
Aaron Walter
So one thing that was alluded to in the product keynote today was that, and have heard this so many times in product teams, that's qualitative data from 10 people. How do we know that that is factual, that this is enough to invest our time and resources into developing this or solving this problem? When more of the company has access to qualitative and quantitative data, do you see any change in perspective on how, say product managers or engineers or even executives who might have a more analytical.
Jason Giles
Approach to their work?
Aaron Walter
Do they think of qualitative research differently?
Jason Giles
It's a good question. I mean, I think the role of a designer or researcher or product person is you want to get a solution out to market. What does that require? It requires influencing people who make the decisions. Traditionally in a business you're going to have a lot of folks that are very data oriented. They're about the numbers, they feel comfortable. They've been trained in understanding quantitative methods to couple that with stories. I'm a designer by trade I'm a big kind of qual heart guy. I like the stories because it brings that data to life and I think it's really the powers in the coupling of those two together. I mean I remember the first time I was at Microsoft, we had the formal labs and I'm watching over the glass and I watch a test participant start to cry because my design is so bad. That emotional response, I still remember what she looks like, you know, and these customer stories that we see. This is the power of the video and the story and the storytelling that we all do as designers. And now couple it with the data and you probably depending on who it is, you lead with one or the other, depending on who you're trying to influence and really support. It's okay.
Eli Woolery
We've all made people cry from the designs that we've created over the years.
Jason Giles
I found out I'm not alone. Yeah.
Eli Woolery
So we've been running these series of workshops with companies. Some of them are tech companies that are actually working, working on artificial intelligence and generative AI. One insight we're getting is just that even on those teams they don't have a lot of time to play with these tools. And part of the exercise we run them through is just mapping AI to different parts of the design design thinking process. And people often highlight risks or maybe opportunities where they want to use it. And one thing that often comes up is this idea that maybe eventually the AI will be smart enough that you can throw a design at it and it can give you more qualitative feedback and see where the errors might lie. Now I think you may have already answered this question with your last story, but I'm curious your thoughts because it's clear that user testing over its history has a very human centered focus. You're actually watching humans interact with products, getting real time feedback that's not just quantitative but very qualitative. Like are they frustrated, angry, delighted. So how do you kind of make that argument that yes, AI is a great kind of co pilot and tool but we have to keep humans as part of the loop in the design process?
Jason Giles
Yeah. I mean at the end of the day it's human centered design. That's what we all believe in and like it's how we need to be delivering products. And you can't take the human out of that either with testing with real people or by having the discernment, the creativity that a human's going to have in ideating solutions or really having and understanding. That said to superpower our staff with these capabilities that you know, it's not going to be long before oh, by the way, the video happened to notice that this person, even though they said this, were uncomfortable or they were showing these things that used to be sci fi. Now we're realizing with the advancements of technology, things are moving really quickly. So from my perspective, I'm seeing bionic researchers that are just, you know, super powered. And then at the same time, I think maybe it was Michelle who talked about this yesterday, just applying it to all the time consuming stuff that nobody wants to do anyways. So I'm quite excited about, you know, as a designer. I'm, I'm a skeptic by nature, but I've seen enough and I see the path forward and I'm really, really excited about what's coming down the pipe.
Aaron Walter
So one of the challenges of winning the research battle, of getting more people in the company involved in IT and asking questions, is that research teams are bombarded with questions and they sort of get like up to their eyeballs and tasks and they just have to start saying no. That can be frustrating and it can keep the team focused on all these short term. What are we doing this quarter? What's our KPI for this? And you know, if you lose sight of the big picture thinking, that is one of the great powers of a research team. Looking at horizon two and beyond, how do you think about balancing those? Like, let's work on refinement and sanding off the edges of a product, but let's also keep our eyes to the horizon of changes in culture, changes in devices that could totally disrupt a business.
Jason Giles
Oh, 100%. I mean, we were just talking about how technology is changing so fast. It's interesting because we've been looking and talking to a lot of folks in this room about attitudes around AI and I will tell you, six months ago it was highly skeptics. And we'll see a recent study that we just did. We're seeing sentiment of like, oh yeah, well, we expect there to be an automated summary for this and this and this. I mean it's happening quickly. So that's all to say, you need your research team focused on changing behaviors and where things are going for us to do that. We do the our scale program where we set a goal that 80% of any evaluative research is done by a non researcher to allow our research team, which is what they are focused on now, on roadmap planning and changing behaviors. And it was really exciting because in March we do an operations review. Who's doing what Tests and what type of tests, and we hit 81% in March. So I threw that out there just as kind of like a high level goal. It was pretty exciting to know that now our researchers are spending so much of their time on the stuff that really is impacting the business at a larger scale.
Eli Woolery
So part of the potential, as we've been talking about for these products is to democratize the design and research that happens within an organization. At the same time, you know, researchers go through some amount of training to understand how to effectively present insights or, you know, build a survey or whatever the task is. So how do you train people that are outside your research team to contribute without diluting the quality of the insights?
Jason Giles
I mean, this kind of goes back to the orchestra versus jazz approach. So an orchestra and I think Caleb, awesome dude from US bank yesterday, was talking about their really awesome structured program. It reminded me of like, oh, that's a great example of an orchestra approach. On the jazz band side, we do tend to focus on micro learnings because we're not a huge team, so we can personalize it a little bit. And so it's like teaching kids how to learn music. Start simple, celebrate, wins, and then build out from there from a quality perspective. We've set up a cadence and a structure so that I would say that the biggest unlock that we had versus other failures that we've had before. We have a really strong relationship between our header design and one of our product leaders. They meet bi weekly to review all the questions that are coming in together. They do a triage of what is the tactical research great? Or we already know that we don't need research or here's the big strategic stuff and we're going to have the researcher take the lead on that. Not only is that great from kind of making sure the right people are looking at the right questions, but they have visibility into the results. So our research team, they're meeting twice a week. They see all the tests that are running, they're doing the coaching. There's little pairings between the researcher and multiple design teams. So every designer and PM knows who the researcher is, but there's visibility into what happens and there's course corrections. That was a leading question. Okay, let's work on that next time with this particular designer. So it's a little bit high touch. This is why I love the jazz approach. We figured out, oh, you hit a wrong note. Okay, let's refine that a little bit, but let's just keep going.
Aaron Walter
Anyone who plays jazz knows there's no wrong notes. It's just something you hit on the way to another note.
Jason Giles
Yeah, as long as you're not making a big business decision on that wrong note.
Aaron Walter
Well, to all of our guests, we always ask, what are you reading, Listening, watching. That's interesting and inspiring outside of your work. Because we're more than just our work. We're curious in other ways.
Jason Giles
Yeah, we are. I will say so. A couple years ago, I moved to Edinburgh, so I work out of our Scotland office. And we have a very geographically diverse team. And I should have read this book five years ago, but I didn't. I finally read it just recently. It's the culture map. It was just a game changer, not only to understand how people are processing information. I love managing. I love being a leader and designing teams, but understanding the nuances between just the geographic differences, but also why? Because the German education system teaches people to think like this. Oh. Because the Spanish, you know, because of their history. Anyways, I'm sure you probably have all read it. I'm like late to the game.
Aaron Walter
I have not read it. This sounds fascinating.
Jason Giles
It is excellent.
Aaron Walter
Is anything influenced by the weather? I have this running theory that our dispositions are largely shaped by the weather.
Jason Giles
You know, they doesn't touch on that. I don't recall that being a mess. But coming from LA and then moving to Edinburgh, I can tell you my disposition position has changed dramatically based on the weather. So for sure. Great.
Aaron Walter
That's a great book. Anything you're listening to or watching, I.
Jason Giles
Wish I had something like really exciting. You know, What I've gotten back into is so I'm a fake drummer by heart and I rediscovering tulle and I'm.
Aaron Walter
Just, oh, we got tool fans in the audience here.
Jason Giles
I telling you, just. And I'm also late to the game. I've got vinyl now. So that's what I do in cold Scottish weather is I come home, pour a whiskey, put some tulle on, and that's my life.
Aaron Walter
Jason, it sounds like you're winning at life.
Jason Giles
You know, I'm not going to complain.
Aaron Walter
Yeah, fantastic. Well, thank you so much for joining us on Design Better for this live episode. And thank you wonderful audience for being here with us.
Jason Giles
Thank you guys.
Aaron Walter
This episode was produced by Eli Woolery and me, Aaron Walter, with engineering and production support from Brian Paik of Pacific Audio. If you found this episode useful, we hope that you'll leave us a review on Apple Podcasts, Spotify or wherever you listen to finer shows or simply drop a link to the show in your team's Slack channel designbetterpodcast.com It'll really help others discover the show. Until next time.
Design Better: Bonus Episode Featuring Jason Giles, VP of Product Design at UserTesting
Hosted by Eli Woolery and Aaron Walter
In this bonus episode of Design Better, hosts Eli Woolery and Aaron Walter sit down with Jason Giles, Vice President of Product Design at UserTesting. Recorded live at the UserTesting Human Insights Summit in Austin, Texas, the conversation delves deep into effective collaboration models, democratizing research, the integration of AI in user testing, and maintaining a human-centric approach in design.
Jason Giles introduces the analogy of collaboration models by comparing traditional team structures to orchestras and his preferred method to jazz bands.
Jason Giles [02:16]: "There are two major approaches. There's what I call the orchestra model... The other end of the spectrum is the jazz band. And that's good for environments that might have a culture of learn by doing, maybe have a higher appetite for a little bit of risk, maybe sometimes make some mistakes."
Giles explains that while the orchestra model suits larger, risk-averse organizations requiring highly trained designers and product managers, the jazz band approach fosters flexibility and improvisation, encouraging teams to learn through experimentation.
Aaron Walter highlights the cultural implications of adopting a jazz band approach, emphasizing generosity and collaboration within the team.
Aaron Walter [03:52]: "Presumably, like if you're taking this less structured approach where more people can be involved in the research process and the design conversation, that's going to have cultural implications."
Giles responds by discussing the challenges of implementing flexibility, such as resistance within teams and the importance of structural support during hiring and onboarding to instill a collaborative mindset.
Jason Giles [03:52]: "For the designers and the PMs in their career model, part of what is expected as a competency is to be able to get their own customer feedback."
The conversation shifts to the benefits of centralizing research efforts, making insights accessible across the organization.
Jason Giles [08:43]: "Knowing what's available... the idea that it's right at your fingertips, I think that really could change the behavior of how companies are making decisions."
Giles emphasizes that reducing friction in accessing research results in more informed decision-making and fosters a customer-centric culture throughout the company.
Eli Woolery raises concerns about balancing broader team involvement with maintaining the rigor and certainty of research findings.
Eli Woolery [05:06]: "Maybe you could talk us through a little bit of the risks and the benefits of this improv approach."
Giles shares his experiences at UserTesting, highlighting the challenges of ensuring quality while empowering non-researchers to contribute. He notes that striking the right balance involves setting guardrails and providing oversight to maintain the integrity of the research.
Jason Giles [05:22]: "We kind of pulled that back and now we've gotten back into this place where there's enough guardrails and we're able to have oversight and see the results."
The discussion explores the potential impact of artificial intelligence on user testing, questioning whether AI can replace the nuanced insights derived from human interactions.
Eli Woolery [11:28]: "One thing that often comes up is this idea that maybe eventually the AI will be smart enough that you can throw a design at it and it can give you more qualitative feedback."
Giles advocates for a hybrid approach where AI serves as a co-pilot, enhancing the capabilities of human researchers rather than replacing them. He underscores the importance of human discernment and creativity in the design process.
Jason Giles [12:34]: "You can't take the human out of that either with testing with real people or by having the discernment, the creativity that a human's going to have in ideating solutions."
Aaron Walter addresses the challenge of research teams being overwhelmed with ad-hoc requests, potentially diverting focus from strategic, long-term projects.
Aaron Walter [14:38]: "How do you think about balancing those? Like, let's work on refinement and sanding off the edges of a product, but let's also keep our eyes to the horizon of changes in culture, changes in devices that could totally disrupt a business."
Giles shares UserTesting's approach to mitigating this issue by enabling 80% of evaluative research to be conducted by non-researchers, allowing the core research team to concentrate on high-impact, strategic initiatives.
Jason Giles [15:49]: "We set a goal that 80% of any evaluative research is done by a non-researcher... our researchers are spending so much of their time on the stuff that really is impacting the business at a larger scale."
Ensuring the quality of insights from a democratized research process is crucial. Eli Woolery inquires about training methodologies that maintain high standards.
Eli Woolery [16:14]: "How do you train people that are outside your research team to contribute without diluting the quality of the insights?"
Giles explains that UserTesting employs a high-touch, personalized training approach akin to teaching music in a jazz band. Regular bi-weekly meetings between design leaders and product leaders help triage research questions, ensuring that only valuable and relevant insights are pursued.
Jason Giles [16:14]: "We do tend to focus on micro learnings because we're not a huge team, so we can personalize it a little bit... We have a really strong relationship between our header design and one of our product leaders."
In the concluding segment, Jason shares personal interests outside of work, highlighting his love for music and how relocating to Edinburgh has influenced his lifestyle.
Jason Giles [18:23]: "I've got vinyl now. So that's what I do in cold Scottish weather is I come home, pour a whiskey, put some tool on, and that's my life."
The hosts wrap up the episode, encouraging listeners to engage with the show and share their experiences.
This insightful conversation with Jason Giles offers a deep dive into modern approaches to product design and research. By embracing a jazz band model, democratizing research access, integrating AI thoughtfully, and maintaining a strategic focus, organizations can foster a more innovative and customer-centric culture. Giles’s experiences at UserTesting provide valuable lessons for design professionals aiming to enhance collaboration, creativity, and efficiency within their teams.
Notable Quotes:
Jason Giles [02:16]: "There are two major approaches. There's what I call the orchestra model... The other end of the spectrum is the jazz band."
Aaron Walter [03:52]: "Presumably, like if you're taking this less structured approach where more people can be involved in the research process and the design conversation, that's going to have cultural implications."
Jason Giles [08:43]: "Knowing what's available... the idea that it's right at your fingertips, I think that really could change the behavior of how companies are making decisions."
Jason Giles [12:34]: "You can't take the human out of that either with testing with real people or by having the discernment, the creativity that a human's going to have in ideating solutions."
Jason Giles [15:49]: "We set a goal that 80% of any evaluative research is done by a non-researcher... our researchers are spending so much of their time on the stuff that really is impacting the business at a larger scale."
This comprehensive summary captures the essence of the conversation, highlighting key discussions and insights shared by Jason Giles. It serves as a valuable resource for anyone interested in enhancing their design and research practices.