hello and welcome to the bottom of skills podcast. I'm your host. My name is Mike
Parsons and I am the CEO of quality teams. And we have come to a special moment. Ladies
and gentlemen, we have arrived at the last part. I think it was like 14, 15 part series on design.
Thinking, Oh my gosh, it's been a journey.
And if you can believe this, the next series that I'm going to do for you is going to be on agile
ways of working. So no rest for the wicked here, but we're going to wrap this up with a episode
dedicated to how to share your product insights. Now, this goes to really one of the hot of the
biggest challenges we have when we're in [00:01:00
] teams and were designing, creating,
building, whether it be product or a service or an entire business.
Um, the truth really is, you know, We never really built a product on our own. There's always a
couple of people involved for at least half a dozen or a dozen. And as soon as there's two
people involved, there becomes a question of alignment. Do people really get, uh, what we're
trying to build? And as you've heard me talk about so much design thinking is not about
guessing what's going to be nice.
It's not trying to be the ultimate. You know, tastemaker with this fantastic aesthetic and just
picking something that is on a whim or just a result of your sheer brilliance. No, we love to work
hard and serve users test and learn with users. But once you've done all that hard work, there's
one final step.
And I'll tell you why. I think this matters so much. I have seen [00:02:00
] so many. Wonderful.
Creative designers, UX specialists, UI specialists. You name it. Struggling to share what's inside
of their head, what they may have learned with users, um, to the broader team and how the
classic one might be between a UX or UI, a BA or a developer who are all coming at the
challenge of the product.
In a slightly different way. So if you're doing any sort of design thinking, work, sharing your
testing insights is a key opportunity to get everybody on the same page. And what I hope to do
now is to help you tell your product story, and this product story is going to be rooted in user
insights, user testing, learning, and validation.
And one of the key things we can talk about is. [00:03:00
] How are we going to capture what we
learnt? How are we going to like put it in a vehicle and asset, how are we going to present it?
Now, one of the things you can definitely do is share your, um, output, uh, the sort of.
Conclusions that you can draw upon your research in updating your user personas, that you
should have started at the beginning of any project and they should continue to grow and evolve
throughout the entire process.
You should have something like a user journey map. Again, we've covered what that is in this,
uh, design thinking series. So just go to one of the older podcasts, you'll see it in the list, or you
might even have. Something like wire frames, which might help you kind of bring the product to
life. If you've tested well, you can update those.
If you want to get a little bit more strategic, you could use a user matrix, an affinity diagram, or
an empathy map. These are all classic [00:04:00
] tools of design thinking, gurus who are trying
to kind of capture the learnings and the insights. In a particular way. And you know, there's a,
quite a science and an art to picking the right one, but you should check out our master
firstname.lastname@example.org, where we get a bit more into those.
And we talk about how to do them. If I was to pick one thing, um, that would be really essential.
And I do this on every single product that I work on is. I would use a value proposition, canvas,
where you essentially outline what the pains are experienced by your users and what gains
they're looking for and they're trying to get done.
So that's all on the right hand side. And on the left hand side, you match map, correlate these
with pain relievers. And gain creators that define, determine and inspire the feature, set the
product set [00:05:00
] that you're working on. This value proposition. Canvas is so good
because it's incredibly simple, but what it does is it forces you to map, do your gains.
That your user experience, your user experiences, are you actually building gain creators? And
if the user experiences pain, are you relieving that pain? I love the simplicity of this. The value
proposition canvas for me becomes a simple essential tour at any executive or designer or
anyone in between can refer to an ask.
Are we getting it right? Do our features. Mapped to the jobs to be done. Do the gains really get
created and are we relieving the pains? The value proposition canvas for me is a great exercise
because it makes your, makes you really work on the stuff that you learned in your user testing.
Um, and it really makes you ask yourself, Hmm, have I really validated the gains and the
And it provides you with a really good baseline now. One of the things that you might like to do
either before or after your value prop canvas is you might've conducted a big research and
everybody might be really eager to learn, like, okay, what did we learn from this? Now, for
example, I'm working on a product and we've already done.
Two rounds of quantum surveys, two rounds of qualitative interviews. And we're about to do one
of the first user tests where we're going to test the, uh, the first designs of the brand new
products. Now what's essential is to make sure that whatever we can to get of this upcoming,
uh, test is, um, we need to find a way to tell the story.
And so I've got a three part. Approach. It's a bit of a model that you can break down your
presentation into and. [00:07:00
] What I love about this is it has a very clear separation from
data from insights and recommendation. And, um, the reason that I use this is it really helps, uh,
create clear outcomes and it helps everybody stay on track on why those outcomes are so
important, why those next steps are important.
So let's dig into this three-part model of presenting. Your design and product ideas. First of all,
data for me, a piece of data, great example is 22% of our customers are aged 18 to 24. So
when you present data, you try to be as black and white, as clear and as binary as possible and
specific. Um, and. You don't have to draw any insights or recommendations from this.
I quite like the idea of just passing out the raw data. [00:08:00
] If it was a survey, each question
has a clear outcome, a clear table or a chart that tells us the data. So just present. Data first,
then you have the interesting job to do is you can say, well, what is an insight? Um, and you
might say, well, it's fascinating that, uh, East coast customers have a much stronger preference
for mobile in this age group than those on the West coast.
That's actually. A really, really powerful insight because you might need to go and validate, uh,
why there's such a discrepancy because it's unusual, but that's what we'll get to in a second
with the recommendations. So you've got the data and then you can draw a really interesting
insight. Now, one of the key things that you're going to do is come up with lots of data points in
insight, whatever you do.
If you're presenting the results of a big test or a big piece of product [00:09:00
] work, I would
challenge you just come with 10 data points. 10 insights and 10 recommendations. So
essentially 10 ideas that run through those three vectors. The reason why is I rarely see even
the smartest people I've ever worked with really able to hold their attention beyond that.
And that's a very good model. If you can get it under 10, even better. So we've got data. We've
got insight. Now your recommendation could be, uh, many varied. I mean, we've been talking
about this idea of this variance between 18 and 24 year olds in a, in a particular study. Maybe
you want to go and experiment, try some different things.
So let's do a popup store in New York and a popup store in LA, or alternatively, you might say
we don't know enough. Uh, and we want to conduct another study. I mean, that's the range of
recommendations you could come out with, but by separating the data. From the insight from
the recommendation. What you have [00:10:00
] is an ability that maybe someone doesn't like
the recommendation, they love the data.
I love the enzyme. They're really good, but they think they have different interpretations of what
to do based on the insight. That is totally awesome. What you don't want to do is say, Hey guys,
we should do these six things. And everyone's like, Whoa, hang on. They're not tracking. So
what you often find is the reason people are not aligned.
They're not supporting a recommendation is there's a break somewhere. And it's usually either
in the insight. Or the data and sometimes on really sophisticated products, you actually almost
have to have like an agreed approach and how you're gonna look at data and insights and how
you're gonna focus those.
So that's what happens when you work on really big products. Yeah. Yeah, a great example is if
you're building an app and it's going in say 10 or 12 English speaking app stores, then you're
like, okay. Uh, we really got to kind of process this data could run a [00:11:00
] lot of vectors, a
lot of different points and criteria.
Now here's the thing. If you are presenting your ideas, you might want to start with the data, or
you might want to start with a recommendation and work your way backwards. That's something
you need to choose based on your audience. But the beauty of design thinking is everything.
You have done has started with empathy for the user.
So there's going to be a ton of time interesting that you would have done, uh, to make sure that
the data is strong and robust, you know, we will, uh, see all sorts of things around sample sizes.
And how did you interview. What was the task you gave them. All of those things need to be
cleared up before you presenting your ideas.
So you have an agreed approach, but if you have this three part approach, data, insight,
recommendation, this is such a great practical way to talk about your product. It's a great way to
say, look, we're all on the same [00:12:00
] page. Now a great example that I have often seeing
the earliest stages is hugging guys.
We think we've got some pretty good validation of the core offering the core feature. It really
does address a pain. It's a, it's a pain reliever. However, There's some particular things about
how it might work, that we still got to do a lot of work on. And for some of the other stuff, we
have completely no idea where we're at.
That's okay. What do you want to get to is by the end of your productivity before launch,
everybody has really embraced the data and the insights, and they're fully on board with the
recommendations. If people aren't adopting your recommendations that are really rooted in
design thinking, then just fall back.
Peel it back say, look, let's go back to that insight. And if you can't get anywhere, they go back
to the data itself. And sometimes there might be misinterpretations of data. Or what I often find
is, Hey, [00:13:00
] we just don't have the data. So let's go out in the next round of testing and
test those things. And all of a sudden you can create some movement.
Well, hope all of this has got you moving towards design thinking and I hope you're typing in
bottom up. Dot IO into your browser to get all those free master classes. I truly love sharing, um,
all of these concepts, mindsets, frameworks, methodologies with you. This has been a
wonderful a series breaking down design thinking this huge thing.
Until these bite sized parts and really what we're about is bite-sized parts. Yeah. Bite-sized
skills. Even for designers, creatives, builders, entrepreneurs, you name it. So I hope you've
enjoyed this final part of our design thinking series and I hope you're ready. For the upcoming
agile proach, the agile software development masterclass.
Okay. That's it for [00:14:00
] me on design thinking. I hope you've enjoyed the bottom up skills
podcast. That's a wrap.