How I Tested That

In this conversation, David J Bland interviews Andi Plantenberg about her experience with testing and experimentation, particularly in the context of NASA missions. 


They discuss the importance of testing early stage ideas, the challenges of navigating a culture of perfection and safety, and the need for fluidity and adaptability in the face of uncertainty. 


Andi shares insights from her work with NASA, including the use of simulated mission control rooms and the iterative design of software. 


She also emphasizes the importance of systems thinking and the need to address complex problems in a holistic way. 


Overall, the conversation highlights the value of experimentation in driving innovation and addressing the pressing challenges of our time.


Is your innovation pipeline clogged?
  •  Uncover the risks, bottlenecks, and gaps holding your best ideas back.
  •  With the EMT Diagnostic, you'll get a clear, actionable plan to fix them.
👉 Book a free discovery call at https://www.precoil.com/innovation-diagnostic

What is How I Tested That?

Testing your ideas against reality can be challenging. Not everything will go as planned. It’s about keeping an open mind, having a clear hypothesis and running multiple tests to see if you have enough directional evidence to keep going.

This is the How I Tested That Podcast, where David J Bland connects with entrepreneurs and innovators who had the courage to test their ideas with real people, in the market, with sometimes surprising results.

Join us as we explore the ups and downs of experimentation… together.

David J Bland (00:01.363)

Welcome to the podcast, Andi.

andi plantenberg :D (00:03.547)

Thanks David, I'm glad to be here with you.

David J Bland (00:06.611)

Yeah, I'm so excited. I was going through my short list of people to have on this podcast. And I remembered our work even back at NEO. I think it was all the way back in 2013 when we first started collaborating. And I was always impressed by how you thought through early stage ideas and kind of blending all your design and product background to really create these thoughtful tests to go figure out if we're on the right track or not. I was like, I have to get Andi back on here to talk about.

andi plantenberg :D (00:16.795)

Mm -hmm. It's true.

David J Bland (00:35.922)

what she's been up to, some cool things she's been working on. And so maybe just give our listeners a little bit about you and your story and how you got pulled into this world of experimentation.

andi plantenberg :D (00:47.547)

Well, now at this stage in my life, I realized that I'm just very hungry for learning in all sorts of environment. You know, very, like my first four way into higher education was to get a philosophy degree. Always I'm doing art, you know, I'm an artist at heart, but the philosophy degree morphed into a theology degree, which morphed into a run out into.

physics degree, which I did end up backing out of for cultural reasons. But I did want to go into the space sciences at one point and was working towards that. And just this hunger for knowledge in what's going on. Entrepreneurs always excite me because they're coloring outside the lines and ultimately highly creative folks. So that's what landed me.

in this line of work. Plus, I was living in San Francisco in the very early 90s, so I was the first wave of people who were building the internet. And that was just a very exciting time. So the rest is history.

David J Bland (01:57.775)

Yeah, I have to say I was still in the East Coast during that time, although in startups and I didn't get experience the West Coast flavor, which I'm a little jealous of, of .com. Although I could just say if it's anything like I experienced, it would must've been amazing in very many different ways.

andi plantenberg :D (02:16.155)

Well, there were their fiefdoms of people who would run around and do startups, sell them. And then I was in the wave of people who were doing different companies to sell back to AOL. Like, let's do something that gets bought out by AOL.

David J Bland (02:40.174)

Yeah, yeah. So we met at Neo, and for those of the listeners who don't know about Neo, it was this really interesting little agency where it's almost like Lean Startup as a service, I would describe it. I mean, we were building MVPs, testing with entrepreneurs, testing with corporations, there's little cross -functional teams. There were a lot of great principles we were applying there. And I was always really amazed when I get to pair with you on anything because...

andi plantenberg :D (03:00.187)

Hmm.

David J Bland (03:08.589)

Like the way you go through that process and not everyone can really, really embrace the uncertainty of early stage stuff. And I loved how you always, you know, thinking through and helping people kind of test things and pushing back a bit when you feel as if, hey, maybe I can just skip this step because I already know all the answers.

andi plantenberg :D (03:28.891)

And that's what's so fun about the work is I've always gotten in trouble for asking too many questions. And so to have the chance to show someone their confirmation bias and come back with real data very quickly that they're involved in gathering, and then they make that switch themselves is huge because we don't see that in business much. So it actually gave me hope that there was a place in business for.

a weirdo like myself.

David J Bland (04:02.411)

So moving on, so Neo is acquired by Pivotal and we all eventually kind of moved on to different things. And you've been taking a lot of this work forward, you know, a lot of this work that was sort of informed by Eric Ries and Jeff Gothelf and Josh Seiden and Giff Constable, like this whole group we had.

Being able to kind of take that and move it forward, I'm really just fascinated to hear a story that you can share with our listeners about how you're kind of navigating that and going through helping people kind of test this early stage stuff.

andi plantenberg :D (04:34.619)

Absolutely, because these principles need to adapt for the current stage of business and the problems at hand and not get codified into a dead process. The thing that comes to mind, and I believe this person was first introduced through NEO, which would be Jay Trimble at NASA Mission Control. I joined NASA Ames campus for a couple weeks.

I've worked with them several times since, but the first time around was definitely very exciting, where I was brought in to help them work on what the best software and processes for developing and designing the software would be for a robotic mission to the moon to mine water from the surface, because we know that there's water on the surface. So I think I said earlier that I was.

actively looking to get into the space sciences and study cosmology. So I was just thrilled to be at NASA mission control with this problem set to dig into. And to the point of not letting these things get codified, what we ended up doing was a customized Google design sprint. We took two weeks to do it, and we changed it a little bit.

And this is something that's interesting because in the large corporate clients, this is very true as well, that you can't just like unfurl. The great thing about the Google design sprint is that there's a very, very clear playbook, but rarely is a large organization able to follow it closely. You do need to kind of morph it into your culture a little bit.

David J Bland (06:20.103)

Yeah, I agree. It's a lot of, well, anything like we're taking from Lean Startup and Agile and design thinking and business model innovation. They're sort of some really common tools, but the way you put those together, it's not just something you rubber stamp out across orgs. Every org has a different culture, a different focus.

andi plantenberg :D (06:37.115)

Mm -hmm.

David J Bland (06:38.022)

I mean, NASA, I mean, I've only visited that campus a few times, one to meet with folks and give a talk, another I was doing some advising at Singularity University, which I had a really great time doing as well. And yeah, and the idea of like failure is not an option, right? It's NASA. And so how do you introduce this idea?

andi plantenberg :D (06:48.731)

Mm -hmm.

andi plantenberg :D (06:52.155)

super fun.

andi plantenberg :D (06:58.939)

Failure is not an option. And there are signs up all over that remind people that if you're not doing your very best, someone will die. So if you say, hey, we're going to be fast and loose, we're going to go for the big 80%, you meet with scowls. And these aren't just, they're really piercing scowls too, because they're very knowledgeable, educated scowls. And

And at that time I was a woman in my early to mid forties and everyone in the room was a very accomplished, older, you know, baby boomer male. And it was intense. The things that you, you have to be very careful, careful about the things that you do morph to fit a company because most people want to.

take away the uncomfortable things like, well, we can't really listen. We can't really obey this data or we can't be precise about our assumption and then change it. This confirmation bias exists at scale in organizations and it pushes back really hard. The things I got pushed back with right away from NASA were, first of all,

Experiments means something very specific in the scientific community. What we call experiments in the business community, I think it's right to call them experiments because you're putting out, it's like the Richard Feynman definition. We make a guess and find out whether or not it's not true. You can't prove that you're right. You can only prove that this is not gonna work because we tried it and it didn't work.

So NASA is very, as many big organizations are, you know, we've all learned to think, think, think very clearly about something, and then pull the trigger. And they do their, and that's, that's what keeps people safe, right? However, there's also a saying in NASA, which is, say it and then sim it. So they're really big on simulating.

andi plantenberg :D (09:21.691)

because they understand it's when you simulate it that you learn what's wrong with it. So if you come in, like when I came in through that angle, say it and then sim it and piggyback on that, then it started to make sense. This is also true, I done some work with General Electric as well and the same, I mean, I think it's very active in a lot of organizations, you know, with design, don't really.

don't really figure out early where you're wrong by putting it in front of customers and having them kind of be like, that's not ready.

David J Bland (10:00.063)

Yeah, I think we talk about experiments, and I'm guilty of this to an extent because I wrote an entire book about experiments, and always people ask me, well, that's not an experiment. That's doing research or other things. And one of the reasons I use that really broad term of experiments, and the way I use it very broadly is, well, we have these guesses, right? We have these assumptions we're making, and we have to go find out if they're true or not. And there's a...

andi plantenberg :D (10:25.691)

That's the foundation of the scientific method right there.

David J Bland (10:30.079)

Yeah, and if I have to say anything I would alter, I would say, well, it's probably closer to a social science in many cases because it's very similar in how we're going about experimenting with people and on people. Yeah, and so I still use the word experiment quite a bit and I do see it becoming more popular inside big companies, but I do get some pushback on, well, that's not technically an experiment.

andi plantenberg :D (10:44.539)

because of the qualitative data.

andi plantenberg :D (10:51.739)

Mm -hmm.

David J Bland (10:59.901)

And I think what we're trying to do in our work is make it more accessible, make people feel more comfortable trying to find out if they're on the right track sooner versus later. And if you put all of that together in a way that makes people really defensive, it just digs their heels in even more of, no, no, I'm not gonna test this. I'd rather just like build the whole thing and then find out if I'm right or wrong.

andi plantenberg :D (11:10.011)

Right.

andi plantenberg :D (11:19.963)

Mm -hmm.

andi plantenberg :D (11:24.763)

Right. And so that one graph that we've casually, you and I refer to as the sawtooth graph that I believe is originated from Luxor with the Frasers, Janice and Jason, which shows that, you know, the earlier you test, you take more risk early on, but the risk is a very small bite -sized risk with very high learnings. And so messages like that do work with

with people like at NASA and at General Electric and other places where there is a lot of process to keep people safe. And then also reminding them that we're not launching a rocket with someone on it, we're just showing a design early. And then you get to the real issue, which is that I don't look credible if I show something lo -fi to get fast learnings, which is gonna save everybody time and money and ending up.

David J Bland (12:06.651)

Yeah, and a lot of times the process is there for a good portion.

andi plantenberg :D (12:24.283)

making us more effective in our mission. There's a different cultural thing there to be unpacked, but I think it's important to get there and unpack those things.

David J Bland (12:34.266)

Yeah, so maybe we can dig in a little bit there of, so you're introduced and you're introduced as a woman into a failure is not an option culture. Like navigating that, what are some of the tests? Like how are you getting people warmed up to do that? I know you're loosely following the sprint process. So maybe we can dig in a bit there how we got started.

andi plantenberg :D (12:39.067)

Mm -hmm.

andi plantenberg :D (12:46.299)

Yep.

andi plantenberg :D (12:53.147)

Mm -hmm. Mm -hmm.

yeah. So we founded the two weeks on the Google design sprint and then adjusted it. we had a very big whiteboard so we could whiteboard things out. I, I'm also an artist, so I was very excited that I got to draw quick drawings of the moon and robots on the moon as like part of the thing. I was just like in heaven. one, one aspect that I didn't consider upfront, but that, was a very real thing that I think went okay.

is that I just had to stand at the front of the room and just take it, you know, just take all the questions and all the doubts and, and a style of communicating that I'm very familiar with because I grew up in a big family with a lot of loud uncles who would rib each other constantly. So I, so I just got out my like plant and bird family elbows and I was like, yeah, well you guys want to try this. I can leave now if you don't. And so, you know.

Just kind of like duked it out a little bit. But then you're like, if we agree to move forward, let's just move forward. And then because the process was so clear and it was endorsed by Google, I think that also helped them feel safe enough to do ridiculous things like ask me anything questions, sketching out things when they have no business sketching because they're not an expert. If you're going to work at NASA,

You have some serious cred under your belt and you have many rubber stamps of approval. So I think working outside of the rubber stamped approval, like outside of your sanctioned silo and discipline is a big uncomfortable thing for a lot of people. It's not like the early days of NASA where you have all these crazy pilots just like, let's try this. You know, that's not what it's like. And thank goodness, right? Thank goodness.

David J Bland (14:51.637)

Yeah, thankfully it's evolved from that. So, okay, so you're going through that and what type of, so wanting a little bit more detail about what you're testing and how you began to kind of navigate that process of, you know, wow, we're going to the moon, you know, how do we test our way through that without finding out just putting something?

andi plantenberg :D (14:54.747)

And.

andi plantenberg :D (15:06.939)

Yeah.

The mapping part of the Google design sprint was tricky because of the problem statement, which was something like it was in 2017. The problem statement was something like, how do we maintain situational awareness, temporal awareness, and procedural awareness in a real -time live mission with people distributed around the globe who are roboticists?

drivers and scientists. And this is something that hadn't been done before because although we have remote, we're all familiar with our lovely rovers on Mars that we just adore, right? But it takes quite a while for a signal to go all the way out to Mars and come back. It's not, it doesn't have that real time feeling. And with the lunar surface, it's just a little bit of a lag. So this is like a brand new problem space.

for humanity right now. So it was very fun to be involved in that. Some of the other unique things about the problems we were solving are we have a rover driving around that collects energy through its solar panels. And we need to know where the shadows are at all time. And we don't know if we're going to drive over a sticky patch of sand. We don't know what the terrain is until we get there and we're looking at it.

andi plantenberg :D (17:02.683)

So we can't get in the situation where we go into a crater and lose signal because then we'll be stuck and the rover won't be able to charge and then it's dead, the whole thing is dead. So that was also very interesting.

David J Bland (17:20.433)

So it feels like there's all these things that can go wrong potentially once it gets there, or maybe even before it gets there. But once it gets there, there's an element of navigating this unknown. And so how do you test your way through that process of, well, what are some potential issues that could arise and how might we address those? How are you navigating, or how would you navigate that with NASA?

andi plantenberg :D (17:41.723)

Yeah. Well, what you don't do is just have the small amount of people in the mission control room for this, just guess and then do something really high fidelity and then have like some big unveiling and then get feedback. That's how you don't do it. Well, what we did do, which is the more uncomfortable thing to do socially in a large serious organization, but it's much more effective.

is get the other brains involved, the mission scientists involved, the people who will actually be driving the rover, getting them involved and simulating something lo -fi in a faked up mission control room in a week. So that's what we did do. And it's very uncomfortable for people to do that because you're coming out as someone who's very trusted in this very special organization. You're taking the time of someone else who's

a very special and knowledgeable person and having them look at something that's kind of clunky. You know, it's hard to do that. It's hard to do that. But when you get the learnings in, it's the efficacy is undeniable. And that's what really helps.

David J Bland (18:57.998)

So how do you?

andi plantenberg :D (18:58.587)

And as a practitioner, you got to go in there and be willing to stand up and put your name on that and hope it all goes OK, because you don't have control over the team. You don't know if they're just going to flame out, but you have to kind of do your best, take care of them in the process, and then the learnings come.

David J Bland (19:16.462)

So how do you fake a mission control room? Like how do you test that?

andi plantenberg :D (19:21.627)

That was my favorite part. We took the mission control room at NASA Ames, which is really just a bunch of computer banks. It's not quite the thing you see like when they showed JPL and there's that really big mission control room. It's more of a modest one. But we did keynote mock -ups very quickly with pictures of the actual rover that they were testing at that time. And we...

put it on the moon, we faked it up on the moon and faked the type of mapping that we had this like very interesting and cool mapping that Charles, who is the main designer there, had been working on for a while. And we put that in the right scenario. So it's having little bits of all the pieces that you need in a mission control room together, like the right lighting, the right types of things viewable up on the wall.

and land them in a problem space and say like, here's what you have to do and this is what you got, go. And then see where they get confused and where they're tripping up and just record that live. And then we would have all the people in the sprint also looking at that feedback from another room, watching it live. And as you know, David, that's when it lands. When you see the person struggle with a thing, all of your preconceived notions of...

why it won't be that way and all the internal kind of debates in your head and in the room, just go away. It becomes obvious. No one has any questions. You just learn it and you move on and you save yourself all the debates.

David J Bland (21:01.259)

Yeah, so you're not creating maybe a PowerPoint with bullet points of what happened. It's more of a experience.

andi plantenberg :D (21:07.419)

You're not trying to convince people and do your dog and pony show all across the organization. You just, everybody sees it and everybody gets it. And that's what I love about it. It's so self -evident, the learnings.

David J Bland (21:19.883)

So what kind of things do you learn by testing a mission control room for a mission? Like what are some -has out of that?

andi plantenberg :D (21:27.835)

you get to learn where your best thinking on what the science, what the lead scientist is going to need right now at this point, if the rover's here. You think you know what they're gonna need, but you don't. Or there's a lot of things that they're used to using, like because they have existing software for other things, they're used to having something over on one of the other monitors.

that they have access to. And they're not used to seeing it as a tab in this place. So you learn a lot of very specific UI stuff.

David J Bland (22:52.647)

So it sounds as if in that testing situation of sort of a mocked up mission control room, you're learning a lot about behavior change, you know, from what it would take to maybe take something to Mars versus what it would take to take something to the moon. I mean, is that correct?

andi plantenberg :D (23:09.243)

Yeah, the behavior change with that is...

The primary behavior change there had to do with the lag time of getting information back. So, and that was the new thing in this specific project was that it was live and ongoing and distributed with a distributed team around the world.

andi plantenberg :D (23:34.203)

And what were the needs going to be there? yeah, which leads me to another aspect of the of because it's just kind of happening live and ongoing. What if someone needs to take a break and have a nap and then they have another shift coming up? Well, you're going to want to reach them in the cafeteria, you know, in the commissary. So there we were exploring ideas of a mobile app to go with this for.

mission execution.

David J Bland (24:07.941)

Okay, so it's like you would think maybe as an outsider looking at this, it's like, well, this should be easier, right? Like we're going to the moon instead of Mars. So that should be easier. But I think testing this thing through our sprint and through kind of a testing process, you're understanding all the little nuances that make it different enough that if we skipped over that and designed something based on what we thought everything would behave, you know, we could potentially be.

andi plantenberg :D (24:19.867)

Mm -hmm.

andi plantenberg :D (24:28.347)

Yeah.

andi plantenberg :D (24:35.611)

Mm -hmm.

David J Bland (24:37.604)

I honestly have fatal flaws in the mission because we're just skipping over some stuff, some assumptions maybe that we should have tested.

andi plantenberg :D (24:48.443)

Right, and there's the opportunity to be a lot more fluid and reactive on a mission like this. So you build in that opportunity to learn. Like the mission plans are very, very scripted, but everything you encounter is a surprise, you know? And the idea is that based on what we discover as we're driving this rover around and measuring the soil underneath, because you're measuring it and then seeing the likelihood of water and then drilling down and then baking the

the Earth of the moon, the moon of the moon in the belly of this robot to get the water vapor out of it. So as we're driving around, we're learning and making decisions and adding, annotating the map and then making kind of live adjustments based on what we find. And there is no like two hour downtime of now we wait for the rover to talk back. It happens right like that. So being able to have some fluidity with.

that the teams that are running this mission is important and something rather new.

David J Bland (25:57.505)

So it sounds as if, you know, you're driving this, like everything's going well. If you're able to drive this Rover around, I think I'm like the South Pole of the moon, and you're able to kind of detect water in some way or ice and then drill down, pull it up, heat it up, or some nature of that. How do you test some of that locally, or do you test any of that locally, or do you find out, hey, we think this is all like works in the sim and it's just gonna work when it gets up there. How do you test your way through that?

andi plantenberg :D (26:24.187)

Well, the rover gets simmed differently. And at NASA Ames, there's a little, you may have seen it, a roped off area by the giant wind tunnel where they test the aerodynamics of big craft. There's a fenced off area with rovers in it and rocks and little hills and stuff. And the roboticist on this project was, his name was Matthew. I don't remember any last names right now. But he was a brilliant roboticist who was living in San Francisco. And...

He was iterating the robotics during the same sprint. So I think the rover looks very different than it did now. And all those pieces get hooked up at that. So there are many technical approval gates. And as you unlock those gates, there are certain levels of testing. So I wasn't involved in those kind of later stage. I was very early upfront, like, let's design this software in a way where.

it's going to be really usable for the needs of this mission and see how much we can just learn upfront and cheaply.

David J Bland (27:30.334)

So how do you iterate on software in the sense of, was it a case where you bring them into this simulated mission control, you see how people behave, short feedback cycle, right? Because it's just, it's low fidelity.

andi plantenberg :D (27:42.971)

Mm hmm. Yeah, we brought in one person at a time. Like we bring in a, we tested like a set of drivers, a set of scientists and can't remember offhand the other set.

been a while. But anyway, we would bring them in by discipline and one at a time and make sure we captured a certain amount of sets. So in NASA, the calm loop is very important. Like you're talking on the calm loop and you're like, everything's nominal. And, you know, so and so, you know, you're chatting, which is interesting, you know, it's kind of helpful to just have a calm loop where everybody knows to be quiet unless it's their turn to talk. So it's fun simulating those sorts of things. But you gather the data you need on problem on

probable use cases for those primary sets of users.

andi plantenberg :D (28:36.923)

And then we affinity mapped on the wall. We wrote down all the observations, stickies on the wall, affinity mapped it, retros, all of the things that you'd expect.

David J Bland (28:50.428)

So you take all that on the wall, visualize patterns for those of you at home and listening that don't know what affinity mapping and you're liking, you're putting the likes together and seeing what themes emerge. And so I can imagine you go through that process and that low fidelity, really quick cycle of testing, I'm assuming it actually impacted the design of the software.

andi plantenberg :D (28:53.531)

Mm -hmm.

andi plantenberg :D (28:59.739)

Mm -hmm. Mm -hmm.

andi plantenberg :D (29:12.635)

Absolutely, to a big degree. And one thing, so there's a distinction that I think is really important or that I think is very valuable between NASA and some other large organizations. The healthy thing about NASA is they will debate and it will be out loud and all of it gets out. And you have to be able to hold your ground and be able to be in a conversation like that. But things that could have been,

just debates and debates and debates and debates. It's just like, we all saw it, we move on. And so it shortens that debate cycle, which I think I mentioned before. In other large organizations,

that debating isn't necessarily allowed. And then people will just kind of subvert or something later. And you don't get to have it all out. So I think it takes a bit of like good leadership and safety to be able to kind of have those conversations. So we would find things like where there would be debates about like, well, how should we design it? It should go like this. No, it should be over here. And internal debates like that. But because of the rigorous

time block, we just went with whatever option and then the learnings were super clear.

Some people, I think, really survive on their ability to debate and win. Those people have a hard time in this process because it really is about having the work be the appropriate output for what the needs are and doing it through experimentation and the debating is not important.

David J Bland (30:57.336)

Wow, there's so much to cover there. So everything from how do people collaborate on a different type of mission, how do you test your way through that, how do you set up a mission control center and test your way through that to help inform software design, how do you get the rover tested locally so that when it gets to the moon, you've covered all the scenarios. Are there any other kinds of tests you ran or anything other like insights or nuggets from that experience?

andi plantenberg :D (31:27.067)

The real insights were how to do this at a place like NASA, you know with the with the legacy the credibility and the dedication to safety and perfection how do you iterate quickly and maintain credibility is a real issue and people don't really Talk about it that much, but it's such a core issue that I know you've seen a lot right David like I've seen this everywhere

It's like, but my credibility is on the line if it's not looking very beautiful right out of the gate. And so I really think that comes down to trust, you know, with the people you're working with and to really be a trustworthy person on the team that you take care of each other through that.

andi plantenberg :D (32:23.003)

Because this ivory tower legacy is real and deep.

David J Bland (32:31.701)

So how has this experience testing with NASA influenced other work that you've done throughout the years or has it?

andi plantenberg :D (32:39.675)

It took me down a peg in terms of like the startup machista or machismo. You know, I would say like the late teens or mid to late teens were just the height of brodom. And so it gave me empathy for those kinds of issues.

You know, where people's credibility is on the line, they've built a career and a name for themselves that the care of their family is tied up into it. You know, it's what puts a roof over your head. It's what pays the insurance. You know, you don't trifle with that stuff lightly. And you have to like, work with people thoughtfully through that, and not just shove them through the process, which I think we were.

You know, one of the reasons we were successful at Neo is the challenge or sale. Like we would just shove people through the process a little bit. Be like, yeah, your idea is not going to work. Let us show you that first.

David J Bland (33:37.843)

Gently nudge him through to extent. But yes, I think...

David J Bland (33:45.234)

Yeah, and that challenge is still to this day. That's not something that's gone away. It's still a lot of, well, you said you were going to help me build a thing. Why are you questioning if it's the right thing to build? And, you know, still there's still that culture. So what's exciting for you? Like, what's exciting next for you? How are you taking this work forward? Like, what gets you excited about the next, you know, five to 10 years of experimentation?

andi plantenberg :D (33:53.403)

Mm -hmm.

Right.

andi plantenberg :D (34:06.395)

What really gets me excited are, you know, we're in a very interesting time in history. This is real talk now. We are post global pandemic.

where there are many people still vulnerable to that. We've had a giant racial reckoning in this country that has opened lots of eyes. We're experiencing climate destabilization. There are multiple genocides happening throughout the world. You know, David, you and I used to go around and a lot of people in our profession would talk about the rate of change is accelerating.

It's underpinned by technology. Get ready, your business will be disrupted. This is also what disruption looks like. We've got, and this is underpinned by technology. You know, it's underpinned by the industrial revolution. It's underpinned by the digitization of everything. We have mass migrations that have never before existed at this scale. So millennials and Gen Z are

going to be moving the needle in some very big ways. And we don't know what they're going to look like right now, but it's going to be in the direction of equalizing some of this in better stewardship. And there's a lot of efforts for decolonization going on right now, efforts that empower the regular people. There's a real imbalance of power. And I think it's going to look like a lot of

small very interesting and very things to address all of the very problems that are existing right now i'm very interested in helping people learn what they need to learn is quickly as possible to do things that are going to help in this and this is very destabilized time in the world and i think it's gonna it's gonna apply to every sector but it's not gonna come out of the large organizations

andi plantenberg :D (36:16.411)

It's not going to be top down, you know, it's going to be bottom up in some way. So I'm looking for ways to dovetail with, with those folks. I do, I do cheap to free, experiment crafting for female entrepreneurs of color or people who need it.

who have great ideas. If I get put in touch with them, we just go to work. So that's what I'm very excited about. I don't have active paths into that right now, but that's what I'm looking out for because I think that's going to be coming over the horizon pretty soon.

David J Bland (37:00.461)

Something that struck me about what you're saying, and I think I picked up this from you, but being in one of your workshops too, which was system thinking. What are the systems going to look like and how do we test our way through systems and see what kind of impacts we have and maybe unintentional impacts of systems? And maybe we can start to wrap up on this note. How do you view system thinking? Do you think it's going to be crucial moving forward? Do you think it's going to re...

andi plantenberg :D (37:08.955)

Mm -hmm.

andi plantenberg :D (37:15.675)

Mm -hmm.

David J Bland (37:28.461)

have a resurgence in popularity? Do you think people are going to start understanding systems more? Because it feels as if systems are our key part of all of this, not just experimenting at a local kind of maxima. So maybe what are your thoughts on that?

andi plantenberg :D (37:41.243)

Yeah, I think systems thinking is more like close to the ground and internalized in the generations that are coming up behind us. You know, I think that we're a waning generation like the the Gen Xers hopefully are the last generation that's really stuck in the either or thinking or the right or wrong thinking. I think that Gen Z especially and certainly some millennials just kind of have more of a systems way of thinking anyway.

You know, it's less, it's less black and white. It's less binary. It's more intersectional and, it's more intersectional and adaptive. So I think systems thinking, Mike hunch is that systems thinking is not so much of an ivory tower, a part of the waterfall experience of making and building things so much as just, an organic approach that gets up that.

that is embedded in new companies and new ventures that are trying to make a difference. That's my hope.

David J Bland (38:48.49)

I share that hope as well. I keep thinking, you know, we can't just keep experimenting away where we're not considering the system and the impacts on an overall system. And that goes from corporations to funding and all that, but also just climate and how we live and everything else. It's...

andi plantenberg :D (39:07.259)

And we got to let some things go. We got to let some things that aren't working that were tried and true. We just got to let them go. You know, everything has its season.

David J Bland (39:17.16)

I agree. I appreciate you sharing that perspective. So this was amazing conversation. We ranged everything from how do you test missions to the moon, to systems thinking and where we're headed with all this. If people want to reach out to you and they have questions about how you approach testing or maybe they need help with testing things, where would they go to find you?

andi plantenberg :D (39:38.491)

Well, my website has a form on it. Don't use it. Just because you can email me directly. And my email is Andi, A -N -D -I, at futuretight .co. So it's future tight, just how it's spelled. Yeah, with no hyphen or anything. So futuretight .co is how to get a hold of me.

David J Bland (39:58.599)

Thank you, thank you.

David J Bland (40:05.927)

That's amazing. So all you listeners who really resonate with Andi's message here and want to reach out, I highly encourage you to do so. Thanks so much, Andi, for just sharing all these great testing stories with us and how you approach it. I really, really appreciate your time. Thank you so much.

andi plantenberg :D (40:20.891)

It's always lovely to talk to you, David. Thank you for having me.