Always Be Testing

Guiding you through the world of growth, performance marketing, and partner marketing.
We sit down with growth and marketing leaders to share tests and lessons learned in business and life.

Host: Tye DeGrange
Guest: Andrew Covato
Hype man & Announcer: John Potito

Andrew worked on the buy side and the sell side of various platforms working on measurement and data at Netflix and Snap. He has experience in ad tech and measurement at VC-backed start-ups Google eBay and Meta.

Timestamps:
00:06 Meet Ty and Andrew
00:51 Discussing Growth by Science
01:40 Andrew's ad tech insights
03:04 Ad spending and ROI
04:13 Ad buying mistakes
05:41 Smarter measurement tools
07:36 Testing's role in marketing
09:07 Netflix's testing culture
11:23 Geo-testing techniques
14:32 Conversion optimization pitfalls
18:32 Ad tech platform challenges
22:15 Strategies for effective testing
29:33 Networking and website invite

What is Always Be Testing?

Your guided tour of the world of growth, performance marketing, customer acquisition, paid media, and affiliate marketing.

We talk with industry experts and discuss experiments and their learnings in growth, marketing, and life.

Time to nerd out, check your biases at the door, and have some fun talking about data-driven growth and lessons learned!

Welcome to another edition of the Always Be Testing podcast with your

host, Ty De Grange. Get a guided tour of the world of growth, performance

marketing, customer acquisition, paid media, and affiliate marketing.

We talk with industry experts and discuss experiments and their learnings in growth,

marketing, and life. Time to nerd out, check your biases at the door, and

have some fun talking about data driven growth and lessons learned.

Hello. Welcome to another edition of the Always Be Testing

podcast. I'm your host, Ty DeGrange, and I am just slightly

excited to talk to a good friend, Andrew Cubano, today. What's up, Andrew?

What's up, Ty? Thanks for having me on here. And, I have to say, I love the podcast

name Always Be Testing. I can't tell you how many times I've said those exact words in various

roles. So you have knocked me out of the park with that framing of the

material here. So I'm excited to get into it, man. Love it, man. Appreciate it. And,

I have to I'm just gonna throw out more compliments. Your new firm, is it

Growth Science? Is that right? Growth by Science. We grow using science. Growth

by science. Yes, sir. I have to just throw a compliment your way also.

I kind of I feel like we're of a similar mindset because the naming on that is is

stellar in my opinion. Thank you. Heck yeah. Love science,

man. It's a passion, and I think that it is, you know, testing and

science go hand in hand. I think we're we're definitely gonna get into that. So excited to excited

to talk about that. Love it. So for folks that don't know, Andrew is

kind of a badass and a great guy and and very scientific in what he's,

how he's approached things and how he's approached ad tech and growth. He has

worked on the buy side and the sell side of various ad platforms. He's got a ton

of measurement experience, a ton of data experience, a ton of ton of team building

experience. And with some brands that maybe you've heard of Netflix,

Snap, Google, eBay, Meta, just a few to name. No big deal. Yeah. I've

had a lot of great fortune, a lot of great companies to to work for. So I'm really grateful for all

those experiences. Heck yeah, man. So maybe maybe like to kick things off

to make it super easy for folks to understand kind of what you do, Maybe

could you share if we if the audience was a fifth grader, what do you do?

Yeah. Great question. And it's always hard to articulate, you know, such a

nuanced topic like ads measurement, but, we'll we'll give it a go here. Companies

want to sell their products and services. And in order for them to

sell their products and services, they need to talk about their products and services and share

them and and, you know, inform people that these things exist and that they can be

bought. And so they spend money to have their products and services be put

in front of people that they think will potentially buy their products.

So what I do, primarily is help those

companies understand if the money that they're spending on promoting

their products and services are actually resulting in people

buying things that they would not have bought otherwise, that they would not have bought

except for the fact that they saw those ads. And so that's really what it kind of boils

down to. It boils down to ROI, but specifically to get a little bit

outside of the fifth grader into the jargon world, I focus specifically on incremental

ROI and and figuring out how to calculate that. And so there's a lot

of math and methodology that's that goes along with it, a lot of technology

that goes along with it as well. And then, you know, believe it or not, a lot of politics,

especially as it comes to, you know, talking about how to implement and how to how to

change the way the perspective on on on media buying. There's a lot of politics involved in

that too. So, it's weirdly Love it. A specific niche area that has,

I think, actually, broader economic implications. For sure. I love

this because it's something that comes up so often for people

engaging in ad buying as you we've seen this explosion of digital where everything

can be measured and people get super excited about that. Yet, unfortunately, as you and I

know, just so many brands actually do it wrong, are chasing the

wrong things, are seeing that they're either fretting about a lot of that spend

not being incremental and not maybe be able to act on it almost to a fault.

Some of them are are able to actually get it right, and so many just just don't have the visibility that

they want. And, and there's there's partners and internal and external teams that don't always

see it nor can they act on it. So for you to be able to talk about it, educate

people about it, cut through the noise is really exciting. Yeah. I mean,

I think visibility, you mentioned that. It's really interesting because I think there's a trade off

with having access to more data in a lot of ways inspires

people to do dumb things when it comes to measurement. And, you know, I'm sure we'll we'll

this will come up a lot in our discussion, but privacy and a lot of the changes with regulatory

changes and OS changes like ATT have sort of killed a lot of the

data accessibility that marketers enjoyed in the past. I

thought that that would result in people kinda going to to

more smart measurement methodologies and and tooling and things like that, and it and it has to

to a certain extent. But I don't think we've evolved as much as, as I think we

could evolve. And so, really, what I'm trying to do today is is help move that

along a little bit and encourage more platforms, marketers, you know, really the

whole ecosystem to lean into some of these these more appropriate methodologies versus

things that that maybe won't give you the right visibility into that ROAS or to that incremental

ROAS incremental ROI. Absolutely. And maybe a good segue, what are you seeing as some of

those things that people are getting wrong about incrementality testing or that

incremental ROI that you're speaking of? Yeah. I mean, I think the first thing that they're getting wrong is they're not

doing it. There's still a SKAd

network, Last Touch, you know, anything that's that's kind of post exposure

attribution. So what I mean by that is you look to assign credit to a marketing

channel based on, you know, a touch point that a user has had and a

subsequent conversion, and and you're signing credit based on on on that pathway. That's,

first of all, pretty broken in a post ATT world, abstract transparency,

for for those who aren't super familiar with that, which is basically Apple's way of shutting down the

ability to make those path to conversion connections. So there's there's a

anybody that's that's looking at post exposure is not doing incrementality. That's number one.

Now for those that are doing incrementality, I think a lot of people get kind of afraid that

there's this opportunity cost and so they kind of dip their toe in, they don't really go all in

on it, they maybe do a test here and there, there isn't a lot of structure around it, maybe

they're trying a lot of different tools and and and kind of switching tools and and this and that. I think

it's really important to to pick a tool, pick a methodology, stick with it, invest the

time, have a structured testing approach, and and really, you know, not

not dabble in it a little bit, but make it the center of your marketing program

and the center of how you assess the performance of your marketing. Love that.

It's not something that you can just it's not an afterthought, and it's not easy to do,

and so it shouldn't be an afterthought. But but really getting getting the the organization

around that culture of testing and incrementality, that's the foundation of everything. And then,

you know, those other things that I said, the structured learning agenda and and, you know, commitment to the testing sort

of follows from that. But if you don't have that foundation and you don't you don't have buy in from leadership

and people are are doubting that that's even a good place to start Mhmm. You're you're kind of

you've lost before you've even gotten off the ground. Well said. And maybe a good segue, you know,

talking about the Netflix experience a little bit and how you kind of built out that paid growth

engine. You touched on so many interesting things there, like education, leadership, buy in,

alignment, culture of testing, but not to, like, take the words out

of your mouth, but, like, how did you kind of approach that Netflix experience and and and your thoughts?

I mean, Netflix was was a a phenomenal experience for me, working there. It was the first time that I worked

on the buy side in in a in a big way. Everything that you mentioned, like, everything that that we

were kinda talking about as it relates to incrementality was totally at the core of of what we were doing at

Netflix. But but first of all, I mean, Netflix, for those who don't know or or aren't

familiar, is inherently a very test heavy culture. And they test

everything. And I love it. It's you know, if you read their culture doc, which is available, you know, online, and, you

know, there's books written about it as well. But really, there's a lot of kind of solicitation solicitation

of dissents, kind of the peer review process, which a lot of what I put into

growth by science is our ethos. I've kind of borrowed from that, and and taken the things I've

loved from that. But but, yeah, there's this great culture of of being scientific, and I don't think they actually

call it out, but there's testing, there's updating your priors with new information,

soliciting dissent, disagreeing, committing. There's all those kind of themes that

that relate back to being a scientist and having scientific mindset that are at the core of

Netflix. And that's why taking that cultural foundation,

translating into the marketing side, it was it was the the best possible petri dish

to develop a really truly scientific paid growth program. Love it. And so

I still believe that to this day, I still think Netflix is one of, if not the top

performance marketer that that exists out there from, you know, from a sophistication perspective. You know,

what we built was was what I would say it should be the North Star for all

performance marketers out there. And of course, you know, Netflix is gonna have tons of bells and whistles, and

I'll mention a little bit of what that kind of looked like. But this general direction of what

they've done is I think what a lot of all growth marketers I think should be aspiring

to get as close to that as they can given the resources and the size. And so what we we

built there, essentially, had a cross platform, media

buying tool, which we built bespoke in house to the way that we were buying media, connected all with the

APIs. It was just as good, if not better, some of the tech that I've seen at agencies

or third party buying platforms, and we had this all in house. We had testing

automation built into that. We had learning agendas that we would want to

put forth to help guide our marketing decisions and we had that built into the

optimization and into the deployment engine. All that sent back into our in

house buying sorry, our in house measurement Mhmm. Tooling, which was built

by, you know, some of the smartest folks in industry. And funny enough, back then,

what that entailed, what that measurement platform entailed is a lot of what you see a lot

of the best incrementality tooling out there is sort of doing something similar.

And some of that that great tooling that's out there, a lot of it uses geo testing to inform

an media mix model that's more of an econometric model, can be always on without having

holdouts. And so I think that's a fantastic set of tools for

growth marketers to kind of look at. So all of this tied together, we had this

feedback loop. And one of the most exciting things from that

whole tech investment that we made and that great kind of MarTech stack

and growth stack was that we got some fantastic insights

about what worked and what didn't work. And a lot of it, you know, was very surprising and I would say

antithetical to what platforms and and sort of common knowledge,

growth principles were and what the platforms were telling us to do, but it worked. And and

it goes to show you that if you invest in incrementality, you measure things from an

incremental causal perspective, and you will you will find that the way that

you buy media can be totally different from what's out there. And I happen to have a a

perspective not solely informed on, you know, the work we do at Netflix, but really

informed on all of the research that I've seen and that I personally conducted, you know,

even with clients today at Growth by Science on what a good approach to

growth marketing is. And I think it's a lot simpler than than what people

think. So I'll tell you my my principles. I mean, I think it's a good time. I'll just

I'll just share them. Yeah. Drop them on us, baby. Let's go. Great creative, number one.

You gotta have great creative. It's gotta be bespoke to the platform that you're using it. You gotta nail your

targeting and only rely on first party targeting. A lot of the other targeting

is is sort of broken, third party data exchanging, all of that. Just assume it's broken or will be

broken soon enough. So rely on, you know, kind of platform targeting, especially if you're doing if

you're doing something like paid social. Optimize for reach and frequency. Don't look

at any conversion optimization. This is the spicy one. I'm sure you're gonna get some people

that really come back and say, no, this dude's off his rocker. But I can

I can promise you, if you like last click and you like post exposure, then use

conversion optimization? If you like incrementality, do not use

any kind of conversion optimization. It does not drive incrementality in most cases.

Not to say that it never does and not to say that you can't get things to be incremental if you use it,

but I would not start there. I would I would exhaust other options first and then, you know, move

to that if you're still not seeing incrementality. Can you double click on not to interrupt you because you're

on a on a roll here, but can you double click on the conversion rate

optimization? Just just definition of that because I think that could get For sure.

Yeah. So not the conversion rate ops. I think if you're just talking CRO, that's different. That's, you know,

tweaking landing pages and Yep. Stuff. So definitely do that. But as it relates

to conversion optimized ad delivery, where you're passing feedback through a pixel

or copy or something to an ad platform, and then the ad platforms using that to optimize.

The way that the optimization systems work at these ad platforms, they optimize

on a post exposure basis, meaning the signal that they get back that they use

as a feedback loop is did somebody see an ad and then convert. Okay. That

seems to make sense. Now the problem is it's the the delivery system is is

good at predicting people that will convert. And so you're showing ads to

people that would convert anyway, and you're saying, okay. It works. I'm gonna feed that back. And so

that's why there's a lack of causality from these conversion optimization algorithms.

Love it. And I've seen it time and time again. You put this thing out there and there's by the way, a lot of companies are

putting research out that shows that the incrementalities here, the plat or rather the

platform ROAS is here. The incremental ROAS is way down here for all these conversion optimized

products that are out there. And I've seen time and time again, you cut it off, you switch to something more generic, you

switch to either engagement optimization

manage, you know, kind of the delivery with that. It almost all the time, when you look at the and manage, you know, kind of the delivery with

that, it almost all the time will work better from an incrementality perspective

than a conversion optimization perspective. The pitfalls of chasing that CAC

and that CPA. Right? Oh, yeah. People are overly dogmatic about that. Yes, sir. They could

lose out on incrementality. They they could lose out. And the funny thing is when you

switch to that kind of reach and frequency style, what I what I just described, your CAC goes through the roof.

Your last click CAC goes through the roof. And so people get freaked out. They go, oh, this ain't working.

No. It's not not working. It is actually giving you people that would not have converted

had they not seen the ads, which is what you want. You don't wanna show people ads who are already

gonna convert. You wanna selectively ideally, you wanna selectively show it to people who are kind of just on a

threshold, but then have most of your your budget not spent on people that are

gonna convert with only a little bit of so it's actually the opposite of of the way that conversion

optimization or a conversion based optimization delivery works. You wanna do the opposite of what it's

doing. Love that. You're on a roll. These principles are are gonna be, tacked up

on the wall in a number of places, I think, in here. We'll see, man. We'll

see. I'm I'm sure somebody's gonna, find some contentious, something contentious to

say about it, but I welcome it. There may be some comments. Yeah. We welcome the discussion. That's part of

being scientists. Right? Absolutely. Yep. I I welcome being proven wrong if somebody can show

me data. I I love it. And maybe a good segue too. You've laid out some principles. You've

talked Netflix with Meta, Snap, and Google. Obviously, a ton of interesting learnings

there. What can you share based on that experience? I think the the big

takeaway I have from all these ad platforms is is sort of related to the principles, which

is they're kind of a victim of their own success a little bit, right? You know, that

this massive multi hundred billion dollar digital marketing industry has been built on the

backs on the back of post exposure optimization and measurement. Because back

early days of search advertising, it actually worked. Like, it was there was a good case

to be made that if you clicked on an ad and then you converted, you probably converted because

you clicked on the ad because that's just the way the ecosystem was back back then. It's gotten

way more complicated now, and that logic doesn't hold anymore. But you've got

these big behemoth ad tech stacks that are built on that

on that foundation, and you can't really pivot off of it if you know, to do

something more more sophisticated, like, say, a more more causal. But there's also

not a great motivation to do that, I would say, from the ad platform side and not that this

was ever explicitly talked about. This is me kind of just observing. Yeah. I think I know where you're

going with this one. Advertisers aren't demanding big changes in in, you know,

in fundamental changes in the ad tech stack. In fact Yeah. A lot of them are demanding

a better last click measurement. Can we keep that alive in an a post ATT world, things like that? My

numbers look good. Look at all the revenue I'm in charge of. I can hire more. I can have more

tools under my belt. There's a whole ecosystem built around that, and that's what makes it

really hard for the ad tech platforms to move away from that. Not to mention, it's objectively a hard

problem to solve. Yep. Love it. You said something earlier in your principles that I I

think relates to the work you're doing with growth by science. I'd love to maybe double

down on that concept of simplicity. We've talked about this a little bit, but can you share more

about what does simplicity mean for you in what you bring to the

table for your clients at Growth by Science? Simplicity, I think is super important,

and and, you know, it's a part of a lot of the the cultural culture docs and

and sort of approach that I've that I've tried to to bring to, you know, different places that I've worked

and even now in in growth by growth by science. One thing I really

believe in is simplicity and specificity, And I think you can

do a lot with really specific tests that are simple,

that are kind of coarse grain. You can you can really get a ton of value out of those,

before having to do anything crazy. I think a lot of marketers overestimate

the complexity that's required to get value out of, out of an incrementality test.

And not that the setup for the test infrastructure and and the tooling, yes, that

can be very complicated. But the tests themselves, to get a lot of value out of them, you you don't

have to do anything crazy like some multivariate, full factorial test where you've

got this creative variant and this targeting and blah blah blah. Like, yes, there can be a place for

tests like that, but you'd be surprised at how many marketers don't

do basic tests like a full full marketing program holdout test.

Something like that can be hugely powerful. And in fact, that's the first test that

whenever I am working with a growth marketing client that we run is a full full

marketing holdout across all digital marketing, ideally all marketing if it's not just

digital. But at least across all digital, we we do a a holdback, and we see what is the

efficacy of the overall marketing program. And, you know, that can that can lead you down a couple

of different paths. Because if it is if it is if it's working, you don't need to make make big

changes to it. Right? You just need to find out how to make it work better. But if it's not working, if if it's

below the ROI, the minimum acceptable ROI that that you need,

then you have to make some bigger tweaks. You have to potentially think about, well, let's shut down some

channels. Let's see if there's any, kind of directional info about where we're bleeding and

and what's kind of bringing that incremental CAC up and and what's what's pushing it forward. And so

you you can have almost, like, two different learning agendas. One that's more iterative if the

overall program is working versus one that's more aggressive if it's not working. And Well,

that's I think something like that is a great place for most people to start is just with that,

that concept. I love that. And when you say holdout, Andrew, are you

talking about, like, a geo holdout or a segment holdout? In in this day and age,

geo holdouts are are the best and, in some cases, the only way that you can really do a cross

channeling from mentality test, which is what you need to do. And so I would I'm a big fan of Jio

holdouts. You know, there's a lot of great tools out there that that can help you with that. You

can also you know, it's open source, you know, open source tooling out there as well that that you can use.

And, you know, I've I've done pretty sophisticated geo holdouts just in Excel

with and the analysis just Excel. So let's see. You don't have to have to invest in a big

fancy tool to get this stuff done. You can you can do a lot of it in a pretty hacky way and get, you know, get a

pretty good answer out of it anyway. Yeah. And I just wanna tell like, I think what you're kind

of attempting to do with growth growth by science and, and there's just so much learnings

and power and experience in what you've achieved and thought of and been through. And I feel

like you're kind of distilling it down into these, you know, simple principles for a lot

of teams to take advantage of with growth by science in a way that doesn't require

as much bloat and cost as I think some people try to approach it with. Is that fair to say?

A hundred percent. I mean, you can always start small. A lot of this stuff is really easy to start small

with, and they can grow to be complicated. And you can build a really sophisticated

thing like what I described with Netflix and build that tech and and that feedback loop. But that

that you don't have to start there, especially if you're a smaller smaller advertiser. In fact,

one mistake I see a lot of advertisers do that are, you know, kind of just getting started and and

maybe not in, call it, let's say, you know, seven figure a year sort

of ad spend or or even a little bit less, They try to spread that around too many different

platforms. That's totally unnecessary. Pick two or even just one platform that you feel

aligns with the audience that you're trying to reach and and get started there and spend a little bit

less. You concentrate your spend in in smaller areas, smaller geos, see how it

works, And you'll you'll end up learning a lot more if you approach your growth marketing like

that versus something, you know, right out the gate that's just spreading dollars across multiple

platforms and but not getting a ton of saturation into the different into the different segments

of markets that you're you're approaching. You're you're not gonna learn what's working and what isn't working. I

think the key is to find something that you can hang your hat on, that you know is

driving incrementality, and then build off of that. And and, you know, start small and build

exponentially. Like, once you're driving incrementality in an ROI positive capacity,

even if it's just on one platform, guess what? You can reinvest that money and you can be confident. You know,

it's not wasted dollars. It's dollars that you've earned, that you've re earned from from doing something

in a smart way. And and so I think that's that's a great great approach for for

especially for smaller marketers. Yeah. I love it. It echoes, like, a lot of times what we see, we look under

the hood of a of a company. It just it's just amazing how much money we can save them and how many

things we can rebuild for their benefit that are just a little bit smarter. So kind of wrapping up,

I know you have to hop, Andrew, but what what are some of the biggest mentors or sources of

inspiration for you? You've achieved so much in your career and bring a lot to the table. What are what are some that have inspired you? You know, I I said this

before, but I'll I'll reiterate it because I do think it

every time I reflect on it, I think I I realize another way in which it's influenced my career. But I

think the culture at Netflix and and not all of it, by the way. I didn't there were certain parts of it that didn't, you

know, fully jive with the way that I, you know, like to approach work. But a lot of the kind of

language and approach to experimentation, disagree and commit, those kind of those kind of

principles, freedom and responsibility. Disagree and commit is not a a Netflix, one, but

but there's those, you know, the same kind of ideas that are pervasive in it. But freedom and

responsibility, context not control, these are sort of some of their principles.

And everything around testing and the general testing culture, I think, has has really, really

shaped a lot of the way that I've approached not just technical challenges

and kind of products and strategies that I've built, but I think some of the

softer skill type principles also shaped my management style. At

Snap, I was leading a pretty big team, the measurement team globally, which had a bunch of subteams.

And I really, really loved applying some of those principles, at least the ones that resonated

with me. And we kinda made them our own and and, you know, adapted them to to the Snap culture, which

is kind, smart, and creative. And, you know, I think it worked out really well. I think there's a lot of portability

to some of those principles, and I, you know, I really hope to apply the what

we built over. It's not from a cultural perspective as well as the Netflix stuff into

into, you know, subsequent, roles and into future kind of cultural opportunities.

I love that. That one comment you had earlier in the in the pod about soliciting dissent, is that

what you said? Yeah. Soliciting dissent. Yeah. Again, not explicitly it's not explicitly

the Netflix one, but but it is it's, again, built built in there, and it is,

I I think it's it's so important. As a scientist, like, if you look at in the scientific community,

there's this concept of peer review, and that's literally solicitation of dissent. Here's

my study. Here's my my paper. Tell me what I did that was wrong. What did

I miss? What do I have the blinders on for? Right? People will go in there and and

shred it, basically. Right? And and that's just gonna make your hypothesis and your experiment

that much stronger. And I don't think enough people do that because it can be, you know,

frankly, a a blow to your ego. It's not easy to do. One hundred percent. Right? You say,

what's wrong with this? And somebody finds ten things that are wrong with it, you're gonna feel like, oh, did I fail? But I think

what it boils down to is changing your your perspective. It's not a failure. I've I've just made

myself stronger, more resilient, more robust. I made my work more robust against

you know, I've just made it, made it more robust in general. I love that. It's such a I feel

that I feel that so inherently. I love that growth mindset and willing to be coached, willing

to take criticism, willing to learn, you know, in the face of of things that are very

that you hold dear in your work and in your findings and your experimentation. I'll say, like,

you know, one thing that I've I've had managers tell me, there's multiple managers telling me this,

but, is Andrew's got strong opinions weekly hell. And I don't know if

that's meant as a as a compliment or as a, you you know, constructive criticism,

but I always take it as a compliment because, you know, I love having strong opinions

because I've got to them those strong opinions by doing what I believe to be, you

know, thorough analysis. Right? That's why I feel strongly about something. But they're weakly held because if somebody

says, hey. Actually, you didn't consider this, and you, you know, you're missing

this part of it, I'll be very quick to to kind of go back on it. I think the the the challenge that I

have is pretty forward guide, pretty assertive guide. So not I I

don't always invite or I I don't I I sometimes intent unintentionally uninvite

that, that dissent, which I I don't ever wanna do, and I try to clarify that as much as I can. I

I want people to prove me wrong, but I don't always come across that way. So it's it's something that I'm I'm I love

that. To work on. I love that, dude. Love love that EQ and self awareness and

and and growth and and being willing to acknowledge. And when you bring a lot of,

you know, intelligence and, and wins and, and research and, and facts to

the table, it's, understandable. And I love that concept of those strong opinions loosely

held. Andrew, I've been just blown away and appreciative of your time today.

For the audience out there that wants to look up Andrew Cavato and all things you, where can they

find you? Yeah. I'm on I'm on LinkedIn. I'm trying to putting, I'm trying to put more content on

there, just Andrew Cavato. And, of course, you're welcome to check out our website at growth by science

dot com for a flavor of of kinda what we do in our approach. And, yes, if you hear anything on

the podcast that you agree with or, more importantly, that you disagree with, I would love to hear from

you. Please, you know, shoot me a a post on on LinkedIn or a message or email, and I'd love

to start a conversation. Legend. Even in his sign off, he's, soliciting descent,

living up to those cultural values. Andrew Cubano, you're the man. Love the conversation

and, can't wait to get this out there. Talk soon. I appreciate you, man. Thanks for having me. Always.