Runpoint: AI Business Transformation Podcast

Unlock the practical side of AI adoption in private equity. 🤖💼
Matthew Hall and Sam Gaddis break down Zapier’s widely shared AI Fluency Framework and then rebuild it for PE—covering deal sourcing, diligence, fund ops, and the coding workflows that actually ship AI products.

What You’ll Learn
Why Zapier made AI fluency non-negotiable for every new hire—and what that means outside tech.

A four-level ladder (Unacceptable → Transformative) tailored to PE functions:

Deal Sourcing

Deal Evaluation & Diligence

Fund Operations

Value Creation & Investor Relations

Concrete tool stacks: Clay, Replit, Claude Code, GitHub Issues, custom GPTs.

The “white whale” of PE ops: a chat interface that understands every deal doc—and why we’re this close.

Sam’s two-terminal setup that turns AI agents into reliable teammates (and kills downtime).

Links & Resources
Zapier AI Fluency Framework → https://zapier.com/blog/zapier-ai-first-hiring-leaning/

Sam’s coding-workflow video → https://www.youtube.com/watch?v=v0o50r4hz24&t=259s

Full PE AI-fluency matrix & examples → coming soon on runpoint.ai

Subscribe for more AI-in-business deep dives → 🔔

Chapters
00:00  Intro
01:00  Zapier’s AI Fluency Framework explained
10:00  Deal Sourcing—spray-and-pray vs adaptive agents
17:00  Diligence workflows with Replit & contract analyzers
24:25  Fund Ops dashboards, data warehouses & the ‘impossible’ chatbot
32:00  Sam’s multitasking Claude Code + GitHub flow
35:00  How you can score your own firm (and help us refine the model)

What is Runpoint: AI Business Transformation Podcast?

Hosted by Runpoint Partners’ founders Sam Gaddis (tech entrepreneur & AI builder) and Matthew Hall (PE operator & growth strategist), Runpoint Podcast strips the hype from artificial intelligence and shows you how to turn it into concrete business results—fast.

Matthew Hall (00:00.609)
Hello everybody, welcome to the Run Point podcast. Hello Samwise, Gadus, how you doing?

Sam Gaddis (00:05.612)
Hey, pretty good, Matthew. How are you?

Matthew Hall (00:08.19)
I'm great. Sam, you a Lord of the Rings guy? I don't think I've ever asked you that before.

Sam Gaddis (00:13.326)
I'm more of a Harry Potter guy, I'm not as sophisticated.

Matthew Hall (00:16.279)
I understand, understand it. Poor you. Okay, we have, think, great show.

Sam Gaddis (00:21.506)
Wait, where did that come from? Why did you ask me if I was the Lord of the Rings guy?

Matthew Hall (00:24.533)
I called you Samwise, like Samwise Gamgee.

Sam Gaddis (00:26.99)
Okay, that makes sense. See, that's how much of a Lord of the Rings guy I am. It just right over.

Matthew Hall (00:29.024)
Haha

You don't even get that? It just rolls right off you? that's wild. So we are going to talk about AI fluency today across private equity. And I want to introduce something to you. I'm sure you've seen this before. I am sure you've seen it before because we just looked at it. So let me share my screen.

Matthew Hall (01:00.599)
But so a few weeks ago, the CEO of Zapier, guy named Wade Foster, posted this, I think on LinkedIn first, but then it just made the rounds all over the internet on Twitter and I think other people commenting on it. But it's Zapier's AI fluency framework. So it's how they evaluate potential new hires on AI fluency. And this is a mandate from the top down for the organization that they want pretty much every new role across all of their departments.

to be pushing AI fluency. And so across engineering, product support, people slash HR and marketing, they have what unacceptable, capable, adoptive and transformative means for those roles. So to give you a sense of this, in engineering, unacceptable is something like the engineer calls AI coding assistants too risky. And then you go all the way to adoptive and transformative.

It's an engineer that you hear say things like, or in their resume can explain things like that they're chaining LLM calls with fallback and retry logic. They know Claude code cursor, windsurf, they're using all of those tools. And then at the very end, the kind of top of the mountain and transformative, you see stuff about how they've actually learned the hard lessons from using those tools. Stuff like shipping LLM powered features that monitors live metrics and builds AI first dev pipeline, guardrails, rag docs, et cetera. So they're kind of pushing.

Sam Gaddis (02:02.574)
you

Matthew Hall (02:26.615)
pushing the frontier in those scopes. Sam, have you seen this? What are your thoughts on this?

Sam Gaddis (02:31.32)
Yeah, I was going to say the same. I was going to ask you the same thing. I've absolutely seen this. I've come across both types. Engineering is kind of the easiest one to think about for me because I have run into engineers at our clients who say exactly that it's too risky. LLMs cannot produce good code. I like to do everything myself. In fact, when we've hired engineers as contractors to do very specific things,

I run into this with them a lot. In some ways, the more senior the engineer, the harder it is to get them to use AI. And I use it enough, I would not call myself a great engineer, but I use it enough to know what it can do. And it's been on occasion like pulling teeth to get them to use cloud code, for instance. So I've had to buy the $200 plans for engineers to use cloud code and then sort of like mandate that they use it and...

In every instance where I've done that, they come around, but it is hard to do. Similarly, I've run into guys, there's one guy, one of our clients who I just love working with because he is completely open-minded about this and has really enabled us to go as far as we can doing things that we weren't as familiar with before, like deploying infrastructure.

doing real DevOps stuff with AI and we've been collaborating on that. So yeah, it really does run the full gamut and I think this is a great framework.

Matthew Hall (04:02.039)
Well, so my question to you though is, because what strikes me about this is, is that it's across the entire organization. So this, the CEO of Zapier is basically telling the company that they don't want to hire anybody in any function, right? That isn't at least on the AI fluency ladder. And one, you know, that's a big bet. That's a top-down bet, but it raises the question, does every organization need something like this? Or is this unique to businesses like

Zapier or is it a bad bet altogether?

Sam Gaddis (04:34.093)
Every organization could certainly benefit from something like this. mean, I don't have to, I think I'm preaching to the choir if I say something like AI can actually improve every aspect of the organization. You and I both believe that. think most people who are paying attention believe that at this point, but more specifically to the question of,

Matthew Hall (04:52.055)
Well, I'll actually, I'll push back a little bit, maybe just for the sake of the podcast though, because this is a thing I really do vacillate with back and forth. There are days when I'm fully bought in that AI is going to change literally everything about life and business going forward. And then there are days on the other side where I think, you know, it has really great use cases that are specific right now to tech and engineering, and it's promising in other areas like customer service and that sort of thing.

Sam Gaddis (04:54.881)
Okay.

Matthew Hall (05:20.757)
And I think maybe that'll mature a bit and it'll be great at certain things and it will never actually get great at other things. But you know, where I'm not, and what I have no time for is that this whole thing is a bubble and this is, know, we're gonna walk all this stuff back in a few years. I think that chip has sailed completely. But I do wonder, you know, in this, if people in HR or, you know, I don't know what the functions are, or really, are we going to completely...

is support going to be this thing that is completely overrun with AI? Because you could also make the case that actually the best support and customer service, there's going to be a premium put on the human touch now because everything is going to be so automated with AI going forward and all that kind of stuff. So that would be my pushback. What do you think?

Sam Gaddis (06:06.413)
Well, sure. mean, of course, yes, the top level premium support plan will include a human and that human is going to be using AI by the way to assist you. tell me one facet, give me one example across the organization where AI can't provide some value and I'll buy into that. Otherwise I'm on the complete opposite side of the spectrum. Yes, it can't solve every single engineering problem. We've run into a bunch of these where, you know, rag is just not up to the task yet.

If you're following this closely though, and you're saying that every single week some new technology comes out that enables a capability that wasn't there before, it's hard to not think that it's going to eventually get there. But even if we say that AI will never be able to do this specific engineering task or this specific HR task, sure, that's fine. That doesn't mean that the person responsible for that task shouldn't be an adaptive AI leader.

because there's 10 other things that are in their purview that AI can absolutely help with.

Matthew Hall (07:07.231)
Right. All right. Well, I think we're more or less in agreement, let's let's good. Good for the conversation anyway. I want to talk through. So our pitch for this show is we wanted to make a version of this for private equity, which is the niche we focus on. And so Sam and I spent a little bit of time putting together an AI fluency framework specific to private equity. And it's a little bit different than the Zapier one in that

It's not based for hiring. We're not trying to solve hiring for private equity, but it's more about kind of mapping the maturity of the organization across the primary jobs to be done in a private equity fund. And so we're to go back and forth here and talk about what the spectrum of unacceptable to transformative means with AI across deal sourcing, deal eval and diligence, fund ops, value creation and talent search, and investor relations and fundraising. Before we jump in, Sam.

Are these the right rows?

Sam Gaddis (08:08.429)
I think so for the lens that is private equity and certainly venture capital as well. And any kind of fund management, I think this is the right lens to look. What's been interesting for me to see is how many of our private equity clients are also interested in just the run of the mill business functions like expense management and things that would not necessarily fall into these categories. That would be the category of running your business.

And so I think that category can remain implicit for the sake of this discussion, but certainly there's more rows than what we see here.

Matthew Hall (08:48.235)
Yep. All right. Well, let's, I had, have a similar read on that, think as well. didn't know when going through this exercise, I didn't know where to put some of the just day-to-day uses that I get so much value out of with AI of just using it as a thought partner of just bouncing ideas off of it, strengthening my, my thinking habit, habit poke holes in a draft. wrote, you know, all those sorts of things, which is just, it's just life. It's just, it's just, you know, work and that doesn't, it doesn't neatly fit no bucket.

Sam Gaddis (09:16.729)
Here's the thing about that though. If you can take either your organization or yourself or any individual in the organization, it doesn't have to be every single person that is all the way to transformative on the AI side, but if you can take them for any one of these elements to the level of adaptive or transformative, then...

they're going to automatically use it across all domains because it's just going to be so obvious the ways that you can. So don't think that you have to focus on how are going to do finance management for our fund operations. That's just going to happen. If they are good enough to be building stuff and Claude code for fund ops, then you're going to get all that stuff for free.

Matthew Hall (10:01.227)
Yep. Yep. And I do think we've sort of, we've almost, I don't know if we meant to do this, but something we sort of drafted off of, of the Zapier ones, these are almost behaviors of things that would indicate that your team is on the adaptive transformative path. And I think if they stick there, they will adopt whatever the latest or greatest tool is, you know, to actually get there. So the tools are sort of agnostic. Does that make sense to you? So like, you know, if you're using Claude code to spin up sub-agents to do your tasks,

you're gonna hop on whatever the next version of Claude Code is, you know, or whatever, like you're kind of thinking, right? So yeah, all right. So let's dig in. I'll kick things off on deal sourcing and then let's alternate going down. that work for you? Okay, cool. So on the deal sourcing side, and obviously this can mean a lot of different things to a lot of different types of PE funds, you know? So I think it's probably more applicable to the higher volume, smaller deal size funds that are going through.

Sam Gaddis (10:32.193)
Yeah.

Sam Gaddis (10:41.514)
Yep, sounds great.

Matthew Hall (10:58.421)
going through more deals, know, because it's just more volume there than the ones that are maybe super niche or almost super offline, you know, that they may target businesses that are less, you know, B2B or for instance. So on the unacceptable though, I think a lot of these are gonna be similar. Unacceptable is kind of the spray and pray manual outreach. No AI enrichment whatsoever. You're sending out a bunch of un...

undifferentiated, unthoughtful, not thoughtful emails, you know, to people who might fit your thesis spec and probably getting a pretty low conversion rate from there. Moving up the ladder to capable, you're starting, your team is starting to use ChatGPT at least to help write emails and be more thoughtful about them or improve, you know, your templates, those sorts of things. And I think you're dabbling with tools like deep research or with perplexities version of that, where you're doing more

upfront diligence on prospects before you ever reach out to learn more about the founders and learn more about the market and learn more about the history of the business itself. Moving into adaptive and I think adaptive and transformative are kind of where we come to play. I don't think anybody's really hiring us to get to capable on any of these metrics, Sam. So we can speak more to the blue and the green here, but adaptive I think is when you're starting to use

multi-model agents. wrote ClayGence here because Clay is sort of the leader in this space as we learned a couple weeks ago in our conversation with Shane. But a lot of these tools do similar things. They use list building and research and AI to do enrichment and scoring of new prospects. So how does everything that we can learn about a business online, how does that match up with what our thesis is and what we're looking for? And can we get an AI?

to write an analysis about where we should fit, where this company should fit in our priority stack. And then I think another kind of great thing you would see in the adaptive space here is moving from just vanilla chat GPT to custom GPTs that can adopt your style, either your personal style or the outreach that is most effective for you across your business if you have best practices, things like that. I think getting all the way to transformative in deal sourcing is when

Matthew Hall (13:21.749)
the system starts to get better. So you're not only researching and writing emails with AI, you're evaluating the performance of those and tweaking the prompts along the way. So you're either getting trained for you personally as the employee at the fund or based off of win rates and best practices across the the crop tire fund. And then the other one that I think is really interesting in terms of transformative is that

Sam Gaddis (13:43.852)
Thank

Matthew Hall (13:50.101)
you're dealing with this problem we're seeing everywhere, which is that everybody's dealing on the same lists. If everybody's reaching out to the exact same prospects, how do you get there first? How do you get there better? Or how do you think differently about your lead sourcing altogether? How can you start mining data sources that aren't the same three or four providers of B2B businesses? So we have a couple of great examples in this space, I think, on the simplest way, we built a tool that

transforms those market maps, you everybody's seen those of, you know, every business in a specific niche sector and pulls out the name from their logos, structures into a database, gets the domain and then triggers their enrichment flow from there. Another way we've done this is with YouTube in the past. if there's, it's really easy now to get transcriptions from YouTube videos. And so if there's rich data about prospects on YouTube or in government meetings or

Sam Gaddis (14:26.243)
You

Matthew Hall (14:49.015)
We are predicting for commercial real estate clients, Sam, you'll recognize this, know, what future legislation is going to do to a specific market, you know, in Texas. And so we are using AI to infer where the opportunities are going to be, you know, kind of classic skate to where the puck is going to be as opposed to somewhere, yeah, as opposed to where it is right now. What do you have? Do have any thoughts on deal sourcing in AI, Sam? What is it? How does that sound to you?

Sam Gaddis (15:09.355)
Yep.

Sam Gaddis (15:18.275)
I would just say that I don't see many people coming in and they're red and the unacceptable that we talked to. Of course, that's self-selecting for the people who are interested in AI that are the ones who reach out to us. think most, but I would venture a guess that most people are already in this capable area and it's the top 20 % that make it to transformative.

Matthew Hall (15:39.123)
Yeah, well, I do think there are certain of our clients, people we've talked to, where deal sourcing isn't a problem. They are at the level or they have the personal network or they get deal velocity through other means. And so it's not actually reliant on any of these things. And so I think for those listeners, you know where your strong muscles and your weak muscles are across your organization. So I think that's just kind of part of game.

Sam Gaddis (15:41.003)
Maybe top 10.

Sam Gaddis (15:48.342)
True, true.

Sam Gaddis (15:54.112)
Yes, absolutely.

Sam Gaddis (16:09.109)
Cool. So I'll talk about deal evaluation and diligence now, which I think I will err more towards just like the concrete. The way that I thought about this was through the lens of tools mostly. So unacceptable to me would be you have an analyst copy and pasting data from Sims into Excel to keep track of those. And maybe occasionally they'll ask GPT for some information about terms or

something that they don't know from the Sims. Capable, which is where I imagine most of our clients come in at, would be maintaining a GPT project where you have a lot of the workflows predefined. This is what we're interested in. This is the schema that we want you to output every time we upload a PDF. And so you upload a Sim and it outputs something in a structured data format. And then you probably copy paste that somewhere.

Adaptive would be using something like Replet to build a custom drag and drop contract analyzer that extracts the key terms that you're interested in and compares them to your desired ideal contract. Replet is really interesting. I've started to get some clients on this and move from the engagement model where we are building everything to where we're empowering them to build stuff. And we're just starting to do that. It's really fun to watch because

Replet is kind of like the starting point of building your own real tools. So I think that's super adaptive.

Matthew Hall (17:39.223)
Totally. I'm on a similar thing with a client on the deal sourcing side of things, which is the first iteration we built for the researcher and email drafter was centralized. It was prompts that I ran my accounts and my machines. And now we're in a place where every

every analyst that does outbound for them will have their own agent in their own tool using relevance in this case that they can personalize themselves and make better. And that it's, you know, they're getting on the ladder, you know, of actually building tools themselves. Yeah.

Sam Gaddis (18:06.858)
Yeah.

Sam Gaddis (18:14.079)
Yep. And so you'll see this mirrored across all of the transformative column, but I think for this area, it would be using the advanced tools like Claude Code or maybe the Grok CLI when it comes out soon, or GPT-5, which is also coming out soon, to do full deal lifecycle tracking. So uploading all of the documents, analyzing them, building their investment memos and that kind of thing.

And this is building real software at this point. And so you're getting into super nerd territory. This is something that business users can do. You don't have to be an engineer. You don't have to have an engineering background. But this is a person who's spent dozens of hours playing with this stuff because they actually have a curiosity and enjoyment about it.

Matthew Hall (19:03.201)
Totally.

Matthew Hall (19:06.743)
Okay, let's move on to fund operations. So this is all of your internal processes, but also where you're tracking your success, I think, as a business in fund ops. It's where a lot of our products have come from, which I don't think was our original thesis in this business, but there seems to be a real need in this space.

and a lot of room for improvement. I think on the unacceptable sides, it's all manual reporting and KPI roll ups. Your portcodes are sending in their results and you've got people manually building Excel sheets and dashboards and slides. On the capable side of things, you're standardizing with GPT, I think, is the big one. So you're taking all of those disparate inputs from your different portcodes.

and you're getting them into standard tables and formats for your slides and you're using AI to do some of the heavy lifting there, replacing some of the AI, some of the analyst grunt work. In adaptive and transformative, I think you're much more flexible, but you're also starting to build workflows that are specific to your businesses and your reporting and what you care about there.

You're building dashboards that can kind of accept reporting from any in any format, doesn't matter where they come from, get them into a standard structure, be able to run analysis on them, but also be able to run kind of those idiosyncratic workflows that are unique to your fund, to what people care about your business. And I think those little pieces of Python or those little automated workflows that just get.

Sam Gaddis (20:35.901)
So.

Matthew Hall (20:49.419)
the report in the exact way that your leadership team wants, know, monthly or quarterly, is just those are the types of things that are easier to do now than they've ever been. And it's often sort of like where we're jumping in. It's like, yeah, we can build this dashboard at exactly how you want it, you know, pretty quickly. Let's start there and then we'll build off of that. On the transformative side, it's sort of the dream. And I think we've been asked for this a lot and I don't actually know.

I don't actually know how possible this transformative one is, but I want to talk about it here, Sam. The dream of transformative fund docs is that you have this data warehouse of every deal you've ever done, every deal you've evaluated, every deal after the fact, every piece of paper that's ever come into your business. And you can talk to it in natural language and have it run reports and have it run analyses and all this sort of stuff. There are SaaS providers that are selling this. There are people who are trying to build this.

Sam Gaddis (21:21.446)
It's almost possible.

Matthew Hall (21:46.583)
pretty consistently been enticed by the potential of this. And it's almost like we're just not quite there. It's just out of arm's reach for us. But I wanted to put it on here because I think it's coming soon. What do you think about this?

Sam Gaddis (22:01.256)
Yeah, I totally agree. I've tried to build this. I've said many times, this is my white whale. I will build this. It's really interesting to me that there are companies selling it because we've talked to clients who have bought that software and not been happy with it.

Sam Gaddis (22:19.848)
The primary technology that is used for this kind of thing is rag retrieval augmented generation. And there are just some downsides to that because you're essentially breaking up the

LLM context into it because it can't all fit into the context window. You have to put all this data somewhere. And so you put it into a vector database, but then you have to worry about chunking and how the embeddings are done. And there's a million different parameters for that. And it just is not there yet. If somebody listening to this has found it, please reach out because we will figure this out one way or another. It is just around the corner. I know that within a year we will have this ability.

I was hoping that with tools like GPT's enterprise, Teams accounts, that kind of thing, that you would have that built in, but it's not able to do it. And what am I talking about here? I'm talking about, if you've got 50 or a hundred documents associated with a given deal and they're Excel and PDFs and Word documents representing the initial, the SEMs, the Pitch Decks, then the transaction documents, then the diligent stuff.

the investor updates and ultimately the exit of those companies, can you build a standardized model from that and get reliable information from it? Just asking questions like what was the revenue of the company for the periods contained within our data? That's a harder problem than it sounds. I've been fooled by the simplicity of how that sounds. So it'll get there, it's not quite there yet.

I still have ideas that I want to try on how to build this, but for the moment, I'm sort of waiting to take another stab at it.

Matthew Hall (24:04.713)
Yep, totally. All right.

Sam Gaddis (24:06.664)
I think for the rest of these, maybe we let people grab this from our site because it's pretty similar, the remaining two, value creation and talent search, and then investor relations and fundraising. It's along the same lines, but we have some specific examples that I think we'll share. Are you putting together some kind of documentation for this on the website?

Matthew Hall (24:25.545)
Yep, yep. like what you're thinking here. Yeah, we will, we're going to pull together kind of a longer resource you can catch on our website. We'll put it in the show notes as well for this on YouTube. So you can see examples of all these things and kind of deeper thoughts on each of the levels across private equity spectrum here. And so, yeah, we have across here, I think we have active or finished projects.

at least in four out of five of these rows, so we can kind of bring this to life with products we've actually done, or we can call out to other folks who are doing this well. And so we'll put all that together. And I also want to mention, we want feedback on this. So if anybody listening thinks that this is wrong, thinks that the rows are wrong, thinks that there's something we're missing, or if we're completely overestimating the role of AI in something.

Let us know. We're trying to get better at all this and we're trying to act in as best of faith as possible, trying to use AI to actually improve these elements of the business. So hit us up.

Sam Gaddis (25:30.758)
I'm almost tempted to make a scoring wizard for this where you can have a conversation and score your company on this too.

Matthew Hall (25:39.019)
Yeah, yeah, yeah, totally. All right, well, I'm going to stop sharing, Sam. know you want to talk about a video you recently made about your current kind of coding setup, what you're doing to build the things you are right now. Actually, sorry, I'm going to intro that differently. Let's take a break here. Let's do that again.

Sam Gaddis (25:53.768)
Yeah, so I just want to mention this. Go ahead.

Sam Gaddis (26:01.897)
All right, let me mark it. Okay.

Matthew Hall (26:05.729)
All right, Sam, I know you wanted to talk about a video you recently made about your coding setup. And I think the way that I would position that is we're talking about AI fluency and how it's a horizontal skill across an entire organization. And what are comparable things to that in the history of business? Because you couldn't really say that about most skills or technologies aren't actually applicable to every organization, every sub-department within an organization.

Sam Gaddis (26:34.952)
Yeah.

Matthew Hall (26:35.607)
One that struck me though as the best comp for what we're talking about AI though is maybe not self-evident, but it's delegation. You know, it's the ability for people to clearly state what they want and how to manage teams, build consensus around those teams, get things done with people under you, you know, and keep everybody happy. The longer I work with AI and the more and more these sort of agentic workflows and sub-agents and things like that come up,

That's what this work feels like. And I think you are such a great example of somebody who excels at delegation. And even this video you put together about how you use Cloud Code to multitask is an example of delegation. So why don't you walk us through that?

Sam Gaddis (27:20.893)
Yeah, I think,

Sam Gaddis (27:26.459)
It reminds me of the old Harvard Business Review article. It was a classic HBS article called Monkey on the Back. And it talked about the four different levels of delegation, starting with you hire somebody and they come and ask you for every single thing. What do do now? And you have to come up with it and tell them what to do. And then the next level of delegation is...

I think they come up with it and tell you, I think I'm gonna do this and you approve it. And then the next one is they do it and tell you that they did it. And then the next one is they just do everything and you don't have to even think about it. And we're getting to the point of level one right now, not level zero, which is fantastic. And the big insight for me, which I think was actually your idea and I just kind of ran with it, was moving to...

Matthew Hall (28:07.255)
you

Sam Gaddis (28:17.001)
Issue tracking, actual proper issue tracking in GitHub. So what I'm going to start talking about now is super user stuff. If you are trying to be in that transformative column and you are actually building things on the leading edge with the absolute best technology you can using AI, it's that kind of stuff. If you're one level before that and you just want to start experimenting with building tools for your business, try Replit. R-E-P-L-I-T, people always ask how to spell it.

Try Replet, it's great. It's the perfect starting point for that. But if you want to go beyond that, the answer currently, and it won't be this I'm sure in a month, but the answer currently is Cloud Code. And the thing that we figured out to pair with Cloud Code that just makes it really sing is GitHub issues. And so the workflow that I've outlined in this video is essentially you're using two terminals. Everything now is terminal based. So if you've never gotten into Mac terminal or PC terminal,

It makes you feel like a hacker. It's really cool. It's fun. You can essentially do everything without even looking at your code base. You'll still use an IDE to understand what's happening with the code and put in your API keys and things like that. But for the most part, you're operating in terminal. And my workflow now is essentially having two terminals open for the same project. One of them is the project management and GitHub issue tracker.

that terminal is not creating any code at all. It's just managing issues. And the other one is actually doing the code. And so I will come up with a list of issues, sometimes 10, just in the issue tracker and say, I need to migrate my database. I need to do a design review and then I need to implement the outcomes of the design review and whatever else. And then over in the actual code runner, I will say,

I actually have a custom command that's just go, so slash go, and then I type in the issue number, so slash go 64, and it'll just run with it. It'll go grab the issue from GitHub. The issues, because they are written by Cloud Code, are deeply thorough, they're well structured, they're perfectly clear, and it'll start. This has been really transformative for me, and one of the things that makes it work so well is you're taking...

Sam Gaddis (30:43.961)
all of the documentation out of your code tree. Every time you put something, you add a new file to your code tree, you're expanding the context. You're expanding the token usage of that code tree. So it makes things less clear for the LLM. You're reaching your usage limits faster. And most of that stuff doesn't need to be in there. You need to have your Claw.md file and you need to have your README file. And that's it. If you are doing some big refactor or

adding some new feature and you need to create some documentation for that. There's no reason that that needs to live in the code base itself. That should live somewhere else. GitHub is the perfect place for that. And now you can manage all of that through there. What else should I say about that? I think that's kind of the basic overview. It's covered in detail in the video.

Matthew Hall (31:29.559)
That's the basics overview, but it's, this was, I don't know. That video has helped me quite a bit. And I would, I think, I think most people could benefit from this if they're at all interested in, in kind of taking their, their AI usage to the next level, adopting things like cloud code. It's the, the comp that I see though is we all see on LinkedIn, these kind of

bullshit artists that are showing you diagrams of every role in a business they've made an agent for. You know, you've probably seen it's like an org tree and it's like 35, you know, agents. There's your like PR agent and there's your marketing ops agent. Exactly, exactly. I don't think any of those are real, you know, frankly, but this is the first step to actually having a team of, you know, a team of things that have their own use cases.

Sam Gaddis (32:02.908)
Yeah, yeah, yes, so annoying.

Sam Gaddis (32:09.158)
It's running my whole marketing organization. Yeah.

Matthew Hall (32:23.933)
And it's so much simpler than you think. You're not organizing every job this agent could do. You're just saying, start with two, you know, start with a PM and your developer, you know, and this could be different for your business, but what is it? What is it for you? Right. Maybe it's a copywriter instead of a developer, but somebody whose job is to keep track of the work and clearly spec stuff and somebody whose job it is to actually do the work itself. And if you can start there and have them have a shared

a shared punch list, a shared task list. That's all GitHub issues is, because both of them work well here. But you could swap that out for something else. I'm sure that using other things. So you're just got somebody who's always making sure documentation is clear, who's never writing code, who's organizing the work and somebody doing the work. It unlocks multitasking because you're now able to do the same. I've seen you do this where you've got three different products going on at once because you've eliminated all the downtime of watching.

the AI write code. You can just go across all these issues across three different projects.

Sam Gaddis (33:28.85)
Yeah, I mean, the timeline at which AI can write code used to be something like 30 seconds. It can spend 30 seconds churning on something and maybe it'll write a file or two. And that has gotten so much longer. Now we're at five, 10 minutes, sometimes a little bit longer. And so you don't have anything to do while it's building out your database or whatever that major task is. So you can either go to Twitter and mess around on that and do scroll or you can organize your project.

Matthew Hall (33:47.637)
You

Sam Gaddis (33:58.024)
or work on some other project. And if you do get into the coding, the one thing I will say that you have to learn if you're trying to go into the transformative side of actual coding, like you said, I think you could probably do this for other things that don't even involve code or GitHub. Maybe you can do it through Trello or something like that. But if you do want to do coding, the one old school prerequisite that you have to learn and you just have to slog through is GitHub. And you just have to learn what commits are,

what remotes are, just learn the basic technology, make yourself a GPT that teaches you that or watch some YouTube videos or something. This will take you a few days. It'll take you a month to be good at, but once you do that, it unlocks so much. And yeah, I would say it's worth it for anybody who's really interested in this stuff.

Matthew Hall (34:43.905)
Well, great. I think that'll do it for our conversation today. Quick recap, we'll put together, we'll publish a long lengthy article of sorts with the Zapier AI Fluency Framework, our version for private equity, examples of those things. And you can grab the video that Sam shared on his workflow. Maybe we turn that into a document as well for people who don't wanna watch the video so they can see that. And we'll show that up.

Sam Gaddis (35:08.21)
Yeah, great idea.

Matthew Hall (35:13.387)
please send us any feedback you got. And yeah, if anybody can help Sam solve his white whale of the natural language chat bot, for lack of better term, that has access to every deal doc you've ever done and can run analyses and do that, let us know. We can sell it. If you've got it, we can sell it. So let us know. All right.

Sam Gaddis (35:34.662)
Yep, absolutely. All right, see you Matthew.

Matthew Hall (35:38.54)
Yeah, bye.