Chuck Yates Got A Job

AI isn’t just a buzzword here, it’s reshaping how oil and gas gets work done. Collin McLelland breaks down what he’s building at collide, from tackling the nightmare of legacy systems to creating smarter, vertical AI tools that actually fit the industry. He and Chuck swap stories on why community and knowledge sharing matter as much as tech itself, how AI speeds up decisions and cleans up workflows, and what the future of jobs in energy might look like when machines and people work side by side. It’s equal parts practical advice, behind-the-scenes insight, and a good dose of Chuck’s trademark humor.

Click here to watch a video of this episode.


Join the conversation shaping the future of energy.
Collide is the community where oil & gas professionals connect, share insights, and solve real-world problems together. No noise. No fluff. Just the discussions that move our industry forward.
Apply today at collide.io

00:00 - What is Collide
02:28 - The Catalyst for Collide
04:26 - What is RAG
08:56 - Collide vs. Clyde AI
12:40 - Community Over Everything
14:55 - Colin’s Wall Spacing Story
17:05 - Collide Community
19:57 - Clyde AI Enterprise Software
24:34 - AI in Oil and Gas Today
28:20 - AI: Correlations vs. Causations
30:28 - AI and Questioning Techniques
32:40 - Value of AI: Search and Correlations
35:10 - What AI Can Do for You Today
35:55 - What’s Next for Canopy
37:08 - Future of Work Insights
41:40 - Extracting Tribal Knowledge
46:14 - Market Size of AI in O&G
49:25 - Chuck Got a Job

https://twitter.com/collide_io
https://www.tiktok.com/@collide.io
https://www.facebook.com/collide.io
https://www.instagram.com/collide.io
https://www.youtube.com/@collide_io
https://bsky.app/profile/digitalwildcatters.bsky.social
https://www.linkedin.com/company/collide-digital-wildcatters

What is Chuck Yates Got A Job?

Welcome to Chuck Yates Got A Job with Chuck Yates. You've now found your dysfunctional life coach, the Investor Formerly known as Prominent Businessman Chuck Yates. What's not to learn from the self-proclaimed Galactic Viceroy, who was publicly canned from a prominent private equity firm, has had enough therapy to quote Brene Brown chapter and verse and spends most days embarrassing himself on Energy Finance Twitter as @Nimblephatty.

0:19 So I do love the fact that I was hanging out a digital wildcatterscollide for three years before I actually got a

0:30 job. Yeah, no. Yeah. You made it a long time. That was a long internship, apprenticeship. You tried fighting the inevitable as long as you could.

0:41 You know, it's funny I think people appreciate this, but Chuck's been working without a salary for a while now. And Chuck is like, Hey, look, someday I probably want to get paid, but right now

0:56 I don't want a salary because I don't want to be told what to do. I don't want to have to come to the leadership meeting and all this, and you still do it anyways, but you're just like, I don't

1:03 want to be told to do it.

1:06 Fair enough

1:10 You know, the serious side of that is, at the end of the day, we just have to go raise money.

1:18 being VC backed company and all. And yeah, if paying me takes a month off our runway, that doesn't do me any good, you know? Yeah, for sure. Yeah. Well, we got you on health insurance now.

1:30 Yeah, we did. We're making. And it's pretty good helping. Yeah, we got good health insurance. I went to the ENT the other day and earned it a way too much information here. What do urologists

1:43 the other day? And the - It's good back to the ENT.

1:48 And now, and they put me on a testosterone, just I've become the old man and - Yeah, you got your testosterone max. And I'm gonna go ahead and give you this for you to beat me over the head with

1:57 it. Yates, males, historically. I've always been low testosterone. My doctor's like, I don't treat numbers though. I treat patients. And so finally I went on testosterone and the insurance

2:09 actually pays for it. Oh, nice. Yeah, we should do an episode where we analyze your hormone levels and optimize you. Yeah, you should do like a six month program of, 'cause you're on Crete now.

2:20 I'm on Crete, you've got me broing out. You're gonna be a beast here in six months. Yeah, yeah, not exactly. So do this, I think a lot of people have heard, but tell people what we're doing

2:33 now at Collod, because I still bump into people that are like, Hey, the podcast wins energy tech night. That's a hell of a transition to go from testosterone to what we're doing with Collod Don't

2:46 forget that you're all a just. Yeah, I think it's good to actually take a step back, especially for people that have followed us along along the journey with Digital Walk Outers. It's actually

2:58 cool. I had a friend, texted me this morning, and I was looking through my calendar for something old, and I found this meeting invite from 2020, coming to talk to you, and you were telling me

3:07 about the initial ideas for what became Collod, and it's just cool seeing it come to fruition, 'cause we have been working on this for a long time, But, you know, background. 2020 started, you

3:19 know, it all kind of starts with a great crew change, you know, we had this bimodal distribution of age in the workforce. He had boomers and millennials and not a lot of Gen X in between. And so

3:27 this creative record, I am Gen X. I made it by three years. I'm not a boomer. Check is one of the few Gen X's and oil and gas. And this opened up a few, I'd say problems and opportunity sets.

3:40 And one of those problems was that the industry operated off of legacy software, software that was developed in the 90s and 2000s. You go to conferences and people would be showing off their

3:49 software solutions and they just looked old and you contrast this like what was happening in Silicon Valley. It's just like, you know, the industry was always 10 years behind and digital software.

3:58 And so you started seeing these oil and gas tech startups start popping up. And this one, we started the podcast back in 2018, the oil and gas startups podcast and really were trying to fill that

4:10 void because there was no community or platform for Tech companies in the industry and obviously, you know, we're really successful in that and build community around oil and gas startups podcast

4:22 and digital wildcatters and just building community of people that wanted to change the industry and around 2020 2021

4:31 is when it started getting the idea for what is now collide and a couple of Different reasons for starting it But the main one that I saw was that there wasn't a place on the internet for oil and gas

4:42 professionals to hang out So you had LinkedIn LinkedIn sucks. Everyone hates it. You know the majority of the industry is not on LinkedIn you had little pockets on other social media platforms like

4:53 Reddit and Facebook and Twitter, you know, you had energy fin twit growing over on Twitter and one day I saw an engineer get on Twitter and make a post and he said hey I have a lot of iron carbonate

5:04 build up in my saltwater disposal world as anyone know why that is Two other engineers come in and give really technical valuable answers And then two days later, that information was just lost into

5:13 the ether and. That was really the catalyst. It was like, hey, why don't we have a community knowledge sharing platform where engineers and ops can come, ask questions, share information, build

5:22 their network, and some will index all those conversations and we'll be able to query those and use those in the future. And there's a platform that I took a lot of inspiration from, which is

5:34 called Stack Overflow. And Stack Overflow is used in the software engineering world. And you go there, you ask a question about your code. The community would answer it and then Stack Overflow

5:43 would index those answers. So the next time someone had a question, they could ask that question. And be specific on that. It's like a Microsoft engineer will hand a Google engineer some code.

5:53 Yeah. I mean, that open source - Yeah, you know, the tech industry so much further ahead than oil and gas when it comes to collaboration and an open source mindset. And so it took that. And it

6:05 was really inspired by it. And so we started building out that part of the platform, community knowledge sharing platform which. Sometimes you and I should record a podcast on like how many

6:16 iterations that took in like failures. You know, it took us three times of launching. It's again, an attraction and that's just. I always say shittiest, shittier, shitty. Yeah. Those are

6:26 the previous three versions. That's the progression. And the whole idea is, you know, I remember, Chuck, I remember a conversation with you. I mean, this is back in like 2022. And you're like,

6:39 man, there's a story. We're gonna be doing all this with AI And I was like, hold on, hold on, hold on. We're not doing that stuff with AI yet, but that was always the vision, is that, hey,

6:46 we're gonna be able to take information and be able to find it and access it with AI. And, you know, we've been using AI for a while in the business. You know, all the way back in 2018, we're

6:55 using AI to transcribe our podcasts. And, you know, everyone wanted transcriptions or blogs. And back then, the only way to do it was to pay people to do it. And you had like services, I think

7:04 one of them was called like RevRiding or something like that, where you could pay like a dollar a minute to get transcription well essentially what they do is they just hire teams in India. and have

7:14 them sit there and manually transcribe it. We started using AI, and back then, it just wasn't good. It wasn't accurate, you know, it was about 70 accurate, it was expensive. And so a joke I

7:25 always like to tell is like, in our podcast every time we mentioned Anna Darko in the transcript it would be Banana Darko. So I just have Banana Darko imprinted in my brain. But when chat GPT three

7:35 and a half came out, I saw a couple of different things that were about

7:42 to happen The first was that immediately I saw that the way that content and data, how it was structured around the internet, was about to change forever. These models were really good at answering

7:52 questions about coding because it had siphoned all of the data from Stack Overflow's community and trained on that. Taking all the content from Twitter and Reddit. And so immediately I was like,

8:03 look, these companies are gonna get extremely strict with API access, like that is their data. That's what's special to their platform. It's like search is gonna change. Google search business is

8:14 potentially at risk and all the people that depend on search are gonna be at risk because traffic's not gonna come to those websites anymore, those blogs, things of this nature. So just saw that

8:26 the way that data and content was structured around the internet was gonna change. And then the thing with large language models being used at the enterprise level was that they just couldn't be used.

8:36 And there's several different reasons. Back then, they were really good at generating an answer, but they didn't have any transparency. They were a black box. They couldn't tell you how that

8:44 answer was generated or what the source content was. You had problems with hallucinations. We still have problems with hallucinations today. You had problems with data security and privacy. And so

8:56 I started really researching and a couple years ago found a technology called RAG, retrieval augment of generation. And RAG is this framework that uses large language models. But what I really

9:08 liked about RAG was, Oh, hey, we can cite sources to our answers and we can also give instructions and tune models to say, I don't know when they aren't able to access information to answer that.

9:20 And that's something that was super acceptable in oil and gas to return an answer and say, I don't know, instead of hallucinating and giving some bullshit answer. And so we started building out our

9:29 version one of - How we talk about RAG is it's providing context and in effect some rules for the interaction with that foundational model Yeah, I mean, it's a good way to think about it. That's

9:43 what I liked about it was we could essentially set parameters of, hey, you need to access this information. You only answer a question if you have confidence and the information that you're

9:54 sourcing, things of that nature. And so built out the first version of Clyde AI, which was an energy specific RAG. And our sources of content for that first public version or hey, we're gonna

10:08 take. public information that's out there. So think energy related podcasts, blogs, articles, earnings reports, things of that nature. If it's in the public sphere, boom, we're gonna scrape

10:17 that and bring it in. Then you add second source, which was third party data and content. So we went and got deals with society petroleum engineers on their abstracts, society of exploration

10:28 geophysicists on their white papers. And then the third source would be the conversations that happen within the collide community forums. And so you could think, you know, somewhat ask a question,

10:37 hey, I'm having issues with this electric, submergable pump in 4 12 inch casing, Clyde AI would give you an answer, and then it would site its sources. And the sources may be a book, it may be

10:47 an SPE paper, or it may be a conversation between two production engineers that happened in collide. And so we started building that. And what ended up happening is we took that augmented search

10:57 technology and started deploying it into EMPs and OFS companies on their proprietary data. And you go back to that, generational, great crew change and the industry operating off of legacy software.

11:11 So data lives in all of these legacy software systems and then the majority of it actually lives in CSVs and

11:17 PDFs. And so oil and gas engineers and operations teams spend 30 to 40 of the time just searching for information to use in their daily workflows. So the whole idea for us coming into the ZMPs and

11:29 OFS companies was, hey, we're going to take all of your data. We're going to take all your PDFs, all your CSVs, all your own structured data, and we're going to index it and make it to where

11:32 your OPs and

11:39 engineers teams can find information in a matter of a couple seconds instead of hours or potentially days. And as the concept and the technology started evolving, really realized that Augmented

11:52 Search was the foundation for building truly agenda workflows. First thing, you know, we're going to talk AI all we want, but at the end of the day, it's still a data problem And you have to have

12:01 good data ingestion and organization And then you can start automating workflows. So our first workflow that we've automated is doing regulatory filings for the Texas Rural Road Commission, W10s,

12:13 forms like this. So the way that it works today is Texas Rural Road Commission will send a notification to an EP, say, hey, you need to file W10s on these Wells Engineering team. We'll take that

12:24 notification. They'll go find the Wells and their production systems, get the Wells files, they'll find the data, then they enter that data into a W10, hope that they don't fat finger the data in

12:32 there, and then send that off. We've automated that process to where Collide can go in and do that in a matter of a few seconds. Let's do this real quick, just break this down, 'cause I get this

12:44 a lot. We have Collideio, which is our Knowledge Share forum, if you will, community, and we have some tools there for folks to use, but then Collide is also our AI enterprise software offering.

13:04 that we sold a customer's. Yeah, so Clyde AI is true enterprise software. This isn't SaaS that you can just pick up off the shelf and run day one. And so, when we deploy in the enterprise, I

13:18 mean, with some of our customers, we're deployed 100 within their Microsoft Azure environment. I mean, this is a pretty heavy lift, both from the CapEx that we spent on our core technology and on

13:30 our deployment teams to get it deployed And, you know, a blare over at Mercury Fawn has been telling everyone that, you know, we're the palantir of the energy industry. And I think it's actually

13:40 a pretty good - So Charlie said, Oldest daughter, Charlie, when I was going through what we did. And she goes, Oh, you're a palantir for energy. Yeah, and, you know, you look at palantir,

13:47 they have their core product, which is Foundry. And then they have their deployment teams that go in and deploy Foundry into their clients. It's kind of the same thing. You know, we're

13:54 productizing 80 of Clyde AI.

13:57 As we move into enterprise clients, there's 20 that's custom implementation and integration based on their workflows and fitting it to their workflows. So there's that. But then, yeah, you look

14:08 at this public facing side of the platform, this community and knowledge sharing platform. You know, it's pretty funny because people used to, the story used to be really confusing to people a

14:21 couple of, you know, two to three years ago when I would tell people the vision of having this community driven platform 'cause, oh, what? It's like Reddit for oil and gas, but you're doing

14:30 enterprise AI. You know who's helped out a ton with that? Is Elon Musk building XAI and Groc off of Twitter? And actually last week on a phone call, some guy was like, Wait, call on a story.

14:40 Are you gonna use your community and the contentto make better AI models for oil and gas? It's very Eli Musk of you. And I was like, Hey, to be fair, I came up with this idea before Eli, he's

14:49 just got a bigger balance sheet than I did.

14:53 And so, you know, with the Clyde community platform, there's a couple of thoughts that I have on this. And, you know, I remember one VC asking me last year, he's like, Hey you, know, like,

15:05 we're bought into the enterprise, but like, the Clyde community, like, is this real? Is this actually needed? And I got pretty defensive about it because this is needed for the industry. And,

15:16 you know, we do these community onboarding calls every week, and it's so cool listening to them, of listening to why people are coming on, and what purpose that it serves. And,

15:24 you know, the public facing platform, it's free to join. You have to apply, we're trying to keep it extremely high quality of energy professional, so you have to fill out an application, but

15:34 it's free to join. And we're gonna keep putting money and time and capital into that, because I think that it, It's needed. I think that it's important for the industry and I feel like an

15:45 obligation to build things that are valuable to, I mean, literally our first core value at our company is community over everything. And so I think about how can we deliver value to people and

15:54 build something that's going to help them. So if I can help one person make a connection that gets a job or one person that started, I'll give you a real story guy got on there and he's like, I've

16:03 heard you talk about Clyde on Twitter. He's like, What finally got me on here is some EMP engineer was asking for a new chemical vendor and my friend sent it to me. He's like, We're now on the

16:14 third step of the process of getting them as a client. He's like, That's why I'm in Clyde. He's like, That's so much value. We're doing that to just create this place for engineers and people in

16:29 the energy industry. Don't get

16:33 me wrong, we're going to benefit from that too. We're going to be able to build better AI products and AI models that we monetize at at the enterprise level based off of data. that comes from that.

16:44 But this is what I love is that it gives us such a line incentives of like, hey, look, I don't gotta go monetize the community and bombard you with advertisements or things of this nature. We're

16:53 not gonna data mine to send you targeted ads. Like we just want this to be a place that is such a high value exchange for everyone involved. And then we're gonna be able to use that to build better

17:04 AI products in the future. And the thing that gets me about that is someone who got fired because we couldn't figure out spacing. I mean, I think you've heard this before. I've never heard why you

17:16 got fired. I just thought it was 'cause you're a jackass at times, but I've never heard that it was due to well spacing or anything else. Yeah, well, you're more than welcome to have

17:27 my former partners on to tell that side of the story. But no, in 2019,

17:35 We spent 951 million dollars.

17:40 in CapEx across the portfolio. And it was basically all DNC costs. 'Cause we had bought our acreage. And if you used end of the year reserves, IRR on our

17:54 drilling was 33. The only problem was given what we had paid for acreage, we needed about

18:03 45. And so that was us just taking our lumps on spacing We literally did it better than anyone else in the industry, but obviously not good enough for me to keep my job. Yeah. Yeah.

18:18 And my point there is in hindsight, we should have been going to people, our offset operators saying, Hey guys, let's run a pilot together. Let's share all our data. Let's choose a little bit

18:32 hereso we can all benefit from this. Yeah. And we could have been more thoughtful about it I'll tell you a couple of anecdotal stories. I'm sitting at a private equity fund and one of the partners

18:44 that asks a question like that, he's like, Well, calling if we invest and collide, will that keep other private equity firms from participating in the platform? And one of the younger guys there

18:52 is like, No, you don't get it. It's about open source and collaboration and knowledge sharing. And I asked, I was like, Hey, how much capital did your port goes burn trying to figure out wall

19:02 spacing? And everyone kind of smirked a little bit. It's like, imagine how much more advanced the industry would be, and all boats will rise with the tide if we're sharing best practice on

19:11 operational challenges. There is not a lot that is proprietary in shale anymore.

19:20 And so now it's about operational efficiency. And the other thing that I think about too is, I was at another private equity fund, and they're like, at the end, they're like, Hey, sometimes we

19:30 get calls from our port codes, and they're having a really bad at operation and fishing and. By the time they're calling us, it's pretty bad, 'cause we're finance, bros, we're not a little

19:35 pulled to this, but it's more so about like, hey, how much money do we want to spend on this? He's like,

19:42 and what we do is we get all of our portfolio company management teams together, and we get on a conference call and just like, hey, you know, what happened up to this point? You know, what are

19:50 potential avenues forward? And he's like, that'd be a perfect conversation to happen in the Clyde forums. And so that's really what the idea is, is for the Clyde community is sharing information

20:03 on operational challenges. These aren't things that are proprietary. Well, and you can't arbitrage them. I mean, if you figured out how to get wells on a back online instead of in 15 days and

20:16 nine days, you're not gonna go buy a property 'cause you can do that better and make into significant money. But we all benefit if you share that with everybody. I don't know, I think what's

20:25 interesting is that this is the way that the industry is, I mean, you go back to the great crew change with millennials, taking leadership positions like this. mindset is shifting, I mean, and

20:37 I'm not even safe from millennials, you know, take it. One of our clients on Enterprise for Clyde AI, very, you

20:47 know, storied oil man in his 60s, maybe 70s, I don't know his age. And even he's not kind of given the not of like, hey, someday I may be willing to anonymize our data if it goes to the, you

20:60 know, collective pool of others that are - And cut my software bill Yeah, give me a discount, yeah. And so I think that this idea of collaboration and knowledge sharing is actually gaining a lot

21:11 of steam and a lot of that's being driven by the great crude change. But yeah, so that's a big thing for me is just this collaborative open source mindset and building a platform where people can

21:23 come build their network and find opportunity, you know, where we got the name collide for the app Some people think that it's after calling and it's not. never even like dawned on me that it

21:36 looked like my name. But when we did events for digital wallcatters, we had a thesis around collisions and like removing friction. Like you'd go to these old willing gas conferences and they're

21:46 just stuffy and stale. And it's awkward to walk up to a group of like five people on a showroom floor and like talk to them. And so our whole thesis around events was like, how could we make them a

21:57 low friction environment, make it like a party where you'd inject content into it and make it in a really close environment where you had to bump into people. And that was kind of like the culture

22:07 was like, okay, it's all right to just talk to random people. And anyways, those collisions that would happen between people were the catalyst for collaboration and ideas and opportunity. I can't

22:18 tell you how many stories I have from our events of people doing deals, people partnering up on businesses, people getting jobs from running into other people at our events. That's what we wanted

22:28 to do in a digital environment with Collide is give this a place where people could have those collisions and interactions. And you know, the other part of it too is I saw that people in this

22:39 industry didn't have a good

22:43 way to actually show their true ability and skill. And so you'd like, you'd look at the software engineering industry and if you wanna hire someone, it's like, hey, you know, send me your GitHub,

22:54 take this coding test. You know, you had proof of work that you could do to see if someone was actually a good software engineer. You didn't have that ability in oil and gas. All you had was a

23:03 static resume that said, hey, Susie worked at Pioneer for five years and Endeavor for three years, but that doesn't actually tell you anything about someone's true ability and skill. And so what

23:14 we started doing was building out, spending a lot of time on user profiles so that people could develop a dynamic resume. So you can get on there and you can see, You earn points for all of your

23:27 contributions. They're called Watts, they're the little lightning bolts. So every time that you make a post, you like something, you comment, you get Watts, and now Susie goes to apply for a

23:37 job at Diamondback. Diamondback can say, oh, hey, she worked at Pioneer for this many years, Endeavour for this many years, but man, she's got a million Watts in collide. She has a ton of

23:48 social proof and validation from her peers. She's super involved in the community. You go see a history of her post, and so it gives you this much more qualitative element to base someone's true

23:60 ability and skill than just a static resume. And so it's something that I'm super passionate about is that part of the platform, because I think that it's so needed in the industry. But all that

24:09 does is help fuel our enterprise AI product. And

24:15 I don't think that there's a lot of companies out there that are truly community driven and I think now that Grock and XAI exist, that it's a pretty good analogy of using Twitter and the user

24:28 generated content to train better models. It's something similar that we'll execute on and so that's how we think about the whole platform coming together. So,

24:41 chat through AI enterprise software, what are we doing there, what are we thinking there? Yeah. I wrote a blog post on this the other day that enterprise AI is hard I saw Silicon Valley founder

24:56 get on Twitter and talk about how they were pivoting their business. After three months, they were no longer building enterprise AI because it's too hard and it is. I mean, these are very, very

25:06 non-trivial things that we're doing. I don't know if you saw this MIT report that came out a few days ago. I haven't read it yet. I just read the journal article about that. Okay. Just kind of

25:16 the short version of it is they said that they surveyed all these companies in 95 of enterprise AI pilots are failing. And most of these pilots are internal teams that are trying to build with GPT

25:29 and co-pilot. And they're just not finding success. And it kind of goes back to what I was talking about originally is that large language models just wouldn't be good at the enterprise level

25:38 because you can't solve for that last mile problem. Like, yeah, you'll see like flashes of brilliance and like, oh man, it'd be cool. Like if you could do this, but it can just never quite get

25:47 you there. And so

25:52 you also look at that report and it showed that the companies that were working with vertical, specific AI vendors, light collide or having 65 success. So I have a deep belief that vertical AI

26:04 platforms are going to win. Like you're gonna have your foundation models and platforms like us will use those foundation models within our, you know, we use GPT and CLOD and all these different

26:14 technologies that are out there. But when it comes to a vertical AI, you have to have deep domain expertise to understand the problems that the industry deals with. our team has really deep domain

26:25 expertise and oil and gas. We understand these problems. And so what we're doing when we go in is we're really focused on operations teams and upstream and now in midstream. And so there's a couple

26:41 of different applications. Let's talk about a few of them. One EMP is using it to extract terms out of their midstream contracts The other thing they're doing that I think's really cool is he gets

26:54 the the the biggest problem with with midstream contracts when you're an EMP company is what acreage is dedicated where yeah and so gets the drilling schedule goes in and says this location which

27:10 contract is it dedicated to oh boom because this company I believe that's 3540 midstream contracts marketing contracts oh it's dedicated here then he looks at the right and says okay that's a pretty

27:22 good right

27:24 Or he sees the rate and he goes, man, that's not a very good rate. I should get an exception. You know, it's not going to be profitable for them to lay a line to me, whatever the case may be.

27:34 Yeah. And the way he used to do that is literally conference room table, all 35 marketing contracts open to exhibit A where it has the designated acreage

27:45 or dedicated acreage. And you know, where is it? Yeah. Yeah And we workflows and problems like that where they're just extremely inefficient. And it's actually not value add for the subject

27:58 matter expert to do that work, right? Yeah. And so it's actually like, I think this is a big, we should talk about this more in depth later. But, you know, everyone talks about AI taking jobs

28:08 and I just do not see that as the future, especially in our industry. I see collide as augmenting teams to allow them to focus on the important work, take all the trivial Someone that knows how to

28:19 do your job with AI is going to take your job. Not AI, it's a fair statement. And so you look at, you know, that midstream contracts, you know, people are using it for general well file search.

28:30 Like, hey, you know, give me, you know, actually I'll tell an anecdotal story. An engineer at a big EMP was like, man, I get handed this well. I've got two fish in the hole. It's got all

28:40 these problems. I have to give recommendation on next steps, but I have no historical context of what's happened in this well. So I go spend two days going through well view, going through email

28:49 chains, going through Microsoft Teams, just trying to paint a picture of what's happened in this well. He's like, if Clyde could tell me the story of that well, it'd say me two days of work. And

28:58 so some people are using it for, you know, well fall search of like, hey, give me, you know, a summary of this well, you know, hey, what a ESP, ESP's out on that well, who provided the well

29:07 head, just being able to ask questions and query data really quickly. We have a mineral fund that's working with us on extracting data out of their revenue statements and matching that against other

29:18 accounting and production systems we have.

29:23 an EMP that is using it to find double invoicing and invoicing discrepancies. And then you look at like pro-frack, pro-fracks using it to upskill their field engineers. You can be out in the field

29:36 and say, Hey, you can ask anything. How do I prepare a pump for downtime in cold weather? And boom, in two seconds, Clyde AI will take all the pro-fracks internal data and documents and return

29:48 an answer. Say, Hey, this is how you need to prepare a pump for downtime, if it's short-term, less than 12 hours. Here's what you need to do if it's longer than 12 hours, and then hyperlinks to

29:57 the citations, and you can click on them and boom, it'll open up pro-fracks standard operating procedure right there in the window. And so those are the types of applications that we're working on.

30:08 And I think if you - One thing that I've noticed that a point I wanna make, 'cause when we're out there selling software, we have to do two things. We have to level set where the technology

30:22 but then we also have to sell what we can do in the future, where this is gone. And I think level setting the technology today, you have your well search. I also think at this point, AI does a

30:37 good job of showing you correlation, not necessarily causation, it's not telling you why, but it can have things. And my great anecdotal story is I have a room of 15 people and you're gonna hate

30:50 this as I guess chief revenue officer. Being a good software salesman, it's kind of like being a lawyer. You don't ask a question unless you know the answer. I've kind of thrown caution to the

31:01 wind. So, and a geophysicist - Check solid liability for us, right? Don't tell HR. No, but

31:10 a geophysicist has a very thoughtful question. She goes, Chuck, you're telling me I'm the subject matter expert.

31:18 telling me that AI is allowing me to get my answers quickly. But you're also telling me that when I connect disparate data sources, I find relationships that even me, the subject matter expert,

31:28 didn't exist, didn't know existed, and I can't wrap my head around that. And I said, why don't we try it? So I had a database of six wells. And I just went in and said, how would we compare and

31:41 contrast all these wells? And hit enter, not knowing what it was going to do. Well, it was really cool. It came back and said, well, you could look at casing design, you could look at

31:50 completion recipe, you could look at formation, you could look at, you know, where it's located. It also came back and said, I would compare

32:01 EUR and IP 30 by the different vendors that worked on the well. And I mean, back at Cain, I did 125 early stage assets, kind of lease and drills. Yeah, never once did I sit there and go is

32:13 company A better at fracking than company B, you know, we looked at the recipe. and compared recipes with each other. And so the geophysicists went, oh, I see what you're saying. I should go

32:25 look at that. I've never really looked at that. So I think that's the cool thing too. It's not just search. It's, if you can ask a question, get an answer really quickly. You ask 47 questions

32:38 instead of five. And that 43rd question may be a rabbit hole that's totally worth exploring. Yeah, it increases bandwidth and allows us to do work that was previously uneconomic to do. You know,

32:52 it's talking to one EMP. They want to use our document generation on their workover procedures. And he's like, man, if you can generate our workover procedures, we get to move to a four day work

32:60 week. It's like, but not only that, it's like we can put so much more information in our workover procedures than what we do now, but it's just too much of a burden for our engineering teams to do

33:08 it. And so I think what AI is really good at right now is being able to search across

33:16 a large volume and. and. um, just vast data set and it's able to find those correlations, like you mentioned, but this is where, this is where the magic happens to, to make that happen. You

33:28 know, for us, it's so funny. Like when people are talking shit to me on Twitter, they're like, Oh, your GPT rapper, I'm like, actually, if we were anything, we're an Azure rapper. It calls

33:36 anything, at least get it right. Like you have no clue what you're talking about, but where our magic sauce happens is in our data pipelines, how we process data in documents, how we ingest that,

33:45 how it gets stored, and ultimately how we have a really accurate retrieval of that information, because, you know, people, you know, the question that I get a lot when I go into like an EP or an

33:58 OFS company is, why can't I just do this with co-pilot or GPT? And it's a very fair question. And there's a couple of different things. You know, one big tech cannot move into these industry

34:12 specific niche workflows And just like we talked about earlier in that MIT review is that 95 of Hove - AI pilots that are built internally with these tools are failing. And even if you just look at

34:23 like basic search, I mean, there's so much just research being done on what's called the context stuffing. And so context stuffing is essentially like, Hey, can I just use a large language model,

34:33 you know, like GPT and put all my documents in it and search across it. And I do this all the time with cloud, you know, I'll put in, you know, legal contracts in the cloud, but I'm talking

34:43 like, you know, three, four or five contracts, not thousands of documents The pilot that we ever did was 20, 000 PDFs. You're talking hundreds of thousands of times, millions of pages of

34:51 technical documentation. And to do this, this is where you have to use retrieval argument of generation and have these data pipelines where you're embedding documents and storing that in a

35:02 vectorized database. And then, you know, it's not just, you know, one form of search. There's multiple methodologies that we're stacking on top of each other when it comes to search. So these

35:11 are really intricate systems that we're building but. You know, that's the thing is that once you get data stored and processed and the ability to query it, that's what sets the future for these

35:26 truly agenda core flows to be able to retrieve information and use it. But yeah, to your point, it's like, hey, now instead of asking five questions, you can ask 50. You're still the subject

35:34 matter expert at the end of the day. And you know what information that you're looking for, but instead of having to go thumb through, you know, potentially even a physical well file, you can

35:43 just query the system and find it And so, yeah, I think that all the value in AI today sits in search and then being able to find weird correlations. You know, one question that I get from all the

35:55 old-timer CEOs is like, okay, so can this AI tell me how to drill the best well we've ever drilled and I always laugh? And I tell them I'm like, okay, not yet. And when it does, I'm gonna

36:05 charge you a lot more money for it. But what it can do today is your engineering can sit there and be like, hey, what was the highest producing well

36:16 Okay, it was this well. All right, hey, what casing design did they use? What frat design did they use? You know, what type of prop it, how much prop it? You know, what - Was it a parent

36:25 while was it a child? Yeah. And you can ask all of these questions. And as long as it has access to that data, it can get you the answers really quickly and get you to that point of like, okay,

36:32 hey, what does the optimum bolt design look like for this well? And so, you know, I think that that's really how it needs to be thought about and utilized by different teams and companies. 'Cause

36:48 it's augmented search. It's, my twist on that is it's finding correlations. We're generating documents today, like you said with the regulatory filings. I'm sure we're gonna build somebody's

37:02 workover, procedures sooner rather than later. What are we guesstimating in the future? You can go first and I'll go. So six months a year. Yeah, I mean, you look at the AI industry, it is

37:18 moving so fast, weekly. I mean - My answers have changed. Me out selling, selling software, my answers have literally changed in a month. Yeah, I mean, new models, new technologies, new

37:30 methodologies get dropped every single week. That's where a large amount of our time - Six months ago, if you would have said, can it handle handwriting? I would have said, nah, can't do it

37:42 today Yeah, it does. If you were us building a retrieval augment of generation 18 months to 24 months ago, you were in the early stages of it, and the whole idea of RAG has evolved. And to go in

37:57 from these kind of rudimentary systems to

38:00 very intricate purpose-built systems. And I think I can't predict where the industry is going as a whole, but I can predict where we're going You know, where's a lot of our time? and money in RD

38:17 is gonna be spent. It's gonna be in some of our backend technology. So automating a lot of our data pipelines, like our document classification, fine-tuning our embedding models based on specific

38:29 types of documents, things of that nature, to where we can really streamline data ingestion and

38:37 get even higher accuracy on that. Having a Genic orchestration, fine-tuning, let's say that we fine-tuning an instance of llama three and a half on land and title, and it's just really good on

38:51 that thing. And someone, a user, ask a question on land and title, it knows to go route through that land and title model

38:59 and has the workflows that are associated with that. So getting into Genic orchestration. And then one thing I really excited about is the computer vision capability of the

39:09 system, what you just brought up is like the handwriting. I mean, it's cool that, You know, you can take a old handwritten lease and title and be like, Hey, who's the, you know, who's the,

39:18 the, the signer on this embolm? It'll tell you the name and it pulled that from a cursive signature. Um, you know, one example I like to give is, this is even in our early days, which just kind

39:29 of blew my mind was I asked a question on one of our client's data sets and in the answer returned a picture. And the picture was of a server rack and it had like three computers on the server rack.

39:40 And each one of those computers had a serial number on it It was like in two, nine, seven, eight, into nine, seven, nine. And

39:48 I just typed in, I did a new query and I typed in into nine, seven, nine. And it was able to go find that picture within all that data. And I was like, shit, it's like, think about the future

39:57 of being able to query information on your, you know, surface equipment or well equipment by bringing in picture data, you know, one of our clients or potential clients has. a WhatsApp channel

40:11 where they take pictures of every ESP

40:16 that's ran down whole, every piece of equipment, and it just sits in these WhatsApp

40:18 chats. And I was like, man, we bring that into Clyde and start indexing that. And now you can search, you know, it's multimodal. It's not just text, but it's across pictures and charts and

40:28 graphs and well schematics, things of this nature. That's where it gets really powerful. Like, I think, you know, what's cool is the way that we're building these automated workflows is you can

40:38 get a quick ROI

40:40 associated with that and do time. But really where these systems become powerful is you connect everything. What random piece of insight are you gonna find from that, you know, through the

40:51 correlation of time and, you know, well files with production systems? With safety, you know. With safety, yeah. Safety's, I've been thinking about safety a lot lately and how do we actually

41:01 make a clean user experience that is actually Um. valuable to people in the field to make the operations safer. And so, yeah, these are all just very complex things, both from the underlying

41:16 technology and then how do you make a product that is actually loved by the people that are using it. And so, and these things can't happen in silos. Like you get out of the best performing

41:24 technology of all time, but if your product sucks from a user experience perspective, it'll never get used. And so we have to put a lot of time and brainpower into both design and how we build

41:36 things and how they look and how they operate and then the underlying technology as well, so. So I think what we're gonna see are these little bots based on different disciplines. So like if you're

41:49 a finance bro or you're ops guy or whoever that just automate a lot of the 15 things you do a day that are copying and pasting, whatever that might be. Yeah, I think those are gonna happen.

42:04 automatically in the background, and you're not even gonna be thinking that it's AI. It's literally just happening for you. So I think that's gonna happen. I think the second thing that's gonna

42:17 happen and you've alluded to this is a lot of data is gonna be anonymized and shared. I think we're gonna see, when we're gonna build Well Failure Bot, and I know I'm the master marketer here, so

42:30 I will come up with a better name than Well Failure Bot I mean, we'll get some snazzy name, you know. But yeah, I think folks will get to the point where they're all sharing that information.

42:47 They're participating in those bots and we'll get a lot better for that. But it's gonna be wild all it can do in terms of just the mundane stuff we do every day Yeah, the one, the one worry I have.

43:03 And this is, I'm gonna kill a lot of brain cells on this is, we need the old crusty people that, you know, the guy or gal that could put their hand on the machine and go, Ah, here's what's wrong.

43:16 Yeah. You became that person because of all the grunt work you did. And the one thing I do worry about is automating the grunt work. How are we gonna develop the old crusty? Yeah, the guy, how

43:29 do you capture that intuition and tribal knowledge? I mean, look, at the end of the day, when I wrote this white paper for Clyde way back in the day, 2020, 2021, talked about the biggest

43:41 problem in the industry, was knowledge, retention, and transfer. And that has several different facets to it. The first facet is that the gray hairs are leaving and taking a ton of tribal

43:51 knowledge with them. Every room that you're in, there's that old guy in the corner that just knows everything. He can put up a wrench to a wellhead and tell you what's happening down hole by, you

44:01 know, just listening to the vibrations. And, um, so you have that element. Second facet is operational teams work in silos within companies. So like in an EMP, for example, drilling doesn't

44:14 talk to completions. Completions doesn't talk to production. Uh, you know, the person that drilled a well five years ago is no longer there. So you go to work over the well, you have no

44:22 contextual understanding of what they saw, you know, they just go down the list of, of the bad communication that happens between departments and then on the MA process, I see this a lot of EMPs.

44:32 You'll go acquire an asset and you just get data dumped, a large volume of well files, none of it standardized. You know, we have some clients that just have terabytes of data sitting on prim on

44:43 their servers from acquisitions, but they have no idea what's even in there. And so you have all this knowledge loss that's lost in that

44:52 data and then lost because maybe some of the team doesn't come over with that. with that asset too, right? And so you see this across the board and I think - Silicon Valley didn't believe it when

45:03 we tell them that we use less than one person of the data we have. No, because you think about it, dude. Look, get on my phone, get on your phone. Look at Facebook, look at Instagram, look at

45:12 Twitter, look at YouTube. They track every single little movement. They're watching your eyes, they're watching your mouse, they watch how long you hover over some link and they're just

45:22 constantly processing and crunching data for what, to serve you the best ad that they can. Does this actually make the world a better place? No. Then you come into industries like oil and gas,

45:34 energy and medical that just throw away data and don't use any data and could use all of this data to make the world a better place. And so we have big tech over here on one end of the spectrum that

45:47 is, I mean just, they do not let a piece of data get through the system without it being crunched, utilize and leverage and then you have this other side of the spectrum. which are the industries

45:56 that actually run the world that don't utilize any of the data that we have. And that is a problem that has to be solved just for humanity and society, is that these legacy industries have to get

46:09 better on these things and do that. And I think kind of going back to the gray hairs leaving is, I think about this problem all the time is how do you extract that tribal knowledge and the domain

46:25 expertise out of people's heads. This is what I really geek out on within a company is, how can we go in there and create this deep brain or this internal knowledge base. And talking to some of the

46:37 engineering firms that do reserve reports, like it's as easy as hey, when you're sitting there creating type curves in a reserve report, screen record and walk through - Shoot a podcast. Yeah, it

46:48 really is, I'll tell you a funny story in a second on that comment, but screen record, record your workflow. And hey, you know, Susie's recording and Susie says, Hey, this is why I use this

46:59 B-factor. And 12 months later, when that engineering firm goes to pick up that well again, they can open it up and they'd be like, Hey, why did Susie use that B-factor? And they don't have to

47:07 sit there and guess, they can just query the system. And it's like, Oh, you know, Susie used this B-factorbecause she said she, you know, did this, this, yeah. You know, I was talking to a

47:16 senior ops person at a big offshore

47:20 EMP. And he's like, You know, we have an incidentI found the goal from Mexico. We have our entire incident reporting processthat we go through. And we do that. It gets filed away. And he's like,

47:32 But you know what we don't have? We don't have just a conversationwith a 30-year experience drilling superintendentof like, Hey, what are your thoughts? What did you think that we did wrong here?

47:42 What could we have done better? And he's like, Sit in there talking to me. He's like, Honestly, dude, this sounds just like a podcast. Like we should be recording a podcast on it. Voice to

47:49 text is so good. It's so good the story earlier of us recording. or trying to transcribe our podcast with AI, because I've seen that firsthand of how much better voice to Texas today if then it was

47:59 back in 2018. So we weren't core all our meetings, and we send them out at the end of each week, and you can get in there and query it. And it's good enough where I can say, what did Colin mean

48:09 by this? You get a summary. He gives you a take. You know, here at Clyde, we record all of our meetings, and so at the end of the week I get a summary, hey, what's happening across engineering?

48:17 What's happening across product? What's happening across sales? And so, we don't just build AI-first products, the way that we're building it is AI-first as well. And really, what my focus on is,

48:30 I care a lot about talent density, so how can I get the best people in the world to work at our company and then augment them with AI? So I don't think that it's good to use headcount as a good KPI

48:42 for the health of a company anymore. You should be looking at talent density and how can you get the best people come from well and then increase their bandwidth and their throughput with AI and,

48:51 you know, and oil and gas and

48:55 EMPs, I think this is how do you take 60 to 70 percent of their work that's very trivial searching for information, then copy and pasting information and generating a report and automate that so

49:03 that they can actually focus on the high impact decisions that move the needle and increase revenue or decrease costs. And so that's what the focus should be. Because you and I kind of, and

49:15 speaking of the urologists, I have to pay so we'll wrap this up, but

49:21 you and I think about this a little differently in terms of how we quantify the market size. You think of kind of GA and what percent of savings we can have or something. I actually think there are

49:36 kind of three buckets. You basically have CapEx spend and we can definitely make CapEx spend better. We can drill better wells, get pumps back online faster.

49:51 then there's risk mitigation, you know, whether that's making safety better, whether that's just flat out the amount of insurance we have to pay actual cash outlays for it, marketing, hedging,

50:04 et cetera. And then there's also the enabling of the work for us. And whether that's saving money or allowing you to do more with less. Well, that's what, you know, so many people look at it as

50:15 a zero-sum game of AI is going to take away, but almost all technology enables us to do more, you know. Some people don't know, you know, the story of the Luddites, but literally blood was shed

50:28 because automation was coming to textiles and manufacturing and like, oh, they're taking our jobs. The robots are taking our jobs and literally had a little skirmish and revolution over this where

50:39 people died over it. And what you come to find out is that, you know, we go from hand-sewn garments to do we produce so many clothes in the world now end up in landfills and like it just allowed,

50:51 automation just allowed us to make more clothes. My favorite is always the third world countries where somebody's wearing the Kansas City, Super Bowl winner. And you look at that applied to oil and

51:07 gas. For example, one of our clients always has 60 wells that sit offline in perpetuity. And they just never have the engineering hours to get over to them. And they have five to 10 well failures

51:18 every week And so all their engineering hours go to those new well failures because they're higher producing wells. What if you can free up engineers time to where they can go focus on those 60 wells

51:27 that are offline? Now you get those wells back online and you're producing more revenues going up. And so really looking at it through this, this, you know, and I'm sure like it was same in

51:37 finance back in the day with like Excel and, and Dude, Lotus, Lotus one, two, three. Yeah, like I'm sure that you have the same exact conversation. You know how I tell our coders, dude, I

51:47 coded back in the day,

51:50 but it's one, two, three. No, for real. You know it was all keystrokes. Yeah, no, for real. So I was simple coding. No, for real it was. And I guarantee, like you were there, but I

51:57 guarantee like you go back and there was that same type of conversation, but in all reality, what those programs allowed us to do is just do more deals and bigger deals and more complex deals And so

52:08 I think AI is the same exact way. It's just going to increase our volume and throughput and we'll be able to do work that was previously on economic. So now that I have a job, I think what we're

52:17 going to do is rebrand this, Chuck Yates got a job and we're going to talk about AI kind of every week, just different aspects of it. 'Cause I mean, it's crazy. Like you said, we need to talk

52:29 about data and how you get your data in shape. We need to talk about data security and all the issues there And it's literally changing enough each week that I think. I'm going to have, you know,

52:41 chock full of content on this. Yeah, no, I mean, the, the industry's moving quickly. Um, you know, what I'm really proud of the oil and gas industry is being a quick adopter. This is the

52:52 first time ever I've seen a digital technology been adopted like this. And I think that it's just because everyone sees like what can be done with Chad GPT and it's a game. It's, it's enough of a

53:03 game changer. That's what I've always said, you know, the oil and gas business, when it's a game changer, like drilling a three mile lateral and a 97 stage frack. We do it and we do it

53:14 technologically as sophisticated as anything. We do it fast. Yeah, because it's a game changer. If it's not a game changer, you know, and we don't do it. And he's yet. Yeah. So. All right.

53:25 Cool. All right. Um, the, uh, I'll have you back maybe in December, give me my performance review. Might be a little burrito, but I'll be able to put it on. All right.