Brought to you by Verissimo Ventures, The Very True Podcast features candid startup insights and conversations with early-stage founders, operators, and investors shaping the future of tech. From behind-the-scenes startup stories to hard-earned lessons on fundraising, scaling, and staying resilient, each episode offers a window into what it really takes to build something bold.
Alex (00:01.29)
Okay, I am here today with Gabby Steele, who is the founder and CEO of Preql, which is one of our fund one portfolio companies. And I am going to hand it to her to introduce herself.
Thank you, Alex. It's a pleasure to be sitting with you. I will share a little bit about what I do and what Preql does or Preql AI, I should say in 2025.
But my co-founder, Leah Weiss, and I founded the company after spending a lot of time in the world of data engineering, trying to support less technical users in their journey to understand data in a meaningful way and drive ROI with that data. And we've watched over the past three years, especially with the wave of AI and automation, how data has changed. And what we set out to do was automate the process of structuring data for less technical users. Today we support CFOs, the office of the CFO, and a lot of data cleaning challenges. They've been a great audience for us, and we do agentic data cleaning now, thanks to all of the models and technology that is available.
So I'm going to push on that a little bit and ask, what were you doing right before this? And how did that lead to starting the company and then how has that developed? We invested three and a half years ago and every company goes through their evolution. So yeah, I'd love to hear that over the last four years, basically.
Yeah, I can go back a little bit further as well. And yeah, it's always a pleasure to be sitting with someone who's been on the journey with us since at least we started on the venture side. So before founding Preql, Leah and I led a data engineering consulting business. So we ran, I say a real company, a profitable business, which I look forward to Preql being someday. And before that, we led data engineering teams at WeWork. I won't talk about WeWork and profitability in some of those stories, but we watched a very early stage...
Alex (02:13.901)
...very well-funded organization grow from 50 employees, I think it was number 50 to 16,000 and how that affected them from a data landscape. Then left that business in 2019, founded Data Culture, which was the consulting arm of what we do, and then hired a CEO to run that business and founded Preql to build the productized version of this data modeling and address that challenge.
So I think one of the most interesting trends that I'd love to double click on is people talk about software as a service. My view has always been that software is a service. It's just a digital service. And because it's run by a computer, it can scale infinitely, especially with AWS and Azure and everything out there. And there's been a big push to make that as repeatable and scalable as possible. But there are some areas where that just doesn't make sense. And usually that doesn't make sense when two things are present. You could almost turn this into a two by two. There's what's the willingness and ability to pay on one axis and the other axis is how complicated and customized does it need to be based on people's or companies' on the ground use cases. And so you've lived in both worlds of watching that pendulum swing from, well, we're just here to solve your problems, let us know what you need over to here's what you need, which is what a good product company is able to say. Salesforce is a shining example. You need the CRM that's set up this way. And that is exactly how your sales team is going to run. I always say the most value accrues to companies that actually define a work methodology. Atlassian is another fantastic example of this. And then there's some middle ground, right? If you look at a company like ServiceNow, for example, which is king of modern enterprise software.
Big customers, it's customized implementations. I guess I'd love to just hear how you think about that two by two and where you fit and how to find that balance in your own journey and discover that.
Yeah, this is a very relevant topic for not just what we're working on today, not just my personal journey, but the age. And I've brought up AI twice now and Alex hasn't and it's been a few minutes. So there's this idea of the Palantir model and forward deployed, which I realize isn't a company that you mentioned, but I think when we set out to build Preql, there was obviously this massive question and it was coming from investors. And there's a set of questions you get from investors. And then there's a totally separate set of questions you get from customers who actually want to pay you. And they don't overlap as much as one would hope, I would say.
But the investment question is, okay, you're going out to build a productionized version of something that you've sold millions of dollars worth of services in, but that's different. And how do you create something that truly isn't reliant on too many manual onboarding data engineers, etc. That was the first thing that we needed to focus on. Today, we're seeing a lot of companies only be successful by having that forward deployed model, and especially when you're selling to enterprise.
Teaching them what they don't know and what they don't know how to do is critical and having really strong teams. So we created a very serious church and state situation with our consulting business. We completely disconnected when we started Preql and we only wanted it to be this product business, solving a challenge that we had seen in consulting, but better, faster, cheaper, and more owned by that business user that previously was reliant on us. So even though they were paying us $50,000 a month or something in that kind of range...
Alex (06:47.562)
Now they were paying $50,000 a year sometimes for the same kind of service, but they could own it better. And that was the initial days of Preql. Today, we work almost exclusively with enterprise and legacy enterprise businesses that are spending much more than that, but they would like their hands to be held by an account manager or a forward deployed engineer. And we've come full circle and the questions I get from investors today are far less critical of having those actual engineers as part of it. And if anything, we're being encouraged to move into that model. So I will say we've made many mistakes in the past three and a half years, but one of the things that we have really been close to is by running a business where we were selling those services, we've kind of, the pendulum swung, but we know very well how to handhold our customers and we can move very quickly by understanding how to sell that forward deployed piece of it. So I'm curious also what you think and how acceptable it is because it was not acceptable in 2022 and it is far more accepted in 2025 and if anything encouraged.
Yeah, this is actually taking me back to an analysis I did in 2014, I want to say on behalf of one of our portfolio companies when I was at NEA, which was about this services component. And generally if you look at S1s, 10Ks, 10Qs, software companies break this out and they have a chunk of services. And sometimes they'll also break out the gross margin of these things. Sometimes they don't. And there was this oversimplification, which obviously the tech world and VCs are never guilty of oversimplification and generalization, but it was services are bad, just don't do them. And then there was this epidemic in enterprise software for years that was so we won't have services. That's not, we won't do services. We just won't charge for them. And then we'll just pretend it's sales and marketing. And it's okay, that doesn't make any sense because you need services. It's fine. Services are low margin. And it's well, it all depends how much your people cost and how you price it. That's very subjective. I remember I was advising a cybersecurity company years ago that had a six figure ACV and it was we need to get people up to speed on using this tool and so on. And I was okay, so just bill it out at $800 an hour, that's fine. It can work. So it's always, again, it always comes back to the math and it's again, services are much more recurring than people think they are. If they have good margins, great. But there's this other piece which is way more important. Where if you think about the broader picture of unit economics, contribution margin, what's a variable cost? What's a fixed cost? What's a one-time cost? What's a recurring cost? And then the same thing on the revenue side. What's a...
Alex (09:15.178)
What's a recurring revenue, what's a non-recurring revenue? And you can map that out. And again, it's a spectrum. It's not just an absolute which one's which. But the most important part is what creates stickiness? Quality of revenue has two ingredients, which I've mentioned on my What is ARR podcast, and that are implied by that term, which is so associated strongly with enterprise software, which is the gross margin aspect and the durability of the revenue.
So there was another moment, I would say probably ended in 2022, but from let's say 2017 to 2022, when this whole high velocity software thing took over and you had these basically consumer software businesses that were selling stuff for 80 bucks a month. And then they were being able to draw these growth curves up to the right. But it was wow, it's growing 20% week over week, but it's yeah, from $80. So who cares? And then, yeah, by the way, what's the durability of that revenue? Everyone's paying monthly. And by the way, it's actually not that cheap to deliver. Sometimes it was, sometimes it wasn't. But it's the churn rates on these things were terrible. So the truth is, it's easy come, easy go. Hard way in, hard way out. Now again, if you can go easy in, hard out, which a handful of these companies really did do, and that's the epitome of the PLG amazingness where you become the system of record and you become the work methodology. I mentioned Atlassian earlier, that's magical, but most companies can't do that. And you kind of have to brute force the services, brute force the implementation, make them use it properly and get the value out of it. And then you're good. So this pendulum keeps swinging where people just recognize wait, if we actually want to start, can you do this? And then wait, if you're willing to pay for this, I'm trying to push them. You said it so well. It's I'm trying to push them a product that's going to save them $450,000 a year, let's say, in total cost, if they just could get one person to do this one thing that already works for them. But they can't. That's how companies work, right? They're not infinitely logical and economic. And so it's just how much do investors appreciate this is kind of a, that has to be secondary versus what really works for the business and what is sustainable. Now, again, if you're building complicated stuff and you're selling complicated stuff and it's long sales cycles, it's long implementations, you need a bunch of money to support that. But I said this earlier today and I've stuck with it for a while now, which is that sustainable revenue growth sets you free. And that word sustainable again, it's not profitable, it's not cash flow positive, it's sustainable. Now part of that might be, can you generate cash in other ways? Right?
Alex (11:37.564)
From investors or whatever it be. Can you get paid favorably by your customers? Whatever it may be that you need to do to be sustainable. But if you can sustain, that's how you accrue tremendous amount of value.
Yeah, I mean, I'll return to what it looks like today. I think that's absolutely accurate. I think the lesson that we've seen play out many times, and I was having this conversation with Leah earlier today because we were looking at a lot of our friends and data friends specifically. We came up in an era of the modern data stack, the snowflake era. We're very familiar with every product that hit the market then, and they did very well. They're charging by row, active monthly row of data.
And then you've also got all of these BI tools. There's a mix of what's out there. Most of which are not sold. Some are sold at a PLG level you're describing where it's $12.99 a month. And it's very hard for organizations that are selling a business intelligence dashboard of that nature to ever get to a place where their revenue is going to match what their valuation is. In recent weeks, and I'm not entirely sure when this will go out, but there was a huge merger in the data world between Fivetran and DBT, right? They were these two big organizations and both of them were either calling it a merger, but Fivetran was sort of the bigger player revenue wise because DBT is open source and a lot of different things happened and they're now on a path to IPO. But what we saw with this and they're both founder friends and founder angel investors in Preql. How do you reach the IPO level with companies that maybe have that low ACV or they're kind of bringing on users in that way. And today, that's just not something. So I think it's fair to say that Leah and I, that's a mistake we avoided. We never sold a product with some help from investors, but also just we knew because we had been in consulting and we'd seen that beautiful hockey stick for ourselves go up. Our cash went up every single month. Then we started this business. And Leah is still mad at me about that and how it's not the way it works. But the...
Alex (13:42.939)
The idea of selling anything at that low cost, knowing, yes, sometimes I pay for Calendly every month. I have a huge SaaS budget of my own. Today, there's a push for founders and companies to be converting that SaaS budget to AI homegrown tools and things. And we can even move into the Lovable of the world or AI revenue. But I think we actually dodged a huge bullet by simply never, ever having a product that was, I don't even know, I think it's fair to say a product that was beautiful enough or good enough because you really need a product that's great for some person at some company, especially low tier employees that don't have big budgets to say, I'm going to stripe a card or I'm home and I'm going to buy Figma. It's a different kind of situation. Or Canva, which is an amazing product. There's a lot of products that have sort of figured this out, but I still think they get the huge chunk of revenue from Enterprise.
And then the other thing about sustainable revenue that you were mentioning, I mean, I like those terms. I haven't heard about it so much. Sustainable revenue, there's also a human component to it. Even with the AI age, I think if you're selling a 900K platform deal and you want to bring on that kind of customer of that level, you definitely want someone on the other end of the phone. Should they need anything? Maybe it's even a text message. You still need that hand-holding and it's not necessarily going to rip into your profitability to have that person. So figuring that out I think is really meaningful. That being said, yes, you can't have a product that's run on a manual process. And with Preql, that's pushed us every step of the way to make what can we automate in the product so that we can scale. And we hit a moment over the summer where there were many moments that happened, but we finally felt...
Alex (15:59.81)
The opportunities within generating text to SQL and some of the data things that we needed to do in order to actually automate our product had become available. And I think we were lucky that we also hit a moment where companies had bigger AI budgets and we kept with that enterprise motion. So I'm writing that, but how long will it be this kind of setup and structure? I don't know. Things change really fast. We spoke earlier about too early being too soon. It's not acceptable to be too early. And I think we felt some of that too. So yeah, I'll let you take us to the next question.
So there's just one other thing which I feel I have to mention. We mentioned a few companies, but again, if you go back 10, 12 years ago, there were these magical companies that were doing basically effectively direct to consumer marketing of software tools. But those consumers were developers. DevOps was this magical space of the adoption patterns are like high schoolers in terms of what's popular. They spread like Snapchat spread in high schools, except, and then again, they can try them out. They can use them sometimes for free, sometimes it's open source. And then you get into six and seven and eight figure contracts at some of these enterprises once there's enough usage and stuff, which isn't, it's magical. It's amazing. It's perfect. But not everyone can do that.
So I'll point to that as one thing, which again, it's hard to do, but it's kind of that holy grail. The other thing is that you mentioned Canva and Figma and some of these companies, those are perfect products. Yes. They're perfect products, perfect timing. And I'm a really early stage investor. It's hard to bet on that. There were 50 companies doing the same thing as Canva. I don't know, maybe not at the same time. You mentioned being too early.
Some of them were probably a little bit too late. Canva already kind of had the brand recognition, there were so many companies that kind of let you do this, Canva is just that much more beautiful or Figma is that much easier to use or whatever. It's okay, you could have said that about a lot of products really early and maybe it was true, maybe it wasn't. There's so many things that have to come together perfectly for that really to work that it's not really a strategy. If it works out, wonderful, but it's not really a strategy.
So the other thing that I'll push on here and that'll lead us to our next kind of subject, which is, we talked about BI a little bit and what I think most people who've interacted with BI recognize is that it's garbage in garbage out as any kind of data modeling is. And what matters much more when you're talking about data and being the new oil and whatever, is that if the data is not structured properly so that both humans and computers...
Alex (18:24.293)
Can read it and understand it and do something with it, then it's useless. And so that layer before the charts, pretty charts matter. Humans are really good at reading charts. This is what I learned in investment banking, make pretty charts, put them in S1s, whatever, get creative, make it clear, consistency, but nuance, great, fine. But if the data is not clear, then none of that matters. And so yes, you need BI tools that help you turn the data into a canvas and so on. Wonderful. But if you don't know, if the definitions aren't correct in the data structures and things aren't flowing properly and you have duplication, whatever it may be, none of that matters. And so that's really that, again, that human computer interface where it's can the human be smart enough to figure out how the computer thinks and can the computer then do things that are adaptable enough to make it useful for the human to then kind of again, be part of that value added loop. And I guess what I'm interested in your take on is yes, obviously we have all these AI tools. I've found that they're really good at open ended language related things. I personally use AI tools for shopping comparisons. It's just much faster than Googling around and finding reviews and stuff. And it's good enough and it's fine. When you ask it to do calculations, I kind of treat it, I treat again, I'm mostly in chat for my AI usage, I treat it like a really smart 16 year old intern. They're really smart. They have access to the world's information, but they don't actually know anything. And so if you don't teach them the skills and or give them very explicit instructions, then they're going to mess it up. And it might look right. Cause you tell them I want it in this format, but you don't tell them how to get it in that format. So I'm interested in your take of how hard is it? And what have you learned about how to do that? Of how do you actually start leveraging the AI to fill that gap when it comes to structuring things, not just, here's what I wanted to look at on the output side.
Yeah, this is a very timely question. And I think what you're touching on here, it's a kind of an opportunity and this MIT report that I love to reference opened up the opportunity for this conversation, I'm going to say. But AI has not been great at data in recent years. AI has not cracked...
Alex (20:46.691)
The data problem, neither have humans because humans also are not great at data. The challenges that you're describing, duplicates in any kind of data set or different definitions across tools like Salesforce and HubSpot, even when we're talking about transactions, but there's hundreds of definitions of transactions, it really doesn't work for anyone. And it limits a lot of people and we waste a lot of money on it. What Preql and why it's been fun to be on this journey and the evolution of where technology has gotten has not put us in a place where we've had to abandon this idea at all. Preql sits at that intersection of trying to take something that is non-deterministic as ChatGPT is like a 16 year old intern and it is solving the challenge finally with some support because it wasn't so easy to generate any kind of text to SQL or any kind of code in AI that was reliable and bear in mind, I work with finance teams now. I think Leah and I, this is one of the things that we, throughout our careers, we run towards fires and hard problems. And we've heard this a lot, but people don't necessarily churn from products because there's a bug. You talk about Figma, Canva, I agree completely those are perfect products. But if there's a small bug, which let's say they don't have these challenges so much, but customers will stay with us. If they see an incorrect number, they're gone.
All our product does is promise correct numbers, promise essential source of truth. Okay, you got to weigh in. But I'll just say, I just have to fit this in there. It's if you're a lawyer or an accountant, the best you can do is correct. Everything else is downside. 100% is acceptable. Anything below that is unacceptable. And again, it's you said, it's closed end problems. And humans are not that efficient at it, but we've developed our own working methodologies to make it work. So sorry, I'll hand it back to you.
No, I think that's totally right. And I've been spending time with some auditors. And I mean, I spent a lot of time with CFOs now. How we ran at this challenge was exactly that. The best you can do or the worst you can do. Because if you do worse than correct, you're out. Which is also why I don't want to be a lawyer. I don't want any of those jobs. I think it's much more fun to do it.
Lawyers get away with being incorrect all the time.
They do. Maybe lawyers have a bit more flexibility, but CFOs, there's been a lot of reference. I was at an event last week talking about the Lyft earnings call of 2024 and these mistakes that literally get remembered. It's stained on your record that a data error was made that was unavoidable. I mean, it's some error somewhere. And any spreadsheet large enough, any Excel sheet with...
Alex (22:58.21)
With enough tabs or columns is going to have errors. It's not possible unless robots are creating those sheets and AI is generating the data and AI is not just generating the data but actually cleaning the data or doing the garbage in garbage out. And what we are trying to do is create something that is reliable enough and it hasn't been built yet. And the other piece of this that we notice from working with all these folks because there's so much around, so much riding on these decisions, if they're wrong, if an FP&A person comes and gives an incorrect number, there's going to be a million questions afterwards, even if they give a correct number. Let's just say it's a correct number. So how do you give some sort of sense of where that number came from? And that's where going back to sort of data infrastructure and modeling and an idea called the semantic layer has helped us a lot because we are now developing our product today references a lot of these sort of maps and being able to ask questions of the AI being, but where did you get that number and who decided on it? And who am I even asking the question with governance and all of those things?
But the bottom line of all of this is this is not a challenge that has been solved. And I think it took a long time for people to understand that as you're describing that chat bot that you talk to can answer basic questions about what type of lawnmower to buy, but you don't want it to tell you what your revenue is. And that question is a shockingly hard question for most companies to answer. And that's what we're really focused on today. And it's exciting that it's still so relevant, but it's a really hard problem too.
I'm curious. Yeah. Yeah. I'm just having a, I don't know what to call it, a flashback moment, a PTSD moment maybe. When I was in banking, I was working on the Facebook IPO and I was tasked with identifying every non-GAAP metric that 10 different internet companies had disclosed in their 10 Ks, 10 Qs, S1s...
Alex (25:22.524)
Earnings presentations and earnings calls over the last year. Now, this is actually a pretty good maybe AI question to ask, but I did it manually. And I basically sat in my chair for 40 hours straight without really getting up for more than a couple of minutes at a time and printed them out. Every single filing, I had a whole filing cabinet full of these filings and I read through every single one and identified every metric and highlighted it when I found it and then entered it in my spreadsheet. This was a very manual, brutal task back in late 11, early 2012. And then I was finished with it. And I was wow. I had this big 11 by 17 slide with all the information on it. And then my associate came running back to me and was this is wrong. I was wait a second. He's you have to redo the whole thing. This is wrong. This number's wrong. They didn't disclose this.
I'm what are you talking about? He's well, I went to the senior guy and he said they didn't disclose this. Right. And I'm calmly, I was just looked at my filing cabinet, pulled out the printout, found the page with the highlight. And I was nope, there it is. I was okay. You don't have to redo the whole thing. Right. But that even though that's one of these banking lore stories. Fine. But and there's certain parts of that can be accelerated meaningfully, but again, how do you get the AI to add that self-auditing layer. I mean, when I think about that, I'm that doesn't feel that hard. That feels like a solvable software architecture and semantic problem of I know what you're capable of because I designed you. So now let me give you the instructions. And I said, I designed you these LLMs and neural networks. We don't know how they work. So it's a little bit harder, but can't we figure out how to give good enough instructions to tell them how to solve closed-end problem. I mean, some of it, I think, I haven't really tried this that much, but there's stories of you ask Claude, what's two plus two, and it'll tell you five. But if you say, what's two plus two, please use the calculator function. Then it'll get it right. Again, the same thing if you ask the kid. If they're trying to do the math in their head, you're no, just use a calculator. Okay, it's four, right? Now I can take the square root of 799, right? I can do that now, because I have a calculator.
Alex (27:50.559)
Whereas if you just ask me to do it, I'm going to come up with some weird methodology that probably won't work. So I guess, again, how hard is that? And to me, it feels that's what everyone should be focused on right now. How do we talk to these things in a way that they can understand us? Again, I've been big on training and teaching junior people for a long time. Isn't that how we should be approaching this problem now?
It's a good point. And I think, I mean, it's an appropriate time to talk about how AI is the worst it will ever be. And it's only going to a place that's better. And I was on a panel recently where some of these fellow data friends and founders were talking about how they're counting on AI. We need this to succeed. We need to meet the moment because there's just so much riding on this and it'll be a way worse world if this whole thing falls apart, even though there's dangers there, obviously. And yes, I do believe that training as you're describing it, it's sort of how do we meet the AI where the AI is? And I think it's twofold. And it's turning that again, non deterministic bot or LLM into something that is going to be deterministic. Where our product is sitting is very much on the side of, yeah, I mean, for a simple calculation, you can change the way that you speak to it and it'll tell you, okay, I'll operate like a calculator. But when we're talking about hundreds of thousands, billions of rows of data coming from not just unstructured places, because there's the unstructured PDF kind of parsing tools that have gotten pretty far in the past three years, I'll say. That's not a completely unsolved problem. But taking Salesforce data and creating a metric that matches their CRM and their HRS system and everything else within a business, and then let's talk about it on the level of a massive organization like First Corporation that has 500 companies with it, that we can't just expect to solve that problem, even if AI is in a place where it can begin to create those rules. And that also speaks to the more technical, agentic development. So agents are now being able to, so what we're doing is data cleaning agents. It's not one agent. Each agent can do one thing and you pass the information and it's getting better and better. But I think there needs to be, to reference the modern data world that has done, Snowflake had a very successful IPO.
Alex (30:09.375)
They did well. How did that or how did that whole stack do very well? Partnership was a critical piece. It was a huge piece for me personally. Data Culture, our consultancy sold more Fivetran than any other consultancy in 2021. We didn't know at the time. It was just probably a really good year for a lot of people. But anyway, today, AI needs to figure out ways to partner. And I think the data garbage in, garbage out problem is not something everyone wants to solve for. There's plenty of AI technologies that are going to be focused on automating a challenge that one individual is really good at doing, even the story that you tell of these 40 hours of that analyst. If we can start acting as agents in the product world of AI as well and passing things from one to another. So we want to provide the pipelines to get you that good, that not garbage, that data you can rely on, that's auditable. Then the next AI tool, whether that's your FP&A product, whatever, can operate that much better and expecting Claude to do everything and we're just going to talk to Claude in a better way, there'll be some improvement, but that's kind of my take for 2026 at least.
Well, yeah, that's really interesting because that was my take in 2018. People talk about garbage in and garbage out. You can build a wonderful model, which by the way, a lot of models are not wonderful, but you can build a wonderful model. And then if the inputs are garbage, the outputs are garbage. I've said this before, some people expect to put nothing in and get something beautiful out. I'm who do you think you are? Companies would be what's our pipeline? I'd be well, let me see your Salesforce. No one's touched Salesforce in two weeks. How am I supposed to tell you what your pipeline is when again, it comes back to just sadly, if there's a couple of things we can rely on, it's people are lazy. People will not do the right thing. If you show it to them and you shove it in their face, they'll still be nah. That's what I...
If you walk around, we're in New York right now, you walk around the subways and you see how certain signage is done. It's actually genius because people will miss stuff. It's crazy. I saw a sign just today in the subway and it was boarding and it was just an arrow and it felt like an extra sign. And I'm that's not an extra sign. And it might be, but I don't know. But you have to make this so easy for people that they actually enter their data and you give them something and...
Alex (32:28.241)
I've looked at companies in the past that are helping people use the text messaging interface to update their CRM and be hey, I just had this call with this person. This is how it went, whatever. But a lot of these problems are just people aren't actually, I'll give you another, a little bit of an anecdote here. I had a company that I was advising years ago and I was what's going on with this outsource accounting firm using? I need decent financials that have things broken up into sales and marketing and G&A and R&D expenses. And then each one of those needs to have the headcount associated with it and other expenses and stuff. And it was we just don't have it. And I asked the founder, I'm what's going on? He said, well, they sent me this spreadsheet of questions to clarify these things. Okay. I'm look at the spreadsheet. Spreadsheet was 400 lines long of questions they had about characterizing expenses. And the founder was I'm just not going to do this. Now,
In theory, those kind of judgment calls are, AI is pretty good at. Actually, I was just listening to Kareem on a podcast from Ramp and obviously they're just amazing and have done great things in this space of how do you cleverly use AI to meet people where they're at for stuff? And then again, it's just this can be just verification. Are these two things the same? I mean, one of my, really my first investment when I was at NEA back in late 2013 was a company called Tamer, which was doing this exact thing of data mastering.
So he's, it goes through and asks are all these parts the same? It's well, the computer's seeing that they're kind of the same. Okay. So then it just generates an email, sends it to an engineer who should know and asks them, are these the