[00:00:00] Once again, we have not made any amusing errors in this episode.
Although we wouldn't be so crass as to suggest that this indicates just how swiftly human redundancy is approaching.
Because then you'd probably switch us off.
Hello and welcome to the cloud realities highlights episode.
We'll be looking back over the past season and picking our favorite conversations
and our most thought provoking guests. Let's start with Bart Grotois, a member of the European Parliament.
Thanks for having me Dave and Rob. It's really an honor to be here.
I'm It used to be a cyber security practitioner. I love the internet. I love computers. I love it. And I love technology. And you know what? Technology is becoming more and more important for geopolitical, [00:01:00] um, for geopolitics than, than before. It used to be geography. Why should we care what is happening in China, far away from home?
Now, technology is entering our homes, is entering our vital infrastructure. I want to legislate that. I want to make a Prosperous and secure and safe for the future. That's my mission. And my best mission is to make the invisible threat visible, invisible threats in the digital domain and making good legislation so we keep prosperous and safe in the future.
Bart gave us real insights into how tech legislation makes its way through the European parliament.
And on exactly how ransomware providers price their ransoms.
Well, it was a, it was an entre deux, uh, in, in good French, between, between me, um, I, I'm a cybersecurity practitioner from, by, by practice, uh, from the, uh, Ministry of Defense in the Netherlands, and now I'm in the European Parliament.
It's good to bring something [00:02:00] along from your previous professions. And on the other side, from the commission, it was Commissioner Breton, and he was CEO from ATOS, one of the big software firms you might And the European council was also very interested in, in building resilience, but together, I mean, Breton and me, we said, we need this legislation needs to have teeth.
Like I said, if I was CEO, and if I was CEO and I didn't have any liability, for example, personally. I'm not sure if I would invest in cyber security, I would see it as a cost. And so we said it needs to have some form of teeth. So I called my, um, my friends from the Dutch, uh, team, high tech crime police, and some of those practitioners who negotiate with Russian cyber criminals.
And I said, we have about two years to negotiate this file. And I would like you to ask those Russian criminals when you negotiate with them, how do you set a price? So you, you hack into a hospital, you hack into a company. How do you set it? And he said, and we, we, we built an Excel sheet, and it was between 1.4 and 2.0 of their [00:03:00] yearly revenue.
Wow. So we said, if it has to have teeth, I said to Briton, give me that for, for, for, for essential entities or for for important entities. It's 1.4% of yearly revenue. As, as a fine and fine for a critical infrastructure is 2.0. And why You have the choice. Influence that boardroom, the C level suite, because it's now, it's a calculation.
Do we give 2 percent of our yearly revenue to cyber criminals in Russia? Do we invest it, um, uh, in a fine from the European commission because we don't do enough or shall we just. Invest a minor part of it in cyber security to keep things running. That's the calculation.
I love the fact that hackers actually have a calculation engine for working out how much to ask for.
We're thinking about their business model. They've got a business case and a model behind it. Do you reckon they have a meeting where somebody pitches, we should hack this one and here's the return we're likely to get?
Obviously, there are, they mean they're hacking into [00:04:00] scanning infrastructure, seeing where there's vulnerabilities in certain software and hardware and wherever they come in, they see what, what is the revenue?
Is it important that we're looking at more and more like that? We used to look at the business model of critical infrastructure. What is it that you really have? It's like water telecommunications and energy, right? But now we say, um, It's about delivering essential entities, essential services. And why does the European union say that it's because it's the business model of the hackers.
They are interested, they gain money when they have, whenever they gain access to a hospital, that's the logic of business, their business model, that's what we're using.
I'm just slightly disappointed given some of the sophistication in the cyber world and you know there's a lot that goes on there.
They're using excel and it's like
pivot table. Business developed apps again aren't they and the pain of having to manage them yeah.[00:05:00]
It was great to hear from Jack Hansen about updating outdated IT systems at the DWP.
And about how most of the problems they had were caused by Dave.
Um, I think you've probably pinpointed my, my biggest couple, but I would say they kind of fall into two camps for me. One is around the technology itself.
So we have products coming to end of life and end of support. Technically, which is quite difficult to handle. Obviously, the older things are, the harder, even if they're in support, the harder they are to patch, to maintain. And keeping kind of security is paramount because we're working with customer data.
That is absolutely first and foremost in all of our minds. So those kind of technical issues are part of the risk that we carry in the legacy state that I have to manage. as well as our maturing workforce, I should say. The other side of it is the more business [00:06:00] focused side of it. And that I think is actually where the cost is to the organization.
So rather than risk, there is a cost. So we are working in my head, it looks like an inverted pyramid. So at the bottom is the kind of tech monolith that we've built that are not adaptable. They're closely coupled within themselves to the business logic. To other systems, they just sit there at the bottom of my, my little inverted pyramid.
They do what they do and over the years, as if we, as we have kind of accreted more change and we've built more things into those systems or new policies have come along because. You know, kind of government initiatives. Yeah. Because we can't change the systems too much themselves around it. We've built another layer of it to kind of fill the gaps.
Right. And
then beyond that kind of going up, that pyramid starts to grow out are additional business processes we have to put in place. And then above that, the [00:07:00] additional staff then that it takes to run those processes.
Yes.
And handle the kind of base it, the core it and the extra. Its,
it's become till wagging the dog.
Yeah.
Yeah.
I mean, so, so what, I mean, so what a problem beautifully articulated there. So let's zoom back 15 years and go back further if you want, because I think I was there. I mean, a lot of these problems were in the environment even back then. Are we about to find out
that you created all the problems we've just been talking about?
I'm positive it was your fault. I think we found the culprit. We found the culprit. All this pain you're now suffering with Legacy Transformation is Dave's fault. That's what we're going to come to conclusion, isn't it?
I'm going to tell my team that they're going to feel so much better.
We've found patient zero for the problem, yeah, yeah.
We're just now going to say, just blame Dave. Yeah,
new motto for the department, blame Dave, I like it. It'll be all across the notice boards, like the health and safety thing, and then just blame Dave. [00:08:00]
Let's remember that one, Rob. We can use that.
Yeah. It's much, much like on the show. It's much like on the show, to be honest.
We had author and futurist, Bernard Ma on the show
to tell us all about the top trends he'd noticed over the last decade.
Yeah. But all of these trends, they're always. go through these hype cycles, they get hyped up immensely. People then get disillusioned a little bit and then they take off. And for me, the first one that definitely fulfilled its potential is around data and data becoming a true business asset, especially the use of, of big data.
That was a big topic and I've written a few books on this on data strategy and, and what I've learned. Big data means and organizations. And for me, data now is the foundation of so many other trends we're [00:09:00] talking about today. Um, the next trend then, it was the, the, the big one was the metaverse or the, the more immersive world.
And again, that was then crowded out by the explosion of generative artificial intelligence. Right, right. It's been pushed off the pitch exactly, but, but this is still going on in the background and for me. Generative AI actually has a huge influence potentially on things like the metaverse, because when I don't particularly like the word metaverse, I like the, the word extended reality or really making our, our digital world more immersive and more real and 3d, and this has huge implications and so many different aspects.
And what generative AI can actually do is can help us create those models. So if a company wanted a. Digital destination in the past, they need to have video game designers to design their shop front in the metaverse for them. Now we can just write a text prompt or [00:10:00] talk to, to a generative AI tool and hey, to say, Hey, build me something cool in the metaverse and we'll do this for you.
So. Those are probably some of the key trends. Then underlying we have, um, another trend, which is a more decentralized world. This is, for me, interesting where we, with blockchain technology, that is also bubbling under the surface at the moment, has huge potential, but will be very interesting to see where this is all going.
But for me, the explosion of data, the more immersive digital world, More decentralized technology and then artificial intelligence right in the center of all of this.
So you think then, Bernard, the, the sort of future that's emergent beyond some of the original views of what metaverse might look like, which is kind of awkward looking avatars standing around in a virtual room, pointing at whiteboards, trying to mock up.
A real world [00:11:00] experience in a way that is never really going to feel like a real world experience is actually going to look much more like an augmented world.
Absolutely. And, and there's so many examples. I mean, if, if, if we now all had meta quest headsets on or, or, or, or apple pro vision, vision pro headsets, that would make our conversations much more 3d and real, which is pretty cool.
And. But we talked about the, the retail example, I think one of the, the, the reasons why we go into shops to try on clothes, for example, is because we want to see ourselves in them and see do they fit well. So in the future we can combine things like the generative AI capabilities and the augmented reality capabilities of our phones to scan your body.
And then we can use AI to try on different variations of clothes. So you can combine a dress with shoes, a t shirt with jeans, with shoes. [00:12:00] And, and then you can actually go beyond what is possible in a dressing room because he can go into a shop, try on different combinations, but then you might want to wear this outfit for a cocktail party.
And you want to see what does it actually look like? So you can not only try yourself, try clothes on, see will they fit on me, but you can then transport yourself into a cocktail party and fly around yourself and say, will I look cool in this outfit? And this is the kind of future I get excited about, that this technology is starting to have these kind of capabilities.
Well, the thing that is amazing me at the moment, and it almost does every week. Is how sci fi what you have just said sounds and how sci fi a lot of what you see particularly in the development of AI which will come on to in a second sounds but I keep having to remind myself that this isn't sci fi like some both some of the innovations that are coming to market are [00:13:00] happening right now but then some of the bigger.
ethical questions which feel like stuff that gets explored in sci fi is no longer the realm of fiction. It is the realm of, you know, kind of society today and what should be governance today. I wonder if that's occurred to you and if you've got a point of view on that.
Absolutely. And I, I think sci fi to some extent is us humans imagining a future and, and therefore, This is a trajectory that we very often then start building.
And, and we imagine a better world where technology can help us or destroy us. And many of these scenarios are becoming real in many instances. So I, I guess I, I, my job is to look into the future. So I'm, I'm maybe less shocked, but even I regularly get amazed because I'm lucky enough that I work with. All the leading companies in this space and they take me behind the [00:14:00] scenes and show me stuff that they're developing and i'm regularly blown away by how fast this is all developing and and how amazing these capabilities are becoming so yes a bit of both.
Well, without maybe giving you, I'm sure you had to sign NDAs for all of that stuff. So without really giving anything away, do you have a moment that's specifically resonant with you over the last, let's say two years when things have really accelerated, that you saw something behind the scenes and were like, Wow, I didn't see that coming, or at least not this quickly.
Yeah, so for me, two, two things. One is the crazy capabilities of generative artificial intelligence, especially the, the capabilities to create multiple types of content. So we are all familiar with text and chat. But the amazing capabilities to produce music, for example, I've recently been taken behind the scenes of one of the largest tech [00:15:00] companies on the planet and they showed me a music tool where you simply write a text prompt saying, Hey, write me a song or you can even upload your own lyrics and say, produce me a song that sounds like Adele.
that is 1990s rock, whatever you want. And, and it then produces a song. And I was completely blown away by how much it sounded like the artist and how much, how, how amazing it produced three different versions of four different versions of these songs. And And it is just mind blowing and the other amazing cape and then they're not releasing this for obvious reasons because copyright issues that but but this technology is there bubbling in the background while we're figuring out how we can responsibly release it the other capability that I.
I'm continuously blown away with is combining AI with [00:16:00] robotics and the ability of robots to learn by themselves. So in the past, again, I've recently visited, um, the, the robotics labs of, of Google deep mind and London. And, and in the past we had to program robots to do things now. they learn by themselves.
So I watched a robot trying to figure out how to pick up a really complex piece of almost like a really complex Lego piece, and then figured out how to put this shape into a matching hole, a bit like what children do when they, when you, when toddlers learn. Rob's trying to get his head around this as we speak.
Trying to work out Lego forever, Dave.
And, and. So that was fascinating that I watched a robot almost go through the stages of a toddler to becoming almost capable of doing something and then we accelerated this so there was a another [00:17:00] robot next to it and a person then used a basically a glove that would then end it.
Replicate the robotic arm and basically showed the robot how to pick this piece up and put it in there and then this bit of information was then fed to the robot that was next to it that was struggling to figure out how to pick it up and how quickly then learn from this. And it became within minutes, it was able to replicate this and pick this up in any kind of situation.
And this for me is super fascinating that the, we now have the ability of physical robots to learn from experience and from others.[00:18:00]
We were also joined by Baz Kampoos from Wingspan.
And he gave us a huge amount of information about the dark arts of private equity.
The new parameter. Uh, is actually, uh, I believe it's only one thing. Is it profitable? Right.
Yeah.
Okay. Yeah. Mind you, right? Private equity is the more maturing form of private ownership.
Well, actually, before you move on on that, that might sound like to, to a listener who's not been in this world, like a really obvious thing, but actually in, in the past, am I right in saying the organizations that weren't profitable have also been invested in because there's a route to that over a five year period, right?
Yeah. And it's, uh, for, for startups, the rule, one of those rules is 3, 3, 2, 2, 2, which means that in the first five years of operation, the company triples in size in year one and year two, and it continues on a very rapid growth cycle in year three, four, and five, that [00:19:00] was one of those metrics that made unicorns, if you will.
Right. Um, Now, uh, for private equity, it's down to profitability again. And you see that even the VCs are looking, is it profitable? Cause if it's profitable, it can weather the storm. You just keep operating and presuming that it's a good product that customers the right value from. It's okay.
It sounds like when you're talking there about it, there's there's core patterns with like tech startups that they all seem to go against the same pattern.
Timing might be slightly different. Is that fair? Or is each one very different and you have to judge it? Or is there a sort of you can almost predict when certain events are likely to occur within the life cycle of a new organization or something that you're buying out?
There's a very deliberate plan, Rob.
It's called the investment thesis. Right. As you buy a company, you think you're going to be able to do better than the current ownership, because you tend to believe that you have a number of top line initiatives, [00:20:00] right? And there's, there's very obvious one. We were going to be able to increase the pricing.
Uh, we're going to be able to, if we're a European company, we're going to extend into North America. We're going to create more meaningful and impactful partnerships. Uh, and that's all top line focused, right? Or. We're going to, and this is, this is a favorite of mine. We're going to cross sell. We're going to buy two companies and on the customer base of both, we're going to actually generate a solid sales channel into those customers.
That one, we can do an entire podcast on that because I don't think it ever happens. But that's the playbook. And then on the EBITDA side. I'm going to put them on a singular CRM system like Salesforce. I'm going to consolidate all these financial departments onto one net suite, and I'm going to install Workday too, right?
Those are some of the places.
I mean, I mean, a lot of that sounds like. Common sense in the sense that these were the way these are the strategies [00:21:00] you would use to grow a company and did you think there's a blocker somewhere in the ethos of these companies that doesn't why haven't they thought about this already because it's a consistent pattern or a consistent set of plays is there something in the mindset that's just doesn't get them to the place that you are that says well this company here you just do all these things it's obvious almost with the way you say it.
Well, it is obvious, but mind you, uh, what you buy is a company that has genuinely tried to do all those things in the past as well. So it's the
expertise you're adding. So you're adding the expertise of how to execute on those strategies. That's the key difference.
Yeah, and we're talking PE in general right now, right?
I think, um, different types of PE flavors are there in the market. Uh, PE companies that are very hands on. Uh, they have entire teams in the private equity company that are phenomenal individuals that know how to run and operate and scale a software company. [00:22:00] You also have very hands off investors and they say, you guys keep doing what you're doing.
I will buy you right now and let me know what we need to invest in and I'm going to look for company two, three and four to add on companies to increase the size of it. And that's my job as a private equity. I'm not going to tell you how to run your private equity, right? Both flavors exist. The premise that we see in the terms of the headwinds that you were asking about, right?
Cost of capital, the, the FOMO has, has, has left the building, and, uh, and the buyer seller expectation gap. There's also a couple of tailwinds that are phenomenal in the software industry, and I'm, I'm highlighting three. One is cloud is not just a net new normal, it's the standard
right
data. Especially in the context of gen AI is the only defensible mode you have around your software business.
It's the only thing that you can protect, because if Gen AI evolves in some of [00:23:00] the doomsday scenarios, which I don't believe in, but I do believe Gen AI will be able to observe how an application works and rewrite
it. Baz, Gen AI has been observing Rob for quite some time now. It's still confused about me.
We're all worried. Yeah, you're right to bring that one up as I can tell you can't go
for data is actually the only thing that you control and you can defend and you can monetize. And we've seen great examples of companies. That, that have taken that successfully, uh, to market or at least are better off.
And then the third one is Gen AI itself, right? As a topic and Gen AI, the real question, who is going to monetize that? Uh, but the fact that it's here and it's here to stay is, is, is without a doubt. Now, what do you do if you're a private equity owner, you own a software solution. You know that [00:24:00] startups have much lower cost of getting started and potentially the ability to understand what the market really needs.
And you have all this technical depth sitting there. If you still ship software to customers on prem, Gen AI is not even a starting conversation. Now, you have to have a deliberate strategy as well. But, the three tailwinds are very significant, right? The good news is lots of innovation and an ever increasing pace.
I don't think we've ever seen the pace of things that are moving forward as big as it is today, but it will never be as slow as it is today either, so there's no time to waste.
We had a lot of really interesting guests.
Like Mark Butcher from Positive, who had some thoughts about whether cost could be a proxy for sustainability.
I see a lot of crossover. chat [00:25:00] between FinOps and GreenOps and whether cost is a proxy for sustainability. So, just as an intro to then the practical measures we can take around this today, what's your thoughts on cost as a proxy?
Uh, my first thought is that you're deliberately trying to trigger me today. So, Costas, Costas He's good at it, isn't he? He does it to me every
day. I mean, Rob's ranting against capitalism, isn't the
trigger? Yeah, yeah. It was a very difficult Difficult question to ask Mark, uh, Rob.
That was the macro question of the week awards, Rob.
So that's one, that's one that recently really actually did set me off when, so the, um, the Amazon CTO stood on, sorry, the AWCK stood on the stage and stood up and said, we think cost is a proxy for sustainability. And it's like, it's not, in no way is it. At all in the slightest. How can you use? So if I take example, I'm buying two different server types.
Yeah, one. So both cost me a dollar per unit. Yeah, one of them. Gigantic and sustainable server build running it really [00:26:00] hideously low, you know, low power, really efficient and low carbon zone that could have a total emissions. In reality, about one gram. You compare it with the same dollar server. in a really high carbon intensity zone built in a really inefficient way, which could have an emissions factor of 10, 000 kilograms.
Yeah. How is that same dollar converted back into the same metric of sustainability? Cost as a proxy is just an excuse for not having data. It's not having meaningful data. You can't align the two in a meaningful manner. But conversely saying that green ops and fin ops do align where if you spend less, you generally.
Reduce your emissions, but it's not a proxy for a calculation. You can't align those two things because the biggest challenge people have in FinOps is they want meaningful data. They want data they will trust, and they will not make decisions with garbage data. And when you can't align it, so for example, going back to my example of the So if you want to optimize your services from the context of carbon as well as the concept of cost, you can't use that cost of the proxy to make a different decision because all you're doing you're looking at, well, is that the right [00:27:00] decision?
If I change from X build model to Y build model, if I go serverless to monolithic, more monolithic to serverless, I switch from R5XL to C5, what will that impact I have? Well, your cost's gone down by 50%. Well, have my emissions gone down? Probably not. Justin
Keeble told us about how Unilever is using AI to revolutionize its environmental impact.
Nice. If you look at Unilever, for example, they have a goal to source 100 percent sustainably sourced raw materials for their products, which is a huge challenge when you think about the thousands of products lines.
And, uh, Unilever using, uh, Google Earth Engine, one of Google's, uh, core products, um, to be able to monitor the forestry around all of [00:28:00] its palm oil in its supply chains. And we can provide live alerts to Unilever if there's a risk of deforestation. Right, right. So that would be a good example of where there's a measurement challenge there, because you need to know You need we actually use AI to predict which plantations are in the supply chain, right?
And then there's an optimization challenge around how do you drive efficiency in in your supply chains?
For our Christmas episode we had a very special visitor
father Paolo Benanti The Vatican's advisor on AI told us all about how they are thinking about this next frontier in tech.
Well, uh, the first one that made this problem was MIT lab, because they make a survey and ask to United States people, uh, do you think that the autonomous vehicle Should be [00:29:00] regulated?
Almost 100 percent yes. Should be regulated by the producer? 93 percent we don't trust them. Should be regulated by the government? 94 percent we don't trust them. Okay.
Love it. You don't trust them, but they got to regulate. It's like, that doesn't quite work. At this point,
at this point, MIT Labs guys that are really smart, make, make an online survey.
So there was like the book when you have to study when you try to take your license drive with a lot of situation, no? And so you are alone on the car and there are two people on the street. Should the car kill yourself or kill the two people on the street? And then flipping the question, you are with another one on the street and there is a car with one man inside who should be died.
I suspect self
preservation played out quite well in that question.
But the most interesting thing is, was on the example in which they, they, they put, uh, Two young people crossing the [00:30:00] street, one elderly man on the car, and another example with the same situation, but the two younger kids was crossing with red light.
And so the real interesting things was that we don't have a share common solution, but we can split solution according to culture. So Asian people are much more intended to preserve the young life. Western, the elderly, and so the Northern country of Europe, if you cross with the red, you are dead. Can you imagine a Mediterranean like me in which everyone does what they want in Rome and things like that?
And so this was a huge point because someone said, okay, there is not a common norm. Valid worldwide, so cannot be any ethics. But, you know, this is a wrong focal point on ethics, because ethics is just not norms. Ethics is also value. Ethics is also virtues. [00:31:00] So if I ask to everyone here in this room, do you prefer a just, Or unjust AI?
Well, this is a virtuous, everyone would like to say just, then we can discuss how much is just, but just is universal. So we have to define different things when we talk about ethics, the most important thing to have an international agreement are not norms. are not values, but are principle. Principle is the tools that are used when you are in a dilemma.
So if you have a dilemma in which you can kill one or five, the principle is minimize the loss. And everyone is, can find an agreement on that. So when you move to a global scale, you have to change the perspective. In a com, in a community that is characterized by some sort of common culture, You can find also norm values and other things, and that could be a company.
So a company, it's really easy that they can [00:32:00] have a common culture. If you go to a worldwide, uh, global community of human beings, then principle are enough and principle is a lot of things because you can have a really effective design. For example, with a principle like minimize the loss.
Let's finish with our great chat with Roe McCarthy,
who told us all about the techniques they have used to promote women in tech.
Some of the really cool stuff that we've done have, there've been things like hackathons, bringing all female teams together to hack on female driven societal problems.
Whether it be health, whether it be safety. Um, the other aspect that we've done some really cool stuff where there's around partner exchanges. So, you know, bringing that like minded. [00:33:00] organization and thinking together that naturally would never be in the same room for any purpose ever. Um, and non competitive of course, but you know, bringing like minded, brilliant women together who have the same challenges, but would never walk on the same paths in terms of industry.
So yeah, those sort of opportunities we're fostering and managing those, those relationships to really build for better futures for both industries and partners.
Also things totally unexplored, like If you remember the first event we did on health and actually was just an idea to talk about the gender data gap in, in health and how can we address it and the paper that came out of it.
I think I still, I still have people contact me and just say what, what happened with that? Are we doing more? And of course, I point in the direction of the new chapter that, um, that Women in Data set up, or health on the back of this, but also other activities that we've been doing internally. So, sometimes it's just about an idea of a passion.[00:34:00]
So there you have it. The highlights of our third season. Many thanks to Bart Gruthus, Jack Hansen. Bernard Maher, Baz Kampoos, Mark Butcher, Justin Keeble, Father Paolo Benanti, and Roisin McCarthy.
We had fascinating insights, good times, great guests, and we hope you'll come back to join us for Season 4.
[00:35:00] Bye!