Stupid Simple

Kamil and Brian host a live roundtable with winners from the REVEL × LycheeAI × NVIDIA Isaac Sim Hackathon. Lukas Knak, Jan Olsson, Pascalis Trentsios, Loan BERNAT and Zakariea Sharfeddine share how they built a humanoid butler on UniTree G1 with GR00T, a physics-heavy claw machine rope sim, and a chess-playing arm trained by VLA. We dig into cloud setup on BREF H100 and H200, giant Docker images, data and domain randomization, teleop vs imitation learning, and the stubborn Sim2Real gap. Plus lessons for Hackathon 2.0 and what builders need next like one-click train compute.

What is Stupid Simple?

The REVEL Podcast dives into startups, robotics, and venture capital through the lens of building our own company from the ground up. Each episode blends unfiltered conversations, lessons from our journey, and insights from guests who are shaping the future of tech. Expect raw stories, sharp takeaways, and a front row seat to making the real world robot ready.

Brian Walker (00:00.27)
have an apartment in Nice. So we're to stay there, right on the water for a little bit. OK, Lucas, let's test it.

Kamil Dusejovsky (00:08.737)
Can you hear me?

No, he can't hear me. you can hear me. Okay.

Lukas Knak | Packing Panthers (00:12.217)
I can hear you.

Brian Walker (00:13.55)
You can hear Camille?

Lukas Knak | Packing Panthers (00:15.905)
No, Camille no. But you, Brian.

Zakariea Sharfeddine | LeTeam (00:18.289)
Yeah.

Kamil Dusejovsky (00:18.359)
All right, we're just gonna roll with this. We're just gonna roll with this. Brian, tell him we're just gonna roll with this.

Pascalis Trentsios (00:19.181)
No.

Brian Walker (00:27.15)
How can that be? Did you like mute it somehow? Come out? Something? What's going on?

Kamil Dusejovsky (00:32.767)
That doesn't make any sense to me. It's fine, it's fine. We'll be fine.

Brian Walker (00:40.408)
We'll be fine.

Kamil Dusejovsky (00:41.621)
Yeah, you guys have to relay the messages from me to Lucas.

Loan Bernat | Hospital 4.0 (00:44.502)
you

Kamil Dusejovsky (00:48.503)
I'll let you start the podcast, Brian. Yeah.

Brian Walker (00:51.958)
Intro? Yeah, well, since people can't hear you, you know.

Kamil Dusejovsky (00:55.511)
Okay, go ahead.

Brian Walker (00:59.054)
All right, well, OK. Welcome, everybody. This is our second episode of the Revel Stupid Simple Podcast. Here today joining me is Kamil, our CTO of Revel. And we have some really, really great guests today. surprise, surprise. We got the winners and the runner-ups of the Revel, Leechia, NVIDIA, Isaac Simhakaton, which finished last week. had the judging on Tuesday. We had the.

the livestream announcement on Thursday, which was a lot of fun because the guys actually didn't know who won at all other than Zakaria who won the community vote. The public voting ended on Wednesday. So he knew he was coming to take a first from the community. But other than that, you guys had no idea who was going to win. And I remember that you guys, I think Vasili was like, when he realized that he was going to be a

Lukas Knak | Packing Panthers (01:47.001)
Yeah.

Brian Walker (01:56.566)
that he's going to take the first place. He was in the background waiting to join the livestream. What just happened? So that was fun. Gentlemen, thank you so much for joining us. We were super, super amazed. And just in awe of the talent and the work that you guys put into the hackathon, thank you so much for learning about it and then being sort of

Pascalis Trentsios (02:04.623)
Yeah, yeah, yeah.

Brian Walker (02:23.266)
brave enough to sacrifice some time to because we know how time consuming this stuff is to join and put your best foot forward and put yourself out there and say, hey, I'm going to do something. So thank you so much for doing the hackathon and congratulations to all of you. And of course, one more time, congratulations to all of the other teams that participated. Thank you so much to all the judges and sponsors and people that help us put this together, including Lychee, Muomir.

who's running this really cool community in Europe, robotics community. He's an educator, he's running his own YouTube channel where he basically shares sort of the how-to's in terms of Isaac Singh and Groot and NVIDIA Omniverse and so on. And I would actually love to start with the question, has any of you guys been aware of the lich.ai community prior to the hackathon and if so, have you used any of the...

educational videos that he posts on his YouTube channel to, know, when you maybe ran into some trouble, have you been able to kind of go to his YouTube channel, look up some how-tos and follow a tutorial of kind of step-by-step things. So I see Zachary asked kind of nodding. Maybe we'll start with you.

Zakariea Sharfeddine | LeTeam (03:42.609)
Yeah, like I knew him before. I think I know him like three, four months ago when I wanted to look into i6 Sim, how it works. And I really think that he's doing great educational content, which goes into depth. So it's not like a surface tutorials. It's actually answers questions. And for me, it was a lot of help. And I had like another project that I wanted to do with i6 Sim.

It was like a simple reinforcement learning using the S100 to push a box into a red target. It was part of a uni project. And yeah, that was a lot of help, like using the YouTube channel.

Brian Walker (04:27.98)
Yeah, super. He's going to be thrilled to hear that. Guys, if I can just ask you, when you're not talking, your mics, please. hear a bunch of background noise coming from somewhere. Maybe Kamil, actually, or someone. OK, who else used some of the tutorials alone?

Kamil Dusejovsky (04:39.703)
hate it too, so it can't be me.

Loan Bernat | Hospital 4.0 (04:45.578)
Yeah, like Zakaria, I discovered the lichichannel two, three months before the hackathon. It was because I was doing a little project on how can I, doing like quadruple robots patrolling into a factory and it need to detect some waste on the ground and pick it up. And so I started watching some of lich's video to understand how is that thing work because it's a...

really dense software with a lot of things possible. And so I started watching some of the video and in fact I discovered the hackathon itself by receiving a notification of its announcements on YouTube. And so I saw the video and so yeah, registered to the hackathon like the last Sunday before the deadline. So yeah, it's mainly thanks to Lichy and its content that I discovered the hackathon and discovered Xil.

Brian Walker (05:44.094)
Incredible. Lucas, what about you guys? I know your partner is missing today from you had Jan. But Jan's actually non-technical. Jan is in the VC space. He's the one who usually brings the money to the project. How did you guys get together? It's such an interesting combination.

Lukas Knak | Packing Panthers (05:48.589)
Yeah.

Lukas Knak | Packing Panthers (05:54.049)
Right.

Lukas Knak | Packing Panthers (06:02.671)
Yeah, so we've been best friends for a very long time. We've known each other since high school. For the hackathon, so I have found out about Lychee and the YouTube and Discord community, et cetera, just about two weeks before the hackathon. More like...

I stumbled upon litchi.ai randomly when I was looking up something for the SO101 arms that I wanted to personally start a project with. I think he started some series with the SO101 and Isaac Sim. And that's how I found out about litchi.ai. then, I don't know, one or two weeks later maybe the hackathon announcement dropped.

on LinkedIn, think. That's where we saw it. Well, Jan and I actually saw it independently from each other and we thought, yeah, let's, I don't know, let's do this project together, let's reconnect on this project and yeah, was a great time for us.

Brian Walker (07:12.718)
Amazing, amazing. So did you use any of the tutorials? Because you guys picked a humanoid as the robot of choice. And you guys did Groot training. And you even set up your project with the cloud. You needed some cloud computers. You started on H100 from the NVIDIA BREF cloud. And then you actually needed to speed up the process because the deadline was approaching. So you guys switched to the H200.

Lukas Knak | Packing Panthers (07:19.385)
So.

Lukas Knak | Packing Panthers (07:38.147)
Right, yeah.

Brian Walker (07:39.788)
So did you watch any video tutorials on that, or did you kind of had experience with that prior and already knew kind of what to do? And also, if you can kind of unpack the complexity of that, because as you know, we've talked about it briefly with you guys, like what Revel is working on. But if you can kind of talk about the pains of what it's like now to get your project into the cloud and get it compute.

Lukas Knak | Packing Panthers (07:55.406)
Yeah.

Lukas Knak | Packing Panthers (08:07.267)
Yeah, for sure. So I've had a lot of experience in the past with cloud computing, setting up clusters to train models on the clusters. But it was actually my first time also using crude and vision language action model like crude for fine tuning. Of course, I've used LLMs and VLMs a lot in the past, but just for inference when I've used them. But yeah, now...

During the hackathon, I kind of wanted to give Jan like a full round trip showing him everything that's going on in the domain of robotics right now. And that's why we decided, well, let's go all in. Let's do a humanoid. Let's do vision language action models for like optimal control stuff, et cetera. So.

Brian Walker (09:00.206)
Was there a real quick, was there a debate between the Unitary G1? Or did you guess we're pretty set on that robot and its open source file? Or did you look into other humanoids as well?

Lukas Knak | Packing Panthers (09:15.927)
We started actually with a, what's the name again, Fourier Intelligence, GR1, I think. That's the name of the other one. That's the one that we started with actually. I'm not 100 % sure why we switched, exactly, but at some point we decided to go with the UniTree G1.

and we stick with it. And actually, so since you mentioned the cloud computing, that was a lot of a mess, to be honest, because we started with trying to fine tune Groot locally. But of course, we didn't have a big enough GPU to get it on the GPU at full. So we had to do low rank adaptation fine tuning. And it didn't work out as we expected. So...

We had to find a way to train it on a bigger GPU. That's why we went to BREF. And it was a lot of hassle setting that up. So just to give you a brief...

Intro on how I did it then at the end. I've used Docker actually to put all of the Isaac crude stuff into Docker and then putting that on breath, spend some time trying to figure out how to set up the container optimally so I don't have to load up the 50 gigabytes container always from my PC locally into the cloud. That took a bit of time and then also trying to figure out how to get data on the cloud computer. That was also a bit of

Actually, I guess it's not the intended way, but I did the fastest and hackiest way. So I put all of the data at the end also inside a Docker container, then just pushed it onto GitLab Docker Rack, Rack history. And then I could easily pull it from there. But, Right. Yeah. Yeah.

Brian Walker (11:09.176)
So you were legit hacking throughout the hackathon, just hacking away and hacking everything, hacking everything. If you had to ask.

Kamil Dusejovsky (11:15.915)
How big did the container end up being Lucas?

Brian Walker (11:19.31)
Well, he can't hear you. I was going to say, would have Kamil ask you a few questions, but you can't hear him. So he asked how big the container ended up being. Do you remember how big the container ended up being?

Kamil Dusejovsky (11:23.361)
you

Lukas Knak | Packing Panthers (11:32.057)
So the iSecroot container actually was 50 gigabytes.

because I completely built it and built all of the libraries in that container so I didn't have to build it from scratch every time I booted up the BREF instances. It was mainly because, well, we didn't have so much time to always let it build and it was also, I mean, every hour costs on BREF so you have to be quick as well and downloading the 50 gigabytes was faster than...

Brian Walker (11:47.886)
Mm-hmm.

Lukas Knak | Packing Panthers (12:03.319)
I think it took one hour to build.

Brian Walker (12:04.632)
Could you give me an estimate on the total time that you think you spent setting up the compute, like from when you're like, OK, we're to need to run on in cloud. What would you say was sort of the time that it took? We're talking a few hours, so we took a few days.

Lukas Knak | Packing Panthers (12:18.799)
Hmm

So.

Well, I would say in days, we, on Sunday, so on Friday, Friday was the deadline for the hackathon, Friday, yeah. So on Sunday, we decided well,

Low-rank adaptation wasn't going to cut it, so we had to go on BREF. So on Monday morning, I started setting up BREF, setting up all of the datasets with the containers, et cetera. So that took, I guess, one day. So first training on BREF, then I started on Tuesday, I think. Then it was only like Tuesday, Wednesday, Thursday that we had for training. Of course, I mean...

You never get it right on first try, I had to, I don't know, do the same process two or three times and always it takes some time to set up the breath instance. I mean, once I had the Docker container, it was more like in the range of one or two hours to get everything set up and run the training. But for first training, was more than one day, I would say, to set it up.

Brian Walker (13:40.406)
OK, I'll come back to you, Lucas, with that. But it seems like one of the features that we're building on Revel for the training, which is we call it this one-click train compute, would come quite handy in that situation. Basco, have you had any sort of interaction with Alicci? I think you were part of his previous tournament. Yeah, yeah, yeah.

Pascalis Trentsios (14:04.547)
That's right. Yeah, I also participated in the previous project. Yeah.

Brian Walker (14:09.154)
Excellent, excellent. so how, in terms of the content that he's producing, have you ever kind of delved into it to get on some of the things that you're working on?

Pascalis Trentsios (14:19.759)
For sure. Yeah, yeah. actually, like, Lychee was for me like a great motivation to get started with Omniverse. His tutorials are great and a great start. He even covers how to install everything and set everything up into detail, which I think is well received in the community. One of his most viewed videos are actually how to set Isaac Sim up. So yeah, also did a video with him about my previous project. He hasn't reached out to me yet.

about this project. Maybe we can also do some feature video where we go into detail because I think every now.

Brian Walker (14:56.384)
I'm sure that's on the table. There is so much that happened around the hackathon, on the backend of everything, gentlemen. It's like we're still kind of catching up with messages and emails and all the things that have to come out.

Pascalis Trentsios (15:09.911)
I'm actually surprised how quick you were with everything. I think the deadline was on Friday and by next week we had already the Nvidia call. You had done all the grading and set up the podium, the winners, the judges. So huge kudos to you. That was really amazing. I didn't expect it to be that quick.

Brian Walker (15:33.784)
Well, thank you. really appreciate it. I didn't sleep for a month, but it worked out. OK. And let's talk about a little bit your project then. Obviously, you're the winner, so congratulations once again. Funny enough that you mentioned the videos and sort of the ease of some of the things that Momir puts out there.

Kamil Dusejovsky (15:36.895)
It was a fun weekend for sure.

Pascalis Trentsios (15:51.087)
Thank you, thank you.

Brian Walker (16:03.106)
The other day, actually though, I came across a video of his. And we kind of both laughed at it at the end, because, hey, it was titled something funny. It was kind of like, hey, look how super easy it to get started with NVIDIA Isaac Sim. And then the video starts. mean, the first thing goes, OK, you have to go to here, this part of the website. Then you open this container. Then you scroll down here. You find this extension. You first need to upload this driver. Then you unzip this file.

then you transfer it into this container, and then you can start running the installation. During the installation, it's going to ask you to do ABC. And I was like, there's nothing simple about this. There's absolutely nothing simple about this. And I sent it to was like, hey, the title of the video doesn't quite match the first 60 seconds of where the video goes. But it was beautifully done, and he's so patient.

kind of calm in the explanation. So big shout out to Mo and Mir one more time. We're putting up another one for you guys and for everyone who wants to participate in November. I think we're going to be opening the registration somewhere in the end of October. And as we mentioned on the NVIDIA livestream, our goal is to sort of...

10x everything, 10x the prize pool, 10x the number of participants total. I think we had like 300 plus, so that would mean like 3,000 people. So the stakes are going to be hopefully also 10x and higher and reach more robotics clubs, more universities, more researchers, more PhDs. And for us, this is great, guys. We're basically building in public and collecting data on what's working and what's not working in robotics, having these kind of conversations with you guys. And then we're sort of building.

we're building Revo really for people like you that want to do cool things with robots, they want to train robots and do some kind of cool things. Lucas, real quick, going back to what you were talking about, do you remember, this is something that when I talked to a few people here and there, they get quite shocked how small the models actually for robotics. Like the controlling locomotion of a humanoid is something like 1.5 million parameters. And people are really shocked, like, wow.

Brian Walker (18:21.1)
like that small of a model that controls the sort of, you know, the basic locomotion walking and whatnot. Do you remember when you guys finished training the model and what was the weight of the model when you finished?

Lukas Knak | Packing Panthers (18:35.011)
You mean the size of the weights?

like in the file that we ended up with or?

Brian Walker (18:45.422)
Sorry, I muted myself. Yeah, do you remember, well, the model itself, do you remember how big it ended up being?

Lukas Knak | Packing Panthers (18:52.717)
Like on the GPU it's about like 10 gigabytes and I think the file is between seven or eight if I'm not mistaken. Or you mean how many billion parameters or? Well I didn't change the number of parameters so it's 1.5 billion I think, right?

Brian Walker (19:02.254)
Do you know how many parameters of a model define the model? Yeah, million or billion?

Got it.

OK, all right, cool. Alone, the hospital. I want to talk about quickly how that came about. And you guys all registered for the hackathon. You had no idea what assets you're going to be choosing from. I think we put about 30 assets into the library from Revel.

And but you had no idea. you were kind of like, OK, I'd like to build something like this in the registration form, but I don't know which assets will be available. And the rules were pretty straightforward. You had to use the asset in the pick and place motion of the robot. So let's talk about a little bit like when you open up the library and realize what assets are in there, how did you kind of start navigating, what could I build?

Loan Bernat | Hospital 4.0 (20:05.194)
Well, in fact, to be totally honest, I did not have any idea of what I am going to build and I just saw the different assets. Like I saw the tray and I was okay, I want to do something which navigate with the tray. And so I was thinking about doing a bit like the packing pen with humanoids. But in fact, the...

I was thinking that I won't have enough time to train the locomotion and also the manipulation of objects because I know from experience from the work that it's not really easy and most of the time the locomotion is still a bit jerky so it's...

make moves on the tray so objects can fall from the tray. I start, I was thinking using a mobile robot with a holonomic base, like no legs, just navigating on the ground. And then I saw the shelf and some of assets which can be picked from a shelf. And so I...

I think like, okay, I can do maybe something where a robot takes something from a shelf and then put it on the tray and then the robot which has a tray move something somewhere.

But I know from my experience that I cannot finish any project if I do not have a concrete context around it. And so it came the idea of doing an hospital when I was navigating into the different assets that provide also is AXIM. They are an environment called hospital. And so I directly think of, maybe I can add language where patients are in their room,

Loan Bernat | Hospital 4.0 (21:59.335)
can like speak into a microphone to ask to the system something and as one of the

judgment critter was the usefulness of the app. I was okay. know, I know that nurse are nurse and doctor are always doing a lot of things. There are a lot of things to do. We we lack of some nurse in certain place. So it could be really important and impactful to do something like this, even in the real world. So I was okay. It could be great to make something simulate to make a simulation of the system. And so here was the idea.

me like maybe one or two days to really think how I will do the system or will make everything connected together and then I start working directly on it and finish on time. Not entirely but I respect the deadline.

Brian Walker (22:54.794)
Is there somebody in your family or someone that you're close to that you've had these conversations with about the actually because it's so relevant, right? Like there really is a shortage of nurse nurses worldwide. This is a pretty common problem in America and Europe and other countries as well, of course. So, you know, having them focus on the human interaction with the patient and having the doctors tend to the patients as well while automating some of these kind of very labor and time consuming

Loan Bernat | Hospital 4.0 (23:05.994)
Yeah.

Brian Walker (23:24.312)
tasks to robots was totally makes sense. I'm kind of curious to see if you have, was there someone in family or friend that you like you picked up on this, like, hey, like, you know, friend is a nurse or how did you learn about sort of the problem itself?

Loan Bernat | Hospital 4.0 (23:41.207)
In fact, I have a lot of my friends which are going to be doctors or nurses, they are doing some studies for that and they are at a point where they are working 40 hours a week and they are just doing things like that, like this, it's called an internship, they are interns and they are working on a hospital and just help patients, bringing them some...

Bring in what they ask for so they are not really doctors nurse and they are like working in non-stop on these they are always there are sometimes even staying the night to I don't know to verify that everything is good Squirt maybe it's like they need to sleep on the inside the hospital stay here and if they are a problem with a patient or a patient ask for something they have to get up and bring them what's a patient asks or

I like chatting with them. I understand that this is really problematic because it's a really hard task. They are sleeping here. It breaks a bit. The family lives. They don't get home after the work.

Yeah, just chatting with them, I understand that it could be something really problematic and if myself I had to do this, I will probably be a bit sad of working like this, but it's important in fact compassion for everyone. They are doing this sacrifice for helping the society. I think where robotics could be the more important tool is in work like this where people are doing a lot of important things really.

really not, how can I really not demanding, but it's time consuming, so robotics can do this really simply, so yeah, that's why and how I came to the idea to doing this.

Brian Walker (25:46.286)
Thank you, and congratulations again on the success of what you built. when I was looking into the researcher of this topic, I was shocked that the, I think it's about 70 % of the time the nurse is called to the patient. is sort of, bring me, can I have some crushed ice? Can I have an ice pack? Can I have extra pillow? Can I have an extra blanket? Can I get some book, please?

Could you, know, this or that, sort of this, bring me, take this away, bring me, take this away, which is sort of, you know, then when somebody has like an accident and they need to be cleaned or bathed, washed, taken for IV or this or that, that's where you really want to have the human, but like, hey, I need an extra ice pack. Can I have some extra water? This is a fantastic opportunity for usefulness of robots.

I think we have to make that hospital robot still kind of appealing visually and fairly friendly looking, inviting, trusting, especially for elderly patients. they would feel comfortable and not scared that there is a mechanical unit roaming around. And so maybe programming some voice and some funny LLM.

crack a joke when the robot enters to lighten up the mood or something and can read the emotion of the humans because people in hospital are usually at their most vulnerable place in life or position. So yeah, great. Really, really, really great job. Congratulations. And we'll talk more. But I love how you put everything together. I was really,

I was really waiting to see if anybody was going to pick the tray from IKEA, know, the typical cafeteria tray. actually you and Lucas and Jan, the pack and Panthers also picked the tray with their butler. And I think your butler in the setting you guys actually picked as well was also a hospital. think you guys also, Lucas had a...

Loan Bernat | Hospital 4.0 (27:45.654)
.

Lukas Knak | Packing Panthers (28:01.839)
Yeah, we picked the hospital as one type of application where it could be placed in.

Kamil Dusejovsky (28:13.335)
Voskalis, I have one question. So you chose the infamous claw machine, right, which we've all experienced when we were little, maybe when we were adults too. What made you choose the claw machine? Have you had a lot of fun times with it when you were younger?

Brian Walker (28:13.356)
Okay, we got the... Yeah, go ahead, go ahead.

Pascalis Trentsios (28:29.86)
Sure, sure, I spent a lot of money on claw machines. I lost a lot of money there, I think only one, two times. Some toys, some fluffy toy. actually going back to Brian's question, like previously you didn't show us the assets, but you made us also describe our project. So that's why I had the bark and bite idea and wanted to have some robots to give my dogs treats. But then when I saw the assets, thought, hmm.

This is going to be hard with the assets provided. So there were no treats in there. So yeah, that's why I thought maybe just put them all in one place and try to grab them. And then gradually this idea came to life. So I started with a claw and then I just thought, okay, how far can I push the limits of Omniverse of Isaac Sim? And turns out pretty far. yeah, the software is pretty good in simulating reality, but you have to tweak a lot to get it right.

Kamil Dusejovsky (29:29.121)
You did a good job.

Pascalis Trentsios (29:29.156)
Yeah. Thanks. Thanks, GoRamp.

Brian Walker (29:31.758)
The saying about the presentation was just so on point with your presentation. It was like a little movie to watch. And I think we had a lot of technical judges on the team. But we also had a lot of people who really needed the explanation throughout the process. And soon as the car machine showed up,

Pascalis Trentsios (29:53.7)
Mm.

Brian Walker (29:56.32)
Lucas, which was one of our judges, maybe you guys are familiar, he's a pretty big voice in terms of LinkedIn and robotics. The first thing that popped up as soon as your video started to play was this image from the Toy Story movie with all the little toys on the bottom and they're looking at the claw, if you guys remember the Toy Story movie, and they're like, he's here to choose, you know, and it was kind of like that.

And everybody just went immediately back to their childhood with like a Toy Story and calling arcade and stuff. So it was really great. I actually want to ask a technical question that we were talking about with Kamel and that was the rope and the simulating of the rope, which is basically the formable object. Can you talk about a little bit because you went one direction, you said didn't quite work well and then you went a different direction. Can you talk about that a little bit?

Pascalis Trentsios (30:37.06)
Mm-hmm.

Pascalis Trentsios (30:46.98)
Yeah.

So the rope, so I saw like when I researched it, I found a community post on the Omniverse community where someone was showing how to create a rope. And what he basically did was a lot of little capsule rigid bodies and then attached them one to another. So I created like a little helper script that did this with a prim-based approach. So in prims, you have them, you,

they're selectable in Omniverse. You can have their individual and you can select them. But there's also something called the point instancer, where you lose this ability to select them via the UI. But I'm not 100 % sure how it works. But basically, it's much better in performance. So if you have a lot of objects and a lot of physics objects, the best way, I think, or one way to go would be to use this point instancer approach.

But then I had some other components like the winch or the claw, and I want them to be prim-based. And I wasn't quite able to attach both, like the point and sense approach with a prim-based approach. So everything was just prims and also just rigid bodies, but a lot of rigid bodies actually. So that's why you get the feeling that's a deformable body. But it's basically, I think, 50 or 60 individual rigid bodies.

Brian Walker (32:11.896)
So the rope was more kind of representation, almost like a chain of rigid bodies, essentially. Do you remember how small were each link of the chain? To get the result in sort of assimilating the physics of or the appearance of rope, do you remember how many in terms of scale?

Pascalis Trentsios (32:16.387)
Yeah, yeah, it's basically a chain. Yeah, yeah, yeah.

Pascalis Trentsios (32:24.548)
Mm-hmm.

Pascalis Trentsios (32:29.088)
Yeah.

Pascalis Trentsios (32:33.893)
I don't know like the dimensions, but you can somewhat see it in the video where I showcased how I pulled it through the toros through the small hole to retract it. There you can see when you look closely. Yeah. Yeah. So that's why I had like 50 or 60, but in total it was only like one meter of rope. So yeah, basically a few, a few centimeters. I don't know how much it is in a U S units. I don't know some, some inches or look.

Brian Walker (32:45.422)
Yeah, it looked very small, looked like maybe a... Yeah, it looked very small, like a centimeter or something. Ah, okay.

Brian Walker (32:56.654)
Three centimeters.

Brian Walker (33:01.038)
It's OK. We can do a one inch is 2 and 1 half, 2.54 centimeters. So roughly an inch, roughly an inch, a little one of those cocktail sausages from anyways. Zacharia, you are the winner of the community. And your selection was actually the most common selection. But we really felt like you when sort of

Pascalis Trentsios (33:06.885)
Yeah, okay, a couple of inches. Roughly an inch. Yeah.

Yeah, basically.

Brian Walker (33:30.39)
next level with the chest. Can you talk about how did you hear about the hackathon? Why did you decide to join? And when you didn't know what assets are going to be put in front of you, when you did open up the asset library, how did you decide which direction you wanted to go and what challenges you faced throughout the process?

Zakariea Sharfeddine | LeTeam (33:56.153)
Yeah, of course. I heard from the hackathon through DJI, like through his YouTube channel. Same like the other guys here, I was just following him and I saw the video and instantly knew I wanted to participate. And actually in the beginning I wanted to do something like Wall-E. Like I was very, I loved this movie. I wanted to have like a mobile robot that goes around.

picks up trash and like, can kind of sort it but there were no but there was like no trash in the assets and so I ended up doing something else and I saw the chess asset and I knew like most people are gonna pick it but I thought okay it looks fun like yeah let's see what we can do with that and

Brian Walker (34:32.194)
I love that movie too, by the way. I hear you. It's one of the best.

Zakariea Sharfeddine | LeTeam (34:55.268)
Yeah, that's reason behind it. My goal would be to transfer this policy also into the real world, because that could be really fun. Like here, the S100 arm costs only 100 euros. So it could be pretty affordable to just attach it to a chessboard, and it could automatically put the chess pieces into the initial position for you.

Or like you could use it that trains you to learn new openings, it forces you to do the correct opening. And if you fail, would just start from the beginning. I think there are like lot of ideas. could also like build two of them. And so you can play with your friends physically, but still like from a far distance.

So you don't have to go on your mobile app and press on those 2D chess pieces. You could do it in the real world. So there are a lot of applications and I think it's really fun.

Kamil Dusejovsky (35:54.487)
Thank

Loan Bernat | Hospital 4.0 (36:07.958)
I have a question on what you did. fact, because I know that when you are trying to make a policy to take an object somewhere and put it somewhere else, it's fact very hard to design a reward function to it. So did you use some demonstration or administration learning or directly you craft the reward from here?

Kamil Dusejovsky (36:09.047)
Yeah, that would be great. Go ahead. Go ahead, Len.

Zakariea Sharfeddine | LeTeam (36:38.096)
Yeah, like also fine-tuned the VLA model, like the packaging panthers. So I think that's the easiest way. Initially, I tried to have as many random positions as possible, but I could say like as Lucas, after the initial deploying the policy, it didn't work at all. And I thought, okay, I'm gonna like for the hackathon, I'm just gonna use two initial positions. The first one was super easy, like move the pawn to...

Loan Bernat | Hospital 4.0 (36:41.502)
Okay. Okay.

Zakariea Sharfeddine | LeTeam (37:08.048)
to positions further away. Like that wasn't a big problem. But yeah, like normally I think you would need way more data samples. I only had 50 at the end to actually learn all the positions or to learn how to move one chess piece to any arbitrary position.

Brian Walker (37:33.314)
Kamil, you had a question?

Kamil Dusejovsky (37:34.834)
Yeah, that would be great having two people play chess, maybe voice control, eating food. They don't have to do anything. They just say where the arms move. That'd be great. My question is, you guys saw each other's submission and we're going to post the top 10 on our website, on the Hekateon website. So later when it's up there, you guys can go check out the other ones and kind of see how you guys compare. How do you guys see the other submissions? So you four, right?

Do you see that you guys are on kind of the same technical difficulty level or do you do? Did you learn something from the other guys that placed in top three or the community award?

Zakariea Sharfeddine | LeTeam (38:18.712)
And like, if I can begin, I think the difficulties are really completely different. I feel like Pascal has completely used the i6-10 to like, normally it only works with rigid bodies, but like the way he uses it, he could like replicate the chain. So I think that's really awesome. And me personally, I rather like train the model.

and set up the environment. So I think that's yeah, kind of difference.

Pascalis Trentsios (38:57.41)
Yeah, I totally agree with Zacharias. So I had no robot, no machine learning, no model that I trained. I focused only or nerded out on physics and how far I can push. can push that. yeah, there was some, like, for example, I also liked Zacharias post in the community because you shared the community videos beforehand and also the one from Dan where he basically threw an object. So I think this was also like a cool approach.

and from the, like from the pro league, something that I'm also interested in, and I would like to explore is this teleoperation as a baseline for imitation learning. So something that Lucas did, and also there was also another project where he had like a real robot, a real soul arm and used the embodiment of a different robot to teleoperate it in simulation and to basically use like reality for some.

Yeah, some baseline, some baseline that you can use later on for imitation learning. think those projects for me are very interesting and I see a lot of potential in this.

Kamil Dusejovsky (40:11.551)
And if you guys don't have anything else to add, I have one more question. So most of the submissions came in about an hour or two before deadline on Friday. Were you guys one of those submissions or did you guys submit ahead of time?

Pascalis Trentsios (40:28.702)
I submitted I think one hour before the deadline or half an hour before the deadline. yeah, it was, it's only two weeks. He only gives us two weeks and I have a day job. So I only have the weekends to work on the project and a little bit on Friday putting everything together. So yeah, it was one of the late entries, but on time though, on time. No, I didn't go over time. Yeah.

Kamil Dusejovsky (40:48.759)
I see. Yes, yeah. And if you could estimate what was the total, just guess, what was the total amount of work that you actually put into the project in hours.

Brian Walker (41:06.218)
That's a really good question. I'm curious myself.

Pascalis Trentsios (41:07.648)
Yeah, it's hard to say. think like...

three days that I worked like very hard on it. So a couple of hours. I think I'm not sure about it because I started like in a small increments, you know, I first did the claw and some physics testing there and my day job also involves omniverse. So that's why it's hard to, know, like what is for the job and what is for the hackathon because as you like

Kamil Dusejovsky (41:41.365)
Yeah, it all blurs together.

Pascalis Trentsios (41:43.392)
Yeah, basically. yeah, couple of days and a couple of hours for those days.

Kamil Dusejovsky (41:52.887)
I just realized Lucas cannot hear me. Brian, if you could ask him how long did it take them?

Pascalis Trentsios (41:57.476)
Yeah.

Brian Walker (42:00.27)
Lucas, I'm going back to my translation role here. How long did it take the Panthers to work on it? How many hours total would you say the team spent on the project?

Lukas Knak | Packing Panthers (42:15.065)
So I counted my hours actually and well I stopped counting beyond 100 hours that I put in into the project and I think Jan maybe another day so yeah definitely more than 100 hours together.

Brian Walker (42:25.346)
Wow. Okay.

Brian Walker (42:33.206)
a lot. Okay. And loan.

Loan Bernat | Hospital 4.0 (42:37.11)
I think I am maybe more around 40 hours. As Pascalist, with my day job I mostly work on the two weekends that we had. So it was like 8am to 11pm.

and doing some breaks to lunch, going to gym and play a bit but it was mostly all the weekend long so yeah I think and some nights too I in fact as the deadline was Friday and I was scared to not have finished the whole video I just like asked my job to being in vacation the right Friday morning so I just like finished the video here and submit at mid...

at 12am so yeah I think definitely less than Lucas, two times less I think, 40 hours.

Brian Walker (43:38.412)
No, sorry, go ahead. There was a previous question from Kamil as well. Were you one of the last minute submitters? Because we actually received majority of the submissions like a half an hour, an hour before the submission deadline. And we had so many participants, so many registering teams. like three hours before the deadline, I think we had nine submissions. So we were like, wow, the fall off. And then just.

Lukas Knak | Packing Panthers (43:47.439)
you

Lukas Knak | Packing Panthers (43:53.455)
Alright.

Brian Walker (44:09.11)
You know, just a crazy amount of submission came in within the hour and a half an hour of the submission deadline. Were you guys one of those teams?

Lukas Knak | Packing Panthers (44:21.453)
Yeah, actually, so we read on Discord later that the deadline actually got like pushed, right, for five more hours or so.

Brian Walker (44:29.25)
Yeah, we had it. one thing that we, know, please excuse us. And I kind of wanted to talk about that just briefly. This was our first ever hackathon as Revel, right? Like we're a startup building a product, a company talking to investors and we have our own team and everything going on. And we basically are kind of went with the hackathons as to meet our potential customers.

addressed the issues in the market, addressed the issues and the problems and so on and we'll get to talk about that. But this was our first hackathon and one of the mistakes we've made was sort of people not understanding the time zones because we had registered people from all over the world and we had some people messaging us like, well yeah I thought it was eight o'clock like my time not you know they were like what's Europe, Central European time or like American time PD.

So what we did was we did basically like an extension to even out the eight o'clock for everyone to be in eight o'clock. So it was eight o'clock in Europe, it was eight o'clock in New Zealand, it was eight o'clock in California. So there was about a 10 to 12 hour extension to the submissions because people were like, hey, I'm running home from work. I want to submit the project. like, I thought it was my 8 PM.

So next time, we got to be more clear on the centralized time for everyone and for submissions and stuff. So you guys submitted earlier or?

Lukas Knak | Packing Panthers (45:58.915)
Yeah, so, no, on our time, actually, was one minute before 8pm, so it was last minute submission.

Brian Walker (46:07.032)
So close. I wanted to comment real quick on the SO100 arms. It's one of those things that if you skip Starbucks for a week, you can just get one of these robots. It's an incredible kind of a little tool for relic. Like I said, you could skip Starbucks for a week, and then you can just have a pick and place robot on your desk.

Which is great for the Sim2Real transfer. I want to quickly ask you guys, feel free to jump in and answer, but what is your guys' level of experience with the transfer of Sim2Real? And if you could kind of speak about the failure between Sim2Real, like, you if you do a simulation, how often it translates well to real and so on.

Pascalis Trentsios (47:01.122)
Yeah, maybe I can start. So I did a Sim2Real project almost four years ago, I think. And it was not with Isaac, not with Nvidia, but with the Unity engine, in fact. So I trained a robot in the Unity engine. And that's why I have some experience with modeling and simulating stuff. And for me, so the hardest gap was actually the visuals. So the robot I used was a small, like,

like a Roomba, like a small wheeled robot, and it had a camera. so simulating the physics, the rigid bodies, was also relevant, but not so relevant as the actual visuals. So I had to match the visuals in simulation really closely to reality to be able to transfer it into reality.

Brian Walker (47:55.054)
super cool. Anybody has experience with Sim2Real or most of the simulation thus far.

Zakariea Sharfeddine | LeTeam (48:02.114)
I also did it with Unity, like my last project it's a university. And it didn't really work out that good. Like, yeah, we didn't do enough domain randomization, but I think that's really important. And it was just a matter of time we couldn't finish the project. But yeah.

Brian Walker (48:14.254)
Brian Walker (48:20.014)
Yeah.

Brian Walker (48:27.182)
When you mentioned the WALL-E movie, if you guys have seen the movie, which is really, really lovely, but it made me think of when we talking about your project, Zachariah, about the chess playing, and Kamel said, like, well, I could be having a sandwich and play with my friend, and the robots will be moving the pieces. At the end of your video, the arm went quite haywire and crazy and started to tossing the figurines all over the place, right? So would be kind of interesting if you're just eating a sandwich.

Zakariea Sharfeddine | LeTeam (48:27.416)
So nice.

Zakariea Sharfeddine | LeTeam (48:55.696)
You

Brian Walker (48:56.686)
And the SO-100 arm or whatever robotics arms just starts throwing chess pieces at you, you like you get hit. And also to the reference of the Wall-E movie, you know, I don't know if you guys remember, but in the Wall-E movie, you know, the humans left the earth, it's pile of basically trash. And then they're on this boat, right? Where they're like these crazy chairs where they basically really don't do anything. So I hope that the demonstrations of playing chess was wonderful, but I hope that...

will stay with playing chess. Chess games stay with playing against a human, moving, you know, the effort of moving the chess piece is not that big. But in terms of like across the ocean, playing with someone and kind of have this robotic control, I think that's really, really exciting. Somebody just dropped off. I don't know. Oh, it's Lucas.

Hopefully he can join back in. So I don't know how to bring him back up. Kamel, if you're still here, guess you're the host. If you see Lucas and when he wants to join back in, if you can bring him back up. Kamel's in Las Vegas right now. They're setting up a huge trade show.

Kamil Dusejovsky (50:12.609)
Yeah, still see him inside but... Yeah, he's still in so I cannot let him in. Looks like he's just videos off.

Brian Walker (50:20.514)
Got it. Now, I can't hear him. I can't see him. I don't know if you guys can. Can you guys see him, hear him? No, disappeared, Yeah. Lucas is gone. In the last part of this podcast, guys, that we're going to put out there, we'd love to talk a little bit about the current pain points of robotics training, simulation.

Pascalis Trentsios (50:29.335)
see you.

Brian Walker (50:50.606)
I think we'll focus on the simulation. one of the core pillars that we're solving at Revell, which is data and training. As you know, we're on the mission to make robotics training stupid simple. We even named the podcast after this mission. So this podcast is called Stupid Simple. And what that means for us, at least in our vision of where we see this, similar to...

where the building of apps and web applications and so on has head out with products like Bolt, a product like Lovable, where you can have founders, even not so technical founders, build prototypes and deploy demonstrations to a team. It's pretty awesome. They call it Vibe coding, so we're calling it Vibe training. Can you guys talk about a little bit

the difficulties, the biggest pain points and challenges in terms of the software, the software stack. one of the things that we see is the software is built by incredibly smart PhD level engineers. it shows. It's got a lot of power, but therefore it's lacking certain user friendliness.

Can you guys talk about like, the biggest pain points, where would you like to see progress made? knowing what you know about Revel and what we're working on, if you could give us some pointers of what would you love to see solved. We briefly talked about the compute, so maybe you can mention that. But is there any other things? And we'd love to hear from you guys. Feel free to jump in.

Pascalis Trentsios (52:43.461)
Sure, can start. You guys can join later. for me, it's also like, as you said, the software is great. I mean, you have so many possibilities of configuration. There are so many things to do with Omniverse, with iTechSim. And due to its fact that you can build custom kit extensions to your needs, either Python base or C++, this is awesome. So it's just like a really awesome tool.

But then again, as you said, it's like maybe a bit overwhelming and getting started is quite hard. So that's why I'm like super grateful to people like Muirma who did a great job with all the content he provided to get people started in Omniverse. And you can basically like build even on top of that, as you mentioned, Brian, so to maybe streamline certain workflows, to automate them, to make them a bit...

easier or user more user friendly. I think this would be great and would help like a lot of people getting, getting started with omniverse and making, you said, developing robots, stupid, simple. I think there's a lot of potential to go this way.

Brian Walker (54:02.125)
Lone.

Pascalis Trentsios (54:02.127)
Give the mic to Lon.

Loan Bernat | Hospital 4.0 (54:03.734)
Yeah, yeah, yeah. In fact, I never really started training for policy in Iseximo Iseclab. I was more in the model-based side of the robotics, like a question to solve directly and generate the trajectory. That's why I started directly with it during the hackathon.

Loan Bernat | Hospital 4.0 (54:30.27)
It's sure that data is key point for the robotics training. We saw this with LLMs where we need billions and trillions in fact of data, quality data. But for robotics, even if there are some research effort to bring more data for robotics movement,

it's not at all at the same level as text pure. And so being able to augment using simulations the number of trajectories like omnivers, groups, which can...

automatically increase the fact as a number of data using fabrics randomization things like this and having a society capable of bringing really qualitative assets which looks like real life as you provide with a real for example it really help policy to converge more efficiently to a

a behavior which will work in simulation but also in real world because the light, because the quality of the assets looks in simulation just like the real world. So it could bring, I think, impactful sim2real transfer without needing a lot of randomization and things like this. But yeah, I did not experience it myself. In fact, like maybe Zakaria did with the SRAM to train.

the chess, you probably rapidly see that it need a lot of data to make a model like this converge to a simple task of making the play a piece something somewhere, sorry, sorry, I lose my words, somewhere to somewhere else. And even for a simple task like this, you need a lot of demonstration, a lot of randomization, and it only works to be transferring the real world if the chess that you use in simulation,

Loan Bernat | Hospital 4.0 (56:28.734)
It's it's looked like the same, but in the real world. So providing some assets really equally with a very high quality is in fact, I think the best way to limit the SIM2 gap, the SIM2 real gap. But yeah, as I said, I'm not really experienced it myself.

Brian Walker (56:49.026)
Great, great. I see Lucas join us back. can't see your video, but Lucas, are you with us? We can hear you maybe.

Lukas Knak | Packing Panthers (56:55.605)
can hear you, yeah? Don't know why my video doesn't allow me to turn it on again. Sorry, I think my connection dropped or something.

Brian Walker (57:05.716)
Yeah, maybe you can try to see if you can hop off and hop back in if they're going to fix the camera issue that we had before. We're currently talking about sort of the pain points of the robotics training and kind of going in circle. So, Kariya, you want to hop in there and tell, so yeah, I'd to hear from you. Where do you see sort of the biggest pain points of robotics training that you've experienced and then where would you love to see some improvement?

Lukas Knak | Packing Panthers (57:13.002)
Okay.

Zakariea Sharfeddine | LeTeam (57:34.583)
Yeah, definitely. Currently, group mimic is already like just from the capabilities sounds really great. You can change the lightning, the texture and so on. But maybe correct me if I'm wrong, but currently there's also something missing like an asset randomization. So I'm pretty sure like, okay, maybe I made the thing to real work for this chess asset.

but I'm pretty sure it wouldn't work for another one. where the pieces look a little bit different. So I think something like this is currently missing or like what I've seen. I don't know, you built a sandwich maker environment and then the patties look, you have like one asset for your patties, but would be kind of missing that you have some variation in the assets.

So maybe that's something Revell could definitely look into or you're already building something like this, because this would be really beneficial.

Brian Walker (58:40.012)
Yeah, randomization of similar objects, identification. And absolutely, it's definitely one of the core things that we're looking at. And also, really wanting to have a library, very strong library of real world assets, things that robots more most likely come in contact with. We actually have a whole agentic system that's doing a research and looking at

the next three months, six months, 12 months, two years of what are most likely objects that would be useful in manufacturing, warehousing, hospitals, and so on. That would basically, that we need to have in the library and we need to have with the domain randomization, hey Lucas, there you are. The domain randomization that you were just talking about, Zachariah, and.

the variety and also have the models being able to recognize, no, this still a patty. It's just got squished in the transfer a little bit, but still a burger patty. This is still a slice of cheese. This is still a piece of lettuce. And being able to still identify the objects correctly, think that is definitely a very, very good point and something that's super needed. we're working on it hard. That's a great point. Lucas, we were talking about current

challenges in robotics training, where are the biggest pain points, and kind of what you would love to see improved, knowing a little bit about Revel, what kind of, you what we're working on and, but what else would you like us to maybe kind of, you know, what are some of the things that you feel like really need to be addressed in the training and data of robotics?

Lukas Knak | Packing Panthers (01:00:29.549)
Yeah, so I've heard a few keywords here and there that you've dropped already. So from my perspective, well, as I've explained before, I think what was really a mess is trying to get everything training properly with the data and the models, et cetera, on some cloud computing. So that's really an avenue that I see a lot of potential for.

Brian Walker (01:01:00.046)
So taking away the complexity of setting up the training to where you could either take the file or just be logging through our platform, drag and drop the file, get a pre-calculated cost of the compute so you know what it's going to cost, how long it's going to take, and then just kind of one-click pay in terms of tokens that you've purchased or directly debited or something like that. Yeah, we hear that a lot. And it's probably the first feature we're going to go out with, actually.

Lukas Knak | Packing Panthers (01:01:28.119)
nice.

Brian Walker (01:01:29.07)
And if we can, we'll try to make it happen for the Hecaton 2.0. But if not for the 2.0, we'll definitely launch it in the 3.0, where we would have the training platform live along the side of some of the assets and allow people to say, I've built now my project, drag and drop the file, one click, run compute if needed. OK, that's really, really great.

Is there something that you guys see? This is a bit of a bet and a of a future.

We see a huge, at Rebel, we're looking at the generative world, to the generative assets in terms of, you know, simulating generatively physics, center of mass, gravity, and so on, randomization. How do you guys feel about the intersect of physics engine and generative worlds? Have you paid much attention to it? do you, I guess the question really is, we see that

Ultimately, ideally, we would see the robots sort of learn from imitation just by watching the sort of the way that humans learn. You can watch the tutorial, like if you're watching Moomir's tutorial and doing something. Ideally, we'd like to, you know, one of the core features we're developing at on the Revel training platform. And sorry, I'm circling around the name. We just haven't announced it publicly yet, but you know, but it's coming. You know, one thing that we'd love to get to is show the robot

these synthetic training curriculum from text and videos with simulated physics and label that correctly, all the trajectories and so on to collect as many tokens as possible from this training curriculum and then train a new model on that particular skill. you can something as simple as teach this robot how to use a hammer, teach this robot how to do ABC or whatever.

Brian Walker (01:03:38.262)
Have you guys came across any research or any, know, I've been seeing a lot of different angles on this on LinkedIn and all over the place and Unis as well as research papers and so on. What is your guys' take on the future of learning? Do you guys betting on physics and simulation engine like Isaac Sim or do you think that we're going to

eventually get to the point where robots can learn purely off of imitation. And even kind of getting away from the tele-op imitation learning, but really just showing the robot billions or hundreds or millions of generative videos with that particular asset and that particular robot from multi-angle kind of viewport and just collecting data, collecting data, collecting data on all of that data that we're showing to the robot being generic.

This is something that we, at a core of Revel, at the training platform, we're making a huge bet on. So I'd love to kind of hear your guys' take where you think that technology is and where it's going or if you came across anything in that domain.

Loan Bernat | Hospital 4.0 (01:04:55.25)
I think that the main difficulty of the approach you present like doing just imitation learning by watching from robots, the main difficulty will be to transfer the control like when you're doing something like this with a robot like imitation learning

you are basically mapping each possible state of the observation space to an action space, to an action. So like for each joint of the arm, you will control it like sending the position thing, if you control it in positions. And so each robot has a different configuration, like different joints, different size of body length to join. So being able to just transfer all data from a robot to another,

just like this, I think it could be really really difficult to do this properly because it will need an adaptator something that it can translate from the original yeah that it can translate

Brian Walker (01:06:00.31)
Like a converter. so calculating the, yeah, like having the math figure out from the source video, from the training imitation video, labeling all of the distances, you know, in terms of movement, and then recalculating and adapting that into the different degrees of freedom and capabilities of variety of robots. Is that kind of what you're thinking? that you have so many varieties of robot that it would be quite complex to convert.

Loan Bernat | Hospital 4.0 (01:06:28.958)
Yeah, yeah, because it's not like there are not a lot of unique robots with a unique configuration. So being capable of building this transformer or traductor if you want, if we want, it's more complicated. in fact, I think that simulation could bring a lot of quality data like.

as I already said this, but if you are in capacity to simulate something really close to the real world, like physics, like quality of data, which looks like in the real life, it could really facilitate all the training, the first pre-training part. And then when you have all this all done, you can just transfer to your robots in real life by doing a bit of de-repairation.

or maybe not at all if you have enough data in simulation that work correctly. So I think the easiest path is maybe more turning with a simulation, with a generation capability of object, diffusion, randomization, things like this. But.

it's it's do not mean that the second option like transferring being able to transfer everything to another robot won't ever happen i think it could it could work but i think the path is more complex than the first one so at short term maybe the first one with pop-off with really good things

with really good movement. It's like all current robotics society are working, like they are doing some imitation learning on human's movement and then using simulation to randomize a mass of robots, frictions, physics, And then we are, we are sure right now we are sure we are seeing really great demonstrations of really high quality movement from robots or really high quality manipulation, I think about figure A.

Loan Bernat | Hospital 4.0 (01:08:29.792)
deep mind google so I think the easiest path is the first one but it doesn't mean that the second one won't ever work

Zakariea Sharfeddine | LeTeam (01:08:39.408)
Can I maybe add on that or like it's rather a question, but I don't know how you guys see it and Couldn't be also possible that we completely skipped the simulation part in the future because you can like just give a lot of people the Apple vision Pro you can create a training environment in the real world and Just let other people till operate using the Apple vision Pro like for I don't know 10 euros an hour

And they could be collecting like a lot of data because then you would skip the, the, the, same to real problem.

Brian Walker (01:09:14.414)
I think something similar happened in the early development of degenerative models, where there was some cheaper labor hired around the world that basically was labeling millions and millions of pictures every day. so it is costly.

Loan Bernat | Hospital 4.0 (01:09:28.427)
Yeah.

Loan Bernat | Hospital 4.0 (01:09:34.721)
But it's really costly. It's really time consuming to do this. So if you have a different robot and you need to collect the data to control it, you will need like 500 of hours of teloperation data that can take three months of all day long teloperating. So simulation in fact brings brings that we do not need any long teloperation sessions. But I think you point a good point like in the

Pascalis Trentsios (01:10:02.635)
I think it's a multimodal approach. It will need like a little bit of everything. So you can have like a few hours of simulation, simulating years, you know, and yeah, maybe the quality is not so good as in reality. But then again, it could help to give it a little bit of a boost and then you do some some fine tuning in reality.

Loan Bernat | Hospital 4.0 (01:10:05.14)
Yeah.

Loan Bernat | Hospital 4.0 (01:10:27.842)
There are also a lot of work which work on making this type of model, this type of algorithm that they use to learn more efficient, like they can learn from a really few number of demonstrations. So this is also a path that are taking the research. saw some paper from Korea in 2025 on that. So I think, and it's a join what Zakaria said, maybe at a time we are be capable of with a

very few demonstration to be able to learn something without needing a huge amount of randomization, things like this, but we are not at this point still, yet.

Brian Walker (01:11:11.182)
Lucas, you have thoughts on this that you wanted to share?

Lukas Knak | Packing Panthers (01:11:15.373)
Yeah, so I saw a really interesting video from Nvidia recently. Don't know the name of the presenter, but he said that we basically need three paths. So the simulation, the tail operation, and then also just using GenAI to basically dream how the robot could perform some tasks. So I think all of the three.

Brian Walker (01:11:43.022)
Have you guys have any experience with the NVIDIA Dreams, the process of... You just recently seen it, okay. I recommend you guys go look into that because that is essentially you are showing synthetic data from a POV of the robot of randomized, feels like Dreams for the robot essentially.

collecting trajectories and data, but it's all synthetic. So we're kind of pushing that revel as well. Because I guess in robotics, we are all kind of on the path here to get to the chat GP3 moment to where we can say, now it kind of works. And then GPT-4.

when it was like, can actually start building some useful things. then GPT-5, arguably, it's not the best example. But you guys get the point that I'm trying to make, I think, is now we're really actually making applications and a whole systems and businesses build around. So I just recently saw the model from Google DeepMind, the R1.5, which I think is

arguably sort of that similar stepping stone in terms of maybe like a GPT-1.5, GPT-2. And I think that we're going to see similar trajectory in robotics or physical AI. Have you guys seen a figure making and announcing the partnership with?

Brook Shield or Brookfield. It's basically a portfolio of about 100,000 residences where they're doing something like Zacharia basically talked about where they're giving people not sure about VR headsets, but I think they're giving them like GoPros to just record videos from their POV going around their houses and cleaning up, tidying up, cleaning its rooms to they call figures calling this a project big. And they are basically on the mission to collect

Brian Walker (01:14:11.03)
as much data as possible to create the biggest pre-training set of data to show the humanoid, like, here's all the possible messes of toys being randomly thrown and plates and dishwasher loading and unloading. so that was an interesting approach. One that actually hugely, and their model, Helix, that hugely validated something that we're working on in the background. And so loan to

where I was going with that is, so imagine we take those videos that you see that were recorded by humans, but what if we could just create those synthetically completely, you know? And instead of having 100,000 homes, we can have 100 million different homes synthetically generated, and we can then take that data to kind of fine tune and train the model. that in a nutshell is what, that's how we're looking at the data bottleneck at Revel is like,

We have enough information about all of the assets that you interact with in the world, from things you buy in grocery stores and things you buy in hardware stores, anywhere basically. There's enough public data in terms of pictures, videos, training videos, training manuals, data itself, to where we can use a lot of different LLMs that are quite powerful, text or generative, to then create a video that, know, that generative world, that generated physics to where

It appears like it's obeying the laws of physics. It's generative, it's just pixels, but it looks like it got the center of the mask of that object quite done well. I don't know if you've seen Genie 3 from Google DeepMind and some of the simulated physics there, the water splashing and stuff. mean, that looked like, to me, looks like the wake that JetSki would produce on a flat

a lake-like solid body of water. And the compute of it, the demand on compute for something like that versus calculating all the droplets and all the dispersions of the water in the physics engine, that's where also cost comes, like the demand for compute. It's like 100x difference, or if not more, for something as complicated as particle simulation, like water.

Brian Walker (01:16:33.09)
My background, gentlemen, is, I don't know if you know, but if you looked up some of my stuff on the background of Profilus, I come from Hollywood, basically, or the film industry. So VFX, I would not want to be in a VFX business as of today because they're definitely going to get heavily disruptive with degenerative models and AI. And they're going to have to adopt and...

and onboard all these tools into their pipelines, but essentially simulating the compute to simulate the VFX shot where there's some complex particle scenario happening. And that's not even real physics. It's just to make it look good in lot of these engines.

Pascalis Trentsios (01:17:19.589)
And that's, that's the point, like how good can you then translate it into reality? Like how good is it usable for actually training a robot that needs to do stuff in reality? think that's the point that Luan also made previously.

Brian Walker (01:17:37.184)
I guess the answer to that is we will see, but the idea is with the generative process, you could essentially create so many randomized variations where you're of creating almost this like a every instance of possibility and then having training and then just that sheer volume of token and then fine tuning to where you distill from good or bad and kind of having another, having kind of like a loop, you like you find, you create a model.

Pascalis Trentsios (01:17:51.651)
Hmm.

Brian Walker (01:18:05.614)
And then you run it through some agentic system that looks at it and labels it says, well, that's not good. That's good. That's not good. So yeah, I really enjoyed the conversation, gentlemen. And again, wanted to say thank you so much for participating in the hackathon. I hope you guys are going to join the next one. I'll hear you, Pascal. We'll give it more than two weeks. think we're going to.

Pascalis Trentsios (01:18:31.109)
Okay, yeah, that's great.

Brian Walker (01:18:35.278)
We're going to give it little bit more than two weeks for you guys to compete. We'll stay tuned for the details from Revel and on LinkedIn. But Lucas, Prasko, Sloan, Jan, who's not here, Zakaria, and thank you guys so much for participating. Congratulations again. Really enjoyed meeting you. Really enjoyed talking to you guys as well. I any closing?

closing statements on your projects or maybe a little shout out to what you're working on, where you're at, what you're doing, what's next for you. And then we can close it out.

Pascalis Trentsios (01:19:15.791)
Sure, definitely a shout out to Murmur, to Elite.AI. It was actually the reason I joined everything. the, yeah, huge motivation. Thanks guys for organizing it. That's pretty much my closing words.

Lukas Knak | Packing Panthers (01:19:32.577)
Yeah.

Brian Walker (01:19:32.898)
I hope to get to see you do something in the next one with your pup. I think that would be really cool and fun.

Pascalis Trentsios (01:19:41.061)
Yeah, we'll see. He's taking that now.

Loan Bernat | Hospital 4.0 (01:19:47.799)
Yeah, I think for the next hackathon I will try to participate but it's In November, maybe I have a very church calendar So I will try to build a team with someone else and try to compete like this because maybe I hope hackathon will be available in teams

Brian Walker (01:19:47.886)
All right, Lucas, Lone, Zacharia, all good?

Lukas Knak | Packing Panthers (01:19:50.679)
Yeah.

Loan Bernat | Hospital 4.0 (01:20:12.054)
But yeah, definitely trying to do something else, something maybe more complicated with learning and try to train my own models to do complete Zygatons. And yeah, so that's pretty much it.

Brian Walker (01:20:28.984)
Sounds great. Lucas, what's next for you?

Lukas Knak | Packing Panthers (01:20:32.045)
Nothing in mind right now. I could just repeat what Pascali said. So thanks to you for organizing it and Moammer of course.

Brian Walker (01:20:45.472)
Yeah, you're super welcome. Zacharia, when are you getting off of vacation? Back to... Okay.

Zakariea Sharfeddine | LeTeam (01:20:47.664)
I can also only add on that. Thanks for organizing that. I had a lot of fun. was a great learning experience. I would love to participate also in the next hackathon. Maybe I'm going to buy Apple Vision Pro so I can also operate the humanoid. But I would be missing some credits, like Nvidia credits for the breath. It's going to be really expensive.

Lukas Knak | Packing Panthers (01:21:03.509)
You

Brian Walker (01:21:15.854)
Yeah, I think you won some. I think you won some, and they're part of the prize pool. So I think we'll be sharing all that with you guys shortly. And the Apple Vision Pro and the VR headset, I think it's a great idea. I think we should throw a couple of these into the prize pool for the next one. Because you guys all said that you were pretty interested in doing some tele-op or some Groot mimic. And so we'd love to see that.

Zakariea Sharfeddine | LeTeam (01:21:26.916)
Let's go.

Brian Walker (01:21:44.142)
Yeah, I think those are definitely things that we'd like to throw in some SO1 arms or maybe even some other robotics. We went to 10x the prize pool, so we've been even talking how cool it would be to put the Unitree G1 maybe into the prize pool. Who would like to have a humanoid at home? So gentlemen, yeah, there you go. Well.

Lukas Knak | Packing Panthers (01:22:02.167)
Wow.

Pascalis Trentsios (01:22:08.527)
here.

You're counting me in.

Brian Walker (01:22:12.91)
You're going to have to outperform the claw machine. So thank you guys so much. Really appreciate your time jumping in here and making this work, even over the technical difficulties. And Lucas singling out Kamel, just not wanting to hear him talk. Gentlemen, thank you so much. It's been a pleasure. Hopefully we'll catch you guys on the next one. Again, thank you guys so much. Thank you.

Pascalis Trentsios (01:22:35.877)
Thank you, Brian.

Lukas Knak | Packing Panthers (01:22:36.109)
Thank you. Likewise.

Loan Bernat | Hospital 4.0 (01:22:37.46)
Yeah. Thank you, Brian. Thanks for the invitations. See you. Bye.

Pascalis Trentsios (01:22:38.809)
Thank you everyone, see you, bye.

Zakariea Sharfeddine | LeTeam (01:22:39.717)
Thanks a lot.

Lukas Knak | Packing Panthers (01:22:42.711)
Bye.

Brian Walker (01:22:43.02)
Bye guys.

Lukas Knak | Packing Panthers (01:22:48.525)
By the way, are we still on recording?

Brian Walker (01:22:53.91)
Yes, we are still recording and you need to leave the window open so it would upload all of your video and audio.

Lukas Knak | Packing Panthers (01:22:58.261)
Yeah, that's the question that I wanted to ask you because I have now two windows. So one when I disconnected and now the new one from now.

Brian Walker (01:23:08.118)
Yeah, just leave both of them running and when they say it's good, you know, I think you should be good.

Lukas Knak | Packing Panthers (01:23:11.669)
Okay, then let's just hope it works out. Alright. Bye bye. Thank you. Bye.

Brian Walker (01:23:16.522)
Lucas, thank you so much. Bye bye.

Brian Walker (01:23:23.448)
Thank you, thank you, thank you, thank you, thank you. What is thank you? All right. Thank you, everybody, for listening, joining in. Everybody hopped off. Kamau was long gone. So yeah, really enjoyed the podcast. Really enjoyed the episode. Really liked talking to the contestants. They've really delivered some amazing projects. So stay tuned to the next one. And thank you for listening. Have a great day.