IIIMPACT is a Product UX Design and Development Strategy Consulting Agency.
We emphasize strategic planning, intuitive UX design, and better collaboration between business, design to development. By integrating best practices with our clients, we not only speed up market entry but also enhance the overall quality of software products. We help our clients launch better products, faster.
We explore topics about product, strategy, design and development. Hear stories and learnings on how our experienced team has helped launch 100s of software products in almost every industry vertical.
What do we wanna say? The future of robotics is human. The AI wave doesn't replace UX. It actually makes it more critical.
Speaker 2:Simplicity maybe isn't about fewer features. It's finding that frictionless access maybe to the right functionality at the right moment.
Speaker 1:All right, everybody. Welcome back to another episode of Make an Impact Podcast. Podcast? Podcast.
Speaker 2:We can do both. We're flexible.
Speaker 1:Yeah. I'm your host, Makoto Curran, founder of Impact. We've been in business for over twenty years. We've launched hundreds of digital products. We provide the cheat code to successful software product launches and we turn your chaotic B2B enterprise, B2C retail projects into predictable wins through proven strategic workshops and derisk roadmaps.
Speaker 1:That's I've got mainly my cohost.
Speaker 2:We love doing it as well, I'll add that.
Speaker 1:So today, it's an awesome topic. Something I got into university for, robotics. And it's something I've been wanting to talk about. And we wanna jump right into how robotics and UX and AI are revolutionizing the human machine interaction.
Speaker 2:And there's a lot of this one. So I think, you know, stay tuned to the end. There's we're touching on everything. So, yeah, I think just a bit of bit of background on on this topic or I I think, Makoto, we've talked about, AI and the sort of AI explosion, not really robotics. And it kind of seems like the most obvious evolution of AI as it sort of expands into forms outside of our computers or at least our visual interfaces.
Speaker 2:There's so much viral footage going around of, you know, whether it's the Boston Dynamics Atlas robot or recently that cool, like Jake the Rizbot walking around Texas with his cowboy hat, which I think was from Unitree Robotics. But it's also, I guess, from the years of different projects, this is one Robotics is close to our hearts as we kind of know firsthand how UX is such a critical component of sort of successful robotics ventures. And I know a few years ago, we worked with one of the largest robotics companies in the world, which was Yasukawa. And there, I know you're going to give all the details. I'll just give a little intro.
Speaker 2:They're obviously renowned for their high performance industrial robot line. So they utilize those who have seen some robotic arms, and you can program them for things like welding or assembly, palletizing, all sorts of different applications, even sort of biomedical automation. The complexity of their legacy teach pendants really posed a significant usability barrier. They came to us sort of wanting to democratize the industrial automation. But up until that point, it had been really reserved for these more expert operators, which is great.
Speaker 2:With a company like Yasukawa, they clearly understood the value of UX and that they needed to make the product more accessible and obviously get their robots in the hands of anyone. So this relied on cutting down the cost, which was dropping an expensive teach pendant for really a generic tablet and custom housing, and obviously improving that user experience. So we're excited to take you back in time a little bit, a couple of years ago, to what that looked like. But I think first, maybe we look at why UX and robotics really matters more in in 2025, you know, than than ever before.
Speaker 1:You know, as the NVIDIA CEO, Jensen, as I like to call him when I when I text him regularly, He he put it earlier, this year that I'm releasing their Cosmos and and Isaac Groot tools. It's basically their chat GPT moment for robotics. You know, robotics, when I was involved, it was a lot of it was either academic or industrial experimentation to trying to mainstream adopt it. But there was a lot of semi automation and lean manufacturing that was happening as well as just kind of offshoring it to China as well for manual labor because automation robotics was a little bit expensive. But now we're seeing the cost, it's coming down, things are becoming far more efficient, but it's always been, UX has always been central to that because the people who actually use it every day setting up things, it's so important to make that as easy as possible.
Speaker 2:And I think we're sort of seeing that, I think with their tools, of, from my understanding, like the Cosmos tools and I think that Isaac Gruitt, they were they're sort of spatial awareness tools that I think bring in AI as well, but it's very that sort of tech that is just going to take robotics to the next level, which is is so exciting and, you know, just make them even more mainstream.
Speaker 1:Yeah. And I I think when we were approached, and we'll get into a little bit more, but, you know, companies generally, and this is just general for a lot of companies that they reached this tipping point of just being super technical, great at being efficient at the features that are being used, how accurate their robots are and certain points it does help to be a lot more accurate and have more features. But to the point of where it's detrimental to user experience, setting it up takes a lot longer. Usability is now the moat. That's the battleground that everybody's at.
Speaker 1:Yeah. And most factories, things like 90% haven't haven't adopted automation yet. Not because of the robot limitations, but just because the interfaces are too complex and that's what we're seeing with Yaskawa. They're losing market share to competitors that were far less advanced, but it was so easy to set up.
Speaker 2:I mean, that's it. It's kind of user experience is really that bridge between having the amazing engineering and the sort of real world impact. And I guess user experience was the key to the Yasakawa redesign. So, yeah, maybe you wanna go into a bit more detail about that, Makoto?
Speaker 1:Yeah, definitely. So Yasukawa, let me back up and give you the story of Yasukawa. They are one of the world's largest robotics company in the world. Hundred years of innovation in motion control and robotics. They manufacture everything from robotic arms to specialized controllers in manufacturing environments.
Speaker 1:You've got these gigantic robots that you have to be afraid of. And I used to work for Motorola and we used to program these big cells that would program mobile phones, but you would have to have these safety curtains around it so nobody can get hit.
Speaker 2:What was the interface like on that one as well? I mean, wow.
Speaker 1:Oh, it's just Yeah. You would have to learn the different languages. I forgot, I can't recall the language, but we used to program those, yeah, the logic was interesting. It was a good visual interface type of one that, you know, probably are are needed today and what I've seen, especially now with the collaborative robots. Those are the robots that actually were being used more because this is a robot that can work alongside a human if it comes even close to touching somebody immediately stops.
Speaker 1:The safety features there, you don't have to have these big safety curtains and worry about getting injured from them. So that is kind of where, and you see them more in modern videos and things like that where they're being used. That's something where this kind of case has become kind of our blueprint for how we approach industrial UX challenges, whether in manufacturing, engineering, or anything where kind of that powerful technology makes that complex workflow. And this is where their teach pendant that controls and basically programs our robot, I call it the BlackBerry of teach pendants. I mean, it looks like this massive thing with tons of buttons and this little tiny LCD screen that you would have to maneuver to program these highly complex robots in this manufacturing environment.
Speaker 1:And so they wanna convert that all into a full screen touchscreen, kinda like using it on an iPad. And so that was the challenge. And I know that they actually hired in a different UX firm prior to us, didn't get what they want. And the fact when they approached me and our company, it was a serendipitous kind of moment where it's been several years, probably over a decade since I worked actually in the robotics field and then just to hear that I was super excited to work with them And it was such a great thing where I'm like, I'm going to do this project right. So yeah, that's kind of how it got started.
Speaker 2:Yeah. I mean, was a really great project. I think just looking at the initial pendant, it was powerful, but wow, intimidating. As you said, like the BlackBerry, all these physical buttons, dense menus, deep learning curve. So the fact that we were brought in and we could get to sort of reimagine this interface with a UX first mindset was great.
Speaker 2:The goals were really clear. I mean, we had to reimagine a purely touch based interface. We had to sort of do things like cut down the setup times through just simplifying the interface as well, and then reduce that overall learning curve and the use of friction wherever possible. And I mean, I think we've discussed it before, this really was sort of a UX dream project as we got to sort of flex our HMI design skills. So looking into it, I mean, some of the sort of practical things that we addressed, I think in order for it to be easy to pick up and use, we literally had to start from the ground up and we sort of had to focus on those tenets of simplicity and consistency.
Speaker 2:So everything needed to be as simple as possible. The design had to be consistent. So if you knew how to operate one screen, it would be pretty simple to move to the next one. And I think we got to leverage some of those sort of age old or the OG UX principles, things like Fitts' Law, just to sort of improve the speed and accuracy of interactions. Because there was a big sort of push for cost reduction as well to make it more affordable in the certain sort of specific models of these robotic arms was just a default tablet with a custom housing.
Speaker 2:So we needed to look at how do people hold that housing? What are the target areas for what fingers are they going to use? Because you have to throw out the window, it's not a mobile device where you know, right, your thumb's going to be using it or typical tablet. It had different grips on it. It had one or two manual buttons as well, I think a safety button, that sort of thing in the sort of custom housing on the tablet.
Speaker 2:But things like that were really interesting, just approaching where we're positioning things on the screen to allow for ease of use and prevent strain from repetitive tasks, that sort of thing.
Speaker 1:Yeah. I think another point just before you jump into something else, I think the progressive disclosure because the thing is most engineers wanna show, they wanna show things whether it's menus, buttons, features, whatever, all at once is to show off, hey, it can do all this stuff, but that's a big problem. It's always a thing where we're trying to pull back things because you don't need to see everything at once. You need to see it in a contextual state. So that's always Absolutely.
Speaker 2:And that's a great point. I think that was the sort of evaluating the cognitive load as well because they also had, they also to put this in the hands of non tech users to make a much broader market. And for a lot of those users, they're not going to use the most advanced functionality. They may want to install one of these robots and get it to a simple task, like just welding a certain joint or moving one thing from one production line to another. So as you're saying that progressive disclosure was a really vital part, think, to just make sure that you're already showing things when they really are interested in that function, not just overloading the interface and make it really difficult for someone to understand where to go next.
Speaker 2:I think the other sort of key one was that visual hierarchy, which just was vital for making it easy to understand. Having optimized color contrast and layout for what could be diverse factory lighting conditions, which were another thing. You could be using this in a number of different settings. So how is the tablet working with that? How are the colors responding?
Speaker 2:You don't want incredibly bright factory lights reflecting your overly light interface and you can't really see it clearly. So all things like that, I think, were important considerations in designing it. And also just ensuring that there was that clean sort of visual hierarchy. Other things, there was so much technical jargon, I think, as well, that it was great for a technical user, but it would leave someone very confused who didn't really have that technical background. So simplifying those sort of jargon heavy labels, making it more action, sort of simple action based prompts and icons.
Speaker 2:And I guess as with all good UX, we also design heavily around things like your your feedback and system status. So, like, always allowing the user to sort of know the status of the robot that's front and center. And then, of course, that sort of error prevention and recovery, sort of allowing those non tech or expert users to recover easily and continue with their setup so they're not stuck with one particular problem. And probably the last one was safety was obviously a high priority. I think even though they were sort of soft touch, so if they bumped into you, they would stop, it was still just making sure that that's front and center.
Speaker 2:And I think that's also where there was the custom housing with that sort of emergency stop button. Yeah.
Speaker 1:And I think I want to also just kind of tell a little bit about the kind of this process, especially when it comes to UX. You know, what I have to give Yasukawa credit for the the director that brought us in, he understood that, hey, you know, we've tried approaching it, making it like a the UX look prettier, but that's not what you need. You need a very deep dive into how users actually interface with it. And also credit to their competitors, not just make the interface their software better, but it's the entire journey. The competitor knew that from the moment that it leaves their factory to the moment it gets to from the packaging, just like how Apple does now.
Speaker 1:You know, when a box arrives, it's very easy to take out. Mhmm. Even the director knew that Yasukawa is just to take out of the box. It was just it wasn't user friendly. For theirs, it was super friendly just to take it out of the box.
Speaker 1:Everything was like documentation. Everything was just just how Apple does it. And I think a lot of companies are copying that same method because that experience from beginning to unboxing, to actually using it and everything else in between is so important. And so that's part of your user experience. People think it's just part of the product, it's everything involved, especially when you start dealing with hardware and not just software.
Speaker 2:That's a good point. I mean, sort of, I think what we're discussing in the rest of this podcast is just that UX is beyond that. And I think it's such a everyone is used to, well, UX in digital visual user interfaces. And I think that's a key mind shift that has to happen where we're now moving into robotics, it's not necessarily that visual. It's about everything, like you're saying, unboxing and interacting with the robots.
Speaker 2:As UX designers, we need to be thinking along those lines as well.
Speaker 1:Yeah, and I think to get into some of our process that's involved, it's where we actually went into, or I went into the whole facility and I worked directly with engineers. We actually went through step by step what a user would go through if they would have to set up the robotic cell to actually pick and place certain palettes, certain objects, and how would they go across doing that. I mean, was recording so many videos. I'm sure you remember where I'm sitting there, where we're analyzing the videos exactly what buttons they're pressing, what they're doing and breaking that down into actual user journeys and what is important for them to see, what do they need to do during that course of the day if they're actually setting up these things for a facility. Most companies, they wanna just jump right into the wireframes, hey, just make this part of the interface look better using new components or some type of, you know, modern looking designs and then that's it.
Speaker 1:And that's not where you start. And, you know, this was an earlier client of ours where, we've gotten now a more strategic approach where if you're really starting from scratch or you're really trying to redesign something that is a very legacy, we actually talk with leadership, get alignment with business, make them understand what empathy is and understand the user's journey and then create your roadmap of features, what's risky, what's not. But with this, we did that more of a design sprint type of thing where you're really understanding, hyper understanding of what the users are doing when, what that user journey is. And then from there, that's kind of where we got into how can we make things more optimal for them on the screen. And then again, we're adjusting going from a button type of approach, physical buttons to all touchscreen.
Speaker 1:And so that is something that I think we could get into as far as like, you know, when people are used to doing something, especially with like enterprise software, There's a workflow that they're used to and if you disrupt that, they're not going to like it. So anything that you think that you're gonna put in front of a user that's used to doing something for years the same way and you disrupt that, their immediate reaction, there's a change curve of emotion and they will go through that change curve and how intuitive you make that is how quickly they go through that change curve of emotions. It's really imperative for us to teach you know, the client to say, hey, look, this is this is probably what's gonna happen. So in order to speed up learning curves, because that's that was another thing when you do this type of redesign. When you slow down the learning curve, that's problematic.
Speaker 1:But there are steps in which you can actually speed up whether it's onboarding the user, getting access to documentation much easier, FAQs, things like that. So they feel more comfortable. We knew that it was weeks or even a month or two for them to actually train on these systems because they have to learn where everything is at. But if you make it more intuitive, that learning curve is actually diminished. That is something that we were really cognizant of as we were designing the new interface.
Speaker 1:And I
Speaker 2:guess takeaways is what kind of simplicity maybe isn't about fewer features, it's finding that frictionless access maybe to the right functionality at the right moment. And that's maybe something that we can take away from that project and also apply to future projects and anyone can apply to their projects as we sort of move into robotics. And I guess now looking at UX as really a strategic differentiator in the robotics industry.
Speaker 1:Yeah, so I think that's a good point to come across. Companies like Universal Robotics, I mean, that was actually one of the big competitors. They replace the the code based programming with actual easy graphical user interfaces. And they created an entirely new market with these collaborative robots, basically a six month ROI. And UX focused firms in general just achieve a 20 to 80% improvements in performance metrics versus technically superior competitors.
Speaker 2:I mean, the savings now, they're massive and it just shows the right UX can make such a difference. Mean, we look at some real world examples with this ABB's App Studio, which they've kind of got to design this cloud based, almost like a drag and drop interface for programming their ABB industrial robots. But they realized just through UX improvements, 80% reduction in setup time. I mean, that's huge. And that was through things like visual programming.
Speaker 2:Users are able to configure their robots by dragging almost these predefined blocks and elements instead of writing any code. And just also having sort of cloud access to these workflows that can be just built, shared and edited remotely. I mean, that is amazing saving. And then you've got things like KUKA's AI powered programming, and that was really kind of using natural language processing commands just to replace what would have taken weeks of sort of all this manual configuration. And another good example is Ready Robotics.
Speaker 2:They created this operating system called ForgeOS, and they literally had a one hour learning curve. So just enabling pretty much like shop floor operators to program robots. And it was really done with a no code, almost similar, I guess, to the ABB, but kind of no code task canvas interface. And, you know, the the amazing thing about that is this sort of one interface. Through standardization, they made it work across, you know, Yasukawa, ABB, it's FA NUC, and lots of others just solving this whole sort of fragmented ecosystem problem.
Speaker 2:Pretty impressive what's being done with good UX.
Speaker 1:And integration, not hardware, are basically the biggest barriers to adoption. We agreed. Anybody that has good UX implemented, we know that just slashes these costs and just, I think not to get too political, with a lot of the tariffs and things that are happening in The United States, I think the goal is to move a lot of manufacturing back into here. And so one of the things is like, Oh, it's gonna take forever to set up these manufacturing facilities. But nowadays, it's it's far faster, far more easier to do that, especially if you're utilizing robotics and automation that's easy to set up, especially in in in an hour versus what would normally take, like, weeks and highly skilled engineers.
Speaker 2:Absolutely. Yeah. It is. It it's exciting. I mean, it's Yep.
Speaker 2:Just changing the speed of operations incredibly.
Speaker 1:For sure. I'm curious. In video games, cheat codes let you skip months of grinding to unlock special abilities instantly. Have you ever wished for something similar for your software challenges? What if there's a way to instantly access twenty plus years of specialized expertise instead of developing it all internally?
Speaker 1:What if you could solve in weeks what might otherwise take months or years? Would you agree that most organizations faced a steep learning curve when implementing new software solutions? At my company, Impact, we serve as that cheat code for companies looking to transform complex software into intuitive experiences that users love and that drive real business results. Would it be valuable to explore, and how might this work for your specific situation? Visit impact.i0 for a free strategy session focused on your unique challenges.
Speaker 1:So I guess we let's let's get into the, UX AI convergence. So is there
Speaker 2:a There's a sweet spot right there.
Speaker 1:Yep. So, yeah, let's I mean, with the rise of AI, we can now imagine a completely new field of UX because of AI.
Speaker 2:Exactly. I mean, that's, I think, what we're alluding to earlier. It's we're so stuck on the sort of visual UIs and all the things, all the points we were discussing for Yasukawa that was so visually based. And this is pretty much becoming outdated. The new UX frontier is really going to be one that relies on your user interface being things like voice or natural language.
Speaker 2:Forget about the inputs and buttons that we sort of see on traditional UI layouts. These sort of natural language interfaces are going to let all the non technical and non expert users control robots and say simple things like, We'll tidy this up, or Weld that in this way. And that robot can then go and run a whole complex task chain. I think any of the UX designers listening to this, it's a complete mind shift. We're going to be designing now for things like timing, pacing, turn taking, rather than screens and buttons.
Speaker 2:You know, we're gonna be focusing on intent modeling and not just, well, this visual action triggers this. So, designing non visual feedback loops is going to be what we're focusing on a lot and how the user is going to feel confident that their commands actually worked. So again, complete a change from this two d space, you know, these flat visuals, to three d because robots live in a three d space, not on on two d screens. So we're going to be considering things like proximity, their gaze, their body position, the environmental context. Is the robot facing the user?
Speaker 2:Is it too loud to hear things accurately? All these different elements. So it's going to become a lot more complex, but potentially a lot more rewarding to design as well. And even if we think today, we sort of started designing for conversational UIs, but it's really different when we look at robotics because robots aren't just this, with conversational UIs, you're asking a question, getting a response. Whereas with robots, it needs to be a different approach where, like, at some point, they're going to need to take a lead or they're going to pause.
Speaker 2:So there's a whole different level of interaction there. And it's taking it from these passive interfaces to how do we design for what is essentially a trust building agent. Our UX and how we design the robot has got to be there to build trust, to set boundaries, to ensure transparency of how it's working, thinking, all those sort of things. Processing, should I say, rather than thinking. So it's really from aesthetics to ethics, more emphasis on things like cultural norms.
Speaker 2:How is your personal space? Even that goes to sort of cultural norm as well. Where are you operating? What is an acceptable personal space? What is data sensitivity now?
Speaker 2:What should it be listening to? What shouldn't it be listening to? What should it be recording? What shouldn't it be? And also consent to do that.
Speaker 2:So it's a whole different paradigm, I think, of how we're going to design.
Speaker 1:Yeah, and I think one of the most important topics is when you create a product, you're trying to reduce or what you're trying to do is solve a user's pain point. You can have all the features in the world, but the biggest thing is solving a pain point. And the pain point of Roblox is, is the setup time. You wanna be set up and keep going and just you want that manufacturer and how much you can output. If you can output more, great.
Speaker 1:So if it takes a long time to set up, that's cost and time and it's expense. So with AI, they've reduced time because of AI and deployment time. And there's this one company, it's out of Berkeley based, it's an AI startup, it's called Jacobi Robotics. They have basically cut down, I think it's a 95% reduction setup time compared to tradition.
Speaker 2:Yeah, I heard about this. Yeah, it's pretty impressive, wow.
Speaker 1:Yeah, and they have a Jacobi palletizer. So, robots that have these big palletizing cells, instead of taking weeks or months, integration now completes within a matter of hours or a single day. And that's amazing progress.
Speaker 2:That is. Wow. Yeah, I think they're similar. Things like Microsoft and DeepMind's framework, I think they've used tools to almost turn plain English into executable robot code. So you've got these frameworks demonstrating, again, that non experts can write high level robot programs using plain language, supported by these code generations tools and conversational feedback loops.
Speaker 2:So also just reducing the time and the, I guess, the barriers to entry as well. It's not like, well, we need a robotic specialist. It's like, no, no, anyone now can again, coming back to democratizing industrial robotics and just using, I want the robot to do this, and all that complicated sort of task mapping, which we did with Yasakar as well. That was really like plot, change plotting. We were doing all this sort of coordinate based sort of mapping is now completed just by describing what you want.
Speaker 1:Yep. I remember in my master's classes, I had to learn about how to do all that. And now you don't need a master's degree to figure Maybe these things it's a builder from scratch, but for the general public who's trying to program these and integrate them, yeah, you don't. And so what impacts insights? So UX designers, they basically need to move away from the visual UIs.
Speaker 1:And also now understand more of the prompt engineering, the task training and the voice feedback loops.
Speaker 2:So, yeah, I guess something that I've heard some predictions is something like 80% of robot programming will be AI assisted or no code by 2027. So you just think how that's going to affect the market if you think about it. It's huge. It's such a large part. And you've got these companies that we've been talking about now that are leading the charge there.
Speaker 2:But for a majority of it to almost be no code, it's quite an interesting shift.
Speaker 1:Check out our podcast on the AI job apocalypse. We talk a little bit about them. But I think it's a good thing to jump into kind of our next, I think, segment is designing for human readiness, not just the technical capabilities. So the humanoid robots market is projected to be around 36,000,000,000 by 02/1935. It definitely lags a little bit due to the trust, the usability and cost, not really the hardware.
Speaker 2:I think that's the interesting point, the fact that it's the hardware is so advanced, but it's let down by trust and and usability and cost. So you look at some examples like, the Tesla Optimus. I mean, you'd expect it now to be cooking you dinner and cleaning your house, but that stalled by control and overheating issues. But then you look at a company like Neo and their 1X, which I don't know if you've seen that, Makoto, but wow, it's actually really, really interesting. Maybe we can show some visuals of it.
Speaker 2:Really interestingly designed robots like a fabric body and looks all the marketing's quite zen, but that excels in home use and it's almost using bio inspired motion and
Speaker 1:Oh, I thought that was just somebody in a suit.
Speaker 2:Maybe that's what they use for their market, but apparently the people that have tested it, it's been pretty good. Yeah, so they've sort of used bio inspired motion and voice based interaction, And it's obviously not as flashy, although, I mean, I kind of prefer it to the optimist. It seems it's more likely to succeed because it really prioritizes those things of usability, comfort, and trust, which are kind of the UX essentials which are often overlooked. What what I found interesting about looking into this is is, again, some sort of lessons for designers where there's something called the uncanny valley, and it's that we should be embracing this uncanny valley. And this was actually a term coined by a Japanese roboticist called Masahiro Mori in 1970.
Speaker 2:His observations were that as robots became more human like, people react more positively, but only up to a certain point. So it's kind of when realism becomes too close, but still imperfect, your reactions drop sharply into what he called the uncanny valley. So it's almost like our brain says something is off, like whether it's the skin texture or it's like the way an eye is moving or it's your subtle facial expressions. It just may feel unnatural or eerie. And this is another challenge that as designers, we've got to solve for this through motion, voice, and behavior.
Speaker 2:And it's interesting you see some some of of the companies, like I think back to, was it Yamaha or what was the company that did the little friendly robot that you saw a lot of maybe like four or five years ago? Toyota maybe even, I don't know. It was kind of like a very sort of cartoony, but it was sort of talking in a robotic way, and it's interesting that you sort of avoid so much by taking it to be less humanoid, and there's almost more risk the closer you get to being sort of human or making it more human. So anyway, sort of solving, I thought, interesting point is kind of solving for that uncanny valley through your sort of motion, voice, and behavior design. And then I think also build for fail safes, which maybe, Makoto, you want to talk a bit about.
Speaker 2:Because I think just like we built for fail safes in the Yasukawa app, it's such a different field now because it's not just, alright, you've got you know, you may be in operations, you need to hit the emergency stop, or you need to have, you know, your standardized sort of error messages and your whole sort of messaging system. It's now in the sort of physical reality. I think that's an interesting point to chat more about.
Speaker 1:Yeah. And I think, Phil Saifs, why is it important? And I think besides all the scary movies about the robots that take over your family, they fall in love with you, whatever the case is, and kill your kill your family. I think we we wanna make sure the robots, they can misinterpret commands. They can lose their power.
Speaker 1:They can encounter different obstacles. So the users must feel confident that mistakes won't lead to damage or danger or misinterpreted emotions. Yeah, exactly.
Speaker 2:Like you wouldn't wanna say to
Speaker 1:your robot like, damn, you know,
Speaker 2:I could kill for that. It's like, sure. No problem. I'm going to go off. Do that.
Speaker 1:I really hate that neighbor. I
Speaker 2:wish they were. Exactly. Yeah. If another pigeon lands on my porch, I'm just I'm gonna take it out. Yep.
Speaker 2:Yep. Gotta have those fail saves.
Speaker 1:I mean, I you know how many times I thought about security robots roaming. Like, my dog does a great job, but if you have a little robotic dog running around and somebody acts, the Amazon delivery driver comes in. Now the robotic dog he has to worry about instead of the real dog. No, I think some design tips that obviously make sense. There's emergency stops that we do with that they're involved with manufacturing robots.
Speaker 1:So something like that as far as a voice command that does an abrupt stop. Obviously like proximity sensors, again, back to that safety curtain type of thing but you've gotta have some type of redundant safety layers if something doesn't work or doesn't activate it, then you have to have some And kind of I think if something does fail, there should be a way to gracefully degrade in that design so that it goes to a stop and wait kind of instruction if it's not sure. And then some kind of error recovery that clearly states what users should be doing and they're not guessing. Exactly. They're doing things it's not supposed to.
Speaker 2:I could just imagine your humanoid robot staring off into the corner and you're like, Are you still there? It's just having a major hardware failure. Are you going to still do this task? It's just staring at you with its blank face. So yeah, definitely I think something there to go like, no, I'm experiencing an issue right now.
Speaker 2:I'll be back in a second. So I guess the insights from this is that if we think of robotics, they just need to work. I think we've been thinking there's the highly functional aspect of them. But as they come into our homes and they get into much more, out of more industrial settings, into our personal space, they need to feel safe. They need to feel useful.
Speaker 2:And most importantly, I think intuitive. You need to understand what they're about to do or they're busy with. It's going to be fascinating designing for these sort of things.
Speaker 1:For sure. And I think to talk around some of the next kind of segment is voice and gesture interfaces. I think that is going to be the new normal when you're talking about robotics and just seeing that voice UI market, the data that we saw is it's gonna hit a $16.17000000000 market and I think it hit in 2023 and then basically a 20% CAGR is expected as well.
Speaker 2:Wow. So, yeah, it's thinking about industrial settings now, I think they're achieving something like 95% voice recognition accuracy. And some of the technologies are really interesting. I think
Speaker 1:industrial Except that it's Siri.
Speaker 2:Yeah. They didn't use any of these things for Siri. So please, like anyone in Apple, if you're listening, please just use some of this technology because I'm so tired of it. So some interesting things they're using to get that 95% voice recognition in environments that can have machinery going, they use something called beamforming, which is a method of actually focusing microphone sensitivity in a specific direction. So it's almost using an array and being able to focus in on that sort of directional sound with that sort of array.
Speaker 1:It's almost like you do with your ears you
Speaker 2:point out Exactly. You've got two, and then you think of having a bigger array of things can pinpoint that sound a lot more. And then the other thing is adaptive filtering, which is a dynamic process where the system almost learns and subtracts that background noise from the input signal. So it's measuring it and then you filter that out and get the audio a lot clearer, which is pretty smart.
Speaker 1:Yeah, I think the next thing is having this kind of touchless interaction, especially in like messier dangerous environments, that gesture control. That's a five, six billion dollar market that was, predicted by twenty two thousand and thirty three. Mhmm. That's an important And, yeah, you know, you play with a lot of things with, VR and all that. So I'm sure that's something that's going to continue to evolve.
Speaker 1:Yeah. And then from iRobot, Bear Robotics in the hospitals, they've all seen about a 20 to 25% productivity boost using natural interfaces.
Speaker 2:That's pretty decent. And I think looking at it from a design perspective, these sort of UIs that they're using, again, are there to reduce the barrier to entry, but they're going to always need that thoughtful contextual design to just avoid things like misinterpretation and overload. Again, like thinking that user interface, I always we've been designing two d interfaces for over twenty years. And my mind always goes back, I hear UI and I think it's a screen. And just that it's not limited to the screens.
Speaker 2:It's the entire system, like we talked about earlier, by which a user kind of communicates with a machine and receives feedback. Even like you mentioned, Makoto, before and after, like unboxing that robot, if that's how robots are going to be delivered or welcoming it at the front door, I don't know, as it gets a robot taxi to your house to introduce itself. It's, yeah, interesting.
Speaker 1:Yes. So I think in the context of voice and gesture interfaces, the UI includes voice prompts and confirmations. Basically the robots rely on tests started or asking, do you wanna continue? Now we've got auditory feedback, any types of tones or beeps, spoken words is always helpful. Any kind of visual cues, lights, icons, any type of augmented reality overlays, status LEDs, it helps the human interaction.
Speaker 1:And the robotic body language, you know, kind of a head turn, our movement, a what the hell are you talking about gesture? No. Some type of light ring animation and, you know, physical gestures or haptics, a robot nodding, maybe it vibrates slightly, sounds kind of dirty.
Speaker 2:We'll keep it above board here. Yeah. I think the next one is just looking at the opportunity for UX, that it's really a kind of million oh, sorry, million trillion dollar UX opportunity.
Speaker 1:A trillion dollars.
Speaker 2:A trillion. I mean, that's that's a big a big number. I think even quantifying that's difficult. But interesting points are today, only 12% of organizations deploy robots at any form of scale. And 61% cite that a lack of automation experience is really their primary barrier.
Speaker 2:And it's not cost, not technology, it's a lack of automation experience. So what does that mean when we can make that automation experience simpler, can reduce the time, can make it accessible to everyone? And I think the sort of small to medium enterprises, those sort of clients face an even greater UX gap because it's those companies where I think they can benefit the most from using the robotics, but they have fewer technical staff, they've got higher return on investment expectations, and they're going to be reluctant to adopt what may be seen often as these sort of black box systems. So the companies that are left to fix this, well, they've got a piece of that trillion dollar pie, I think, and they're going to take that market share from these slower, more tech heavy companies.
Speaker 1:For sure. And I think the takeaways for firms like us, it always boils down to whenever there's new technology or technologies has advanced, you're going to always see this push for that feature, always a push to integrate that tech and they become companies become technology centered versus user centered. And so our job is not to just allow that to happen, just go in and say, sure, we'll just design whatever you need and that's it. We have to change the mindset and champion that user centric approach with organizations because you're just not designing some screens, you're unlocking a lot of automation for pretty much most of the market, 90% of
Speaker 2:the market. Yeah, that's true.
Speaker 1:And that you, the UI and the UX portion of it, that's project strategy piece is now essential as the mechanical and electrical engineering that's involved with actual the robotics. And just that mind shift is always gonna be that tug of war between organizations that really are tech heavy, but really don't think about, hey, no matter what that tech is, who's ever interfacing with that is your most important thing. What problems are you trying to solve? And are you doing that well and frictionless as possible?
Speaker 2:That's good points.
Speaker 1:This is a good place to kind of close. What do we wanna say? The future of robotics is human. The AI wave doesn't replace UX. It actually makes it more critical.
Speaker 1:And robotics now, they can basically learn by demonstration. You see that with Tesla with their training their cars to self drive based on the millions of hours of data that they're receiving every day on how others drive. And I think they're doing the same thing with the Optimus robot as well. So it's just much faster to train and there's a lot less programming of training. And then understanding the speech and gesture is a big part of that and then optimizing their own paths using the generative AI.
Speaker 2:Yeah, and I think it's all good, but without great UX, none of this is going to reach the users. So your next robot isn't really waiting for better code. What it's really waiting for is better design from you.
Speaker 1:Yeah. And I think in the age of AI, UX isn't an add on. It's an it's the on ramp.
Speaker 2:So Absolutely. I think
Speaker 1:with that, it's been a pleasure talking about this with, with everybody in the audience. I think this is a wrap on an episode of Make an Impact.
Speaker 2:Yeah, it's been fascinating.
Speaker 1:We've been in business for twenty plus years and we've been seeing and doing this for a lot of digital products. And if your company or your organization is facing similar challenges, feel free to reach out to us. We'd love to help you create your own transformational story. And until next time, take care. See you then.
Speaker 1:Bye. Have you ever played a video game and discovered a cheat code that instantly unlocks abilities that would have taken months to develop? I'm curious. What would it mean for your business if you could access a similar cheat code for your software challenges? What if you could bypass months of trial and error and immediately tap into proven expertise?
Speaker 1:You know, I've noticed that many organizations spend years developing specialized software expertise internally, often through costly mistakes and setbacks. Would you agree? That's a common challenge in your industry as well. At my company, Impact, we function as that cheat code for companies looking to transform complex software into intuitive experiences. Our clients gain immediate access to twenty plus years of specialized knowledge and the experience of launching hundreds of software digital products in many different industries without having to develop it all internally.
Speaker 1:You might be wondering how does this actually translate to business results. Well, companies we work with typically see go to market times reduced by up to 50%, their overall NPS scores rocket up, and their product to development team's efficiency significantly improved. Instead of struggling through costly mistakes, they accelerate directly to solutions that work. This is why organizations from startups to Fortune 500 partners with us for years. We consistently help them solve problems in weeks that might otherwise take months or years.
Speaker 1:If you're responsible for digital transformation or product development, wouldn't it make sense to at least explore whether this cheat code could work for your specific challenges? From boardroom ideas to code, this is what we do best. Visit our website at iiiimpact.io. You can see the link below to schedule a free strategy session. It's just a conversation about your unique situation, not a sales pitch.
Speaker 1:And you'll walk away with valuable insights regardless of whether we end up working together. Thank you.