Oxide and Friends

Yael Grauer joined Bryan, Adam, Steve Klabnik, and the Oxide Friends to talk about her recent Consumer Reports article on memory safety and memory safe languages. How do we inform the general public? How do we persuade practitioners and companies? Thanks for joining us, Yael!

In addition to Bryan Cantrill and Adam Leventhal, we were joined by special guest Yael Grauer, and Steve Klabnik.

Some of the topics we hit on, in the order that we hit them (experiment in turning the show live-chat into notes):

If we got something wrong or missed something, please file a PR! Our next show will likely be on Monday at 5p Pacific Time on our Discord server; stay tuned to our Mastodon feeds for details, or subscribe to this calendar. We'd love to have you join us, as we always love to hear from new speakers!

Creators & Guests

Host
Adam Leventhal
Host
Bryan Cantrill

What is Oxide and Friends?

Oxide hosts a weekly Discord show where we discuss a wide range of topics: computer history, startups, Oxide hardware bringup, and other topics du jour. These are the recordings in podcast form.
Join us live (usually Mondays at 5pm PT) https://discord.gg/gcQxNHAKCB
Subscribe to our calendar: https://sesh.fyi/api/calendar/v2/iMdFbuFRupMwuTiwvXswNU.ics

Speaker 1:

Alright. Yow, you're here, which is great. And your Yes. And your audio's working. Excellent.

Speaker 1:

Well, thank you very much for joining us. We are really excited to speak with you. And I gotta tell you, Adam, did you check check out this talk that Yes.

Speaker 2:

I really enjoyed it. Pretty Delightful talk.

Speaker 1:

Almost. I loved it.

Speaker 3:

Found it.

Speaker 1:

Oh my god. I loved it. Yeah. So I not, I mean, we can obviously talk about mediocrity as well, but so for others, the the theme of the talk what that this is like a very quick lightning talk. Right?

Speaker 1:

Yeah. This is like a 5

Speaker 3:

or 5 talks. Like a decade ago. I hope it's not horrible. I haven't looked back at it.

Speaker 2:

It holds up just as mediocre. Yes.

Speaker 3:

Oh, boy.

Speaker 1:

No. Yeah. No. I think it's funny. Yeah.

Speaker 1:

It's wonderful.

Speaker 3:

Yeah.

Speaker 1:

No. I actually I I I so because I know I think

Speaker 3:

that I

Speaker 1:

mean, the whole theme of the talk is, like, you can by doing things, you can get better at things. And you I loved your line. You had a bunch of great lines that I love. One is, I promise no matter how bad a gardener you are, you will be able to grow food that you can eat. And I thought it was great.

Speaker 1:

Like, it was just the whole the whole message was like, hey, you can actually do things on your own. And I feel it's a message that us kids especially need because they are so used to this expectation that they're going to be that, you know, boy, you shouldn't play soccer unless you're going to be, know, unless you're gonna be a a soccer player in college, or you shouldn't, you know and it's just great to have folks encouraging us

Speaker 3:

to just say a lot

Speaker 1:

of stuff I'm really bad,

Speaker 3:

I enjoy. Like I'm really bad at jujitsu, but I love it because I can get out of my head and like, it kind of works muscles, not like literally, but things that I don't normally work on, like I'm out of my head and breathing. So,

Speaker 2:

yeah, I loved your I loved your line that, like, you might be mediocre at playing guitar, but you'll be fun at parties.

Speaker 1:

Should I was a great You

Speaker 3:

must stick on to, like, the 3rd page of Google or something.

Speaker 2:

Yeah. We could go for the 2 cuts.

Speaker 1:

This is a the this is an early hit for you, I gotta tell you. This is, you know, I don't wanna say that this is gonna be, you know, this is a a claim to fame, but this is this is an early hit for you. We're going that deep into the deep it's but it's a great talk. It should be. And I so I'm going to attempt to make a segue from that excellent talk to this terrific paper that you wrote for consumer reports Because you are in that talk, you are really trying to encourage more people to do things.

Speaker 1:

And in this paper, you are really trying to encourage more people to do a specific class of things, namely implement in memory safe languages. So I I I love the the tone of this paper. But before we get to it, I I think that the question that is burning for me, Adam, I imagine for you as well, and maybe Steve, you know, the answer to this, but I'd love to hear this. What is the history of this? How did consumer reports, did how is this conceived of

Speaker 3:

during this report? Last podcast, by the way. I don't think anybody has raved about anything I've written that much, so that was really exciting for me. But, I think you had guessed that it was ISRG, but actually, I think I was, I think it was reading Alex Skinner's blog and, like, the, fish in a barrel. Mhmm.

Speaker 3:

And just realizing, like, oh my god. This is a really high percentage of bugs, and that's alarming. And why isn't everybody talking about this? But also, I, I've done like, my background is investigative reporting and, you know, I'm always whenever there's a Pegasus or something like it, I'm always panicking, And I'll find, like, this time it was, a friend of mine, who was helping me try to figure out if I had Pegasus on my phone because I had done some investigative stories, and I was a little bit worried about it. I think it was Jonathan Rudenberg.

Speaker 3:

And I was like, is this on my phone? And this was before it was easier. I think, like, Trail of Bits came out with an app and, like, MSC International or something. This was, like, before that had happened where, like, you know, like like, he like, they really had to talk me through the steps of doing it. And, I remember, like, at the end of our conversation, they said, you know, until we figure out memory safety, like, we can't really prevent the next Pegasus.

Speaker 3:

And I'm like, wow. Like, that's alarming. And, you know, I run this, I manage security planner, which is a digital security tool that helps people figure out how to be safe for online without necessarily needing a technical background. And I'm like, there's nothing we can tell people to do. There is no setting that they can click on to prevent this.

Speaker 3:

There's no, there's nothing they can buy. Like, it's like, I find that alarming. And then just the intersection that it's had with things that I care about, like, you know, journalism, patient safety, human rights activism, etcetera, and and just consumer safety. I'm like, I don't know. Like, I I'm like and then and then there was, like, targeting of, like, reforms and just the way that it intersected with, like, every vulnerability that I had heard of and was upset about.

Speaker 3:

And I just kept bugging people and being like, we should really do something about memory safety.

Speaker 1:

That's great. I mean, obviously, you know, I'd and I would actually like to understand a little bit because Pegasus is something that was, you know, in the kind of the sea of vulnerabilities. Something I'd heard of, but wasn't paying close attention to. But it sounds like Pegasus was much closer to home for you. Is that because of your experience as a journalist?

Speaker 1:

Were you worried that you were being explicitly targeted?

Speaker 3:

Possibility. It's something that I mean, you know, your worst nightmare is, like, you are working on a story and your source or somebody gets arrested or tortured or imprisoned. And, like yeah. NSO Group was specifically targeting activists, like my sources or journalists. And so, yeah, it was, yeah, it I was totally panicking.

Speaker 3:

But it was my

Speaker 1:

Well, and I yeah. And I think it I mean, that's interesting because, I mean, you think in in your line of work, you've got, like, people's actual physical safety on the line here with the vulnerability. We, you know, we don't always kind of directly connect vulnerabilities to physical safety. But in this case, it's pretty easy to draw that line.

Speaker 3:

Yeah. I've been pretty lucky. I, actually, I was I was traveling and I got a call from an or a text from an editor who was like, call me right away. And I was really worried that somebody had been tortured or arrested, but they just, like, wanted me to delete a tweet or something. And I'm like, oh my god.

Speaker 3:

Don't, like, tell me to talk to me immediately without yeah.

Speaker 1:

Oh god. It's not the worst.

Speaker 3:

For a freelance assignment I was doing.

Speaker 1:

I I know. I did.

Speaker 3:

Don't, like, make me think there's an emergency without saying like, I don't consider that an emergency.

Speaker 4:

So Brian and I at least both share this kind of anxiety. And so we have a really good working relationship because it's like, never send me a chat that's just like, can we talk? Like, oh, it's I'm always always like, hey, Brian. Don't worry. This is not a terrible thing.

Speaker 4:

Like, you know, but also, you know, I'd like to, like, catch up soon about whatever and, like, vice versa. And so it's Yeah.

Speaker 3:

Yeah. Yeah. Establish that. Set up a meeting without telling me what it's about first. I'm like, oh no.

Speaker 4:

Yeah. Yeah.

Speaker 1:

Well, you can always call, I mean, just bluntly, if people have led a team before because it's the ones that had a team that are especially sensitive to, like, hey. Don't worry. It's nothing bad. And I I do always try to do that because, yeah, it's it's just easy to be like, hey. Can you chat later?

Speaker 1:

It's like, okay. Oh my god. I'm immediately in in my worst nightmare. I actually had a manager once who was like, hey. You this is on a Friday afternoon.

Speaker 1:

Are you gonna be in on Monday? And, of course, this is in the era before remote work. So it's like, I have I'm in on every Monday, and I have no I mean, yes. I'm gonna be on Monday. Like, is there something wrong?

Speaker 1:

It's like, we'll talk about it on Monday. Sleep is like 4 PM.

Speaker 3:

Friday. I scheduled a bunch of emails on,

Speaker 1:

for a month because

Speaker 3:

I, I wanted to email a bunch of people on Friday and I'm like, I probably shouldn't send this to them on a Friday afternoon before the Super Bowl weekend. So they're panicking all weekend and, like, freaking out before Monday. So I'll just schedule it for, like, Monday at 8 AM or whatever.

Speaker 1:

Yeah. Good on you. And yes. Don't always give people full context when you need to need to chat with them. And then if it's deleting a tweet, like, maybe that could that that feels like that's not you I I it doesn't feel like we need to make you sweat over the physical safety of one of your sources because we want you to delete a tweet that someone someone found to be Yeah.

Speaker 1:

A little too raw.

Speaker 3:

It was weird. It was a freelance client, and it wasn't a tweet about anything that I was writing about, but I don't know. It was weird. But it happened.

Speaker 1:

So because I think that, you know, I part of why I think so many folks found this work so interesting and delightful is that you are I mean, you've honed in on something that we as practitioners really know well, which is that the choice of language definitely does affect one's ability to, to not have vulnerabilities. I mean, there is very much a correlation, and it is an underreported story. I think the kind of the folks that are talking about it tend to be more, not writing for consumer reports, put it that way. So did you pitch this to consumer reports as something that that, hey, this is something that we should talk about? How did you get consumer reports

Speaker 3:

on doing this? I I was, like, reading, you know, Alex Gaynor's blog and, like, Chris Palmer's blog and the efficient apparel Twitter and, like, just talking to people about this. And I made a I, wrote a memo and I made a PowerPoint, and I was like, we should really, do something about this. And it wasn't like a traditional, I'm actually not on the content team anymore. So it wasn't like a traditional slicks, consumer reports, front page story or magazine piece or anything like that.

Speaker 3:

It was more like, on the, sort of mobilization outreach side, and I just kept, like, bringing it up. And, I was like, we should really, really do something about this. And for a while, like, there weren't I couldn't really answer the question, like, who else is talking about this with an answer that anybody had heard of, but then it's like everybody started talking about it and I'm like, this is the time. So I've been talking about it internally for quite a while before we actually, did something about it. And I was like, we should really do a convening.

Speaker 3:

Cause I don't want to like, what if we do something that does that, like, misses the mark? Or what if we focus on something that doesn't really hit get at the issue? Or what if we, like, reinvent the wheel? Or and so, like, there's just that that danger of it. And I'm like, we should get everybody we know who, like, that we can think of, in a room and and talk about it and, like, figure out, like, where can how do we fit in and what should we be doing and that kind of thing.

Speaker 1:

And when you say a convening, is that like a consumer reports term? I mean, into what were you deliberately kind of bring

Speaker 3:

those parts in? Yeah. We like participate. And, I have an amazing manager who like really got on board and, yeah. And we just invited everybody.

Speaker 3:

We're like, can you make this date? And, like, and then we did, like, a 2 hour discussion. And so basically the the report that I put out is kind of a summary of everything that we talked about. I just kinda organized it a little bit. And because it wasn't, like, capital j journalism, I got to get everybody's feedback who was there on it and that really helped strengthen it because normally, yeah.

Speaker 1:

That helps a lot. Yeah. Yeah. Because it's very again, it's very good. It's very, technically, it's definitely on point that you're raising a bunch of really good issues.

Speaker 1:

So, I mean, you it was a it's a methodology that worked really well.

Speaker 3:

I thought it was really cool to get everybody had the chance to give feedback and like point out. And so I tried to in the report point out like these are things people disagreed on or revert it in a way that everybody who, you know, had the chance to give me feedback could kind of live with.

Speaker 1:

Yeah. So I I wanna get to some of the the the meat here. The so one thing, so you've got a lot of really interesting suggestions. I definitely wanna pull at, and, Steve, I don't know if you had the same question, but I'm like, what about unsafe rust? Where does are we gonna talk about unsafe rust in this, or are we just gonna kinda pretend that we don't have an unsafe keyword?

Speaker 1:

So there's definitely that we got some I got some questions around that. One of the, you know, part of what I love about the approach here is that you're really advocating for transparency above all else is what I feel. I mean, correct me. That's maybe maybe I'm reading into it, but a little bit what I wanna read into it. But, I love the appeal to public accountability and to to transparency.

Speaker 1:

One of the things that was news to me is that when you you talk about the the common vulnerability and exposures database, CVE database, I did not realize I know Rick is here and Rick just kind of rolls rolls his eyes on me being so naive for for not having realized this. I did not realize that that is, like, 100% opt in, and there is no requirement, regulatory or, like, social contract or otherwise, that companies actually participate in CVE process. And one thing that is just galling is there's no way for the consumer to know who is participating in CVEs and who isn't. Is that a problem that you've run across? Did you run across it in this work?

Speaker 1:

I mean, is that because it's I mean, as you point out, we really part of the way we can get going on this problem is by really understanding which CVEs are due to or or would be less likely to occur in a memory safe language. But of course, that means that we've gotta actually, like, adhere to the CBP process.

Speaker 3:

Really. Like, one of the things that came up was that people said that they couldn't distinguish vulnerable like, even when there it wasn't a, even if it was classified, they couldn't necessarily distinguish. There wasn't enough details to know like, oh, this is a memory vulnerability or this is a logic bug. And I'm like, that's bad. Like, I think it would be better if we like, how do we measure this if we, the metrics we have, like, we only know them for certain parts of the industry.

Speaker 3:

So how do we even have, like, broad statistics on which vulnerabilities are due to memory and safety? And I don't know if people are like, I was actually kind of surprised that we have as many as we do. Cause I feel like people wouldn't really be incentivized to share that if they're not planning on addressing it.

Speaker 4:

It's con there's a lot of flaws in this gaming system, and that is definitely, one of many parts of it.

Speaker 3:

Yeah. I have a it's so funny. I have a textbook. I forget what it was for. It was for some certification, but I just got the book cause I thought it was interesting to look stuff up in it.

Speaker 3:

I think it was like sec plus and I like started reading it. And this is a problem I have when I read textbooks since like high school. I'm like the way that they talk about this, these systems is very different from anybody in the industry and how they talk about it. So it's like, you're almost like learning this sort of sanitized version of things that like in reality, everybody or not everybody, but there's like vast kind of controversy around or disagreement with.

Speaker 1:

Yeah. As, as Larry McVoy's quote on that, Adam, where in school you always assume a frictionless surface, And then when you actually go to build things, all you work on is the friction. I feel like there's maybe an analog there.

Speaker 2:

Seems seems spot on because, I mean, in terms of how gritty things get in the real world. And, yeah, one of the things I was thinking about reading your piece was how, some of the benefits of memory safety, I think can be, can feel abstract, especially for folks in university or learning or early in their careers, and then feel indelible and painful to people, you know, with more experience or later in their careers. But then, you know, you, you speak to this point, it can be hard to kind of get off of that train and hard to, to sell the benefits of of changing tax or or, you know, accepting the risk because there's there's not zero risk of, you know, making such a significant change.

Speaker 3:

Yeah. I wonder if I was kind of more it was easier for me to kind of come to the conclusions that I have because I wasn't attached to these languages. Because I have friends that are very attached to these languages. And I was thinking about that when I was rewatching the, again, Alex Gaynor's talk on memory safety and the stages of grief. I'm like, oh, I never had these because I never was attached to C plus plus.

Speaker 1:

So. Yes. I mean, I think the answer to your kind of, I, I maybe perhaps rhetorical question, but is I think absolutely, yes. It is easier for you to, to to because you're also taking the perspective of the consumer. And it's like, what in the way, why are we even debating this?

Speaker 1:

Like, of course, we should have seat belts. It's like, no. No. No. But so it's, like, haven't had seat belts for their entire career, and they are they're really good drivers.

Speaker 1:

It's like, and they, they know they're not going to get into an accident. Greg,

Speaker 2:

that's like, do you

Speaker 3:

hear yourself? Finds it really enjoyable to code in C plus plus like he was telling me like how much he enjoys it. And I'm like, I hope I don't like ruin his life or whatever. But I mean, ultimately, I think just a friend who said that? Developer and I'm like, ultimately, I feel like what like, these issues are more important.

Speaker 4:

You you

Speaker 1:

know, the only person I've heard expressed that sentiment is Adam's teenager, and we have viewed that Adam as teenage rebellion. We review that as, like,

Speaker 3:

we know

Speaker 1:

that he's gonna

Speaker 2:

I mean, clearly, it's somewhere between teenage rebellion and poor parenting and, maybe in maybe in equal parts.

Speaker 4:

I I wanna make a very dry joke about, like, the thing about behaviors that harm communities is that it's fine if you do them, but your work doesn't impact other people. So if your friend is writing c plus plus in the safety of their own home and then not distributing their code to anybody else, they can enjoy programming in c plus plus as much as they would like as far as I'm concerned.

Speaker 1:

Steve just wants to make sure he doesn't

Speaker 4:

Rather rather do it at home than in some kudos. God knows who wear with friends of yours.

Speaker 1:

That's Absolutely. Got some dirty alley. Absolutely. He absolutely teenager. Absolutely.

Speaker 1:

Just rather do it Exactly.

Speaker 2:

I just wanna know where he is

Speaker 1:

when he's doing it. That's all. Exactly. That's it. Yeah.

Speaker 1:

I it that is not a sentiment that I think makes sense at some level just because the or or rather it represents youth more than anything else. I'd I'd you don't need to disclose how young your developer friend is, but, I think it is there is a, is natural for youth to go through a period of just loving complexity almost for its own sake. And then there's actually kind of the pivot that you're kind of referring to, Adam, where you get these these scars that develop. You realize, like, wait a minute, the complexity I in

Speaker 4:

that time frame. That, you know, it's getting into something, though, or maybe maybe not. Again, maybe reading into words slightly. But, like, one of the things that I hear from that sentiment, which I think is a big problem in this is, like, you do have to convince people that, like, say, seatbelts are necessary and good. And there are some people who like not having them, and No.

Speaker 4:

I don't know if being, like, sorry. You know, you can't do that anymore or whatever is, like, the most effective rhetorical strategy for actually accomplishing the change that's desired to be brought about. Because, like, your friend is definitely not the only one, like, as much as we would joke about it. Like, because I because I'm me, this will not be shot at anybody, but I have a c our c plus plus tab open right now. I'm reading.

Speaker 4:

And, like, there's people there's this person arguing that, like, c plus plus is basically Python, and he can write c plus plus as easily as Python. And so, like, you know, whatever, etcetera. And, like, those are arguments I had all

Speaker 3:

the time. There's a

Speaker 4:

lot of people on the Internet. A lot of people like different things. And so it is true that if, like, we want to make this change, the, like, one of the I don't think this is specific to the, like, to the consumer reports paper, which I think is very good, But I think there's this, like, weird there was a comment the other day on the c plus plus subreddit. Someone was like, you can smell the fear around here these days. And people did not take that very well.

Speaker 4:

But, like

Speaker 1:

Was I that comment? Obviously, by you or one of your alls.

Speaker 4:

I don't well, actually, I do technically have an all. It doesn't matter. The point is is that you don't wanna get distracted. The the important part is that, like, you know, we gotta, like, we gotta, like, figure out how to, like, make that change occur, even though there's gonna be people, like and and they're feeling a lot of the sentiment that I see right now in the c plus plus community is feeling attacked and, like, under siege and therefore, entrenching. And that's, like, not actually the way that I would wanna see this move forward.

Speaker 4:

And so, you know, I'll joke about, you know, your friend shouldn't program in c plus plus at all or whatever, but, like, I don't actually believe that. That's just, like, a funny way of putting that sentiment. But I also think that sometimes you make these jokes or, like, there are some people who are a little more hard line about it. There are kinda, like, serious about doing that specifically in, like, all circumstance. And I'm not sure that's gonna, like, get us to the memory safe world faster or not.

Speaker 4:

I'm not sure. But it's, like, a thing I've been thinking about lately.

Speaker 1:

So yes. So and and this actually gets to a question that, Yaela, I had for you because you talk about public accountability. Very important. But I will tell you that for for my own personal journey, one thing that was very important for me personally was actually private accountability. In that, a big wake up for me and, you know, not that I, you know, I appreciated that c was dangerous, and discarding c plus plus for the moment because I I was, I was and have been a c plus plus conscientious objector.

Speaker 1:

But I was I I appreciated that c was dangerous, but I also felt like I can navigate it. And the wake up call for me, the indisputable wake up call, was a vulnerability in code that I had read, that I had thought a lot about. And the vulnerability was actually and, Steve, this is what you when I talk about, like, integer safety in Rust, this is one of the things we don't talk about enough in Rust, is the integer safety, because it is when people think about memory safety, they're often thinking about, like, oh, well, like, I'm dereferencing pointers. It's like, well, it's actually more than that. It's you have got indices that you are using to index into memory.

Speaker 1:

And if those are going to be in any way untrusted and then malicious, it is actually really hard to write that code correctly. And, Lint won't necessarily help you. And I was shocked at the fact that I had this, that I effectively had an indexing error that Lint was silent about. And that was a wake up call for me, certainly. And so, yeah, what do you think about, like, connecting engineers?

Speaker 1:

It's like you wanted it's like, oh my yeah. I'm like, blameless postmortems and hug ops and so on. But it's like also, hey. You should know that 15 years ago, you wrote this code that you thought was correct that introduced this really serious vulnerability because it's actually in fact You know what? Correct.

Speaker 3:

For some reason, this line of questioning reminds me of that blog post that just just came out, the unsafe language doom

Speaker 1:

principle. Yeah.

Speaker 4:

Oh. You know, around It's

Speaker 3:

like, you know, even if you do this perfectly, here's what could happen. Yeah. I don't know how like, from my end, I don't know who like, I don't know the individual names in face responsible for, types of vulnerabilities, but I I kinda think that it's more companies that should I I don't know. I think it's more companies, but but yeah, that's a really good question. Cause it's like, how do you hold the people accountable without them kind of doubling down and getting mad?

Speaker 1:

And I don't wanna, like, put software engineers. I don't wanna, like, public I don't wanna persecute software engineers. I just wanna be like, hey, by the way, you should know. Like, we're not gonna put you in jail. We're not gonna we're not even gonna hold you financially responsible for all this damage that you caused, but, like, maybe you should stop arguing about defending c plus plus.

Speaker 4:

I know you're telling the secrets.

Speaker 1:

You know what I mean?

Speaker 4:

I'll tell your mother you've been writing z plus plus.

Speaker 3:

I'm one of those people who reads like every single response to my article. So I'm like Reddit and Twitter and etcetera, like probably too much, but yeah, no, I've been reading there. I think they were, like, most upset that I didn't just differentiate between c and c plus plus.

Speaker 4:

That was a bunch of it. Yeah.

Speaker 3:

And then somebody accused me of posting a link twice even though I had never posted in that forum.

Speaker 1:

Posted a link to, oh, how dare you. That is just, I, I, the doll.

Speaker 3:

And I didn't want I'm like, really, I shouldn't chime in and be like, no. I didn't. You know?

Speaker 1:

Oh, I think you should chime in. No. No. No. I so okay.

Speaker 1:

By the way, I often I mean, Steve, you know that this is my approach to these things. Whenever you've got a discussion that's getting hot, I think it always helps for you yourself to appear. It really helps, I think, humanize it. Be like, hey. Actually, that asshole you're talking about is right here.

Speaker 1:

It's me. I post the billing twice.

Speaker 3:

With well, actually, I didn't post the link twice just to I didn't even but For

Speaker 4:

the record, just to clarify.

Speaker 3:

I've had really good discussions with people who, like, it started out by them yelling at me on Twitter for something they thought I did wrong and we get on the phone and like end up agreeing on like 90% of things. And I always find it really helpful. So I'm always like very nervous before I get on the phone, but, yeah, no, I think that I don't, but it's hard to talk to people. I don't know. I read a book called, mistakes were made, but not by me about all the ways that people trick themselves into thinking that they didn't actually, mess up.

Speaker 3:

Like they, everybody else is responsible. It yeah. It's kind of interesting. Like, I don't know how people would respond to that because I feel like, people don't like to be like, most of my, you know, broken friendships have been for me, like, pointing out something somebody did in what I thought was like a polite and non confrontational way. Like, they just don't wanna they don't wanna have that kind of discussion.

Speaker 3:

So

Speaker 2:

Yeah.

Speaker 1:

Key is that I want to avoid shame.

Speaker 2:

One of the things, one of the

Speaker 1:

things I love about

Speaker 2:

this piece is I think that one of the ways to avoid shame is to, is to, is to kind of fill in the other side of it. You know, Brian, we talk a bunch of about the intrinsic motivation of engineers and, or even the communal, motivation of engineers. But Yael, your, your piece talks about the other side of it, about the consumer side of it, You know, making analogies with the jungle or with, Nader's book, Unsafe at Any Speed. I love the idea of like nutrition labels. So, you know, maybe creating, you know, more awareness and more kind of desire in the in the public at large then makes it easier for practitioners to sort of get on board and and and to bend because it's not just sort of admitting some failing or whatever, but it's it's more of a migration both on both sides of it.

Speaker 3:

It's interesting. I do feel like I'm more open to like, if I don't feel like people are yelling at me, if they're like, oh, you, you worded this in a certain way, which I think could be harmful because of X, Y, and Z. We don't say dark patterns anymore. We say deceptive design. I'm like, oh, of course I will stop saying that.

Speaker 3:

But of course, like I don't have a lot of, I'm not attached to those terms the way that some people are attached to the way that they have been doing things. So I don't know. This is interesting, like, theory of change, discussion. Like, I don't know if I'm great at it, but I think about it a lot.

Speaker 1:

Well, I I mean, just by that, it helps you be much better at it. Right? Just the just the kind of the consciousness of it. One point that that we

Speaker 4:

transition to

Speaker 2:

the next

Speaker 4:

thing, I wanna say a great example of, can can making this message connect and then having it actually have the change is a previous employer of mine, Cloudflare, which is basically when Cloudbleed happened that was due to a memory safety unsafety, you know, situation. And, you know, that was, like, treated as a existential level threat to the business, and then, you know, basically, like, ended up in a, like, you start new projects, you need sign off from the CTO. I don't think that was ever an actual real policy, but, like, it felt that way or that was kinda, like, the the general attitude was, like, please do not use a memory unsafe language unless you absolutely must. And, you know, now we see them displaying Rust and Go and other memory safe languages at scale, doing systems tasks that we would have previously only did in CNC build plus. And that, you know, by, you know, them understanding his organization that, like, this is a threat caused the change that made that happen.

Speaker 4:

And so I don't know how many other organizations are, you know, willing to make those connections and, like, have management buy in and all that stuff. But just to, like, put down one positive case I've seen of that actually happening. In a way that's, like, not shaming anyone individually. Like like, they, you know, never said which engineer wrote that bug. And, like, I worked there and I'd access to all the internal stuff much later, of course, but, like, I never heard who wrote it because, like, that's not the point.

Speaker 4:

Right? And so I thought that was a good example of of that kind of change happening.

Speaker 3:

That's great. Yeah. And I really hope that that people who are making the argument internally, like, can use this. And it's not the only peep like, there's a lot of momentum happening, so this isn't the only thing that they can use, but I do hope it helps them make that argument. Because I feel like even just a year ago, people are like, companies will never do this.

Speaker 3:

It's expensive and hard. And I'm like, companies can do expensive and difficult things. Like, what if they see it as, this will help us stay competitive? Like, we can actually stand behind our promises around, you know, privacy security, whatever it is. And then, and then all the other benefits to like, stability and speed and etcetera.

Speaker 2:

That's right. It's only expensive, on one accounting. Right? It's expensive through one lens, but to your point, I'll just, you know, if you take the fully burdened cost of what's it cost to maintain and own and fix and apologize and, you know, pay for recalls or whatever, then is it expensive or then does it, is it a push or then are we actually saving money?

Speaker 4:

So Adam, I'm extremely proud of this joke, but are you suggesting there needs to be a total cost of ownership, analysis?

Speaker 1:

Oh my God. Oh, oh, why?

Speaker 4:

Feel free to edit that out before you publish the actual part.

Speaker 2:

Not not a chest. That's gonna be don't tell me.

Speaker 1:

No. No. You know no. No. Actually, no.

Speaker 1:

As a punishment for that, we are not gonna edit it out. Actually, we refuse to edit it out, actually. Right. Well, so, yeah, sure, it reminds me of it. And, Steve, I don't know if you you had the same same thought about this.

Speaker 1:

And, yeah, I don't I don't know if you're folks of my vintage, I mean, I grew up actually with operating systems that lacked memory protection. And there it does seem to be a bit of an analog in that when I was growing up, when you got the 2 operating system choices, namely Windows and Mac OS, neither of them employed memory protection even though it's present in the microprocessor. And as a result, I mean, it was very bad for the consumer, and you would lose work all the time. Your machine would just reset, And it really was awful. And when reading this, I was kinda thinking, like, wait a minute.

Speaker 1:

Where was consumer reports? Consumer reports should have been because it wasn't. I don't think there was any real public agency taking this kind of duopoly to task and saying, hey. This is neglect. This is, this is malpractice, in that you are you've got this this capacity to actually isolate these things from one another using the microprocessor, and you're not doing it.

Speaker 1:

And ultimately, and we don't live in that era anymore, and we got out of it in part because the the the competition stepped it up, and they both ultimately, both Microsoft and Apple, and then, of course, Linux and Unix and so on, which is always memory protected, adopted memory protection. And the consumer is much, much, much better for it. So I I I think it is great that you're using consumer reports as a vehicle to the point that it has me, like, demanding, like, where what what like, my my youth was robbed by Bill Gates because consumer reports wasn't calling him out on it.

Speaker 3:

I can speak back that far,

Speaker 1:

but we do,

Speaker 3:

like, Consumer Reports has done a lot of like, we have digital standard. Like, there's a lot of stuff that consumer reports has done, on the I think people don't know about to, like, raise the standard for, connected devices and, like, evaluate consumer technologies and how they respect people's interests and need based on, like, privacy, security, ownership, etcetera. And so, like, there has been work and and I don't know. I did a whole thing on VPNs, but I don't I think it's like, all of the work that I've done that I like, I don't know. It's always a little awkward because most of the stuff I decide to kinda like, the threads I follow is because I talk to a lot of people, who told me things.

Speaker 3:

And most of them don't want their name used or, like, like, it's like their employer might get mad or they just wanna have, like, on background or off the record conversations, but it's like there I feel like there's a huge like, if you just take what technologists say and talk about and, like, adapt that for a consumer audience. Like, there's some there's a lot there. You know?

Speaker 1:

But there is a lot there and it's, it's underreported. So we're glad you're taking it up.

Speaker 3:

What's

Speaker 2:

the, what's been the reaction from the non tech community? Like I can, I, you know, I've seen in, in my communities how people have responded, but what have you gotten from people who maybe didn't really have any awareness of this topic at all?

Speaker 3:

I guess I haven't heard from people who've never thought about it, but there are people who I've brought it up to who, like, maybe weren't as interested. And then they were like, oh my god. Like like, we should be listening to this. This has gotten a lot of play and, like, you know, NSA is talking about it now. Like, SIS is talking about it.

Speaker 3:

This was, like, mentioned in the Washington Post. And so, like, suddenly, they're kind of getting more interested, which is really, really cool. And, but yeah, no, I haven't heard from people who are like, who hadn't heard about it at all. Like that hasn't happened to me yet.

Speaker 1:

And what so what are some of the specific thoughts? Because you've got a bunch of, like, I of concrete recommendations in here. And I to to, you know, Adam mentioned earlier, like, and you you mentioned the nutrition label. I love the nutrition label idea. It does get really complicated in a hurry.

Speaker 1:

How do we navigate that for most systems, even actually like rust systems, are actually a hybrid of safe and unsafe code. So how would we express that? Either whether it's a kind of a nutrition label or regulatory compliance. Do you have thoughts on on how we would navigate that?

Speaker 3:

I think that that's gonna be really challenging. I think I tried to point out, like, a lot of the challenges in the, report because I feel like there's ways that you can compare like like, there's just certain things you can't really compare. They're really easy to to game. And, like, I think that there's like the, there's just such a fine line between something that would actually create change and something that would just look good on paper. Like if we told people like, oh, we want you to totally, if we told companies like, we want you to have a certain percentage of your code converted to memory safe language, they would just prioritize based on what's easiest and we want them to prioritize based on risks.

Speaker 3:

And so like, Yeah. I think this is gonna be like a tough needle to thread. And I guess that's something I learned too, when I was looking at, VPN, my last big report was on VPNs and I'm like, it's really hard. Like there's ways that people can sort of gain any metrics that you've set and make it look like they're making change. But when you actually look at the impact on the end user, like, there isn't any or or there's, like, very little.

Speaker 3:

And so, yeah, I think that's gonna be that's where I kinda lean on all the experts that I talk to because I feel like, that's the difference between us coming up with some kind of metric that doesn't make any sense. I and I read a book, I read this really amazing book called the tyranny of metrics. It's such a good book, even though I think they were a little bit the tyranny of metrics.

Speaker 1:

Oh, yeah. Interesting.

Speaker 3:

Even though I think the guy who wrote it is against, public records, which I disagree strongly about. But,

Speaker 1:

well, based on his metrics, he's got metrics. So I'm just No.

Speaker 3:

He just talks about, like, ways you can set metrics and then, like, do well in the metrics, not even intentionally.

Speaker 1:

He's against public records? Sorry. I'm still processing that. What how are you against public records? What's that even mean?

Speaker 3:

I think it gets certain information should be, like, private and that it undermines look good when you make too many requests because it, stops people from talking publicly about certain things. Like, I don't and and maybe there's a point there, but there's, like, a gradient.

Speaker 4:

Yeah. Yeah. Like, I guess the way I would think when I hear that, like, what I think of I'm not sure if this counts as a public record, but as an example, of something like this, you know, a lot of my friends being trans have had to deal with legally changing their name. And due to the way that that works, you know, it's all by state, but usually you have to, like, go to your courthouse and part of that is also, like, taking out an ad in the paper being, like, I'm changing my name from x to y because it's, like, determined it's important that's part of, like, the public record. But, like, in that specific case, it also, like, forces people to, you know, give private information out to literally everybody, and they may not want to for whatever reason.

Speaker 4:

And, like, you know, you can make arguments for and against all that, but I guess that's just that's what I would imagine is that he probably has some sort of criticism along

Speaker 3:

those lines. Yeah. But it was about, like, public records request from, like, government entities, which I've I don't know. I feel like that that should be more transparent. But, yeah, I know some of the things like, well, let's say you create patient scorecards to rank physicians and some physicians treat higher risk patients.

Speaker 3:

So therefore the number that comes up for them is lower because their patients have worse outcomes because they were worse coming in. I don't know. Or like anybody who's ever worked in public education, like, like it's like, sometimes the metrics don't measure what you think they do. So, yeah, that's Oh,

Speaker 2:

the that's amazing.

Speaker 1:

Yeah. No. The metrics aren't mean to ask any Zoogler about their OKRs. I mean, the metrics are terrible. And they there's definitely I'm and I'm excited to that the churning metrics looks great.

Speaker 1:

I think that, I have you ever heard of of, principles based accounting versus rules based accounting? My my late father-in-law was a professor account of accounting in Australia, and he, would like to go on some delightful anti American rants. A first among them was our adherence to rules based accounting. So in the US, we have rules based accounting, which is, like, there are a bunch of rules. And so a really good CFO knows how to get around all the rules.

Speaker 1:

They're really they and that's not wrong. That's not considered to be that's not a crime. That's that it's rule. These are the rules, and so you're not actually cheating if you abide by the rules. There's no spirit to abide by.

Speaker 1:

And in principles based accounting, there are principles, And you it it is not based on rules. So you can't cheat the principles. And I kinda feel like for this, we would need something that is principles based, not rules based, because it's just too easy. As you say, it is way too easy to game. And also, like, we don't wanna like, software engineers as a group are, you know, pretty bright, curious, motivated people.

Speaker 1:

We don't wanna take all of that energy and apply it to how to game a bunch of rules because they will definitely do it. We've gone I think we're we're gonna have to go to something that's principles based. I don't know. What do you think about all? Yeah.

Speaker 3:

No. I I definitely hear people complain about like compliance based rules all the time, like people in healthcare. And it's like, cool, you checks off all these boxes, but did it really actually create meaningful change? Like, I don't think that boxes are always bad. Like, some of them can be meaningful.

Speaker 3:

Like, for example, one of the things in the digital standard is like, do you agree to not Sue people who like, give like file, tell you about your bugs or vulnerability, like disclosure, like, like there's sometimes it does make sense, but other times it, it really doesn't. So yeah, no, I like the idea of a principles based approach. I think one of the issues though, is that like, what if, well, this came up for me and VPNs. Cause I'm like, okay, if we, if I wanted, like, the ITO VPN, like, my principles for what I want, like, nobody will do. Like, nobody is going to make their, like, client side and server side code open source, which is, like, my principles that it should all be, you know, open source and audit and auditable and, like, nobody's gonna do that.

Speaker 3:

So it's like, when I I don't know. That's kind of the extreme idealistic version of it is that I'm like, oh yeah, everybody should do X, Y, and Z. And therefore I can't recommend anything, which is kind of where we're at right now. For many things in memory safety, look, like I can't recommend any of these, you know? And so like, ultimately it's like creating that change, but yeah.

Speaker 3:

How do you, I think that's what a lot of the people in the convening talked about is like, how do you create incremental change that actually moves the needle instead of just, people saying they're going to do certain things and like not actually, it doesn't actually have impact. They're just checking off a box to get you off their back. Or like we wrote a blog post about memory safety, but didn't change anything. So therefore we, we can check off our box and as a company that cares about this issue.

Speaker 1:

Well, I really like your call on companies to be more transparent about the causes of bugs. I think that is, I mean, I, I would like to, I think it it we've gotta have some way of making clear those companies that are refusing to participate in CVEs. I don't know how to do that exactly. And I know that there are a lot of prob I mean, problems, as Steve, as you alluded to with with CVEs. But I do feel that, like, getting the information about where bugs are coming from is a really important start.

Speaker 1:

You know, an easy way to do that is open source. And, I mean, I'm I as folks know, I, we, very, very pro open source. I believe their open source yield. I mean, it feels like this like, why am I pretending like this is a controversial opinion in 2023?

Speaker 2:

A question. See the episode we did with Steven O'Grady a while back, where there's sort of, people pulling the football, you know, out from under us in terms of, open source. And then you've also I mean, not to speak to, you know, not to turn this into a therapy session, but you've been living in a world, and the embedded world has lots of proprietary software and the and, you know, as we go deeper down the hardware, it feels like more and more proprietary software, as opposed to us floating up on the surface.

Speaker 1:

Yeah. And I do feel and you call out IoT in here. Yale, but I do feel that a domain that desperately needs the the, open source and memory safety and memory protection. That kind of I I view that that to be the the triad of yielding the best, safest possible artifacts, or or you where you've got a a memory safe language, and it's open source, and you've got memory protection, which definitely should I mean, I as I mentioned, like, that was wasn't that resolved in the eighties? What was resolved in the eighties nineties and thousands for personal computing, for server side computing, it is definitely not the state of the art, sadly, for embedded development, for IoT and for other embedded devices.

Speaker 1:

It's actually very unusual, that memory protection. So I think you need those kind of and I would love to have a way of, at least can we evaluate can we encourage that triad? Do we have some way of looking at some of these things, whether they're devices or products or what have you, and just having an idea of where they score on these three things. And now I keep, like, I'm deriding metrics, and I I just I can't help guide if I look at that.

Speaker 2:

For a publication that scored products.

Speaker 3:

Yeah. We do have an additional standard in the governance section. There is, an open source category that says like, we want to know if the product source code is publicly available and reusable. And, like, is it, you know, under a license, a proven list in by the open source initiative? Like, what type of open source license is it?

Speaker 3:

But I don't know if they have actually used that for, I would have to check with that team. I don't know if that's been actually used for testing, but, it's definitely one of the things I look for when we make VPN recommendations. I'm like, is this open source? It's one of the things I look for. So, so so this is another kind of problem.

Speaker 3:

I think that things can get difficult with any kind of evaluation is that like, how do you, how do you rank things? Like, how do you weight them? Like, is this more important than like, what's more, how do you like hold them against each other? Like open versus transparent or, governance, like, versus ownership, etcetera. And it it's kind of a hard problem, especially when you're comparing things that aren't exactly like, you can't really compare apples to apples in that situation.

Speaker 3:

So, yeah, these are like open quest like, I feel like a lot of people are kinda talking about it and trying to figure it out, but it might be a while before, like, it really makes sense.

Speaker 4:

So I, I definitely like agree with all this, but I also wanna slightly chase down a small thread that was said earlier that I think, you know, like, why is this feel controversial right now? I think kind of what's interesting, I think I first heard Patrick Walton make this argument, but, like, if you look at, like, the past 20, 30 years, like, memory safety has been winning. Like, you go back to the eighties, like, everything was written in Miami on safe code.

Speaker 1:

I pressed it.

Speaker 4:

But, like, by now, you know, almost everything is by default written in something that is memory safe. So part of the reason why this is so contentious and so difficult is that, like, this isn't the, like, start of a battle exactly. This is like finishing the war. This is like capturing the enemy's headquarters stage of the conflict. I don't know.

Speaker 4:

I've been playing too much Call of Duty. I don't know why.

Speaker 1:

People that can reiterate that he honors anyone who wants to write c plus plus that when he says that this is rating the compounds that has c plus plus deadenders, he's speaking quickly now. No. But

Speaker 4:

it's like it is I and I think that that contributes to this, like, sense of, like, why this is high stakes and, like, why people have lots of feelings is because, like, the scope of where memory unsafe languages have been has just solely been shrinking. And, like, there was sort of this island of, like, well, sometimes you need to use it because you have to because of hardware. And now we're starting to see languages like Rust and other things that are demonstrating that that's that's not actually like a law of physics. That's just like an accident of history. And so now that, like, little island that they've been able to live on is is under threat.

Speaker 4:

And I totally get why people feel sensitive about that, but I think that's also, like, a good a good reminder is that, like, history is already on the side of memory. Safety is good, and we should do more of it as much as possible.

Speaker 1:

Totally. I mean, there's almost something Lindy about it. Right? Where the if you are writing in a memory unsafe language today in 2023, you are almost haudologically at a layer of the stack in which performance is really important.

Speaker 4:

And also just, like, a lot of privacy and security stuff, like, that, you know, too, like, tends to be in these languages still, and that's why it's important. I'm sorry. Yeah. I was

Speaker 3:

your best. Oh, no. You're fine. I just keep going back to the numb I'm like, 60 to 70% of browser and kernel. Like, that's ridiculous.

Speaker 3:

Like if somebody told you 60 to 70% of injuries are caused by x, when you like immediately, like I make changes in my life. I made a joke about this when I was on the, ending my panel, on memory safety. And I was like, I don't take a bath when it's lightening out. Like, I make changes based on risk and this seems like

Speaker 1:

Is that that's is that actually true? Do you do you live in a lightning? You live in Arizona. Yeah. There's saw a lightning.

Speaker 3:

Not a lot of them, but, yeah, we could

Speaker 1:

No. I know. But just, like, I'm just saying that, like, I'm just admiring the I mean, I'm not sure if you're speaking literally,

Speaker 3:

but literal. Like, I will literally take get out of a shower or bath during a thunderstorm, because I know that plumbing and other metal can be an electric current conduit. And even though the chance of that happening is very small, it's still like changes my behavior. So then if somebody told me 60 to 70% of deaths are caused by lightning or thunder or whatever. Like, I would definitely, I don't know.

Speaker 3:

I feel like you make changes to your behavior based on smaller risks. So

Speaker 1:

and Yeah. So you do.

Speaker 3:

Back to me. And that's why I kept kind of pulling on this thread. I'm like I'm like, this is a lot of vulnerabilities. That's a very high percentage. You know?

Speaker 1:

I would love to know how the average or median age of code that has a vulnerability, because I think the other challenge is a lot of that code is not new. It is at this layer of the stack. I don't know. Maybe I

Speaker 4:

So the Android the Android's memory safety article had this statistic in it.

Speaker 1:

Oh, interesting.

Speaker 4:

For for Android, and they were saying that I think it's, like, actually the opposite, like, newer code tends to have the most problems. I'm trying to find this exactly. Oh, that's one that's, like, one thing, you know, like, one time.

Speaker 3:

But yeah.

Speaker 1:

To a degree, that's great. Because, yeah, the newer the code is, the more I think you can make that argument of, like, stop doing this. Because I think that it's more challenging when you've inherited a code base that it's, like, yeah, I didn't write open SSL. I didn't you know, I have linked to it, and it's not, it's something that was written a long time ago and reflects that. And it's been evolved ever since.

Speaker 1:

So ends up

Speaker 3:

And that code base is just gonna get bigger. Right? So why not? But I do think that you, that you can ship at least, like if you want to ship a component in existing and existing project in memory, safe language, I think that would be valuable because you could like, companies would kinda overcome that initial infrastructure investment.

Speaker 1:

So Yeah.

Speaker 4:

You have to have a long term thinking, unfortunately, you know, quarterly reports and stuff.

Speaker 1:

No. You gotta be willing to be mediocre about memory safety to get back to your mediocrity talk. Yeah. I'm Right? You got great

Speaker 2:

point, Brian. Is it just dip the toe in. Right? Just go for it. Right?

Speaker 2:

It doesn't need to be you don't need to be an expert on it. You can eat even getting rid of one bug is enough bugs to get rid of.

Speaker 3:

There's like

Speaker 1:

Right. Yeah. If, like, if you if you garden, you'll be able to cook yourself. You'll be able to grow food that you can eat. And I think if you write in every safe languages, you will avoid one defect that you would've had otherwise.

Speaker 3:

I hope that the and I don't know about this because I'm not actually writing code, but I hope that the community would like, you know, be welcoming and help people who are working on learning how to do this better. Instead of like gatekeeping, like, I, I guess I just don't know enough, but like, I would hope that that's not a factor. And I don't know if there's anything, like, how do you connect people that need help?

Speaker 1:

It's like

Speaker 3:

so that they can.

Speaker 1:

I think it's really important, and I think no one has been more in the forefront of this than Steve. It is very important to be welcoming. I think even despite best efforts, languages are often it can be inadvertently unwelcoming. And, you know, Steve, you you've been on a on a many, many year mission to make Rust as welcoming as possible.

Speaker 4:

We got lots of problems too, but it's still better than anything else as far as I'm concerned.

Speaker 1:

And I do think, Steve, one thing that one thing that you have and maybe I'm just, like, too inside this and can't see it, but it feels to me, Adam, I'd love to get your take on this. It feels to me, you know, a couple of years ago, there was this idea that, like, Rust had this almost Haskell like aura of impenetrability. And I feel, like, this idea, like, oh, Rust is really difficult to learn. Like, once you learn it, it's amazing, but, boy, it's really just crushing to learn. And I kinda feel like that has faded, and I don't think it's only because I have learned it since then.

Speaker 1:

I think it's been

Speaker 2:

let me give you an anecdote as a tipping point indicator. I was at a Super Bowl party yesterday as one does. Someone was asking what my son was into and I, muttered C plus plus and this person I never met before said, you know what he should check out instead of C plus plus is Rust. So I feel like

Speaker 1:

It takes a village, Adam. It takes a village. Everyone's brand new.

Speaker 4:

I just feel

Speaker 2:

like there's a, there's a certain, like, hurdle you have to get through where kind of folks are just meeting at Super Bowl parties are are spreading the good news.

Speaker 4:

He wasn't even at my Super Bowl party. That person wasn't even me. So that's awesome.

Speaker 1:

It wasn't accurate.

Speaker 4:

I prepared someone to go to Adam's party specifically.

Speaker 3:

I I do still hear people complain that Rust is hard to learn. Sorry.

Speaker 1:

Oh, for sure. And I think that you it no. It is. And I think that I mean, you also have this there's another challenge with Rust, and and I I'd love to get and, Steve, your take on this as well. I think when we think about Russ pedagogically, I think just as we still teach assembly to undergraduates, I think we have to teach them at some level what memory is and how to be unsafe with it before we can be safe with it.

Speaker 1:

And as the as the parent of teenagers, this whole line of thinking makes me very nervous, by the way, because but the I I mean, I don't know. Maybe I'm being too traditionalist. Maybe you can't just go straight to Rust. It just feels like it's a very tough first programming language or early programming. Or, unfortunately, there are other memory safe languages that are are, I think, gonna be easier for those first languages.

Speaker 4:

My headphones my headphones died, and I missed, like, half of the last sentence of what you said, but I think that maybe I got it, which is, like, is it easier to learn Rust when you don't know other languages first or not? Is that where you were going with that?

Speaker 1:

Yeah. Yeah. I mean, is how is is Rust practical? It's kind of

Speaker 4:

a Okay. So the answer is yes, but with a giant asterisk after it. So those of you don't know, I used to, like, teach Ruby and Rails to both literal children and adults who had never programmed before. A lot of my earlier career was teaching programming, like, pretty express expressly. And, like, so so many people are like, you know, and also my partner, Ashley, had had done this as well.

Speaker 4:

And like she would say that her students who were, like, high schoolers, like, people would say, oh, you should be teaching JavaScript because it's, like, easier because there's no types. And she's like, they they want types. They, like, think in types already. Like, the idea that there's, like, a kind of a thing

Speaker 1:

needs structure.

Speaker 4:

No. The idea that, like, there's a kind of a thing and you don't know what it is is, like, weird, basically. And so I think that there's, like, a lot of people like, a lot of programmers don't appreciate the degree to which knowing how to program has, like, warped their understanding of what it's not like to know how to program. And their estimation of what is difficult versus what is easy is wrong, because what's difficult or easy for someone who's been programming for 10 years is not the same as someone who's brand new. I think types is a great example.

Speaker 4:

I don't think that every type system is equivalent. That's a whole big giant can of worms, obviously. But to get back to, like, what I said specifically, like, the giant asterisk on Rust, So there's not a lot of good materials for learning Rust as your first programming language right now, And so that's, like, one example where I think it's just not the case. Like, there is no book that's, like, teach programming via Rust. Some people try to say that my book is that.

Speaker 4:

I disagree with them. We'd, like, assume that you know how to write code and then text in an editor and compile it and, like, run it and stuff. We show you how, but, like, we don't explain, you know, okay. To do a programming, you need to open a text editor and edit a file and then, you know, run it through a com like, all that stuff, like, has to be taught. Right?

Speaker 4:

Yeah. Right. And so there's there's no resource that does that with Rust specifically. I do think that for a particularly kind of motivated student with a certain kind of mentality, temperament, or something, like, the if you can. And, like, I know there are people who have learned Russ as a first language, and I know some people that have, like, helped supplement that lack of express resources by, like, just teaching them or, like, answering questions or, like, helping out.

Speaker 4:

And they've generally said that they have found that those people have learned Rust relatively easily because they just, like, don't know any different. And so the idea of these rules that are so weird that mess up your existing designs, like, people don't have those existing designs, so they just learn how to write code that works with the compilers that are against it from the start, and so ends up being, like, faster.

Speaker 1:

I oh, I'm sorry. What is a w link list?

Speaker 4:

Right. Exactly. Yeah. Even know what that is. They're not They're not evaluating the programming language based on whether or not they can write a linked list because they don't even know what that is.

Speaker 3:

So you

Speaker 2:

can be like, how could a doubly linked list even exist? What are you talking about?

Speaker 1:

Oh, like, what is it even that that's like walk

Speaker 2:

me through it?

Speaker 4:

Yeah. So that's basically like the what I would say about, like, learning Rust as a as a beginner or, like, as a new programmer is, like, if you really want to, I think you can do it. I don't think it's something I would, like, recommend to the average newbie who's learning to program today, but I don't think that means it's impossible for the future.

Speaker 3:

Oh, there should be a rest version of, to out myself as someone learning Python. There I'm doing the Georgia tech, the David Joyner class online on introduction. It's actually like about fundamentals of computer science, and they said they wanted to, add other languages. Yeah. And so to give somebody random extra work, it would be so cool if this also had Rust.

Speaker 3:

I mean, I don't know. It's pretty good. It's a really good program. Yeah.

Speaker 1:

I mean

Speaker 3:

I feel like I've gotten further than I have in this one online program than all my other past attempts.

Speaker 1:

Well, that's great. First of all, that's great. Good on you and terrific. And it does feel I mean, in part because Rust, in part because of of just it's it's relatively young age, it does have fewer historical kinda warts that are hard to explain. It's like, no.

Speaker 1:

No. Sorry. You just have to do this. Like, we'll explain it in a hopple paper a while from now. Don't worry about it.

Speaker 1:

Just just do it. It there's it feels to me like there's less of that stuff. And, yeah, it'll be interesting, Steve, to, like

Speaker 4:

Less but non zero.

Speaker 1:

You should do this experiment. Maybe you should maybe you should do this experiment on yourself, Al. You could actually go have they talked about I wonder if someone has take oh, if someone has someone attempted to take an introductory courses, the Georgia tech course, have they talked about actually doing

Speaker 3:

it? I don't think they have a lot. I'm not wrong on this, but like in the course, there'll be like, here's how this concept applies to all these other languages. And I'm not sure they have Russ Stevens on the list. I don't know how old this course is, but, hopefully, I'm not wrong on that.

Speaker 3:

But, but yeah.

Speaker 1:

I can't hear, embarrassingly. I'm so sorry. I'm no interrupting you. I can't hear. So I'll I'll I'll do my audio.

Speaker 1:

I'll be back in a second.

Speaker 4:

Brian still uses Linux, so he's got a lot of audio problems. So I can make that joke because he can't currently hear me. So,

Speaker 3:

Yeah.

Speaker 2:

If you're calling in a new room.

Speaker 3:

Yeah. It's actually oh, no. That.

Speaker 2:

That is back.

Speaker 4:

I I am gonna be coming into work tomorrow. If you wanna have a meeting, it's like fine. I get yeah. Yeah.

Speaker 1:

Yeah. Are you yeah. Yeah. Do you have availability tomorrow? Just no.

Speaker 1:

Just checking. Oh, no. It's fine. We'll talk about tomorrow.

Speaker 4:

You want it before the all hands? Not after the all hands?

Speaker 1:

Yeah. If you could just kinda like a

Speaker 3:

a late night No.

Speaker 1:

I like,

Speaker 3:

I need to talk to you.

Speaker 1:

Exactly. No. No. No. No.

Speaker 1:

That was actually I actually I that that was a I was feeling something very unusual for me, which is Linux is being unfairly maligned, and I must rush to defend it. I had actually just pulled my in my enthusiasm, I just yanked my jack out of the ball and couldn't figure out where the plug was. So that was that was purely

Speaker 3:

Yeah.

Speaker 1:

Just me being a doof. Yeah. Exactly. So, yeah, one one question I wanted to ask you was around just, you know, you you bring up these really interesting historical analogs of the and, talking about Upton Sinclair in the jungle and Ralph Nader and Unsafe at Any Speed and how we got some public recognition on these things. And as I as I look at, like, at at certainly our food safety, medical safety, aviation safety, regulatory bodies played a really important role in that.

Speaker 1:

And, I personally, like the I think, no entity has done more for aviation safety than the National Transportation Safety Board. And the just actually and there's a kind of a famous and ongoing tension ongoing tension between the NTSB and the FAA, and the NTSB views the FAA as actually being too beholden to to the kind of commercial airlines. Is there an first of all, as we look from potentially regulatory solutions, is there an NTSB equivalent that we could do here? Because that to me would be, potentially interesting.

Speaker 3:

That's such a good quest I mean, I feel like there's so many different, organizations that think that they're in charge of or have varying levels of responsibility, I should say. Like, there's so many different US government groups that think that, like their mandate includes cybersecurity and it, they pro it probably does. So I don't know. Yeah. That's a good question.

Speaker 3:

I don't know exactly how, that would happen. Like, is it gonna be the cybersecurity strategy out of the office of the national cyber director? Is it gonna be I I know that, like, CISA and NSA have have given some guidance, but yeah. I don't know. I I'm interested to see how that plays out too.

Speaker 3:

I don't really have any predictions.

Speaker 4:

1, I don't know if this is like an overall possibility, but just like an interesting angle of this happening was, when Facebook was going to be making that cryptocurrency, they got dragged out in front of the government. And, was it an FBI director, a senator, or somebody had, like, asked them, like because it's like like, you know, they're they're doing the whole, you know, person being interrogated by the government thing and you you tune in and, you got the senator whoever being like, I see you're using a nightly version of the Rust programming language. Is there are you taking engineering of this thing seriously? Like, how can we trust you? And we all in, like, the Rust world were like, holy crap.

Speaker 4:

Like, this this is a government official, like, caring about the difference between the nightly versus non nightly, you know, release chain or whatever. So, you know, like, that thing happens sometimes. And I think that was, like, yeah, they're doing this, like, security angle. But, yeah.

Speaker 3:

The FTC. It could be the FTC. I don't know. There's, like, like, what if the I don't know. I and it's a good question, and I don't know, so I'm speculating.

Speaker 3:

But, like, what if like, can failure to use memory safe programming techniques be considered an unfair deceptive business practice it by putting people at unnecessary risk of hack student memory exploitation that would have otherwise been stopped. Like, I don't know. Maybe.

Speaker 4:

Adam joked about the sec in the chat, and I replied to the Matt Levine thing, which is everything is securities fraud. And, like, he, you know, often talks about how companies will sort of, you know, get sued because if they say in their quarterly earnings, you know, call or whatever that, like, you know, their products are secure and then, like, the next week they get hacked and their stock goes down, then is that, like, defrauding investors because your product isn't as good as you were? Kind of like that sort of stuff happens. So, maybe we'll see a hedge funds try to take out a giant short position about someone's memory on safety.

Speaker 3:

Oh my god. I would be so terrified to say it's like saying like your product isn't hackable. Like that's like the most dangerous thing to

Speaker 1:

say. Pamper proof is the other one. If you really want to like bring out everybody say that your product is Pamper proof because it definitely is

Speaker 3:

Yeah.

Speaker 2:

I had a kind of wild one along these SCC lines where where we talked about, you know, the companies and and their engineers, employees. We talked about consumers and and consumer awareness and then about governance, but there's also the ownership of companies in the form of like shareholders and shareholder actions. And is that another way to kinda, you know, you talk about theory of change. Is that another way to make progress where, you know, say Apple or Microsoft shareholders, you talk about Apple and Microsoft, in the piece, You know, could could say, you know, asking for the board of directors to make commitments or asking for leadership to make commitments around, you know, these kinds of vulnerabilities around, progress for memory safety. Kind of a wild one, but I I I'd love to hear your thoughts on it.

Speaker 3:

Oh my God. That's fascinating. That's a fascinating idea. Yeah. I think you'd have to really make the business case like, you know, this, I don't know if we'd need more examples or what, but, like, this will save you money in the long run.

Speaker 3:

Yeah. That's super, super interesting.

Speaker 4:

Let's just agree to not pull a Paul Graham and try to, like, talk about the September 11th of memory safety or whatever, because that was in poor taste.

Speaker 3:

Oh, no comment.

Speaker 4:

Well, you don't remember this, Brian?

Speaker 1:

How I

Speaker 4:

Literally, in September of 2001, he tried to talk about, like, Lisp is cool because you can keep code and data separate, and therefore a buffer overflow doesn't happen just like, you know, someone buffer overflowing their way into the pilot seat of a plane, which is just, like and, like, literally ends up Denver 1. So it just really, like yeah. Anyway. That's awful. That's not effective.

Speaker 4:

I definitely see. You know what, Steve? I'm not on the Paul.

Speaker 2:

Yeah. Not a fan because anymore. That's it. You turned me.

Speaker 1:

That's it. Last straw. That's the last straw. It it definitely was not his tweet today implying that no one should trust journalists. Yeah.

Speaker 1:

That has got that has got nothing to do with it.

Speaker 3:

But this

Speaker 1:

biz no. No. That is

Speaker 3:

The Mastodon?

Speaker 1:

Or is

Speaker 3:

he back on Twitter now?

Speaker 1:

He's back on Twitter now.

Speaker 3:

Okay. Okay.

Speaker 1:

Okay. No. He's back on Twitter. He's back on Twitter. That, unfortunately, it would that was the the Musk unfortunately backtracked when he actually left.

Speaker 1:

That was actually the breaking point for Musk, and he realized that he was actually it was which is unfortunate.

Speaker 2:

But the the other thing

Speaker 4:

I was gonna say was, like, why pick one organization? Why not just have all of these people enforce memory safety? Like, who who needs just one government body to make sure this happens? Let's just just get them all, and then, you know, like, at least one of them will end up, you know, doing it right. Like

Speaker 1:

Yeah. And I would like it to not be punitive and kind of shame and enforcement based, if at all possible, just because I don't know that that's practical. I I I would love it to find a way to be principles based and best practices based. And, hey, this is gonna allow you to build better artifacts faster, which is true. Right?

Speaker 1:

And I I think that, you know, what can we do to go I mean, it's like you wonder, I so mean, the the in terms of society's scarce resources, should we spend that on an enforcement body, a regulatory body, or should we spend that on on promoting, education standards that do that do have rust as a as a first language? You know, I mean, it's just it, it it it I I do kinda wonder. I mean, this is, like, you know, it's kinda it is a bit of carrot versus stick. You know, as Adam as I think you you observed to me once many years ago, it's like you can you can kind of beat people with a carrot. Like, wow, that carrot really hurts, actually.

Speaker 1:

That what is that carrot made of? That carrot that

Speaker 4:

Beaks awfully and

Speaker 1:

Ouch.

Speaker 4:

Behold a big carrot.

Speaker 1:

Yeah. I mean, but I do think it's like we we you wanna find a way to get people there without without shame and punishment, but we also, like, we kind of need to know that this stuff is important and you need to get there. It's not just enough to kind of wander your way over here in terms of memory safety.

Speaker 2:

You know, so much of education is short of short circuiting experience. And I think Brian, you were telling me about, Canadian engineers wearing the iron ring. Am I remembering that correctly? Can you, I I remember vaguely the the history of that, but,

Speaker 1:

Oh, I mean, any Canadian here, and I'm sure there are plenty is, like, they've got oh, Canada's ain't going through their heads right now. I mean, this is the the the iron ring that engineers wear from the collapsed, bridge in Quebec. And it is it is putatively made of this collapsed bridge to remind all engineers of their obligations to society. And there is a terrific podcast on on exactly this, and the, that I will that I'll I'll drop in the channel. But yeah.

Speaker 1:

So, no, this is this is an important part of a Canadian Canadian ensuring education, and you wear that iron ring. Yeah. I mean, are you suggesting that software software engineers should wear?

Speaker 2:

We should have We need the, yeah, the iron ring of unsafety, right? The, the, the, the kind of, watershed moments that, that kind of, short circuit that experience. Right? Because that's that's the point of this. Not that everyone needs to build a bridge that falls down, but rather everyone should learn from this historical moment.

Speaker 3:

Should I have memory safety challenge coins?

Speaker 4:

It's been, it's been 5 years since I've written on safe code.

Speaker 3:

People will do anything for challenge coins. That's that's

Speaker 1:

That's okay. The the no. This is a great we need memory safety challenge points. And then so so, yeah, how because it it feels like a challenge coin seem like a a bigger thing in the public sector. Am I making that up?

Speaker 1:

Wait. Can you describe a bit of, like, a of what how challenge coins are used? Because

Speaker 3:

I think you know that. Realize that they were, like, super special until somebody got mad at me because he said he would trade me something for 1. And I gave him some stickers and took the coin and he yelled at me at a Defcon, like, many, many years ago. And I'm like, oh, this is, like, he like, important and special. But I don't know.

Speaker 3:

I feel like, the ones I've gotten, it's for, like, you actually have to do something to get them, like, either, like, actually solve a challenge or like, I've gotten them for speaking. But I don't know. Like people collect them and display their, their coins, but then people will just give them out sometimes too, where they're like, I don't want to take these home or just like kind of randomly hand them out. But I know it has military origins. I know it's like used differently in the military where like, Like there's, like a hierarchy of coins and like you have to buy around.

Speaker 3:

If somebody pulls out a coin, if you pull out a coin, then somebody has to buy you a drink. But if somebody has a higher coin and they have to like, you have to buy everybody drinks or something. Like, I don't know.

Speaker 4:

Oh, it's a PVP MMO. I understand that.

Speaker 3:

But I haven't gotten one in, like, I I have a little collection and I'm like, I think I need to get a few more to be able to display it. And like, I don't know. And then I also like feel guilty when people just give me one when I hadn't earned it. They're like, oh, like we're we, we had challenge coins for everybody who solved this really hard crypto challenge, but you know, we don't feel like taking them home. So here take 1, Like, I didn't earn the coin.

Speaker 1:

I do love the idea of of Steve having fairest challenge coins that he can give out to those that he feels. The I and and then that you got, like, the, you know, what would we have kind of, you know, to to, like, to protect safely or something? I'm trying to figure, like, what what is gonna be the the the the embossment on the the Ross challenge going? I think it's a good idea, though. I think it's I mean, it's you need those kinds of incentives.

Speaker 1:

Right? You think, like, that is you know, what is the origin of that? This is like a a small token, but it's meaningful to people because it represents something substantial that they've actually done. Like, are there can we do things like that? And that's I mean, Adam, to your point about the iron ring, it's like the iron ring is meaningful.

Speaker 1:

And what can we do that would be meaningful to say, hey. Yes. You have made this commitment to the craft, to the to, implementing in a safe language. I mean, like, a maybe there's an oath. I don't know.

Speaker 1:

We're gonna

Speaker 2:

We'll workshop that.

Speaker 3:

Good point though about incentives instead of just getting yelled at for writing and seeing.

Speaker 1:

Yeah. I mean, I think you do. I think that you need some incentives, and I think you can, you know, structure. And even from a regulatory perspective, I think you need you need to structure that as incentives. I mean, we just went through this FCC compliance, the exercise, and which talked about last week, which was really interesting and educational for me.

Speaker 1:

Part of what I actually like about that is it is a bit of a nuanced process, because they know that, like, there's no practical way for the FCC to audit everything that you ship and to put everything that you ship in a chamber. It's like, that's not gonna work. Work. So they've gotta find a different kind of a process where they've got these other these private bodies that are that insurers trust, and then they evaluate you. And it felt like a, a more nuanced process, but didn't lose its kind of quantitative artifacts.

Speaker 1:

It's like you have to be you have to achieve the the these these kind of quantified goals. So I'm not sure if that's an that's an analog or not, but it does feel like there is there is some kind of public part private partnership that is possible here.

Speaker 3:

I totally wonder too if, like, like, I think the transparency has a big, like if you like, say you bring your percentage of bucks, like, like down from, you know, 70, 80 to of memory safety bugs down to like a lower level because of this implementation. But nobody's going to know that if you don't make those statistics public, and just posting statistics, like we've seen is like meaningless when people are like, oh, I posted publicly that, you know, 90% of our staff are white men or whatever. Like, like just posting that isn't like enough. Yeah. So interesting.

Speaker 1:

Yeah. We're just the early point about the charity of metrics, But, it feels like I mean, I think it all of this starts with, I think, getting more public attention on this issue, and allowing more people to kind of appreciate this. We're getting hopefully, Russ brought up at more suitable parties. So, Steve, good going on your alt there. We gotta get, but, yeah, I mean, the the work you're doing is, is really essential here, and thank you very much for being willing to get a bunch of practitioners in the room and, find a forum where they could kinda give you unvarnished information, and then you could collect that.

Speaker 1:

And I think what, you you know, you you kind of, self disparagingly called it, that instead it was not capital j journalism. But, this is a great piece that you've got you've got here. And, I I would just I think on behalf of all software engineers, would encourage you to thank you for it and to, encourage you to explore more in this vein because there's a lot of good Yeah.

Speaker 3:

And if people have ideas about what we should be doing and some of the things we talked about on here, I'm I'm definitely would love to hear them.

Speaker 1:

Well, I think mediocre memory safety has gotta be I I I I that that that would be my challenge. I will give you this this rust station challenge coin if you can give a talk on mediocre memory

Speaker 2:

safety. Talk.

Speaker 1:

I think it'd be great. Well, y'all, thank you very, very much for for joining us. We really appreciate it. And, and really and, hopefully, you'll, can come back and and, you know, good luck with your your future work. And hopefully, are you gonna continue to do work in the kind of this vein?

Speaker 1:

I mean, it's just been have you gotten good feedback on this? Is this stuff you're gonna do more of?

Speaker 3:

Really great. Like, we got more of a response than I expected. So I definitely wanna keep working on this and of course other things like it and yeah. Thank you so much for your being, you know, I don't know. This is really cool.

Speaker 3:

I listened to your podcast last week and I was like, wow. Like, I don't usually get that kind of feedback. So it was pretty cool.

Speaker 1:

Did you feel that I say god bless too frequently? Because there are certain people in this room who remain nameless who believe that I say god bless. What do you say too frequently? You know what?

Speaker 2:

It's just frequently. Frequently.

Speaker 1:

I think in okay. Okay. Technically, you merely asked me the question. Do you know how frequently you say it got lost?

Speaker 2:

Tonight, you're you've moved to good on you, and that's not escaped my attention.

Speaker 1:

I often say good on you. I feel is, is, is that my view is this the one of the, another one of these things that I'm the only one that says

Speaker 4:

I think I've maybe heard another person say it before probably.

Speaker 1:

You've heard 8 other persons. Well, good on them.

Speaker 4:

I'm not sure how good

Speaker 3:

on them.

Speaker 4:

Quality of the audio was.

Speaker 1:

You all.

Speaker 4:

I I don't think that they sounded like a millennial podcast when they were saying it, though. So, you know, really.

Speaker 3:

Yeah. Exactly. Just happy you didn't ask me any questions about unicorns.

Speaker 1:

So that's my feedback. Exactly. Yeah. God bless Unicornos and and good on you Rust. Alright.

Speaker 1:

Thank you so much. Thanks everyone.