Oxide and Friends

Oxide and Friends Twitter Space: September 20th, 2021

Theranos, Silicon Valley, and the March Madness of Tech Fraud
We’ve been holding a Twitter Space weekly on Mondays at 5p for about an hour. Even though it’s not (yet?) a feature of Twitter Spaces, we have been recording them all; here is the recording for our Twitter Space for September 20th, 2021.
In addition to Bryan Cantrill and Adam Leventhal, speakers on September 20th included Land Belenky, Toasterson, Cole Frederick, and Simeon Miteff. (Did we miss your name and/or get it wrong? Drop a PR!)
Some of the topics we hit on, in the order that we hit them:
  • John Carreyrou on Theranos 
    • “Bad Blood: Secrets and Lies in a Silicon Valley Startup” 2018 book
    • “Bad Blood the Final Chapter” podcast as the trial proceeds (announcement), on apple, spotify
  • Cole’s tweet linking to a ~5min video of a would-be Theranos competitor commenting on its collapse > The lone inventor is a dangerous impression to give people.
  • Related: Brian Fitzpatrick and Ben Collins-Sussman “The Myth of the Genius Programmer” 2009 talk ~55mins
  • [@9:47](https://youtu.be/YWdk9CKML2g?t=587) Companies that drive scientific people nuts 
    • uBeam “claims to be developing a wireless charging system to work via ultrasound. Scientists have strongly criticised the plausibility under physics of this proposal.”
    • uBiome > To innovate, you have to balance the world as it is with the world as it isn’t.
  • [@13:44](https://youtu.be/YWdk9CKML2g?t=824) Theranos’ fantastical vision. European attitudes around business and innovation. 
    • PCR Polymerase chain reaction invented 1983 by Kary Mullis.
  • [@18:39](https://youtu.be/YWdk9CKML2g?t=1119) Fake it till you make it? 
  • [@23:57](https://youtu.be/YWdk9CKML2g?t=1437) Whistleblower Avie Tevanian. Smoke and mirrors, giving the board the run around.
  • [@29:05](https://youtu.be/YWdk9CKML2g?t=1745) “Everyone was relying on someone else to do their due diligence” 
    • Tech risk, venture capital
    • Cerebras Systems wafer scale processors
    • Ellen Pao NYT editorial “The Elizabeth Holmes Trial is a Wake-up Call for Sexism in Tech”
  • [@35:20](https://youtu.be/YWdk9CKML2g?t=2120) Software cure-all 
  • [@40:14](https://youtu.be/YWdk9CKML2g?t=2414) Founding myths 
  • [@44:06](https://youtu.be/YWdk9CKML2g?t=2646) Tesla “Autopilot”, Uber self driving 
    • Anthony Levandowski > Judge Alsup: This is the biggest trade secret crime I have ever seen. > This was not small. This was massive in scale.
  • [@48:21](https://youtu.be/YWdk9CKML2g?t=2901) March Madness of Silicon Valley Fraudsters 
  • [@59:02](https://youtu.be/YWdk9CKML2g?t=3542) Levandowski jeopardizes employee 
  • [@1:04:35](https://youtu.be/YWdk9CKML2g?t=3875) Warning signs of fraudulent companies 
    • Transparency, celebrity boards
    • Optane
    • Inconsistency between board and leadership on what the coming milestones are
If we got something wrong or missed something, please file a PR! Our next Twitter space will likely be on Monday at 5p Pacific Time; stay tuned to our Twitter feeds for details. We’d love to have you join us, as we always love to hear from new speakers!

Creators & Guests

Host
Adam Leventhal
Host
Bryan Cantrill

What is Oxide and Friends?

Oxide hosts a weekly Discord show where we discuss a wide range of topics: computer history, startups, Oxide hardware bringup, and other topics du jour. These are the recordings in podcast form.
Join us live (usually Mondays at 5pm PT) https://discord.gg/gcQxNHAKCB
Subscribe to our calendar: https://sesh.fyi/api/calendar/v2/iMdFbuFRupMwuTiwvXswNU.ics

Speaker 1:

Are you going? Is it there in us o'clock yet? Yeah.

Speaker 2:

Yeah. Yeah. Let's go. Let's go.

Speaker 1:

Oh, man. Alright. So I feel that when I tweeted out we can resist temptation no longer. I might not have been speaking for both of us. I feel No.

Speaker 1:

I

Speaker 2:

I've been so you I mean, Brian, you do all the heavy lifting around here such as we have heavy lifting, and you've come up with with a lot of great topics. And, honestly, like, I've been listening to the CarryU, new podcast and thinking you had a lot, and I and I tried to kinda hold up my end of the bargain, and I kept on coming back to, let's talk about their notes. But can we? Like, is that is that in our lane? Is that

Speaker 1:

too far out of our way? We can, and in fact, we must.

Speaker 2:

Digs there.

Speaker 1:

Okay. So you obviously I mean, I feel like everybody read Bad Blood so quickly after I mean, you read Bad Blood, obviously, when I came

Speaker 2:

to this.

Speaker 1:

Yes. Yes. How long did it take you to read?

Speaker 2:

I mean, like 15 minutes. Right? I couldn't put it I couldn't put it down. It was simply I And and at the end, as I recall, I was also I can't remember. What would when did that come out?

Speaker 1:

2017, 2018? Okay. Okay. Okay. Because, I was driving down Page Mill,

Speaker 2:

pretty frequently at that time period, and I won't go into the detailed reasons why. But passing by Theranos on, like, a pretty regular basis, which which made it a little more delicious.

Speaker 1:

Oh, that makes it much more delicious even. Yeah.

Speaker 2:

Yeah. Yeah.

Speaker 1:

So you're, like, watching all the cars, like, watching the parking lot go from packed to to not so packed.

Speaker 2:

Yeah. Yeah. And and I've been passing it for years. I mean, there was a time when I was, you know, on a daily basis driving down to, you know, 3 blocks away from

Speaker 1:

there. So I had was it I think, I mean, there's a there's a bunch to talk about here. I had this idea that I wanted to have, my former housemate, Tim, and longtime friend join us. You know Tim, Adam.

Speaker 2:

Yeah. Yeah. Yeah.

Speaker 1:

Wait. So, I Tim and I went to school together. We lived together. We shared a house with some sketchy roommates, in Menlo Park for for many years. And the Tim is biomedical engineer.

Speaker 1:

And Tim works in this parallel Silicon Valley that I feel we don't necessarily see that frequently, namely the biotech Silicon Valley that very much out here. And I feel like we don't really see that that much in part because they are not self aggrandizing assholes generally. Or they seem to you know what what I mean? Like, I just feel

Speaker 2:

No. Totally.

Speaker 1:

I I just feel like they are and and, like so I had this idea that, like, Tim could could join our our our Twitter space. And so so and and Tim's got a has doctorate in biomedical engineering. He works for Abbott Labs currently. And I was kind of running this by him. And he's like, yeah.

Speaker 1:

I don't not really on Twitter. It was just like one of these things, like, I need to stop talking. I'm gonna embarrass myself. But have you tried to explain Twitter spaces to someone who's not on Twitter?

Speaker 2:

No. But no. No. I've I we've talked about this in the past about how I mean, the closed stand alone is like a it's like a conference call with, like, a bunch of listeners. And, like, it doesn't sound that sexy.

Speaker 1:

Right. And Tim's like, don't you have, like, a company and a family? Like, why are you are you choosing to do this or someone making you do I'm like, no. No. Alright.

Speaker 1:

Look. This is not working. But the the the fact that, like, Tim's got the self control to not be on Twitter is exactly why we don't we don't necessarily and you know what? And this is, like, this is all you need to know about Tim. Tim and I both lived again with these sketchy roommates.

Speaker 1:

One of whom and you met my those anyway, the one of whom definitely enjoyed trolling us. She was maybe 20 years older. She had a bunch of social mores that she had a hard time. Anyway, the the so the difference between Tim and me is I remember vividly we're in the kitchen, and she's like, you know what? I've been thinking.

Speaker 1:

I really think that that science really is like a religion. Like any other religion. And why don't we treat science like good night, everybody.

Speaker 2:

It's like, I've never thought of it that way. Well, whatever.

Speaker 1:

Right. Cool. Good night. And I'm like, I can't extricate myself in this conversation that I know I shouldn't be in be doing. So that's but I do feel that there is this other side of of the biotech Silicon Valley, the the the Genentech, the I mean, there's and tons of companies.

Speaker 1:

I mean, PCR was invented in Silicon Valley. Right? We don't and so I don't feel we've got a very good insight into that. So I I just wanna get into that. I don't know if if if folks are are in, in biotech.

Speaker 1:

But maybe, Cole, I thought maybe you could kick us off with the link that you posted, which I thought was mesmerizing. Maybe you give some more context around, Larry Gold. I thought that was a really interesting talk he was giving.

Speaker 3:

Yeah. Totally. So that's the, the Gold Lab Symposium in Boulder. Connection there was my ex. Her dad was actually, like, their one of the early system architects that did a lot of their computer systems for that company.

Speaker 3:

Anyway, so, they have been working for, like, over a decade. We're, like, really hard on the science to try and make these assays, basically, diagnostic assays, and had made some progress, but, you know, it's hard. And then Theranos kinda came and, like, was, like, eating their hype lunch. You know, suddenly they were, like, kind of in the same space and claiming just truly incredible, results and kinda made them look a little bit bad. Anyway, so they were kinda watching this and saying, man, how are they doing this?

Speaker 3:

You know, how could this be possible? Like, their science must be incredible. And then it turns out it wasn't. And, they were justifiably kind of vindicated that they're kind of, like, long road approach. You know, Larry Gold often talks about, you know, needing to work on things, like, for a good 20 years, kind of in the space to really, like, make big societal changes.

Speaker 3:

And so this kind of, like, Johnny come lately kind of approach raising tons of money, then kind of seeing that they definitely had some Schadenfreude, I guess, of seeing them fail. So it was kind of an interesting thing.

Speaker 1:

So yeah. And I think it's it's there are bunch there's a bunch in there that's One is that this idea that I mean, science takes a long time, and hard problems take a long time. And they also require a lot of people. Like, it's not single people. It's really hard for a single person to be and I I feel that, like, that is not something that is well understood outside of these domains.

Speaker 1:

I feel like we give people the impression that kind of, like, single people move the world forward and from a technical and scientific perspective, and I'm not sure how accurate that is.

Speaker 3:

I think a lot of times in scientific things, I mean, you look at, like, their papers and whatnot, and they'll be, you know, 10, like, primary researchers. And that's just, like, how many people it takes to make a interesting progress in a lot of the kind of medical and, like, you know, biological sciences. So it's it is it does feel very starkly different from computing and, frankly, startups, around computing.

Speaker 1:

But I I don't think see. So here's what I wonder. So I don't think computing is all that different in that regard and that it, like, it takes a lot of people to solve a hard problem. And I think there's a danger whenever we give people the impression. The kind of the the the lone inventor, I think, is a really dangerous impression to give people.

Speaker 3:

For sure. Kind of that, like, myth of the, whatever. You know, kind of the the

Speaker 2:

Yeah. This this cult of the hero entrepreneur.

Speaker 3:

The hero. The hero programmer. Yeah.

Speaker 1:

The the yeah. And I think and it's because when we give people that that misperception that it is kind of single individuals, then you don't do your homework when you have this single plausible sounding individual. I mean, you're you're kind of like there's a degree to which you've been immunocompromised to people telling lies if you're gonna believe that single people, are able to advance science.

Speaker 3:

Yeah. For sure. I I think, you know, kind of, it reminds me of kinda like Linus Torvalds and kind of some people like this where we kinda kinda think like, oh, they invented it all out of nothing. Like, there was nothing, and suddenly there was Linux. And it's, it's not true.

Speaker 3:

It's always really building on other things. But we can definitely kinda get caught up in these, like, cult of personality. We kinda, like, we would want one singular person to kinda lay our praise on almost.

Speaker 1:

We do. I think Yeah. I think you're right. I think we we we do seem to have, like, a bias for it. And I can understand, like, the the I mean, these charismatic I mean, this is where, like, this is the danger of charisma.

Speaker 1:

Charisma can get people to do can get people to believe things, which is great, but it's also really dangerous when what they're believing is just wrong, is is is not supported by the science. So how so, Cole, did you encounter Theranos before this whole thing broke? Had had you No. Okay. So you No.

Speaker 1:

Not at all. Because, definitely, the folks that I knew that had I mean, it was definitely on a list of companies that drive scientific people nuts because and it's not the only one, I would like to say. So, I I mean, I don't know. Do you have, like, a list of these? Because I've got I I do have, like, a list of companies.

Speaker 1:

Okay. So do you remember UBeam? Oh, no. UBeam was going to do charging of batteries via ultrasound.

Speaker 2:

Wow. Okay. Is that a thing?

Speaker 1:

No. It's not a thing. No. It's like and this is just like look. You know, I'm a software guy, but come on.

Speaker 1:

This is odd. Like, no. That doesn't make any sense.

Speaker 4:

Wait a sec. Wait a sec. Wait. Wait, Brian. Say that again.

Speaker 1:

You heard correctly. The the this is the the the the premise was well, and I think and, again, it's just kinda, like, convenient idea that, like, boy, I don't wanna plug my device in. So wouldn't it be great if I could charge my device from across a room? And it's like, okay. But

Speaker 4:

Wait. Are are you sure that's charging or is that detecting charge state? I Because, my my my older brother who who you know well, he works for, for a battery company, and I used to work for an ultrasound company.

Speaker 1:

And we

Speaker 4:

actually had someone come in and propose

Speaker 1:

Don't make any difference.

Speaker 4:

Ability to to test start state with ultrasound, which actually makes sense.

Speaker 1:

Okay. I I feel like the messenger is about to be shot at here. So I just like as the messenger, I wanna say that, no. I am not misrepresenting it. What they actually wanted to do and if you look into this company any I mean, it's they have realized that it doesn't work, but after having raised tons of money.

Speaker 4:

What what was the name of this company?

Speaker 1:

Ubeam. Lowercase u.

Speaker 4:

So, like, MicroBeam.

Speaker 1:

I guess. I'm not sure what the and okay. Then, don't Adam, I got another one for you. Ubiome. Do you remember them?

Speaker 2:

No. Man, you've been hanging around some

Speaker 1:

Okay. Yeah. Okay. Yeah. No.

Speaker 1:

This is I am okay. Now wait a minute. I am not the only one. Am I do I collect these? Now that I am being I I gotta listen to these too.

Speaker 1:

Like, we're not even you remember you buy them? No. Okay. These are the people that were going to take, stool samples I don't

Speaker 4:

mind the canal.

Speaker 2:

I do remember this.

Speaker 1:

Okay. Thank god. God, I was, like just I'm describing it, I'm like, now okay. Now I'm gonna really see this.

Speaker 2:

Dream? No. No. I remember.

Speaker 1:

And they just totally crooked. A 100%.

Speaker 3:

Shut down insurance fraud, it says.

Speaker 1:

Yeah. It's a 100% crooked and same kind of thing where they what they were trying to do is, like, you know, there is a there are actually established industries that do this. And you're being kind of willfully ignorant of them because you because you find the constraints of the problem inconvenient. And then you manage to raise a lot of money on it, which is the real that is the kind of the the the shocking thing. And, you know, I I do feel like the the the challenge and I'm I don't know.

Speaker 1:

You've heard me say this before. But, like, I feel like to really innovate, the the challenge is that you need to balance the world as it is with the world as it isn't. You You know what I mean?

Speaker 2:

Yeah. Yeah. I mean, you do I I saw something of of, scientific inventions. It it kind of asked being asked about science fiction, And the quote was saying, nothing's ever been invented that no one imagined first. So, like, having that imagination of what what could be is certainly important.

Speaker 2:

But but you're right. It needs to be moderated by some dose of reality.

Speaker 1:

Well, you almost have to be, like, overcorrect on reality. When you especially when the vision is compelling, you've gotta, like, now we need to really go to reality. And we we need to be more even more reality focused. And I think that is well, in these companies, they lose that tethering with reality. Or they worse, I think in Theranos' case, they they suppress those who do understand the reality.

Speaker 1:

I mean, they're they're just, I mean, I I don't know, Adam. Did you have any, so I certainly know that that, my wife think was a little bit frustrated when I was reading Bad Blood because I was exclaiming so frequently while reading it.

Speaker 2:

Yeah. No. I mean, I I I had the same kind of visceral reaction to that. And I and I think in particular, some of the, you know, the imagineering that went into it. The, you know, because Theranos didn't start as this pinprick testing.

Speaker 2:

It was a patch that you'd slap on and it would diagnose and cure you. And it was it was almost, you know, childlike in its in its sort of like fantastic view of what could be.

Speaker 1:

Yeah. I guess I'd forgotten that. That that was Yeah. It was a patch. It was a curing patch.

Speaker 1:

I want one of those.

Speaker 2:

Yeah. Yeah. It was a curing patch. No. And and and it's like

Speaker 5:

How many people how many people are actually working in the valley with theory of innovation versus theory of science? Because I know in my studies in Europe, we had a lot of theory of how to take something existing and basically innovate on that.

Speaker 1:

Yeah. I do think and maybe this is what you're getting at, that I I think that oftentimes, there's this idea of, like, we are gonna destroy the old way of doing it, and the experts are wrong. It's like, well, that may just be why they call they're called experts, actually. They're they're actually the experts are not actually wrong. And one thing I've been haunted by, Adam, is the you know, a couple weeks ago, we had, Zach g Pascal Zachary talking about this East Coast versus West Coast thing.

Speaker 1:

And whether and these were like the the East Coast companies were really, like, grounded in making real things. The West Coast companies were kind of, like, flighty, basically. And I'm like, wait a minute. I've always gotta feel myself as really, like, grounded, but I've been out in the West Coast for my career. Like, am I in the wrong coast?

Speaker 1:

I you know, if I they got it up. Maybe I'm

Speaker 2:

No. I mean, I think think there is something like would you

Speaker 1:

I mean, when you explain

Speaker 5:

My study material in sorry. In my study material, I've seen a lot of whenever it was about innovation, that somebody's wrong, it was all about the business process is wrong. It was never about the technology is wrong. If I see somebody go the science is wrong, it's usually like 20 to 30 years until they have proven it enough to actually claim what is actually wrong. This is what I have kind of seen in my study material.

Speaker 6:

I I think that there's there's something there's a connection there between West Coast culture and science. So I used to hang out in the Berkeley Lab cafeteria with students, summer students who would come in the summer and they would spend 2 or 3 months at at the university lab working in somebody's lab on some kind of scientific experiment. And I'd ask them, you know, what is it like compared to Italy or Germany or wherever you came from? And, invariably, they'd always tell me the same thing, which is in Europe, you have to conclusively prove that what you want to do is not going to fail, Whereas what is awesome, what I'm finding awesome to be here, you know, in Silicon Valley is the attitude is try it. If it fails, that's okay.

Speaker 6:

I guess that's the other side is is there is an advantage to that risk appetite, but, obviously, it goes wrong sometimes.

Speaker 1:

No. That's a very good point. There's totally an advantage to that risk appetite. And I think that the you know, I think it's a very good point too about, like, when people are trying to disrupt the the science versus trying to disrupt organizations or businesses or or or what have you. I mean, I think it's actually just on points, I mean, with with what you're saying.

Speaker 1:

The I was reading a quote from the inventor of PCR, the the, and part of this chain reaction, which is, like, extremely important, very big breakthrough. And he was a big advocate of, like, that that the breakthrough came from playing around. It more than and that he never would have gotten kind of funded for kind of rigorous inquiry into this because it was something that was not something that people really thought to be possible.

Speaker 5:

Well, a side load to this, we had in studies, like I've studied in Europe. So we had as an innovative school really extensive psychological training, how failing can be a good thing to the point where we had classes on things like improv theater and so on.

Speaker 1:

And what you think is great, but it's like the the objective is ultimately not failure. You know what I mean? It's like, we I I do feel that we we need to somehow be unafraid to experiment and unafraid of the social consequences of that failure. But at the same time, we need to actually like, if we wanna make this stuff real, we need to understand why technically it's not working.

Speaker 2:

And Well, you're getting into the the fake it or you make well, until you make it aphorism. Right? I mean, the I think that's kind of at the crux of it is is how long you're able to, kind of kind of maintain that optimism and paint that picture of what could be, like, the the imagination, but then moderated by reality. And and when you call it quits, when you say, actually, we're not achieving this goal, it becomes a lot harder when you've got a $1,000,000,000 behind you betting that it's gonna work.

Speaker 1:

Right. And so alright. So, Adam, that's the question. So do you I mean, do we need to fake it until we make it? Is that somehow endemic in innovation?

Speaker 1:

You know what? I mean, not

Speaker 5:

Not necessarily. The point is what you want is the positive psychological effect of not getting absolute anxiety if things fail. Like I've seen the opposite effect of people where it's a culture the opposite culture where people really get anxiety when you try to make something new. So how can you combat the anxiety without trying to fake it?

Speaker 1:

Yeah, I think that that's it. Sorry, Adam, you were saying about

Speaker 2:

Oh, just just, you know, I think it's it's so tough because, you know, you know, fake it and and make it are are such absolutes in a sense. But then, you know, in all of these conversations, you know, whether, you know, at at Oxide or at previous companies I've been in, there is always that selling the vision. Right? Going to customers and not telling them strictly what you have, but also how what you could build and and how that would work. And and it's not faking it.

Speaker 2:

You know, it it it's clearly not making it. We haven't made it yet.

Speaker 1:

Yeah. So I so my view on that is, like, we've gotta be completely we anyone, Oxide, anyone, you just gotta be completely transparent about where you are and where you aren't even if that's delivering not the the problem is people want to hear that you're done. Right? Or you had the breakthrough, and that's not always the case or often not the case. And I feel that you've you've got to be transparent about where you are and where you're not.

Speaker 1:

And I definitely it it would simply occur to me that you would do anything else. I just feel that, like, because of despite its other faults, Sun was always relatively honest. Relatively honest in this regard. Now I'm thinking of all the kind of, like, the the frameworks that we're gonna end all the frameworks and right? We, never mind.

Speaker 2:

You've got Pico Java We're we're honest with everyone except for ourselves.

Speaker 1:

That's true. That's right. The, but I, you know, I and I Adam, I can't remember if I told you this, but, like, after I came to so after I joined with the CEO who was fired sometime thereafter. What I was in a board meeting with him, and he is referring to all the stuff as completed that was very much not And after the board meeting, I'm like, what the fuck was that? Like, we you and I both know that we haven't done all those things that you just told the board that we have done.

Speaker 1:

He said, Brian, have you ever heard of the the optative voice? Like, where is this going? It's like, in in in Greek, there is the optative voice. And in the optative voice, you refer to something in the future as if it's in the present. And I'm like, what the fuck is that?

Speaker 1:

Is that that is the that is the worst explanation for lying. I mean, it's like you're just lying. Like, you don't have an operative voice in English. There's no there there there's nothing in the verb conjugation. Like, the the board has we've lied to the board.

Speaker 1:

You have lied to the board. And I feel like that is but he definitely had this idea that, like, I can't tell people the truth because the truth is, like, bad news. And I think when people get really and, god, did do I've definitely seen this in my career too where people start off with something that is, like, just kind of like an exaggeration of where we are. And then that and that exaggeration then builds on itself. Exaggeration, and then we exaggerate on that.

Speaker 1:

And you get to the point where it's like, okay. Now we're nowhere near where we actually are. And now we've got, like, a big lie that needs to be undone. And my belief has always been, like, we should be and that's why I kinda think that and I and, Nicole, I think this kinda came up either in in the that your tweet earlier, I saw someone else talking about this. The the secrecy of Theranos, to me, is the one of many, but that should have been a red flag to a technologist working at Theranos.

Speaker 1:

You when a company has enshrined secrecy to that level, especially one that's working on a hard technical problem, I think you really need to ask the question, why are we being secretive about it? Mhmm.

Speaker 2:

Well, I mean but you've got you've got Apple and then, you know, obviously, with with Elizabeth Holmes' hero, Steve Jobs. But I think that the the the company, much more than Apple, that they're just, resembled was Next. And Next obviously not putting people's lives or livelihood at risk to the same degree. But you know, reading about the history of Next, it really paints a very different picture of the hero entrepreneur of Steve Jobs.

Speaker 1:

It totally does. And Jobs I mean, I know you have you listened to the Avi Tuvanian episode of Yes.

Speaker 2:

Yes. Yes. So that so this is Bad Blood, the final chapter where John Carrey is is discussing, kind of new facts in particular as they've come up at trial.

Speaker 1:

Which is great. I'm and, you know, I'm so glad that you are as strung out on this thing as I am. This is an episode that came out, like, like, 36 hours ago. But I was, like, I was looking forward to household chores over the weekend so I can listen to the the check that out. But so they talked to Avi Tabanian, who was a, an early whistleblower.

Speaker 1:

They he invested a $1,000,000 into Theranos. I was on the board and realized that, like, this is bullshit. We're not we're just, like we're seeing excuses, not seeing the actual dumpsters, the technology. And I thought he was interesting because that they talked about Steve Jobs in particular, and he contrasted Jobs to to Ultimate Homes.

Speaker 2:

To but to a degree, I found that his his I mean, so Tivania was was at Apple, and at Next and and obviously, like, a close associate of of Jobs. He he drew comparisons, but I thought, there was a lot of white space that that that he wasn't mentioning. So, like, you know, there's a first, there was a lot that he was saying about, you know, he he, on the board of Theranos, described, you know, deals and partnerships that were being talked about as being imminently done. And the next board meeting, they were talked about again as being imminently done. And that certainly felt familiar to me in in, know, when I was sitting in in the boardroom at Delphix when I was the CTO of that company.

Speaker 2:

And and I'm sure, Brian, that sounded very familiar to you from your joint boardroom experience.

Speaker 1:

Yeah. The well, in just like the way you kind of how do you speak of a deal?

Speaker 2:

That's right.

Speaker 1:

How do you represent a deal?

Speaker 2:

What's the degree to which, you know, you're you're being optimistic and presenting that optimism to the board and also feeling some, you know, when when things were not going well, as obviously they weren't at Theranos and there must have been some realization at some level that they weren't. The need to invent the proximity to to revenue or to close deals or to product completion or those kinds of things. But then where where it really diverged was Tavian's story of, of smoke and mirrors. I mean, really, really Wizard of Oz stuff. Well, I love when he

Speaker 1:

was like, hey. I'm on the board. You're you're in this deal. You're in this negotiation that's going on for, like, way too many board meetings. Can I, like, just I'm just curious?

Speaker 1:

Like, what's what's the latest on where we are with those negotiations? And have you gotten to this point in the podcast, Adam? Where

Speaker 2:

Yeah. Send me the PDF. Right.

Speaker 1:

Right. You send me the PDF. And she's like, oh, no. I can't because, like, it's illegal. It's like it's illegal.

Speaker 1:

It's like

Speaker 2:

it's like of course, it's illegal.

Speaker 1:

Of course, it's illegal.

Speaker 2:

It's like it's like a legal document that we're

Speaker 1:

like, lawyers on both sides reviewing, presumably. Also, I'm a fiduciary of the company. Like, I get to see whatever I want as a board member. Certainly, like, I that's been always my view. It's like, this is like we should be and I also kinda feel like, you know, I and I'm I don't know if you talked about this explicitly, but I always feel like my litmus test for a CEO is how do you deal with bad news?

Speaker 1:

Because as a CEO, like, you have to deal with bad news. And, like, how do you deal with that bad news? I feel like Elizabeth Holmes does not deal with bad news very well. No.

Speaker 2:

But in in that respect, I mean, Tavania describes Jobs as being receptive to

Speaker 1:

to Yes.

Speaker 2:

His pushback, but but not generally. I mean, and that and that that is both the reputation, but then also, you know, at Next in particular, he he him being totally immune to the facts or the truth.

Speaker 1:

And then what did you think about Larry Ellison lurking behind the curtain?

Speaker 2:

I I I'm glad that some of his money went into Theranos.

Speaker 1:

I don't not enough though. You know? Because the the when you look at I mean, the thing that that Cole the delightful link that Cole sent out. I mean, this guy, I really admire. I mean, of course, because this is a this seems like a very earnest boots on ground scientist who's dedicated his life to a company that's kinda being, like, paved.

Speaker 1:

As you said, they're they're like their their their hype is being stolen. And so he actually shows a list of the investors in, in their notes. And he's like, you know, it'd be it would be I'm not up here to, like, laugh at these people even though it's, like, very clearly why he's showing you the slide. But you look at who the big investors were, and they were not tech investors. You know, it's like Rupert Murdoch and, you know, Betsy DeVos and, you know, these folks that have really, did not and and they're investing a colossal amount of money.

Speaker 1:

Oh my god. I mean

Speaker 2:

Yeah.

Speaker 1:

Putting it like, Murdoch put in, like, what? 120 or something like that? 1 50?

Speaker 2:

Well, and then to the degree that they were I mean, some of them were still investors, but not and tech investors, but not biotech investors.

Speaker 1:

Well And, actually,

Speaker 2:

again, again, like like Next, there was that sort of familiarity with Ross Perot, you know, like, writing an a blank check.

Speaker 1:

Okay. So, actually, this is another good point out of your rego. The the kind of the the Ross Perot of Theranos because everyone was relying on someone else to do their due diligence. And it all goes back to that the professor Stanford who was kind of, like everyone was kind of deferring ultimately. All kind of paths led to him.

Speaker 1:

But just like, how does one do do because, like, the the folks that I knew in Silicon Valley who looked at Theranos were like, no. This is obviously we're obviously not investing in this. This is this is not this is not this can't work. So I there there there's a certain argument to be said that, like, actually, we you you can blame lots of things for Theranos, but you probably should not blame venture capital. That's or it's not that simple anyway.

Speaker 2:

Which is not to say that venture capital doesn't have tons of problems, just perhaps not this one.

Speaker 1:

Well, I'm just gonna called down to see if our investors are on the call before I okay. I'm kidding. I'm kidding. I know. And I actually you know, I I I do feel that the, honestly, most venture capitalists don't actually want tech risk, and this was clearly a lot of tech risk.

Speaker 1:

And so it was it was kinda natural that it was only gonna be a fit for kind of certainly for biotech investors. But I think for those that do embrace tech risk as a differentiation, I think those investors are very upfront about what it takes hard problem. Refreshingly so, honestly.

Speaker 2:

Yeah. But I'm not even sure the degree to which Theranos was presented to these investors as the test tech risk that it actually

Speaker 1:

to, the investors who actually invested. I mean, they felt that they were investing in something that was effectively a done deal. And the more money that was into that company, the more people thought, like, all these people can't possibly be wrong.

Speaker 2:

Yeah. You're right. That that circle of trust that actually had no fundamental, like, you know, connection. That that it was never grounded anywhere. Right.

Speaker 1:

And I think that you you gotta figure out, like, when you because this is the challenge. We got these, these big breakthroughs because, you know, we've got a a sibling company of ours funded by also by Eclipse. Cerebras, is doing wafer level silicon. And they've done I mean, that is an amazing company, incredible technical breakthrough, But there would've been a lot of people that would I I think I feel I would've been among them to say that wafer level silicon is not is something that's gonna have has all these technical issues that are not gonna be really resolvable. And, you know, you can actually solve hard technical problems as a start up.

Speaker 1:

It's just unusual.

Speaker 2:

Yeah. It it's interesting because you're right that Theranos, at least ostensibly on its face, did have hard technical problems underpinning it. Whereas, you know, if if the you know, Ellen Powell had this,

Speaker 1:

this, editorial in the

Speaker 2:

New York Times recently

Speaker 1:

Yeah.

Speaker 2:

Where she was contrasting, you know, WeWork and, Uber where there were I mean, certainly, there were familiar problems in those companies, but also different in these regards. In in regard that there was some, you know, ostensible technical innovation, at at least that was the dream.

Speaker 1:

I also feel that the Ellen Pao piece aired in that. I understand what you're trying to say. Terrible headline. The piece was actually not as bad as the headline. But I feel like it also aired in that like, you got a regulator present here.

Speaker 1:

And when you got a regulator present, if it's FAA, if it's FCC, if it's FTC, if it's FDA in this case, and you were gonna defraud that regulator, to me, like, you're not you like, you're out of the WeWork despite its many other failings, did not attempt why why am I why am I defending WeWork here? But the the the that's the gravity of the problem. There is no I mean, there there's no regulator per se. This is not life and death. It's office space.

Speaker 1:

That's right. And fairness is life and death. You know? And it's, like, it's amazing how badly it worked.

Speaker 2:

Yeah. Which is even yeah. I mean, it it it's amazing how many people were working at Theranos and how, like, broadly that must that secret must have been known. Right? Because you're you're working in a place where fundamentally none of the technology is achieving any of what you want.

Speaker 2:

And as as real scientists by and large, you know that you're circumventing all of the known processes and all of the science and all of the scientific methods that you have been taught through your, you know, 20, 30 years of education.

Speaker 1:

Yeah. And we got it. The episode that they where they describe going into the lab and discovering that it's like a chicken coop. And what was the they ended up blowing what through the building? There was a, I can't remember if it was a reagent, but they ended up basically having this, like, lab error where they ended up blowing this this substance through the building that is infamously hard to get out in any kind of biotech setting.

Speaker 1:

I'm sorry. I'm actually you know what I'm talking about?

Speaker 2:

No. No. Don't. I'm sorry.

Speaker 1:

Keep keep humming a few more. Yeah. Exactly. Right. Exactly.

Speaker 1:

Damn it. I know you. That's why you're just watching me drown in front of you. Yeah. Yeah.

Speaker 1:

Fine.

Speaker 2:

Yeah. No. Sorry. Was that that was in the podcast? I

Speaker 1:

don't know. Yeah. That was in the a podcast with it. But just to what you're describing about how people who are coming into this are in coming from other domains and they know this isn't the right way to do it. And I think that the I mean, god.

Speaker 1:

It must have been so difficult. I mean, it was difficult for those people. They would wash out and then clearly there was the suicide, which is just, I mean, horrific. Yeah. Absolutely horrific.

Speaker 2:

But but then Sonny deciding that this was actually a software that that, like, we see all around us of of saying that all of these things are software problems. And that that, you know, if only they could they could gather data, gather telemetry, and do productive machine learning, then the cost then, you know, patient outcomes would something something.

Speaker 1:

Yeah. I know. I think you there's definitely something to this because I also feel like there's this idea that software has this incredible trait that it's malleable after it's done. You know? And Yeah.

Speaker 1:

That leads us down this primrose path where we ship it before it's ready. Because, oh, we can just, like, upgrade ourselves into a functioning system. And that's powerful but dangerous. You know? Because you have this idea that you can be because, like like, the one that I look at this as a an example of where I'm worried that software thinking is infecting other domains is with the 7 37 max.

Speaker 1:

Mhmm.

Speaker 2:

Right.

Speaker 1:

Where you that just feels like a fake it till you make it, move fast, and break things kind of thing from a company that's been developing aircraft for, you know, whatever it is, 80, 90 years. And it's like, I feel like is software making you worse? I'm so sorry. We got are we are we vectors for some terrible disease that we're spreading?

Speaker 2:

We don't don't you feel that away all the time when you get your new Internet enabled toaster oven and, like, it's down because of a software update? Don't you feel like this is the opposite of progress?

Speaker 1:

It it does feel that way. And it does feel that, like and I I feel like we're we're still, like, grappling with this kind of this other thing that is software that is neither it it's it's neither pure information nor nor pure machine. It's this, like, this paradox in the middle, and I feel like there's all these, like, outgrowths of us not understanding that. And one of them is is the 737 max and the and Theranos thinking that, like, software is magically gonna solve what he's like. No.

Speaker 1:

This is a physical problem. This is not you're not gonna solve a microfluidics problem with software. You know? I feel they have that idea.

Speaker 2:

I I think as ludicrous as it sounds, I think that was almost verbatim the the operating plan.

Speaker 1:

Was to pull signal out of what is basically it's like it's interstitial fluid. It's a it also, can I just get this off my chest? Like, I hate pinpricks. I hate finger sticks. Like, I this is a part of just, like, the business plan that I didn't understand.

Speaker 1:

Like, I would do a blood draw before doing a finger stick. Am I the only person who feels that way? I don't know.

Speaker 2:

I don't know. I mean, I I think the the invented, genesis of Theranos being Elizabeth Holmes being afraid of needles, you know, I think that one really sticks to my craw too. This the, you know, kind of not enough to have the hero entrepreneur, A

Speaker 1:

Point of vulnerability. Yeah. I see what you're saying. That, like, she's like it it, like, triggers your own mirror neurons.

Speaker 2:

Yeah. Yeah. I mean, how can I blame you for solving this, you know, this deep ingrained, you know, animal brain kind

Speaker 1:

of, rehab? God. Yeah. You're right. Because she then she also has, like, the uncle who died of cancer that she'd like to talk about and how he would have been saved by it's, like, god.

Speaker 1:

Really? I mean, come on. But but

Speaker 2:

but this also gets to, like, I mean, some of the the I don't know. What I think of is the the problem or the challenge of entrepreneurship and and, the Bay Area of of the hero entrepreneur, of the person who looks like Steve Jobs or or Elon Musk or or Jeff Bezos. And and you you also wonder for for the for this Elizabeth Holmes, how many folks, were of the same makeup and the same, you know, scientific acumen, which is to say pretty minimal. But just didn't quite manage to raise, you know, a 1,000,000 or 10,000,000 or 100,000,000 or a 1,000,000,000 Right. Because there there there there's gotta be tons of them.

Speaker 2:

Right? Like, like, the ground is probably thick with them around Palo Alto.

Speaker 1:

I mean, UBeam, UBiome, I mean, I feel like I feel like I was rattling them off earlier. We not That's

Speaker 2:

right. Were you not listening?

Speaker 1:

We don't I we don't pay attention? Yes. No. I think I think you're right. And I when I think that that we are this the other thing is that I feel like when they are modeling themselves over what they perceive is just to your point, Adam, about the these kind of myths being dangerous.

Speaker 1:

And I think founding myths are very dangerous in companies. And you like, the Holmes is modeling herself off of a myth of jobs. Like, she's never she never worked with Steve Jobs. Doesn't what what she is modeling herself off of is what she perceives to be key to his success. And that stuff like that I mean, I personally believe that Jobs' secrecy made him less successful, not more

Speaker 2:

successful. Yeah. I mean, also shares a trait with Jobs is his incuriosity about history and his assumption that his own experience was, you know, replicatable to everybody else.

Speaker 1:

The the right. Exactly. That this is that if I if I simply did she, she, like, wanna hire Johnny Ive or something to do this? She she was

Speaker 2:

I think I I remember this from the book vaguely, but certainly the the look of the product sounding, very familiar with the Steve Jobs fascination with, like, the foot by foot cube and the particular matte black that was Right. You know, unmanufacturable.

Speaker 1:

Unmanufacturable matte matte black. You and I feel like the first time I really encountered Ther as, like, a, a critically was Jean Louis Gaset. So do you did you do you remember this?

Speaker 2:

No. I do remember Jean Louis Gaset.

Speaker 1:

Okay. So Jean Louis Gaset at Apple and then founded b and was, has a I can't remember what the the, the the health issue is, but needs to have regular blood draws. So he sees Theranos. He's like, alright. This is like a I'll go check this out.

Speaker 1:

This sounds good. And so he has this blog where he kinda describes various things. And he described his total check-in the mail goose chase of trying to use this thing. And it was pretty clear that, like, yeah, it doesn't exist. Like, it was it was just very clear that that this is actually not this is being just totally misrepresented.

Speaker 1:

And they were really good blog entries. So it'd be interesting to get his perspective on on all of this. Having insight I mean, being willing to, like, try the new thing is really important. But, boy, you've gotta be, you've gotta be reality centric.

Speaker 2:

And and to your point earlier, I mean, just candid in terms of the efficacy and the state of the product. We get what I mean, I know that we we sort of live with busted ass software all the time. But at least folks often the dignity of of calling it beta software. At least you you at least they give you that. At least when you're using Gmail for a decade, Google has told you it's a beta.

Speaker 1:

Are you I in the it Terry Root, I think and it, like, looked good on him to hey. Listen. You know, sun shining, make hay. But he definitely at one point, like, just vilifies the term beta.

Speaker 2:

Yeah. I mean, the and and and the fact that, you know, we have a we have a process for that, it's like a, you know, double bind controlled experiment that we would publish in a journal and then report the results. I mean, maybe the fervor around like an mRNA vaccine was so high that people would have taken the beta product sight unseen. But I'm glad I'm glad I got the the post beta version.

Speaker 1:

Yes. Yeah. I think that's so, obviously, yes. It's important to not have the beta version because we don't have the ability to but I also think it's a it's a little un me, again, I I I'm I'm conflicted about software because I do I I don't think we should wait till software is perfect to make it available when when it's Gmail.

Speaker 2:

And but it's never perfect.

Speaker 1:

Right. Right. I mean, so you you wanna have the ability to improve it over time, but but you gotta have but that that's not appropriate for all use cases, and that's not especially when you've got lives on the line. I think that is the best. That is just like and I feel the same.

Speaker 1:

We're seeing this right now with Tesla. Right? And the I mean, I I feel that the what is unconscionable to me about autopilot is its name. Yeah. Yeah.

Speaker 1:

It's like look. You literally have Tesla has what would be the safest car on the planet if they represented it properly.

Speaker 2:

You mean, like, if you kept your hands on the wheel or something?

Speaker 1:

Yeah. But you have to drive this thing like you have to drive any other car. Oh, by the way, this thing will if you have a heart attack at the wheel, this is the only thing that's gonna pull over when I mean, this is, like, we are not gonna represent this as something that drives for you. We're gonna represent this as a technology that makes you safer.

Speaker 2:

This is a great analog because, you know, there's the there's the case. I think the care you talked about with, the Uber automated I mean, autonomous, driving Yeah. Car, killing a pedestrian, you know, killing a jaywalker because the notion of jaywalking, you know, talking about West Coast culture was not was not built into this system. And I do wonder the degree to which you know, when if we see a more software oriented fraud perpetrated along these lines, if it's not gonna be in autonomous vehicles, you know, just because there there is real science there and I think that we are all being a little bit hasty in this regard.

Speaker 1:

Well, we are and it's like people you look at like Tesla's position around not using LIDAR, which basically boils down to an economic argument. I didn't it's gonna make the car too expensive. It's like, okay. Well, then maybe let's not call it a model pilot.

Speaker 2:

Can we? That's right. That's right. I mean, I I mean, I'm willing to pay a lot for my my car not to roll over

Speaker 1:

Well, and I just I do feel like this is the other kind of failure mode you see. It's like the economics are this is why, like, doing something new and doing something innovative is really hard because the economics are an important part, and you can't ignore the economics. And you also can't ignore, especially if you're gonna do something that is gonna be safety critical. You definitely can't ignore the safety. And, you know, I think that the, the I mean, I don't know.

Speaker 1:

God, the entity have you looked into, like, the Anthony Levandowski behavior at Google when he

Speaker 5:

was at Google? He almost

Speaker 1:

killed he almost killed one of his own engineers where they he if you you should look into this incident. This is, like, so distressing. And so this is the guy recall that he totally crooked. He misrepresented his company or the I I mean, walk This is Waymo or what? This is, Auto sold to Uber.

Speaker 1:

Right. So he is at Google. He then leads Google. He starts auto, sells it to Uber for, like, $600,000,000. Turns out he stole a bunch of Google IP.

Speaker 1:

And this is the guy who tried to claim the 5th in their trade secrets case, which is, like, he definitely definitely can't do. Judge Alsop ordered Uber to fire him, and they refused to do it. But the, you know, well, ultimately, they actually ceded, and he was fired. But this he was I mean, it ends with him being pardoned by Trump. Sorry to really this is all ringing a bell?

Speaker 2:

You know what? I I I I've been asleep for the last 4 years. So

Speaker 1:

Okay. God. Did you really did you in everything else to be upset about, did you miss being upset about this? This is extremely upsetting.

Speaker 2:

Yeah. Yeah. Yeah. Yeah. No.

Speaker 2:

I did. I I took a pass on being upset about this.

Speaker 1:

Okay. But now I you should get upset about this. So this is the

Speaker 2:

so You got me there.

Speaker 1:

Okay. Well, I no. No. No. I think we're gonna give you this.

Speaker 1:

Not not enough. Not enough. We're gonna give it no. So he the the basically, he committed larceny at Google and ultimately was and went to jail. And then but but and was pardoned because the and that pardon actually you want everything that's wrong with Silicon Valley.

Speaker 1:

The fact that people in Silicon Valley were lobbying for that pardon is gross as hell. So I I

Speaker 2:

do remember this now saying that, like, you know, this, this is a miscarriage of justice because people should be able to take risks and should be able to play fast and loose with some of these things.

Speaker 1:

Okay. I think Anthony Levandowski bothers me more than Elizabeth Holmes. Fight me.

Speaker 2:

So I'm I'm it's hard to fight through a Trump pardon.

Speaker 1:

I'll just put that I okay. There you go. I got

Speaker 2:

And, and, but the, you know, the the number of people impacted by Theranos is is pretty pretty tough to beat. And the duration of the fraud is also pretty impressive.

Speaker 1:

Okay. So to be yeah. But maybe

Speaker 2:

we need some kind of bracket, like, NCAA NCAA style tournament

Speaker 1:

Oh, that's a great idea. Of

Speaker 2:

of fraudsters.

Speaker 1:

A March Madness of Silicon Valley fraudsters. Oh, this is great. No. I think they're definitely so you're saying that is a final four matchup is what you're trying to say. Oh, oh, definitely.

Speaker 1:

Right. Definitely. Like, could go anyway. One seeds.

Speaker 2:

Yeah. Absolutely.

Speaker 1:

But they're both coming in as one seeds. Okay. Yeah. No. I agree with that.

Speaker 1:

Yeah. They're they're definitely high seeds. And then so so I just to like so I get the whole bracket. So are we putting, like, UBiome and UBeam? Are they coming in as, like are are are they do they have a play in game?

Speaker 2:

Those 2 there's, like, some those are Cinderella stories.

Speaker 1:

Cinderella story? Okay. I because they're in my bracket. I I I got I I got you being going all the way, so I'm really excited. Yeah.

Speaker 1:

Well, you know Exactly. I'm not putting any money on it, but the Yeah. No. Okay. I like your your your bracket of crookedness.

Speaker 1:

So okay. So the on here's why I think this. So one, I think that you say that I think Levandowski did more harm than Holmes because this thing so doesn't work that it was never gonna clear the end of the runway. And, I mean, I know John Carreyrou, like, rightfully thinks that he cracked this whole thing open, which is true to a degree. But he cracked this whole thing open because a a blogger who was following him much more closely basically gave him everything.

Speaker 1:

And I I think that it was never going to work. You know?

Speaker 2:

Yeah. You mean, I I guess you're you're saying the fact that it was never gonna work is sort of, a point, I guess, in favor or

Speaker 1:

Yeah. Yeah. Yeah. Exactly. Yeah.

Speaker 1:

No. The it in this tournament, it's lack of plausibility meant that it was you weren't just gonna you just weren't gonna be able to test that many people because it, like, literally doesn't work at all. So,

Speaker 4:

hey hey, Brian. Can I can I nominate a couple

Speaker 1:

Yeah? Absolutely. Yeah. What I mean is so where are are these high seeds, low seeds? Where are they?

Speaker 1:

You have to you have to just reveal what you think their seeds are when you nominate them. Okay. So so,

Speaker 4:

pretty low downs. Solyndra.

Speaker 1:

You know, okay. So I was wondering about Solyndra. So land explain Solyndra for people who have not heard of Solyndra.

Speaker 4:

Well, Solyndra was a was a rooftop solar company, that had a pretty good technology. They spent a lot of money on r and d. They did produce a working product, but they, you know, they came into the market right at the time that rooftop solar was crashing price. A lot of Chinese manufacturing coming online, really high volume. So I don't actually know if they were if they were corrupt, but they certainly got labeled as as corrupt, because, you know, they took a lot of government money.

Speaker 1:

Yes. They took a lot of government money.

Speaker 4:

They had some some high profile, politically connected, investors on both sides of the aisle. So, I knew some really good engineers who worked there, and I, like, I wouldn't accuse them of, of corruption or, you know, faking it, until they make it. I thought they were doing their best. It's just, you know, not every product, makes it in the marketplace, and sometimes companies fail.

Speaker 1:

That's an early exit in the tournament, I gotta tell you. I don't know. I just don't think I I I don't know, Adam. I don't think they're gonna go deep. What do you think?

Speaker 2:

Yeah. I don't think I don't think they've got the longevity this year. Maybe next year.

Speaker 1:

Maybe next year.

Speaker 5:

Alright. Yeah.

Speaker 1:

That'd be a lot more crooked. So the

Speaker 4:

so then the other one, and I will take my answer offline because this is gonna gonna gonna get a lot of heat, but Tethr.

Speaker 1:

Oh, yes. Yes. Definitely. Oh, well, and I feel okay. Yes.

Speaker 1:

I mean, I know that the and just in general, like, the book is not yet written on crypto. And when we we we are gonna need to wait. I mean, this is like the the this is the Warren Buffett line. Right? You don't know who is wearing swimsuits until the water goes out.

Speaker 1:

And I feel like the water has not yet gone out on crypto. I don't know what you but has has Tethr failed yet, Lamp? I'm not, like, I I

Speaker 4:

Like, within within the next 24 hours, it certainly could.

Speaker 1:

Hasn't that been true for, like, the last 4 years? I definitely thought that for the last 4 years.

Speaker 4:

Well, I feel like with this, Evergrande situation, which may or may not be directly related to it, but it's certainly, certainly peripheral.

Speaker 1:

Adam, do you know anything about Tether? Land, you should explain Tether because I I think,

Speaker 2:

you know, I'm just looking it up now. No. I'd love to hear about

Speaker 4:

it. Alright. So so, like, Tether is it's not exactly a crypto. It's it's what they call a stablecoin, which is to say that, it's a crypto coin whose price is permanently pegged 1 to 1 with the US dollar, which right there means it doesn't do anything. Right?

Speaker 4:

It doesn't, like, it's it doesn't go up in value, exists, is to act as an intermediary between US dollars and and these fluctuating bit cryptos like Bitcoin. Right? So so, basically, the way it works is if you wanna buy Bitcoin, you put your dollars into Tether, and they give you Tethers at a one to one rate.

Speaker 1:

Right. Just because you give me your dollars, I take your money.

Speaker 2:

Okay. Good. Now I'm out a $100. I got it. Okay.

Speaker 2:

Good. Next.

Speaker 4:

Then they give you Tether, and then you use your Tether to buy your and then you play the Bitcoin market and goes up and down, whatever. If you want to get out of Bitcoin, you trade your Bitcoins for Tether, and then you trade your Tether for dollars, and then you're out of the

Speaker 2:

So how is this different than when I go to the, like, the the amusement park, and I take my money and I put it into the machine and it gives me tokens that I can only use at the amusement park.

Speaker 4:

So here's how it's different, which is that Tether is totally invested in self promotion, which means they're doing all these sorts of things where it's like, if you do all these transactions, yeah. If you do all these transactions, they'll give you free Tether. If you get your friends to sign up, they'll give you free Tether. Like, everything you do, they'll give you free Tether. And then, you take those Tethers and you pump them into the Bitcoin, and then the price of Bitcoin goes up and up.

Speaker 4:

And so now, like, there's huge amount of Tether that's been issued. It's not, like, it's not clear how much money they have, because at first they said that all their Tether are backed up 1 to 1 with US dollars. Then they admitted that's not true. Then they said they're holding a paper. Then they said their commercial paper may not be any good.

Speaker 4:

So, like, it appears to me that the sole purpose of Tether is to break the linkage between dollars and Bitcoins. So that you can, you can do things with Bitcoin without, like, without feeling, without feeling it in your dollars for now.

Speaker 2:

Well, this could might be kind of a sleeper pick in the tournament. I do like the shell game. I do like the evolving story. I think that's got a lot of traction.

Speaker 4:

Oh, yeah. And they've been, like like like, the story has definitely been evolving. I think for a long time, Tether was saying they wouldn't even disclose who their, who their principals were. They, like

Speaker 1:

Always good sign.

Speaker 4:

Yeah. They they won't disclose where their money is or or or where it's held. And then they just constantly do things like like, you know, they push a couple buttons on their keyboard, they generate another $1,000,000,000 worth of Tether. 500,000,000 of that goes into a Bitcoin investment and the other 500,000,000, no one quite knows where that

Speaker 1:

went. I'm saying they're here to play. They can go deep.

Speaker 2:

Yeah. For sure.

Speaker 1:

So but I the thing about Tether is that it has been and Tether is obviously extremely vulnerable to any run on Bitcoin will presumably capsize Tether. But they seem to have defied gravity for so long that this is the problem. Like, things these things the rational thing and this is true for Theranos too. Theranos went on for way longer than it should have. And but when these things correct, they correct way faster than anyone thinks possible.

Speaker 1:

So I don't know. Land, do you think do you think Tethr is is, Tethr's in trouble?

Speaker 4:

Well, you know, my brothers and I always say that, the market the market can stay irrational longer than you can stay with.

Speaker 1:

Yeah. That's that's the old Cain system, and it's it's definitely true.

Speaker 4:

Yeah. And I, I mean, I would love to take a short position on Tether, but I I I looked into it a little bit. And and the only currency denomination of a short investment you can take on Tether is is Tether, you know, it's in Tether currency.

Speaker 1:

I mean, you have to admire that at some level. That's an act of genius. I mean, there there's a there's a dark genius there somewhere.

Speaker 4:

Yeah. I mean, this thing could crash in the next 24 hours. It could crash 10 years from now. I I have no idea.

Speaker 5:

Isn't there a lawsuit going on to crush it from the government?

Speaker 4:

The last I heard, the government said they were going to look into whether or not it needed regulation. I, I hadn't heard, you

Speaker 1:

know, that I just wanted to say that. Just like, please, god, let me know. My my 17 year old has been working in a job. Like, please let you not be investing your proceeds from your job into but, you know, maybe there's only one way to learn. There's only one way to learn.

Speaker 1:

And and, you know, having lost harder money

Speaker 4:

There are several ways to learn.

Speaker 1:

Yeah. Well, that's I I tell this is good pick. And I I think that there are, but so the the and sorry. I just say not to get us back too much on on Levandowski, but I just wanted to to wrap up this this Prius thing. So the he Google felt the Prius was unsafe for highway speeds.

Speaker 1:

Adam and Levandowski more or less he just secretly had this thing drive at highway speeds and he got into an accident and almost killed the almost killed the engineers in there.

Speaker 2:

Oh my goodness.

Speaker 1:

Just in terms of, like and, like, that is so I do feel that that that is worse because there's a level which I get worse because it's more plausible. I do feel that way, and yet it's still just as dangerous.

Speaker 2:

Yeah. This, this a bleak bracket, the more entrance I start thinking about.

Speaker 1:

Right. If you if you heard a company called a Better Place.

Speaker 2:

No.

Speaker 1:

They they spent a that was a, I think they burned through, like, I wanna say, $800,000,000 in capital, which is a large number.

Speaker 2:

Yeah. That's impressive.

Speaker 1:

Yeah. Exactly. Well, it's like, how do you get the 800,000,000? That's the rate that like, that's the real question when someone burns to that kind of capital. No.

Speaker 1:

They were gonna make a, they had this idea that instead of recharge so you got a car. It's an electric car. You wanna recharge the battery. Wouldn't it be great is it if you could just swap the battery out with a charged battery?

Speaker 2:

I I read about is this an Israeli company? It's

Speaker 6:

an Israeli

Speaker 1:

company. Yeah.

Speaker 2:

I read about this in The Economist, like, a 1000000000 years ago. Yeah.

Speaker 1:

Did you read about it in an obituary?

Speaker 2:

No. No. I read about it in births, I think.

Speaker 1:

Do you do you read The Economist regularly?

Speaker 2:

I did before I had a 4 year old. I have or a second child.

Speaker 1:

Do you read obituary?

Speaker 2:

I did. Yeah. Back in the day.

Speaker 1:

I'm sure. Yeah. Yeah. I've I've I've already I obituary to you. Yeah.

Speaker 1:

The obituary is it is it is obituary in The Economist is definitely the best part of this. So the part that I miss about The Economist, I've I'm with I'm with you on the whole. Kids and The Economist sadly don't mix.

Speaker 2:

It's tough.

Speaker 1:

It's it's tough. Somehow, the kid is into The Economist, which I I do not seem to have any of those. Nope. Alright. So so What about yeah.

Speaker 1:

Sorry. Go ahead.

Speaker 4:

What about that juice company?

Speaker 2:

Oh, Juicero. Juicero. You know what? I I might one of my I gotta share one of my greatest regrets. So as I was sitting on a San Francisco bus, the person across from me had a Juicero backpack.

Speaker 2:

And I I stopped the whole bus ride about offering that person whatever it would take to get that backpack, and I'm and I'm I will always regret not making that offer.

Speaker 1:

It is, Adam, I you know, I I wanna make it easier on you, but I just can't. I

Speaker 2:

No. It's just a mistake I have to live with.

Speaker 1:

It's a mistake you gotta live with. And I I we we can't undo that. You know what I mean? Like, that's it's made, and it's something you're gonna bear forever. But a Juicero backpack, man, that would've been I mean, I don't wanna, like, sharpen the dagger here.

Speaker 1:

I'm so

Speaker 2:

upset. I feel terrible. Yeah.

Speaker 1:

Got it. Would've been, like, that was it a good looking backpack?

Speaker 2:

It was fine. It was fine. It was not, like, an a tier swag, but, like, obviously, like, you know.

Speaker 1:

Well, and Juicero had that guy had I mean, that this is where you also get into trouble with people who've made a bunch of money, and they go to start the next thing. And this is where you get, like, the melt. Come to the Melt? Do you ever go to the Melt?

Speaker 2:

Oh, yeah. I go to Melt all the time. Venture my venture funded With grilled cheese?

Speaker 1:

Venture funded the good cheese. Sure. Because the guys had yeah. We a had sold Flip for whatever it was, 600 plus $1,000,000 to Cisco, which made no sense. Did you do you remember Flip?

Speaker 2:

This was like the video streaming device. Right? Yeah.

Speaker 1:

Yeah. Yeah. It was a really nice device, actually. And sold that to Cisco for and, like, why is Cisco buying it? Cisco, like, 3 months later, is, like, why did we buy this?

Speaker 1:

Where was I on did I I kinda remember I the last thing I remember is, like, we were at the bar together, and then I woke up with Flip. What happened?

Speaker 2:

The most sober explanation I heard of that I think, you know, when we were actually selling stuff to, at Delphix, we were selling stuff to Cisco at the time. They're like, well, you know, streaming video, lot of network bandwidth. I was like, uh-huh. And, like, can you actually

Speaker 1:

can you finish that sentence for me? Because I see sort

Speaker 2:

of where you're going, but Okay.

Speaker 1:

Can you

Speaker 2:

just just I know I'm being a little dumb here, but just finish the pitch.

Speaker 1:

Just walk me through. I know. It makes no sense. And so but then the I do feel like this is the success teaches you nothing department where you've got people that have especially, they've made their investors a lot of money. If you've made your investors a lot of money, they're gonna fund the next thing no matter how, like, obviously wrong it is.

Speaker 1:

And that's where people lose a lot of money, I feel, is when it's that second thing. Because the there's a degree to which an entrepreneur is like, this I mean, because, actually, we got a friend of ours who made name was that who worked for a company that was this this guy had 2 successful companies, and he was on his 3rd. And I'm like, that is not necessarily good. That could be very, very dangerous because this person thinks that starting a company is not hard. And that company imploded.

Speaker 1:

Because it wasn't that he's like, I don't get why this doesn't seem that hard. It's like, no. No. It's actually it's it's really, really, really hard, especially if you're solving a hard technical problem.

Speaker 2:

Yeah. Absolutely.

Speaker 1:

So are anyone else on in your bracket, Adam? You know what? I don't know that I

Speaker 2:

got anyone else who came to play with with the Juicero's and, and, you know, the the stool samplers.

Speaker 1:

You buy them. Yeah. The Cinderella story. Come on. You buy them.

Speaker 1:

That's right.

Speaker 2:

No. No. I I don't think I have

Speaker 1:

anyone who came to play with those those big boys. Okay. So then the next question is for as technologists, what would be some of the warning signs that you would look for that, like, I actually think this compelling vision, but I don't think this company is real?

Speaker 2:

Oh, you nailed it on transparency. Right? Because because the the there are ostensible reasons for for secrecy. But the most urgent reason for secrecy is secrets. Right?

Speaker 2:

The for for billing Yeah. For for, you know, for there being aspects of the product that are deceptive or that aren't working or where the truth is is harmful for the company. So, I I mean, I I I Apple is the elephant in the room, but, you know, I can't I can't think of any other company that that where where secrecy has been done in some sort of, like, notionally healthy way, which is to, like, forgive Apple in ways that I'm not really ready to.

Speaker 1:

Yeah. And I think that it to those especially outside of, I mean, I know most of us in here are inside of tech. But for those outside of tech, they may not realize that, like, even someone refusing to tell you the details of what they're working on for fear of, of industrial espionage, that's a total red flag. Like, ideas are we are at a point where the hard problems are not stealable.

Speaker 2:

Yeah. In particular, in software technology. Right? That that may be different if you're at, like, a battery materials company or something where the materials you're using could give away the game. I'd I'd imagine.

Speaker 1:

I say no. No. No. No. I say no.

Speaker 1:

No. No. No. I say no. I this is my problem with Intel and Octane.

Speaker 1:

Right? Intel was always super secretive about Octane. Optane. They would never tell us how it worked. It's it's presumably phase change memory, but we never really got details on it.

Speaker 1:

And as it turns out, like, that was a problem. That secrecy is and I I don't know. I like the way you phrase it though. Like, the purpose of secrecy is often the secrets. That secrecy was a real problem.

Speaker 1:

So, no, I actually think that in in in okay. Fine. I'm an oversharer. But I actually think that that transparency, we should seek that in all domains. I think that's always red flag.

Speaker 1:

Oh, yeah. Absolutely. Other one has gotta be board composition. And this is the one, Cole. You your the the professor Gold, doctor Gold flashed this in his presentation that that you'd linked to.

Speaker 1:

I think this is another big one. I think you gotta look whenever you are at a company that's taking a big swing solving a hard problem, you need to look at that board to see, does this board understand what's involved in a hard problem?

Speaker 2:

Yeah. But does it does this board make sense for the for the problem and the space and the market? Because, I mean, you see these companies that do have these celebrity boards and you know I'm sure they're good for marketing material but they're not good for actually guiding the company especially on as they face these technical challenges.

Speaker 1:

They are not good. And, because in particular, a board when you're solving a hard problem, you're gonna need to deal with a lot of disappointment as a board. You're just gonna things are hard. That's what it is. It's a hard problem.

Speaker 1:

If you're not doing this appointment, it wasn't a hard problem. But for a hard problem, the there's gonna be disappointment, and you want a board that is gonna that has been there, has seen it, and knows how to grind through it, and knows how to be be and hitting that right balance between positivity and optimism and realism. So what do you think about so, Adam, should people when they're looking at board conversation, do you think they they so that that's one thing I had heard recently that people, when they're looking at a new job, should ask to speak to the board.

Speaker 2:

Like, to to to all of them?

Speaker 1:

I don't know. You know, it was kinda left open like that. Yeah. I mean, yeah. Un unclear.

Speaker 2:

That's an interesting one. I, you

Speaker 1:

know, I I think I've always

Speaker 2:

or I think for a lot of positions, that feels like a big ask. That feels like kind of a I mean, especially now that I've, like, worked with the board in a couple of different capacities, you know, it, you know, it's sort of like, do you really want me to ask to, you know, spend one of my bullets that way? I mean, I can. Right. But but they feel like bullets.

Speaker 2:

And I don't and it's interesting. I mean, you know, what's I think it's an interesting idea, but what's that gonna get you? You know, unless unless you're really crisp about what you're going to ask those folks and how you're gonna interpret those. Because, you know, boards are very important for companies, but they're not day to day operating the company. And they're not even day to day thinking about the company.

Speaker 2:

For a lot of boards and a lot of board members, they're thinking about the company every 6 to 8 weeks, you know, maybe 24 hours in advance to the board meeting when the, you know, c e CEO frenetically is sending out slides at 2 in

Speaker 1:

the morning. It's first of all, it's not 2 in the morning. I'll thank you to know that it's 1:15. So it's not but, please, we're we're civilized right here.

Speaker 2:

For what it's worth, I was talking about me.

Speaker 1:

There there you go. Exactly. Okay. So here here is the the question, though. I think you're right, and you gotta be very focused.

Speaker 1:

And the and I agree with you that it's generally like if someone asked us for this and when we would do it, it would definitely be a little bit like, alright. Wait. What are you asking? The question to ask, I feel, is what are this is a company that's solving a hard technical problem. What are their next milestones from your perspective in the board?

Speaker 1:

Yeah. That's good. Seeing if they both, you know, on

Speaker 2:

both sides are are clear clarity on that point.

Speaker 1:

That's exactly right. For that exact reason. To see if, like, hey. You know what? I talked to your board, and their idea of your next milestone and your idea of your next milestone are not related to one another.

Speaker 1:

And that would be a big red flag. That'd be a big problem.

Speaker 2:

Yeah. And and a place to tease out if there is being a, overly optimistic translation going on from presumably the technical work that you have have come to an understanding about and what's

Speaker 1:

being presented to the board. Yes. Now I think at much more boots on ground, I think you,

Speaker 2:

you know, that's a

Speaker 1:

that that's a big problem. Yeah. And if you're not getting that, that's a that's a big problem. Yeah. And I think so do you think you should I mean, like, you almost wanna get into, like, their GitHub issues.

Speaker 1:

You really wanna get, like you wanna get dirty, I feel.

Speaker 2:

In terms of evaluating company? Yeah. In terms of in particular with regard to to whether they're being fraudulent.

Speaker 1:

Yeah. I'm just, like, trying to put myself in the position of someone who is, like, evaluating something that sounds like a Theranos or might be a Theranos. Like, how do you how do you yourself from either investing in that or investing that. In many ways, we're all investing in these companies because we're investing it with our time. I see we're it's the good time, I think.

Speaker 1:

We're we're getting we're getting a little bit over here. Adam, I I do wanna ask you, just predictions, not just the bracket predictions. Of course. I I I still think that uBiome can actually go the distance and can surprise us all. Does Elizabeth Holmes go to jail?

Speaker 2:

I think so. I think so. I think it's gonna be, like, I think the question is for for how long. But, but, yeah, I I think she's going to jail. I think they're gonna be able to demonstrate that she had, you know, an awareness that she was committing fraud.

Speaker 2:

It wasn't just a Silicon Valley over optimism or, you know, Svengali defense. How about you? What do you

Speaker 1:

I think she's going to jail. I do agree with that. I think that there's just I mean, I mean, good god. If she's I mean, this is like, there are no consequences for anything. But then I think as we will ultimately need to be upset and trolled, she'll be pardoned by some future joker.

Speaker 2:

No. Well, this is a terrible potential.

Speaker 1:

That that's terrible. But no. That's terrible. That's a joke. But but I mean, can you wait a minute.

Speaker 1:

Yeah. It's I

Speaker 2:

But then she'll be be president and pardon herself.

Speaker 1:

Oh my god. Oh my that is chilling. Oh, but right now, like, I feel like that is it just feels way too plausible. Alright.

Speaker 2:

Yeah. Alright. With that happy

Speaker 1:

On that happy spot, Elizabeth Holmes, pardon yourself. Lord, save us all. I'm gonna talk to you next time.

Speaker 2:

Thanks, everyone.