WEBVTT

NOTE
This file was generated by Descript 

00:00:00.000 --> 00:00:02.329
Joe: There's reality
which is loving awareness

00:00:02.460 --> 00:00:05.929
Sam: unconcerned by the arising
and passing away of phenomena

00:00:06.149 --> 00:00:07.999
Ali: And then there are the 10 000 things

00:00:21.993 --> 00:00:24.393
Sam: Hello and welcome
to the 10,000 things.

00:00:24.723 --> 00:00:25.893
My name is Sam Ellis.

00:00:26.054 --> 00:00:26.416
Joe: I'm Joe

00:00:26.416 --> 00:00:26.654
Ali: Lowe.

00:00:26.804 --> 00:00:27.974
And I'm Ally Kriti.

00:00:28.514 --> 00:00:31.604
Joe: Today on the show, who is your guru?

00:00:31.658 --> 00:00:37.858
Uh, we live in an age of online gurus,
public intellectuals, cult leaders,

00:00:38.088 --> 00:00:41.748
influencers, and uh, we thought
we'd have a chat about it today.

00:00:41.758 --> 00:00:48.928
It came about from Sam's frequent
references to Slavoj Žižek,

00:00:48.928 --> 00:00:49.498
Sam: Slavoj Žižek, I guess,

00:00:49.599 --> 00:00:53.059
Joe: just maybe, and I've had a
few over the years, Ali, I'm not

00:00:53.059 --> 00:00:54.899
sure, have you ever had a guru?

00:00:54.974 --> 00:00:58.274
Ali: not one in particular, but there's
certainly been people I've resonated

00:00:58.714 --> 00:01:02.508
with and aspirationally wanted to
sort of, yeah, the things that I've

00:01:02.508 --> 00:01:06.968
taken from them, but, but I wouldn't
say I've necessarily put all of my.

00:01:07.533 --> 00:01:08.093
Eggs in one

00:01:08.093 --> 00:01:09.743
Sam: basket, Nigella and Robbie.

00:01:09.893 --> 00:01:10.053
Joe: Yeah.

00:01:10.053 --> 00:01:14.723
So, so, because what I found
thinking about why Sam often wants

00:01:14.723 --> 00:01:16.313
to refer back to Zizek is like.

00:01:16.313 --> 00:01:17.463
An

00:01:17.463 --> 00:01:17.953
Sam: appeal to

00:01:17.953 --> 00:01:18.523
Joe: authority.

00:01:18.583 --> 00:01:22.203
Why do you want to filter your
thoughts through someone else's?

00:01:22.233 --> 00:01:22.903
Uh, no.

00:01:22.933 --> 00:01:26.073
Or why do you want to refer back
to someone who most of us wouldn't

00:01:26.073 --> 00:01:27.583
know who he is and whatever.

00:01:27.643 --> 00:01:30.073
Sam: No, your complaint
is that he's too famous.

00:01:31.058 --> 00:01:36.128
Joe: Uh, my complaint is he's
incomprehensible, and erratic, and

00:01:36.408 --> 00:01:40.048
self contradictory, and basically, what
I don't get is you're a much clearer

00:01:40.048 --> 00:01:44.708
communicator, and as far as I can tell,
thinker than he is, so why do you want

00:01:44.708 --> 00:01:49.878
to defer to him, and when I jump online,
what I see is him sitting in front of

00:01:49.878 --> 00:01:52.703
rapt audiences, making Basically no sense.

00:01:53.093 --> 00:01:57.213
And then once every 20 minutes, he says
something that basically anyone down

00:01:57.213 --> 00:02:00.041
the pub could say like, Oh, it was good.

00:02:00.041 --> 00:02:00.420
It was good.

00:02:00.420 --> 00:02:04.013
Like it was good when Trump
handed out those 2, 000 checks.

00:02:06.553 --> 00:02:08.983
And when he does that, he says
something vaguely comprehensible

00:02:08.983 --> 00:02:10.223
that the crowd goes wild.

00:02:10.233 --> 00:02:11.013
They're like, Oh, wow.

00:02:11.013 --> 00:02:12.073
Zizek made a point.

00:02:12.103 --> 00:02:17.603
And it's like, Okay, so, confusing times
we live in, a lot going on, incredibly

00:02:17.603 --> 00:02:21.633
confusing man, that you want to refer
things in this podcast back to him.

00:02:22.053 --> 00:02:26.163
So then it got me thinking, well, he's
obviously a bit of a guru for Sam.

00:02:26.723 --> 00:02:30.183
And we were going to explore his ex ideas,
but you weren't up for that, which makes

00:02:30.183 --> 00:02:31.633
sense because they're incomprehensible.

00:02:32.663 --> 00:02:33.615
I'll tell you what I wasn't up for.

00:02:33.615 --> 00:02:36.993
So, I thought we could explore the
broader concept of, of wanting to

00:02:36.993 --> 00:02:37.793
have, like I'm having fun already.

00:02:38.303 --> 00:02:39.873
Who are the public intellectuals?

00:02:39.873 --> 00:02:43.973
Why do we want them in an age of YouTube
where you can absorb like literally

00:02:43.973 --> 00:02:45.623
hundreds of hours of someone's thoughts?

00:02:45.663 --> 00:02:46.533
Yeah, that's right.

00:02:46.873 --> 00:02:50.603
They can jump on YouTube and talk to
talk about the latest thing in the news.

00:02:50.613 --> 00:02:51.573
So you can filter.

00:02:51.623 --> 00:02:51.953
Mm hmm.

00:02:52.403 --> 00:02:55.513
Sam: Well, my, my, um, I grew
up with gurus, like, you know,

00:02:55.563 --> 00:02:56.623
like the Hare Krishna religion.

00:02:56.623 --> 00:02:56.863
Sure.

00:02:56.883 --> 00:02:58.243
Krishna's the, the boss.

00:02:58.663 --> 00:03:03.743
He's at the center of a non binary
God at the center of creation,

00:03:04.343 --> 00:03:06.868
but It's all through the Guru.

00:03:07.558 --> 00:03:10.668
The Guru is the, you know,
the way to access the truth.

00:03:11.398 --> 00:03:16.068
And you cultivate, and it also
veers into, as these things often

00:03:16.068 --> 00:03:20.398
do, into Guru worship, which then
becomes part of the faith itself

00:03:20.398 --> 00:03:21.718
and not just an expression of it.

00:03:21.748 --> 00:03:25.918
So, like, we can't worship God directly.

00:03:26.788 --> 00:03:30.708
Well, we can, we can worship these
representations in the form of deities,

00:03:30.708 --> 00:03:35.796
you know, like the statues, the idols,
as a Catholic might call it, and we can

00:03:35.796 --> 00:03:42.481
worship their earthly servant, the guru,
and in fact, we should serve each other,

00:03:42.501 --> 00:03:46.231
and that's how, we should be humble,
and we should, you know, et cetera,

00:03:46.231 --> 00:03:49.271
and that's how we'll become enlightened
and spiritual and find God and stuff.

00:03:49.511 --> 00:03:54.332
So you have to, Like, some people
will place Jesus at the heart of their

00:03:54.332 --> 00:03:58.112
Christianity and others are kind of
more directing their thoughts towards

00:03:58.702 --> 00:04:00.722
a slightly more impersonal God.

00:04:00.810 --> 00:04:04.745
So in the Hare Krishnas, it's like the
guru is, God is the saviour ultimately,

00:04:04.745 --> 00:04:09.169
but the guru is The in between
saviour and they're gonna, they're

00:04:09.169 --> 00:04:10.829
gonna help get you through this mess.

00:04:10.949 --> 00:04:14.129
Like you're in a crisis and the,
that being born on this earth

00:04:14.149 --> 00:04:16.429
is a crisis and the guru is
going to help you through that.

00:04:16.539 --> 00:04:17.839
So the guru has the answers.

00:04:17.919 --> 00:04:18.719
The guru has the answers.

00:04:18.829 --> 00:04:19.359
They're, they're,

00:04:19.359 --> 00:04:22.039
Joe: they're an authorised representative.

00:04:22.039 --> 00:04:25.549
It sounds like something set up for
inherently for horrific things to

00:04:25.549 --> 00:04:25.799
Sam: happen.

00:04:25.839 --> 00:04:26.369
Oh, of course.

00:04:26.379 --> 00:04:31.489
It's a recipe for disaster, but it's
remarkable how often it doesn't do that.

00:04:31.609 --> 00:04:35.949
And You know, it's not an, it's not
a complete unmitigated disaster.

00:04:35.959 --> 00:04:37.119
In the West, it was.

00:04:37.179 --> 00:04:42.389
So when you take this guru thing and then
combine it with Western values and like

00:04:42.439 --> 00:04:45.459
marketing and, you know, big budgets.

00:04:46.499 --> 00:04:49.579
Well, you end up with guru scandal.

00:04:49.759 --> 00:04:54.309
And that's But India, they have guru
scandals there too, but a lot of the

00:04:54.309 --> 00:04:57.999
gurus over there, they're small time,
like they've got their little local

00:04:58.259 --> 00:05:04.719
band and, you know, you can go see
some of the, you know, here's this tiny

00:05:04.719 --> 00:05:09.849
little building where this guru met with
his followers and it's nothing grand.

00:05:09.939 --> 00:05:12.249
It's like a, basically a dirt hut.

00:05:12.699 --> 00:05:14.369
And it's a little bit better than that.

00:05:14.399 --> 00:05:18.329
You know, it's got cow dung floor
and, you know, it's clay rendering.

00:05:19.094 --> 00:05:20.184
My brick or something.

00:05:20.214 --> 00:05:22.814
It's pretty basic and like they
could have fit maybe 20 people

00:05:22.814 --> 00:05:26.134
in there and that's that's that
Guru is looking after 20 people.

00:05:26.134 --> 00:05:26.644
That's it.

00:05:26.944 --> 00:05:30.856
That seems sustainable to me When
they've got like the Hare Krishna's

00:05:30.866 --> 00:05:34.836
had thousands of followers around the
world Thousands and you've initiated

00:05:34.836 --> 00:05:39.706
every last one of those people in a fire
sacrifice full proper Vedic ceremony

00:05:40.566 --> 00:05:43.535
and they've said In vows in Sanskrit.

00:05:43.835 --> 00:05:49.555
I like will place my spiritual
progress In your hands, and

00:05:49.915 --> 00:05:51.405
I will take your instruction.

00:05:51.495 --> 00:05:53.385
I will do what you tell me.

00:05:53.755 --> 00:05:57.115
And that is the most, like, yeah,
explicit and condensed form.

00:05:57.535 --> 00:06:03.625
So why do we talk about influences or,
you know, Douglas Murray or, uh, you

00:06:03.625 --> 00:06:06.340
know, the Weinsteins or Jordan Peterson.

00:06:06.380 --> 00:06:07.890
Uh, why do we call them gurus?

00:06:07.940 --> 00:06:09.780
Well, because people do a similar thing.

00:06:09.790 --> 00:06:10.890
They go, that's it.

00:06:11.050 --> 00:06:13.410
I'm going to do everything
Jordan Peterson tells me to do.

00:06:13.500 --> 00:06:16.090
I'm going to think everything
this person tells me to think.

00:06:17.290 --> 00:06:21.940
Now with Zizek, I'm referring to him a lot
like I would a scriptural authority, not

00:06:22.010 --> 00:06:29.180
so much to back up my points, but more to
say there is something here I can explain.

00:06:29.180 --> 00:06:31.740
And as you say, I can explain
it more clearly perhaps.

00:06:31.895 --> 00:06:35.055
But, if you want to know more
about this, go to the source.

00:06:35.155 --> 00:06:39.644
Go to the guy who has thought about
it more than me and go, I'm really,

00:06:39.644 --> 00:06:42.654
I'm hoping people will go and dig
in and find something in there

00:06:42.654 --> 00:06:44.744
because I've taken bits and pieces.

00:06:44.763 --> 00:06:49.350
But I actually just want to pass
people along to a higher power.

00:06:49.350 --> 00:06:49.850
But

00:06:49.850 --> 00:06:50.620
Joe: he is your

00:06:50.630 --> 00:06:51.320
Sam: favourite, right?

00:06:51.540 --> 00:06:52.780
He's one of my favourites, yeah,

00:06:53.050 --> 00:06:53.390
Joe: yeah.

00:06:53.620 --> 00:06:55.270
Could you simply outline?

00:06:55.367 --> 00:06:55.957
A few of his

00:06:55.957 --> 00:06:56.397
Sam: ideas.

00:06:57.017 --> 00:07:01.237
He's had a very long career, um, covering
a lot of different things, but I think

00:07:01.237 --> 00:07:06.569
it's fair to say he's a philosopher, first
and foremost, and he's interested in, you

00:07:06.569 --> 00:07:13.947
know, I think worthwhile problems, like
how do, you know, how do the forces of,

00:07:14.002 --> 00:07:19.669
ideology, media, Money, how does all that
impact the consciousness of, the masses?

00:07:19.829 --> 00:07:24.329
So he's interested in consciousness, he's
interested in culture, he's interested

00:07:24.329 --> 00:07:29.909
in films and what films say and how
they speak to us and how ideology

00:07:29.909 --> 00:07:34.259
works through cinema, how it tells
us what to desire and tells us what

00:07:34.259 --> 00:07:36.809
to value and shapes that powerfully.

00:07:36.870 --> 00:07:37.370
But what

00:07:37.410 --> 00:07:40.320
Joe: he's interested in making the person
to go to, if you want to know what's

00:07:40.660 --> 00:07:44.590
the interpretation of what's say going
on in the news, which is when I turn

00:07:44.590 --> 00:07:50.480
on YouTube, what I see is him talking
about Gaza or Ukraine or whatever.

00:07:50.510 --> 00:07:53.640
And what I find is he, he,
uh, he's talking nonsense

00:07:53.640 --> 00:07:54.820
actually a lot of the time.

00:07:54.830 --> 00:07:55.220
Yeah.

00:07:55.280 --> 00:07:58.440
Sam: Sometimes he speaks in riddles
and paradoxes because like all good,

00:07:58.780 --> 00:08:00.080
many, many good philosophers do.

00:08:00.300 --> 00:08:03.440
Joe: Talking about climate change, he was
talking about four or five degrees of.

00:08:03.755 --> 00:08:08.415
warming and London being underwater and
I'm like, Oh, that was 10 years ago.

00:08:08.415 --> 00:08:09.865
That was a possibility, but.

00:08:10.445 --> 00:08:15.365
It's not anymore, so the conversation's
moved on, so, but he finds it fun, right?

00:08:15.365 --> 00:08:17.305
To him, it's fun and it's funny.

00:08:17.785 --> 00:08:18.335
Yeah, okay,

00:08:18.335 --> 00:08:19.495
Sam: we could, look, we could

00:08:19.495 --> 00:08:20.305
Joe: probably, yeah.

00:08:20.305 --> 00:08:23.935
So they're the kind of people that I
find, they're the people I get really

00:08:23.935 --> 00:08:27.825
angry at because a lot of people are
actually listening to them and they're

00:08:27.825 --> 00:08:30.675
talking, they're spreading fear and lies.

00:08:30.725 --> 00:08:30.935
Alright,

00:08:31.035 --> 00:08:34.155
Sam: well let's do the, let's,
okay, the UK has been experiencing

00:08:34.185 --> 00:08:38.780
a rapid And frightening,
apocalyptic increase in flooding.

00:08:39.210 --> 00:08:41.050
They're getting belted every other year.

00:08:41.051 --> 00:08:41.320
No, no,

00:08:41.320 --> 00:08:42.950
Joe: no, he's talking
about sea level rise.

00:08:43.190 --> 00:08:45.740
Sam: Okay, there's two things, yes.

00:08:46.010 --> 00:08:46.873
Sea level rise and flooding.

00:08:46.873 --> 00:08:47.078
Let's not

00:08:47.078 --> 00:08:49.810
Joe: get into it because four
or five degrees is nonsense.

00:08:49.820 --> 00:08:50.520
He's talking nonsense.

00:08:50.790 --> 00:08:51.050
Okay.

00:08:51.060 --> 00:08:52.310
Four or five degrees is nonsense now.

00:08:52.370 --> 00:08:52.910
Right, so.

00:08:52.920 --> 00:08:53.470
We're not looking at that.

00:08:53.480 --> 00:08:55.110
What, is it an old clip?

00:08:55.111 --> 00:08:56.790
No, no, this is like
a couple of weeks ago.

00:08:56.940 --> 00:08:57.990
Look, so.

00:08:58.050 --> 00:08:59.070
He finds it funny.

00:08:59.280 --> 00:09:01.390
There's new books about
the world's already ended.

00:09:01.815 --> 00:09:05.145
There is no future, so let's just
have a play around with the past,

00:09:05.185 --> 00:09:06.325
that's basically what he's saying.

00:09:06.326 --> 00:09:08.135
It just came out,

00:09:08.135 --> 00:09:08.845
Sam: 2023.

00:09:08.846 --> 00:09:09.755
Okay, yeah, I haven't read it yet.

00:09:10.975 --> 00:09:15.255
I mainly access his work through
the people that interpret it.

00:09:15.465 --> 00:09:18.925
So I listen to people that read his
stuff, and then And if that's not a

00:09:18.925 --> 00:09:22.390
Joe: guru What is, like, there's
people out there listening to

00:09:22.390 --> 00:09:23.140
Sam: his words.

00:09:23.140 --> 00:09:23.870
I listen to

00:09:24.270 --> 00:09:24.987
Joe: the interpreters of the scripture.

00:09:24.987 --> 00:09:27.250
To me, just on a common sense level,
his words are basically nonsense.

00:09:27.250 --> 00:09:30.270
Yeah, but no, but I've And then people
are interpreting them on podcasts,

00:09:30.270 --> 00:09:33.720
and then you're listening to the
interpretations of the utterings of Zizek.

00:09:34.030 --> 00:09:34.390
Sure.

00:09:34.670 --> 00:09:37.480
And then you're going, this is a
great way to understand the world.

00:09:37.520 --> 00:09:40.180
And I'm going back to first
principles and looking at him going,

00:09:40.180 --> 00:09:41.310
no, no, he's talking nonsense.

00:09:41.490 --> 00:09:43.260
Sam: Well, I'll tell you, I'll
tell you what he's, what I've

00:09:43.260 --> 00:09:44.460
gathered from his principles.

00:09:44.630 --> 00:09:48.250
He's interested in justice and
freedom, but not in any simple

00:09:48.260 --> 00:09:49.330
version of what they are.

00:09:49.900 --> 00:09:54.480
He's interested in maximum human
dignity and how the difficult,

00:09:54.520 --> 00:09:57.150
very, very difficult problem
of how we actually do that.

00:09:57.620 --> 00:10:02.810
Not always starting with like practical
proposals, let's tax this or do that.

00:10:03.160 --> 00:10:04.460
He's interested in.

00:10:05.365 --> 00:10:09.085
Unpicking the difficult problem of how
to even talk about it in the right way.

00:10:09.585 --> 00:10:14.025
And I think you and he would entirely
agree that these are difficult

00:10:14.055 --> 00:10:15.805
conversations to do in the right way.

00:10:16.185 --> 00:10:20.795
So he's just trying to get through all
the, the baggage and ideology and delusion

00:10:20.795 --> 00:10:24.035
and the rest of like, what's actually
going on in like our little homo sapiens

00:10:24.075 --> 00:10:28.885
brains and like the, how do we respond
to the information environment we're in

00:10:29.405 --> 00:10:33.135
and he's, and to analyze all of that,
he's drawing on Marx, he's drawing on.

00:10:33.600 --> 00:10:40.330
Lakhan, who was drawing on Freud,
and Hegel, you know, so Hegelian

00:10:40.331 --> 00:10:45.300
dialectics, how does history work,
how does the human consciousness work,

00:10:45.300 --> 00:10:49.320
so he's trying to address those sort
of two problems at the same time.

00:10:49.496 --> 00:10:53.499
So, and he's putting out books at a
huge clip, it's impossible to summarize

00:10:53.499 --> 00:10:55.009
his output in any meaningful way.

00:10:55.299 --> 00:11:02.105
What I get attracted to is when I hear
other people reporting back with a With

00:11:02.105 --> 00:11:07.485
a sense of excitement and like I had a
breakthrough on this My understanding

00:11:07.485 --> 00:11:11.555
of this thing and it was reading this
so I keep hearing smart people say

00:11:11.575 --> 00:11:16.042
interesting provocative things that
get me thinking And have like long

00:11:16.092 --> 00:11:19.282
complicated discussions about it, and
I don't follow all of it But then their

00:11:19.282 --> 00:11:23.672
source material is Zizek so in the end it
doesn't matter to me all that much what

00:11:23.682 --> 00:11:29.702
he says It's a universe of discourse that
he's created, and I'm part of that, and

00:11:29.702 --> 00:11:31.372
I'm interested in being part of that.

00:11:31.479 --> 00:11:36.939
And what he does, what seems to do for
thousands of people, is just provide

00:11:37.098 --> 00:11:38.488
pretty much the opposite of what you said.

00:11:39.028 --> 00:11:43.048
Those people would say, he really
gives me this no nonsense, cuts

00:11:43.158 --> 00:11:47.812
through, he attacks wokeism more
effectively than anyone on the right.

00:11:48.417 --> 00:11:51.167
And he attacks all the sacred
cows of the left more effectively

00:11:51.167 --> 00:11:53.097
than anyone on the right.

00:11:53.947 --> 00:11:57.857
And, for example, one of the famous
things that he was misunderstood was

00:11:58.197 --> 00:12:00.357
explaining the value of racist jokes.

00:12:00.488 --> 00:12:00.758
Right?

00:12:00.963 --> 00:12:04.453
And this is one that I'm not going
to attempt to do because I don't

00:12:04.453 --> 00:12:06.043
necessarily know if I agree with him.

00:12:06.438 --> 00:12:09.058
But, it's a powerful
argument that he makes.

00:12:09.468 --> 00:12:14.188
Or another one, people think, he said
once, Oh, people think I disagree with,

00:12:14.238 --> 00:12:15.858
you know, the gender movement, right?

00:12:15.939 --> 00:12:19.339
And, I've maybe not made
my position clear enough.

00:12:19.399 --> 00:12:21.269
He said, I'm interested
in a common morality.

00:12:21.269 --> 00:12:23.349
There's nothing that complicated
about what I'm saying.

00:12:23.869 --> 00:12:27.289
And I am 100 percent in solidarity
with the gender movement.

00:12:27.779 --> 00:12:30.199
I just think they don't go far enough.

00:12:31.219 --> 00:12:33.929
They say they want freedom
and they need to get it.

00:12:34.334 --> 00:12:37.114
By others should allow
me to define my gender.

00:12:37.794 --> 00:12:42.504
Of course they should, but you should
be asking for so much more than that.

00:12:42.864 --> 00:12:46.854
So you're wrong in the sense
that your target is too small.

00:12:46.904 --> 00:12:47.654
Gender movement.

00:12:47.804 --> 00:12:48.454
Go bigger.

00:12:49.324 --> 00:12:52.894
You know, so he's just, he's all
about prodding and pushing and

00:12:52.894 --> 00:12:55.874
just telling everyone constantly
go bigger, go bigger, go bigger.

00:12:56.994 --> 00:12:57.524
Joe: I like that.

00:12:57.544 --> 00:13:01.954
What I've observed in the culture
is that people mod of once.

00:13:02.629 --> 00:13:07.589
Filtered their ideas through a news
outlet like a New York Times or read

00:13:07.589 --> 00:13:10.319
the Age in Melbourne and it's kind
of, everyone's kind of generally

00:13:10.319 --> 00:13:13.519
reading the same thing and it's been
replaced with And our gurus were

00:13:13.519 --> 00:13:15.589
Sam: those boring columnists
that we would read every

00:13:15.589 --> 00:13:15.619
Joe: week.

00:13:15.619 --> 00:13:16.299
I think so.

00:13:16.359 --> 00:13:18.539
It's been replaced with individuals.

00:13:18.979 --> 00:13:22.889
So like at one point Zizek debated
Jordan Peterson but they're two

00:13:22.889 --> 00:13:27.869
sides of the same coin like You put,
yeah, like you and people that you're

00:13:27.869 --> 00:13:32.529
listening to put a high value on the
utterances of this one individual.

00:13:33.079 --> 00:13:33.549
Right?

00:13:33.669 --> 00:13:35.699
I look at it and think,
wow, this is nonsense.

00:13:35.979 --> 00:13:41.619
And then, and then another person comes
along and puts a very, a lot of people

00:13:41.619 --> 00:13:44.409
put a very high value on Jordan Peterson.

00:13:44.899 --> 00:13:47.779
But then he says other
things which are nonsense.

00:13:47.829 --> 00:13:48.079
Well, he's

00:13:48.139 --> 00:13:49.989
Sam: one, he's one great
thing Jordan Peterson said.

00:13:50.384 --> 00:13:53.104
Really resonated with me, and this
was fairly early on, but I was

00:13:53.104 --> 00:13:55.144
already aware of the many problems.

00:13:55.874 --> 00:13:57.534
I was getting a sense of
where it was all gonna go.

00:13:58.014 --> 00:14:01.564
But I must admit, I'm like, man, this
is a brilliant quote, and I don't

00:14:01.614 --> 00:14:03.204
mind sharing it and attributing it.

00:14:04.229 --> 00:14:07.679
It's your duty to make your
children acceptable to others.

00:14:09.109 --> 00:14:12.479
And I'm like, yeah, yeah, you're
a fascist, you're a weirdo,

00:14:12.519 --> 00:14:15.059
you're a reactionary, but there
is some truth in that, you know?

00:14:15.339 --> 00:14:18.839
There's no use pretending that if you
let your kids be anything and everything

00:14:18.839 --> 00:14:22.129
and you don't tell them there's any
need to be accountable for anything and

00:14:22.129 --> 00:14:26.009
don't let anyone tell you anything, it's
like, well, then it's not going to work.

00:14:26.209 --> 00:14:26.469
Yeah.

00:14:26.959 --> 00:14:27.499
He's right.

00:14:27.729 --> 00:14:30.329
They do need to be acceptable
to society on some level.

00:14:30.379 --> 00:14:32.869
Joe: But I guess what I'm
saying is, I think it's worse.

00:14:33.334 --> 00:14:38.955
I think it was better when, like, people
had similar sources and they were, they

00:14:38.965 --> 00:14:43.880
had editorial, they had editors, they had
editorial boards, they had Now that it's

00:14:43.880 --> 00:14:50.350
individuals, it's like you and your Zizek
bros versus Jordan Peterson and his bros.

00:14:50.560 --> 00:14:53.730
There do tend to be men, there's
groups of all over the internet.

00:14:53.730 --> 00:14:56.550
Zizek has way more female followers
than Jordan Peterson, but anyway.

00:14:56.610 --> 00:14:59.610
I just went and had a look this
week at the ones that I've looked

00:14:59.610 --> 00:15:04.476
at before, like Yuval Noah Harari,
Douglas Murray is another one.

00:15:04.526 --> 00:15:10.086
You know, Russell Brand has tried to be
that, but because he's always been so

00:15:10.136 --> 00:15:15.626
dodgy, there's always, I don't know, like,
he's one who tried to go even beyond guru

00:15:15.846 --> 00:15:18.126
and become like a, uh, a messiah, right?

00:15:18.126 --> 00:15:18.926
Oh, a hundred percent.

00:15:18.976 --> 00:15:22.186
So, but it's amazing how far they get.

00:15:22.196 --> 00:15:25.216
They get all, they're
all, they're all thriving.

00:15:25.276 --> 00:15:25.406
And

00:15:25.406 --> 00:15:28.386
Sam: the thing that brings them
down is either a financial scandal,

00:15:28.486 --> 00:15:32.238
a sex scandal, Or people just
eventually go, just move on.

00:15:32.398 --> 00:15:32.908
Oh, but see,

00:15:33.128 --> 00:15:33.388
Joe: yeah,

00:15:33.658 --> 00:15:33.788
Ali: yeah.

00:15:33.788 --> 00:15:37.608
They lose their relevancy, but
it's the thing at the moment with

00:15:37.678 --> 00:15:40.578
everybody with access, well, you
know, most people with access to a

00:15:40.578 --> 00:15:43.078
phone can create their own platform.

00:15:43.088 --> 00:15:43.588
That's right.

00:15:43.718 --> 00:15:47.538
And so, you know, anyone could start a
Tik Tok or YouTube channel off their phone

00:15:47.538 --> 00:15:49.948
and start, you know, sharing their ideas.

00:15:49.948 --> 00:15:51.598
And so, yeah.

00:15:51.618 --> 00:15:54.968
Whereas before, if it was coming from
a place of say, someone who's been.

00:15:55.403 --> 00:16:00.053
Educated to become a journalist to then
write in a column who's, you know, coming

00:16:00.053 --> 00:16:03.950
from a really, it's, it's a much more
limited sort of, view that you're going

00:16:03.950 --> 00:16:07.100
to get if, and then everybody reading
the same thing versus now when it's just

00:16:07.120 --> 00:16:08.930
anybody, it's a bit of a free for all.

00:16:09.190 --> 00:16:11.330
Sam: Yeah, and I don't,
and that's what you find

00:16:11.330 --> 00:16:11.650
Joe: alarming.

00:16:11.650 --> 00:16:14.740
I don't think we're in a, in a
healthier intellectual environment

00:16:14.750 --> 00:16:15.810
than say our parents were.

00:16:15.810 --> 00:16:16.430
But can we, can,

00:16:16.580 --> 00:16:16.950
Sam: okay, but.

00:16:17.375 --> 00:16:20.315
Can we settle, can we bring
reality into it for a second?

00:16:20.685 --> 00:16:23.685
So back in the good old days when
there were editors and consensus

00:16:23.685 --> 00:16:27.065
reality was in better shape and public
epistemology, there was no crisis.

00:16:28.145 --> 00:16:28.895
I disagree.

00:16:29.005 --> 00:16:31.725
There were multiple crises throughout
the 20th century, but we're just

00:16:31.725 --> 00:16:33.205
forgetting about all of them for now.

00:16:33.565 --> 00:16:34.415
We didn't live through them.

00:16:34.645 --> 00:16:35.595
We're living through this one.

00:16:36.075 --> 00:16:39.405
Now, when do I agree there's
a crisis epistemologically?

00:16:39.405 --> 00:16:39.725
Yeah.

00:16:39.935 --> 00:16:41.585
And I'm absolutely fine with it.

00:16:41.665 --> 00:16:44.155
I think there has to be an
epistemological crisis because

00:16:44.165 --> 00:16:46.015
the previous shit isn't cutting

00:16:46.015 --> 00:16:46.135
Joe: it.

00:16:46.185 --> 00:16:48.935
Can you define epistemology
for the audience?

00:16:49.435 --> 00:16:50.335
Sam: How we know.

00:16:51.085 --> 00:16:54.905
So, you know, sort of the, some
of the two, the two basic sort of

00:16:54.905 --> 00:16:56.375
problems in philosophy, you know.

00:16:57.265 --> 00:17:00.225
The problem of existence and the problem
of knowledge, and you know, so Descartes

00:17:00.455 --> 00:17:03.705
deals with both of those in Meditations
on First Philosophy, and he decides that

00:17:03.705 --> 00:17:05.605
God solves both problems, for example.

00:17:05.985 --> 00:17:09.685
Well, I think therefore I am solves
the problem of existence, you know,

00:17:10.105 --> 00:17:13.425
and philosophies of, different
philosophies deal with these problems

00:17:13.425 --> 00:17:15.685
in different ways, different religions
deal with them in different ways.

00:17:16.320 --> 00:17:20.850
And people have expressed the moment we're
living in as an epistemological crisis.

00:17:20.860 --> 00:17:23.450
A crisis in how we know.

00:17:23.710 --> 00:17:25.840
Others have even said it's
an ontological crisis.

00:17:25.890 --> 00:17:28.780
People are doubting their
own existence on some level.

00:17:28.780 --> 00:17:33.630
And social media is a way of,
we're sort of praying that others

00:17:33.630 --> 00:17:34.950
will validate our existence.

00:17:35.150 --> 00:17:39.361
And I find all of that does not
induce the kind of moral panic in

00:17:39.361 --> 00:17:40.631
me that it does in other people.

00:17:42.131 --> 00:17:46.381
To be honest, I was experiencing moral
panic before consensus reality broke

00:17:46.381 --> 00:17:49.851
down, and my panic was based on the
fact that consensus reality just was

00:17:49.861 --> 00:17:52.031
not keeping up with real reality.

00:17:52.261 --> 00:17:55.696
And so, with the New York Times,
for example, the paper of record,

00:17:56.446 --> 00:17:59.106
never gotten it wrong apparently,
or just only with small things, and

00:17:59.106 --> 00:18:00.306
they've always issued corrections.

00:18:00.446 --> 00:18:01.706
Complete bullshit.

00:18:02.476 --> 00:18:04.606
New York Times were
all the way on Vietnam.

00:18:04.926 --> 00:18:10.766
So, that conflict Basically, their
support was instrumental in that

00:18:10.766 --> 00:18:12.266
conflict lasting as long as it did.

00:18:12.606 --> 00:18:17.877
They defended it and protected it in so
many ways, and were constantly on the

00:18:17.877 --> 00:18:23.497
record as this is a justified war, it's
a viable war, and, you know, we will

00:18:23.497 --> 00:18:25.167
prevail and there will be a good result.

00:18:25.410 --> 00:18:27.651
I think we can all agree
that just isn't so.

00:18:27.651 --> 00:18:29.352
So that prompts an uplift.

00:18:29.352 --> 00:18:33.435
Every time the public, the penny
drops about every 20, 30 years.

00:18:34.025 --> 00:18:36.806
Oh, all the big commentators got it wrong.

00:18:36.932 --> 00:18:39.412
And everyone looks around and
goes, well, who should we listen

00:18:39.412 --> 00:18:41.212
to then, if not the New York Times?

00:18:41.652 --> 00:18:48.062
And now what about Iraq, 2001, Twin
Towers, Pentagon says, perfect.

00:18:48.132 --> 00:18:50.822
We had some plans in the top
drawer for just this occasion.

00:18:51.042 --> 00:18:52.862
Let's invade two countries at once.

00:18:53.212 --> 00:18:56.322
Afghanistan, not that hard
to get that over the line.

00:18:56.832 --> 00:18:58.182
Iraq, much, much harder.

00:18:58.472 --> 00:18:59.392
Let's distort.

00:19:00.702 --> 00:19:04.182
Uh, let's really put our thumb on the
scales here at the UN and with the

00:19:04.182 --> 00:19:05.342
mass media and all the rest of it.

00:19:05.342 --> 00:19:05.872
Well, guess what?

00:19:05.872 --> 00:19:07.812
You didn't need to tell CNN to support it.

00:19:08.132 --> 00:19:09.232
They were happy to queue up.

00:19:09.672 --> 00:19:14.172
So the so called left wing
media, absolute nonsense.

00:19:14.202 --> 00:19:15.162
They totally backed it.

00:19:15.162 --> 00:19:15.882
They're not left wing media.

00:19:15.882 --> 00:19:16.682
They're corporate media.

00:19:17.132 --> 00:19:22.662
Fox, CNN, NBC, all the big
papers, all queued up to support

00:19:22.662 --> 00:19:24.872
it 100 percent of the way.

00:19:25.697 --> 00:19:28.797
All the way with Bush, just like in NAMM.

00:19:29.187 --> 00:19:31.267
And then all those veterans
went, Yeah, we're off to get

00:19:31.267 --> 00:19:32.347
the weapons of mass destruction.

00:19:32.477 --> 00:19:34.417
We're off to liberate
the women of Afghanistan.

00:19:34.517 --> 00:19:37.107
Then they all come home, busted
and fucked up, and go, well,

00:19:37.107 --> 00:19:38.467
all those people lied to us.

00:19:38.660 --> 00:19:39.370
Then what do they do?

00:19:39.610 --> 00:19:41.500
Trump comes along and says, Fake news.

00:19:41.550 --> 00:19:43.090
And they go, yeah, damn right.

00:19:43.740 --> 00:19:44.280
It is.

00:19:45.120 --> 00:19:46.020
There's no denying it.

00:19:46.070 --> 00:19:46.990
It was fake news.

00:19:47.020 --> 00:19:48.610
There's no fucking denying it.

00:19:49.130 --> 00:19:50.380
Trump didn't cause this problem.

00:19:50.380 --> 00:19:51.270
He exploited it.

00:19:51.270 --> 00:19:53.270
Yeah, but a broken clock

00:19:53.270 --> 00:19:54.260
Ali: is right twice a day.

00:19:55.140 --> 00:19:59.020
Sam: Those broken men turned to fascism,
Jo, and that's the situation we're in now.

00:19:59.080 --> 00:19:59.670
But is that

00:19:59.720 --> 00:20:04.974
Ali: sort of in, like in a post sort
of fake news world where that is, The

00:20:04.974 --> 00:20:09.294
information that, yeah, you cannot be
trusted that we, you know, we get whether

00:20:09.294 --> 00:20:11.234
even from mainstream news sources.

00:20:11.234 --> 00:20:11.784
Yes.

00:20:11.834 --> 00:20:15.054
How does that play out though now
with like, when you, we've got

00:20:15.054 --> 00:20:18.684
another two wars on two fronts in
Gaza and Ukraine and like, Yeah.

00:20:19.174 --> 00:20:21.374
Sam: And the mainstream media
dropping the ball on this one too.

00:20:21.434 --> 00:20:21.594
Yeah.

00:20:22.219 --> 00:20:26.559
Ali: So, so why did, so we
gravitate towards these outliers.

00:20:26.579 --> 00:20:27.679
Joe: Go to individuals.

00:20:27.729 --> 00:20:30.879
And then what happens it's so it's
more flawed than the New York Times.

00:20:30.879 --> 00:20:33.405
But the individuals, you know,
you can trust Brian, trust Brian.

00:20:33.405 --> 00:20:36.329
I can't trust anyone, but I
can trust Glenn Greenwald.

00:20:36.329 --> 00:20:40.729
Then Glenn Greenwald goes weirdly pro
Russia and then at what point do you go?

00:20:40.939 --> 00:20:41.819
He's gone full Marga.

00:20:42.009 --> 00:20:44.079
Oh, how do I get off him?

00:20:44.079 --> 00:20:48.449
Because he was the one person because of
what Sam said about the mainstream media.

00:20:48.654 --> 00:20:49.674
Well, Glenn was the guy.

00:20:49.684 --> 00:20:51.494
All I can trust is this one guy.

00:20:51.534 --> 00:20:52.744
Well, you have to have a varied diet.

00:20:52.894 --> 00:20:54.324
But, like, I was going

00:20:54.324 --> 00:20:58.164
Ali: to say, with all gurus, or like,
and you see if, by extension, like

00:20:58.164 --> 00:21:01.214
a cult leader, when it becomes like
a cult, it's that sunk cost fallacy.

00:21:01.224 --> 00:21:01.464
Yeah.

00:21:01.554 --> 00:21:03.344
It's like, I have invested so much.

00:21:03.344 --> 00:21:03.844
Yes.

00:21:03.874 --> 00:21:06.684
And I feel I've invested so much in
this person, and they've been right

00:21:06.754 --> 00:21:07.994
so long, they will get it right.

00:21:08.044 --> 00:21:10.554
It's just, you know, and
you have your blinkers on.

00:21:10.584 --> 00:21:11.334
Yeah, it's awful.

00:21:11.344 --> 00:21:14.564
You lose the ability to think about
it critically, even when they fuck up.

00:21:14.864 --> 00:21:16.499
And so, And that's what it is.

00:21:16.499 --> 00:21:17.499
It's this sunk cost fallacy.

00:21:17.499 --> 00:21:18.169
It's the same.

00:21:18.339 --> 00:21:18.579
Yeah.

00:21:18.579 --> 00:21:23.059
In a, in a cult setting as it is when we
deify the, you know, people who, you know,

00:21:23.069 --> 00:21:27.609
with these ideas that, you know, You know,
it's like, well, how could they be wrong?

00:21:27.609 --> 00:21:29.609
Like, they were right about all
these other things, or like,

00:21:29.609 --> 00:21:30.749
I'm going to keep persisting.

00:21:31.069 --> 00:21:31.459
Joe: That's right.

00:21:31.459 --> 00:21:34.859
Yeah, and then what often happens
is people go outside their area of

00:21:34.859 --> 00:21:39.079
expertise, like I say, a Noam Chomsky,
or there's plenty of them, and they

00:21:39.079 --> 00:21:42.279
go not only beyond their area of
expertise, they go way beyond, and

00:21:42.279 --> 00:21:45.539
they stop talking about their area of
expertise, and all they talk about is

00:21:45.549 --> 00:21:47.189
something that they're not an expert in.

00:21:47.279 --> 00:21:47.609
Okay.

00:21:47.629 --> 00:21:50.949
And that happens, and so I can look
up people whose books I've read, say

00:21:50.949 --> 00:21:55.204
like Steven Pinker, And get his latest
thoughts on the latest things and at

00:21:55.204 --> 00:21:58.834
least I've read a couple of his books
So I have some idea how his mind works,

00:21:58.834 --> 00:22:04.554
which is good, but it doesn't make him
infallible No, but I had the experience.

00:22:04.574 --> 00:22:07.684
Sam: Oh, he's he's a he's he's a lowly.

00:22:07.714 --> 00:22:12.884
It's a bloody He's a chum of old
mate in the prison, you know Epstein

00:22:13.294 --> 00:22:18.847
and he is a sus cool liberal guy who
believes in freedom and progress, but

00:22:18.847 --> 00:22:24.592
is actually Pretty much standing by,
while fascism takes over his country.

00:22:24.682 --> 00:22:26.882
And Steven Pinker is useless, that guy.

00:22:26.892 --> 00:22:27.562
He's worse than

00:22:27.562 --> 00:22:27.822
Joe: useless.

00:22:27.852 --> 00:22:31.082
Is one, is a good example,
but there's a lot of them.

00:22:31.712 --> 00:22:37.132
See, I had to reckon with the fact
that in 2018, 2019 His pointless

00:22:37.132 --> 00:22:41.782
Sam: liberalism is what enables fascism to
thrive, because it's full of lies and half

00:22:41.782 --> 00:22:41.832
Joe: truths.

00:22:41.932 --> 00:22:45.242
I read a bunch of different public
intellectuals in the last week.

00:22:45.386 --> 00:22:49.146
and his latest interview, I read
that, it actually still made more

00:22:49.146 --> 00:22:50.496
sense than a lot of the other stuff.

00:22:50.556 --> 00:22:50.996
Sure.

00:22:51.086 --> 00:22:54.076
I, I read Yeah, your

00:22:54.076 --> 00:22:56.406
Sam: no opinion thing, or
this guy, or that guy, you've

00:22:56.406 --> 00:22:57.626
followed these individuals too.

00:22:57.626 --> 00:22:59.156
And I guess that's what you're saying.

00:22:59.366 --> 00:23:03.566
Joe: I'm saying I've done it, and
it's mostly been very bad for me.

00:23:05.316 --> 00:23:09.826
You know, the safest place for me if
I'm going to trust one person's take

00:23:09.826 --> 00:23:15.386
on things, honestly the safest and
calmest place is probably Matt Iglesias.

00:23:16.726 --> 00:23:20.426
It's just a center left Democrat
who wants Biden to get re elected,

00:23:20.436 --> 00:23:25.786
doesn't want the end of the world,
whatever, but I've stopped paying

00:23:25.896 --> 00:23:27.066
him for information as well.

00:23:27.086 --> 00:23:27.976
Noah Smith's different.

00:23:27.986 --> 00:23:30.846
Noah Smith, like, wants a war with China.

00:23:31.876 --> 00:23:36.186
Like once it became obvious how excited he
gets about military spending, you're like,

00:23:36.226 --> 00:23:38.956
all right, I can't, I can't read this.

00:23:39.066 --> 00:23:40.256
And he's a classic example

00:23:40.406 --> 00:23:42.216
Sam: where he's

00:23:42.216 --> 00:23:46.746
Joe: trained as an economist,
but what gets him the most clicks

00:23:46.766 --> 00:23:50.356
is when he theorizes about China
starting a war with America.

00:23:50.406 --> 00:23:52.066
So he starts writing
about that all the time.

00:23:52.626 --> 00:23:55.236
But he's an economist, like he's
not in a foreign policy expert.

00:23:56.086 --> 00:23:58.766
So I'm constantly trying to
triangulate these people.

00:23:58.776 --> 00:24:03.276
So then I go and look at like, John
Mersheimer, and him talking about

00:24:03.366 --> 00:24:07.426
realist politics and how you can
understand rationally what Russia's done.

00:24:07.866 --> 00:24:12.136
But what I wanted to talk about was the
one positive of looking to individuals

00:24:12.136 --> 00:24:18.356
was 2018, 2019, my side of politics,
the left, Was lying to me more and more

00:24:18.356 --> 00:24:22.894
about climate change, particularly with
the, Extinction Rebellion movement saying

00:24:23.094 --> 00:24:24.494
humans are going to become extinct.

00:24:24.794 --> 00:24:27.694
And then basically you can
just pick up the Guardian and

00:24:27.704 --> 00:24:29.044
be told it's kind of that,

00:24:29.064 --> 00:24:29.424
Sam: right?

00:24:29.514 --> 00:24:32.434
I didn't say, I never said
humans were going to be extinct.

00:24:32.434 --> 00:24:33.774
Joe: You're not in Extinction
Rebellion as far as

00:24:33.774 --> 00:24:34.109
Sam: I know.

00:24:34.109 --> 00:24:37.224
But here's the thing, we're driving
everything else to extinction.

00:24:37.224 --> 00:24:38.364
But this is the thing.

00:24:38.774 --> 00:24:38.974
That's

00:24:38.974 --> 00:24:40.134
Joe: what XR is really about.

00:24:40.134 --> 00:24:41.414
That's not what humans, that's not what.

00:24:41.714 --> 00:24:44.234
Extinction Rebellion are saying,
they're saying humans will be extinct.

00:24:44.594 --> 00:24:47.034
Anyway, it's the big
lie on the left, right?

00:24:47.324 --> 00:24:48.924
That climate change will kill us all.

00:24:49.074 --> 00:24:49.971
And do you rank

00:24:49.971 --> 00:24:54.304
Sam: that lie as equally
pernicious as election?

00:24:54.384 --> 00:24:54.464
For

00:24:54.654 --> 00:24:58.604
Joe: my mental health, for my
mental health as an individual,

00:24:58.604 --> 00:25:03.374
it was really damaging, right?

00:25:03.844 --> 00:25:08.224
So I finally, and I'll never know how I
got out of the ideological straitjacket

00:25:08.224 --> 00:25:12.719
I was in, but I finally had a look At
like a Michael Schellenberger and a Bjorn

00:25:12.719 --> 00:25:15.099
Lomberg and what, what are they saying?

00:25:15.109 --> 00:25:18.809
I know I'm not allowed to look at them
because I'm a lefty, but I had a look.

00:25:19.379 --> 00:25:23.339
And suddenly this whole world opened
up of people who weren't lying to me.

00:25:23.399 --> 00:25:24.009
Yeah, yeah, yeah.

00:25:24.009 --> 00:25:24.409
Right?

00:25:24.449 --> 00:25:25.249
Sam: And we can build nuclear

00:25:25.249 --> 00:25:29.019
Joe: power plants that rocked my
world because my side of politics

00:25:29.019 --> 00:25:31.519
was telling me the big lie,
the other side of politics was

00:25:31.639 --> 00:25:36.439
Sam: Stop calling it the, the big
lie is only one use and it's that

00:25:36.459 --> 00:25:38.499
we should let Trump run the world.

00:25:38.824 --> 00:25:39.674
That's the big lie.

00:25:39.764 --> 00:25:40.974
There's more than one big lie.

00:25:40.984 --> 00:25:42.654
No, that's the only one
worth talking about.

00:25:42.726 --> 00:25:46.564
It's actually not even worth talking about
in itself, but this idea that climate that

00:25:46.874 --> 00:25:50.959
Joe: Extinction Rebellion is engaging
in the same the big lie in this,

00:25:50.999 --> 00:25:54.139
in the Orwellian sense, if you say
something that's outrageous enough,

00:25:54.139 --> 00:25:58.219
it's probably not so hard to believe,
but why would they say it unless it was

00:25:58.219 --> 00:25:58.979
Sam: true, you know?

00:25:59.299 --> 00:26:00.199
Look, Ali, you were trying to

00:26:00.529 --> 00:26:05.729
Ali: No, I was just saying, like,
why does, why can't you Remove

00:26:05.729 --> 00:26:07.629
yourself from the situation.

00:26:07.939 --> 00:26:11.599
Why does the big light like why does
why can't you see that for what it is?

00:26:11.619 --> 00:26:11.929
And

00:26:12.259 --> 00:26:16.049
Joe: why do you expect I needed these
individuals to to write books that I could

00:26:16.049 --> 00:26:20.259
read or particularly Schellenberger's book
apocalypse Never to read it thoroughly

00:26:20.259 --> 00:26:24.859
and look just through the UN's own
climate data and projections and stuff

00:26:24.859 --> 00:26:30.714
So just thoroughly sit down and debunk
15 years of, of left, the left wing, I

00:26:30.724 --> 00:26:32.334
understand what they're doing politically.

00:26:32.334 --> 00:26:34.634
They're trying to raise
awareness and get people alarmed

00:26:34.654 --> 00:26:36.164
so that we act on omissions.

00:26:36.224 --> 00:26:36.864
I get it.

00:26:37.234 --> 00:26:41.554
But it got to a point psychologically
where I was gonna, I was like a teenager

00:26:41.564 --> 00:26:42.774
who didn't think there was a future.

00:26:43.284 --> 00:26:43.624
Right?

00:26:43.684 --> 00:26:44.054
Yeah.

00:26:44.054 --> 00:26:44.334
Sure.

00:26:44.354 --> 00:26:47.154
Ali: But also, but why is
that that everyone else's

00:26:47.214 --> 00:26:48.684
responsibility and not yours?

00:26:49.124 --> 00:26:49.924
What do you mean?

00:26:49.924 --> 00:26:54.054
Like why, like why, why can't
you, like, I mean, obviously

00:26:54.054 --> 00:26:55.134
the issue then is with you.

00:26:56.539 --> 00:26:56.959
Whenever

00:26:56.959 --> 00:26:59.459
Joe: I'm disturbed, the
problem is always in me.

00:26:59.569 --> 00:27:00.729
It's actually not the world's problem.

00:27:00.729 --> 00:27:07.089
But what I'm saying is the
balm that I found was a guru.

00:27:07.369 --> 00:27:10.289
But then Schellenberger, straight
away, off the deep end again, he

00:27:10.289 --> 00:27:15.299
gets obsessed with like, um, right
wing conspiracy theories, and

00:27:15.299 --> 00:27:17.119
obsessed with crime in San Francisco.

00:27:17.769 --> 00:27:21.159
Runs for governor of California like
straight off the deep end, but the

00:27:21.159 --> 00:27:26.649
book still stands as a real really good
corrective To that particular decade

00:27:26.649 --> 00:27:28.779
and a half of climate propaganda, right?

00:27:29.169 --> 00:27:31.699
So, but he can't be my
guru because he's a nutter.

00:27:31.809 --> 00:27:32.299
Yeah, right.

00:27:32.509 --> 00:27:34.269
Your long books probably a nutter.

00:27:34.269 --> 00:27:37.022
Yeah, like You're really good at

00:27:37.022 --> 00:27:40.398
Sam: getting off the train
when you should it seems But

00:27:40.398 --> 00:27:43.769
Joe: all I'm trying to
find is the truth Yeah.

00:27:43.809 --> 00:27:45.809
And I can't get it from
Reading the Guardian.

00:27:45.999 --> 00:27:48.589
So that's where my strongly
recommend against it.

00:27:48.879 --> 00:27:53.779
But that's where my, my epistemology
broke down when I would have thought

00:27:54.039 --> 00:27:57.649
like growing up, my dad used to get the
Guardian weekly from the actual, it was

00:27:57.649 --> 00:27:58.559
Sam: probably okay back then.

00:27:58.569 --> 00:27:58.859
Yeah.

00:27:58.859 --> 00:27:59.169
From the

00:27:59.169 --> 00:28:01.639
Joe: actual like news agent.

00:28:01.639 --> 00:28:02.089
Yeah.

00:28:02.319 --> 00:28:05.359
Because it was so much
better quality than the age.

00:28:05.359 --> 00:28:10.559
So I come into this, this age going,
well, I can trust it, but I can't because

00:28:10.559 --> 00:28:11.819
they're just trying to get clicks.

00:28:11.869 --> 00:28:12.219
Right.

00:28:12.219 --> 00:28:12.459
Yep.

00:28:12.469 --> 00:28:13.199
So, yep.

00:28:13.684 --> 00:28:14.904
Capitalism balked it.

00:28:14.984 --> 00:28:15.384
But because

00:28:15.674 --> 00:28:15.874
Ali: I'm safe.

00:28:16.284 --> 00:28:20.284
I mean like even with like, like with
the ABC like I've there's been things

00:28:20.294 --> 00:28:25.524
recently in particularly like some of
the coverage I'm like even that's that's

00:28:25.534 --> 00:28:29.014
always been my safe space or at least
like you know at least I get something you

00:28:29.014 --> 00:28:29.254
Joe: know.

00:28:29.254 --> 00:28:30.514
Some of the coverage of what?

00:28:31.194 --> 00:28:31.684
Sam: You name it.

00:28:31.684 --> 00:28:33.244
I

00:28:33.244 --> 00:28:34.074
Ali: was going to say
particularly the war in

00:28:34.074 --> 00:28:34.504
Joe: Gaza.

00:28:34.784 --> 00:28:37.434
What if I said to you there's
one guy who knows all about

00:28:37.434 --> 00:28:38.134
Sam: the Middle East.

00:28:38.194 --> 00:28:38.754
Immigration.

00:28:38.874 --> 00:28:39.454
Triple J.

00:28:39.454 --> 00:28:45.049
Triple J led a story with We heard
first from a Sky News person about

00:28:45.049 --> 00:28:47.029
immigration and why we need to lock

00:28:47.269 --> 00:28:47.519
Joe: everyone up.

00:28:47.519 --> 00:28:51.469
But Ali, what if, like, if I said to
you, there's, there's one person that

00:28:51.479 --> 00:28:55.699
you can read and watch his YouTube,
read his book or her book, they're never

00:28:55.699 --> 00:28:57.459
women for me, which is instructive.

00:28:58.029 --> 00:29:00.769
And they'll explain the Middle
East to you or give you what you

00:29:00.769 --> 00:29:01.469
want to hear about Palestine.

00:29:01.469 --> 00:29:03.709
Fuck the ABC.

00:29:03.749 --> 00:29:05.229
Stop, stop watching the ABC.

00:29:05.229 --> 00:29:08.599
Just follow this one person and they'll
post every day or every two days.

00:29:08.629 --> 00:29:09.039
But Ali, you're

00:29:09.049 --> 00:29:13.519
Sam: probably getting updates
from like a hundred, 150 sources.

00:29:13.759 --> 00:29:16.489
Ali: Yeah, it's not just the one,
but like, but yeah, no, I've just.

00:29:17.199 --> 00:29:20.229
Yeah, sometimes I would, I mean, I just
remember it was Adam and I were sitting

00:29:20.229 --> 00:29:24.039
there and watching the news the other day
and both of us, both of us were just like,

00:29:24.809 --> 00:29:26.319
Sam: it's useless, like yeah,

00:29:27.409 --> 00:29:29.069
Joe: we were just quite,

00:29:29.569 --> 00:29:30.149
Ali: yeah, it was really

00:29:30.149 --> 00:29:30.559
Sam: appalling.

00:29:30.559 --> 00:29:31.759
It's an awful program.

00:29:31.809 --> 00:29:32.149
Yeah.

00:29:32.349 --> 00:29:32.699
Yeah.

00:29:32.709 --> 00:29:33.809
It's the worst show on TV.

00:29:34.259 --> 00:29:38.419
Ali: And yeah, it just, I don't
know, it really, yeah, my, I suppose,

00:29:38.419 --> 00:29:42.149
yeah, your trusted gurus when
they, they disappoint you or they

00:29:42.149 --> 00:29:42.579
Joe: fail you.

00:29:42.579 --> 00:29:43.871
So you've never had one though?

00:29:43.871 --> 00:29:44.086
Yeah.

00:29:44.086 --> 00:29:46.883
Sam: Well, I'd say, I'd say Oh,
the ABC was like our guru, Ali.

00:29:46.883 --> 00:29:47.744
I see what you're

00:29:47.744 --> 00:29:47.959
Ali: saying.

00:29:47.959 --> 00:29:48.174
Yeah.

00:29:48.174 --> 00:29:48.390
Yeah.

00:29:48.390 --> 00:29:49.896
That used to be my For sure.

00:29:49.896 --> 00:29:50.111
Yeah.

00:29:50.111 --> 00:29:50.326
Absolutely.

00:29:50.326 --> 00:29:53.339
Like as in the way that the
Guardian used to be yours or

00:29:53.349 --> 00:29:54.312
Sam: A kind of institutional guru.

00:29:54.312 --> 00:29:55.382
It's an institutional

00:29:55.382 --> 00:29:55.739
Ali: guru.

00:29:55.739 --> 00:29:55.979
Yeah.

00:29:55.979 --> 00:29:56.376
But it was

00:29:56.376 --> 00:29:58.629
Joe: like, it was at least the one
I'll go to my grave saying that

00:29:58.629 --> 00:30:00.689
was, is better than one individual.

00:30:00.779 --> 00:30:01.249
Oh no, no.

00:30:01.319 --> 00:30:01.589
I'm all

00:30:01.589 --> 00:30:02.419
Sam: about institutions.

00:30:02.419 --> 00:30:02.729
Yeah.

00:30:02.729 --> 00:30:02.779
Yeah.

00:30:02.779 --> 00:30:07.574
But at the moment Most of them are failing
or not fit for purpose in some way, even

00:30:07.574 --> 00:30:10.724
if they're surviving really well, they're
not serving us correctly, and there's an

00:30:10.724 --> 00:30:14.474
awful lot of institutions that need to
be reformed, and there's others that will

00:30:14.504 --> 00:30:15.964
just naturally collapse and be replaced.

00:30:15.964 --> 00:30:16.254
But you have a

00:30:16.254 --> 00:30:18.614
Joe: look at the ABC and you go,
alright, well that's kind of like

00:30:18.654 --> 00:30:20.204
the mainstream, vaguely normie

00:30:20.234 --> 00:30:23.174
Sam: I'm not suggesting the ABC should be
replaced, but God, it needs some reform.

00:30:23.534 --> 00:30:23.714
Joe: Yeah.

00:30:23.714 --> 00:30:28.359
But you know that you're getting like
some kind of like Yeah, I don't know.

00:30:28.419 --> 00:30:32.569
Compared to reading The Economist,
reading the ABC News world page is

00:30:32.569 --> 00:30:36.379
like, I don't know, a hundredth or
something of what you're getting

00:30:36.409 --> 00:30:38.389
about world news in The Economist.

00:30:38.449 --> 00:30:38.819
You find it

00:30:38.819 --> 00:30:40.669
Sam: quite parochial, the ABC world news.

00:30:41.344 --> 00:30:46.054
Joe: I read it now because I don't, I want
to know a much, much less about the world

00:30:46.174 --> 00:30:50.464
after a couple of years of full immersion
with The Economist, I just went, I'm

00:30:50.464 --> 00:30:51.764
going to live in Melbourne for a while.

00:30:51.794 --> 00:30:53.674
Sam: But again, The Economist
is not full immersion,

00:30:53.674 --> 00:30:56.004
Joe: like full immersion
from the center, right?

00:30:56.084 --> 00:30:56.884
Pro capitalists.

00:30:57.324 --> 00:30:57.684
Sam: Yeah.

00:30:57.694 --> 00:30:59.304
Also, I would direct your attention.

00:30:59.304 --> 00:31:03.200
Like I said this before, to, I've
come across like countless interviews

00:31:03.200 --> 00:31:07.440
that have really given me a much,
much better feeling of like, I

00:31:07.440 --> 00:31:08.260
think I actually have a much.

00:31:08.310 --> 00:31:11.539
Better understanding of, not perfect
understanding of what's going on, but

00:31:11.539 --> 00:31:15.549
it's really shifted me away from what
I'm getting in the information space.

00:31:15.879 --> 00:31:19.599
So the New Books Network, I've said
this before, they've got like, I don't

00:31:19.599 --> 00:31:25.389
know, 150 channels, like, you know,
genocide studies, South Asian studies,

00:31:25.439 --> 00:31:27.639
like every single academic discipline.

00:31:27.964 --> 00:31:32.344
And it's a fire hose of new books, and you
can just, and they're all academic books

00:31:32.344 --> 00:31:35.914
on a single topic by a single author,
sometimes they're edited volumes, but the

00:31:35.914 --> 00:31:41.734
point is, every last person there is an
expert, and they've focused on this one

00:31:41.734 --> 00:31:45.564
area for at least a few years, and they're
coming from a disciplinary background,

00:31:45.914 --> 00:31:50.444
and you can evaluate the disciplinary
background, you can evaluate what they

00:31:50.444 --> 00:31:54.554
say, but here's the thing, you don't have
to take any one person for it, because

00:31:54.804 --> 00:32:00.519
in the next week, you could listen to
Uh, 50 interviews with experts on any

00:32:00.519 --> 00:32:05.389
given topic from the New Books Network
and your sophistication of understanding

00:32:05.409 --> 00:32:06.999
will have gone up a thousand,

00:32:06.999 --> 00:32:08.172
Joe: two thousand percent.

00:32:08.172 --> 00:32:10.909
Is your favorite stuff the Zizek stuff?

00:32:10.909 --> 00:32:14.069
In terms of pleasurable, like,
ah, that information's enjoyable.

00:32:14.359 --> 00:32:16.049
Oh, Zizek, you've done it again.

00:32:16.459 --> 00:32:18.769
That kind of stuff you only
get from your guru, right?

00:32:19.159 --> 00:32:19.629
Uh, sometimes.

00:32:19.679 --> 00:32:21.999
You don't get that from
some bland academic you've

00:32:21.999 --> 00:32:22.369
Sam: never heard of.

00:32:22.369 --> 00:32:24.509
Look, to be honest, yeah, I
think he is operating in the

00:32:24.509 --> 00:32:26.319
mode of entertainment at times.

00:32:26.609 --> 00:32:26.979
Yeah.

00:32:26.979 --> 00:32:28.049
You want some information

00:32:28.049 --> 00:32:30.719
Ali: from your guru, you want to, yeah.

00:32:30.799 --> 00:32:31.049
Sam: Yeah.

00:32:31.139 --> 00:32:34.319
And I think a lot of information,
a lot of informational content is

00:32:34.319 --> 00:32:39.059
actually mainly entertainment at
all, is consumed as entertainment.

00:32:39.089 --> 00:32:42.299
So like some of my favorite pods are
very informative, but I'm actually

00:32:42.299 --> 00:32:44.119
there to hang out with those people.

00:32:44.939 --> 00:32:46.419
And they can talk about
whatever they want.

00:32:46.769 --> 00:32:47.869
And like, that's the other thing.

00:32:48.519 --> 00:32:53.969
We get the burden of individual
choice is like, A, unsustainable,

00:32:54.029 --> 00:32:56.289
B, non existent in a real sense.

00:32:56.479 --> 00:32:59.529
Like, I keep saying this on the show,
like, we have to live as individuals,

00:32:59.529 --> 00:33:01.119
but we can't live as individuals.

00:33:01.119 --> 00:33:04.599
Like, we, we can only survive
collectively, and we have

00:33:04.599 --> 00:33:06.349
responsibilities to others and vice versa.

00:33:06.839 --> 00:33:09.169
And that's actually the
good news, not the bad news.

00:33:09.869 --> 00:33:17.669
And I can't Shoulder alone this
responsibility of deciding what To trust.

00:33:17.789 --> 00:33:19.889
It's not a job for one person to do by

00:33:19.889 --> 00:33:20.499
Joe: themselves.

00:33:20.539 --> 00:33:20.899
Yeah.

00:33:21.369 --> 00:33:24.399
I mean, I do, the best I can
come up with is read whole books.

00:33:24.462 --> 00:33:26.472
So I've read a couple of
Douglas Murray's whole books.

00:33:26.482 --> 00:33:30.982
So then when he, whatever he talks about
the most recent thing, I can go, at least

00:33:30.982 --> 00:33:33.632
I know at, at length what this guy thinks.

00:33:33.722 --> 00:33:33.962
Yeah.

00:33:33.962 --> 00:33:35.729
But he talked about, Well, yeah.

00:33:35.909 --> 00:33:39.119
He talked about going to talk to groups
of people, which he's always done.

00:33:39.239 --> 00:33:39.599
Sure.

00:33:39.879 --> 00:33:44.508
And in an auditorium, 10 years ago,
he used to know, He used to be able

00:33:44.508 --> 00:33:48.618
to assume that everyone had read
vaguely similar stuff like whether

00:33:48.618 --> 00:33:51.378
it was the New York Times or you
know He's in he's a right winger.

00:33:51.448 --> 00:33:53.328
So he's working moving
in right wing circles.

00:33:53.328 --> 00:33:55.548
Maybe that all read the
spectator, whatever.

00:33:55.578 --> 00:34:00.398
Yep, right He used to be able to assume
that and he said now I stand in front of

00:34:00.398 --> 00:34:05.098
a crowd of 500 people Like there's almost
no commonality like everyone's coming

00:34:05.098 --> 00:34:07.998
from completely different perspectives
You have no idea what they've been

00:34:07.998 --> 00:34:09.248
reading what they've been watching.

00:34:09.268 --> 00:34:09.698
So he's just

00:34:09.698 --> 00:34:10.078
Sam: saying

00:34:10.743 --> 00:34:15.013
Joe: He says, he goes, I don't, I
can't assume a set of common facts

00:34:15.013 --> 00:34:15.073
Sam: anymore.

00:34:15.073 --> 00:34:17.073
Can't someone, can't
someone make my job easier?

00:34:17.273 --> 00:34:17.853
Well, no, no.

00:34:17.953 --> 00:34:20.643
I just want to trot out a speech and
assume it'll land the same everywhere.

00:34:20.663 --> 00:34:21.103
Joe: No, no.

00:34:21.193 --> 00:34:21.693
What a lazy, lazy man.

00:34:21.693 --> 00:34:25.133
He wasn't saying it, he wasn't even
complaining, he was just sort of saying

00:34:25.143 --> 00:34:27.493
Sam: You've got to respond to every
situation, you've got to be in the moment,

00:34:27.493 --> 00:34:28.083
Joe: dude.

00:34:28.253 --> 00:34:30.783
Yeah, but it's Engaging

00:34:30.783 --> 00:34:31.253
Ali: dialogue.

00:34:31.263 --> 00:34:34.383
Your guru can't keep up with the plurif

00:34:34.763 --> 00:34:35.083
Joe: Yeah.

00:34:35.313 --> 00:34:38.703
I think, I think as a bi I know, I
I think for me as a Pluriferation.

00:34:40.273 --> 00:34:40.533
Ali: Yeah, yeah.

00:34:40.533 --> 00:34:41.673
Of information and gurus.

00:34:42.223 --> 00:34:46.133
Joe: No, but, but what he's pointing to
Yeah, one guy trying to figure it all out.

00:34:46.133 --> 00:34:51.153
Yeah, good of agreed upon facts is
very unstable and dangerous, right?

00:34:51.193 --> 00:34:52.893
Like very, very No, but

00:34:52.923 --> 00:34:57.198
Sam: who said Who said agreed upon
facts have to stay agreed upon?

00:34:57.208 --> 00:34:57.828
It's not static.

00:34:57.868 --> 00:35:01.468
Well, I think from my perspective We have
to continually disagree on established

00:35:01.468 --> 00:35:02.658
facts that turn out to be wrong.

00:35:02.668 --> 00:35:04.198
I think from my perspective
Slavery is fine.

00:35:04.238 --> 00:35:07.978
As, uh, Christianity is the
only true religion, etc.

00:35:07.988 --> 00:35:08.908
I mean, come on, man.

00:35:08.968 --> 00:35:09.048
I

00:35:09.048 --> 00:35:12.298
Joe: think from my perspective as a
bipolar person, because my brain can

00:35:12.318 --> 00:35:14.998
organically leave consensual reality Yeah.

00:35:15.048 --> 00:35:17.338
And then no one has a clue
what I'm talking about.

00:35:17.378 --> 00:35:17.718
Nah,

00:35:17.718 --> 00:35:18.618
Sam: I've understood everything.

00:35:19.988 --> 00:35:21.308
Joe: Yeah, because I'm on my meds.

00:35:21.358 --> 00:35:27.168
But like, because, because I can do
that organically, I find it terrifying

00:35:27.178 --> 00:35:30.158
the idea that no one has a fucking
clue what's going on in the world.

00:35:30.198 --> 00:35:34.298
And, and honestly, I don't think right
now, there is a guru that has a clue

00:35:34.348 --> 00:35:36.648
that that can encapsulate it all.

00:35:36.948 --> 00:35:40.198
You know, I'm looking for the simplest.

00:35:40.198 --> 00:35:40.678
Sam Harris

00:35:40.848 --> 00:35:42.258
Sam: actually gets close in some ways.

00:35:42.258 --> 00:35:43.778
The simplest, clearest.

00:35:43.798 --> 00:35:44.868
Your boy, Sam, go back

00:35:44.868 --> 00:35:45.218
Joe: to him.

00:35:45.448 --> 00:35:47.908
No, I've never liked him
as a public intellectual at

00:35:47.908 --> 00:35:48.278
Sam: all.

00:35:48.278 --> 00:35:50.408
I've shit canned him so hard to you.

00:35:50.798 --> 00:35:51.108
I

00:35:51.348 --> 00:35:53.218
Joe: like him on the
meditation stuff, but I

00:35:53.438 --> 00:35:53.768
Sam: don't, yeah.

00:35:53.798 --> 00:35:58.158
Out of all the gurus out there, he is
the least self appointed in a weird way.

00:35:58.158 --> 00:35:59.818
It's so strange of me to say that.

00:35:59.818 --> 00:36:00.068
His ego's

00:36:00.118 --> 00:36:00.448
Joe: too big.

00:36:00.448 --> 00:36:01.868
I don't want his opinions on everything.

00:36:02.428 --> 00:36:03.658
Sam: I've heard him backtrack.

00:36:03.713 --> 00:36:07.990
And I'm like, man, that's all I need
from a public intellectual, in any case.

00:36:07.990 --> 00:36:08.161
I don't

00:36:08.161 --> 00:36:09.373
Joe: want to be a Sam Harris, bro.

00:36:09.443 --> 00:36:13.243
I actually don't want to be a guru, but
what I want is the simplest, clearest,

00:36:13.313 --> 00:36:15.143
most honest explanation of things.

00:36:15.213 --> 00:36:15.553
Sure.

00:36:15.613 --> 00:36:20.683
Ali: But not one person is going to be
able to fulfill that tip for you in the

00:36:20.683 --> 00:36:24.903
same way that not one person is the answer
to anybody's problems in any capacity.

00:36:24.993 --> 00:36:25.143
That's right.

00:36:25.213 --> 00:36:27.243
And not one guru is going to have.

00:36:27.553 --> 00:36:30.353
The answers to every situation
that you could possibly pose to it.

00:36:30.403 --> 00:36:30.553
But

00:36:30.563 --> 00:36:32.523
Joe: is that what a dialectic is, Sam?

00:36:32.523 --> 00:36:32.713
Yes.

00:36:32.733 --> 00:36:35.273
Because I don't understand a
dialectic, but let me give you this.

00:36:35.653 --> 00:36:36.203
Go, go.

00:36:36.203 --> 00:36:40.573
Like, since the Ukraine war started,
I didn't read any John Mersheimer.

00:36:40.623 --> 00:36:40.953
Yeah.

00:36:41.193 --> 00:36:44.763
And lefties loved John Mersheimer because
he basically blamed America, right?

00:36:45.573 --> 00:36:48.479
So, then eventually, I Chomsky
is a stick to linguistics.

00:36:48.479 --> 00:36:51.433
Eventually, I'll watch a John
Mersheimer speech and I'm like, wow.

00:36:51.818 --> 00:36:55.668
This really appeals to me, because
he's explaining something clearly,

00:36:56.108 --> 00:37:00.918
he sees Putin as a rational actor,
which is very reassuring on that is

00:37:00.918 --> 00:37:03.758
the world going to blow up, because
a rational actor is much less likely

00:37:03.758 --> 00:37:05.778
to do that than an irrational madman.

00:37:05.788 --> 00:37:06.058
And we don't

00:37:06.058 --> 00:37:08.608
Sam: need no clash of civilizations
nonsense, no thank you.

00:37:08.698 --> 00:37:14.528
Joe: So, the clarity of Mershama's thought
is It's very reassuring to me, even though

00:37:14.528 --> 00:37:17.228
it's all bad news, of course, because
of course everything in geopolitics

00:37:17.228 --> 00:37:19.558
is bad news from an American hegemony.

00:37:19.608 --> 00:37:20.018
Of course.

00:37:20.018 --> 00:37:23.488
From his perspective, he
wants America to win, right?

00:37:23.548 --> 00:37:27.928
So it's all bad news, but it's clear
and states are acting rationally still.

00:37:27.938 --> 00:37:31.848
So I want to believe that, but then
because I'm curious, I read more

00:37:31.848 --> 00:37:35.308
and I read critiques saying, John
Mersheimer gets everything wrong

00:37:35.308 --> 00:37:36.398
about everything all the time.

00:37:37.208 --> 00:37:40.888
And so the guru part kicks in where
I'm like, no, I don't want it.

00:37:40.918 --> 00:37:44.148
Is it a dialectic to
triangulate those things?

00:37:44.168 --> 00:37:44.688
Yes.

00:37:44.828 --> 00:37:48.678
As opposed to my instinct is just like,
all right, how about if I only read

00:37:48.678 --> 00:37:50.158
John Mershheimer for the next year?

00:37:50.628 --> 00:37:55.368
And I keep thinking that states act
rationally, you know, for a year, imagine

00:37:55.368 --> 00:37:59.188
that, read his book, just watch his,
whatever pops up on YouTube, become

00:37:59.188 --> 00:38:03.438
a foreign policy realist, become, and
what I'm really good at is observing

00:38:03.448 --> 00:38:06.718
John Mersheimer thought till eventually
I could come on here and I'll give

00:38:06.718 --> 00:38:08.713
you like great Mersheimer, right?

00:38:08.863 --> 00:38:10.813
But at the moment, I've only
known who he is for two weeks.

00:38:10.913 --> 00:38:11.813
You could be Mersha on the

00:38:11.813 --> 00:38:12.853
Sam: GPT for sure.

00:38:12.863 --> 00:38:15.683
Joe: Triangulating it and it
makes me more and more confused.

00:38:15.683 --> 00:38:18.863
But again, I do it because I'm
curious and I do it because I

00:38:18.953 --> 00:38:22.983
don't trust them getting closer to
reality by just trusting one person.

00:38:22.983 --> 00:38:26.483
Ali: I mean, you can't trust just
one person and we have more and

00:38:26.483 --> 00:38:28.573
more people than ever before.

00:38:28.793 --> 00:38:31.368
What is your favourite.

00:38:31.368 --> 00:38:34.003
We all have different stories.

00:38:34.203 --> 00:38:34.473
But I

00:38:34.983 --> 00:38:38.353
Joe: think what you naturally did as
a saner, healthier person than me,

00:38:38.413 --> 00:38:41.513
is you were able to read The Guardian
for 10 years, but with a grain

00:38:41.513 --> 00:38:44.343
of salt, say yeah but not really.

00:38:44.823 --> 00:38:45.173
Over

00:38:45.173 --> 00:38:45.683
Ali: and over again.

00:38:45.693 --> 00:38:45.783
Yeah.

00:38:45.783 --> 00:38:48.483
The Guardian's actually never really,
I mean, yeah, I mean, occasionally

00:38:48.483 --> 00:38:51.073
I'll read something and be like, jeez,
but, but for the most part it doesn't

00:38:51.073 --> 00:38:52.633
bother me, I don't take it personally.

00:38:52.633 --> 00:38:53.353
I mean, like I said.

00:38:53.853 --> 00:38:56.723
Occasionally I do get let down,
like I got let down by the ABC,

00:38:56.723 --> 00:38:59.353
but like, but it's not the But it

00:38:59.353 --> 00:39:01.283
Joe: never, does it ever
make your stomach flip?

00:39:01.333 --> 00:39:04.163
Does it ever make you
grip the chair in terror?

00:39:04.453 --> 00:39:08.403
Ali: Truthfully, the only time I'm ever
gripped with fear and terror is when it's

00:39:08.403 --> 00:39:13.203
a completely irrational, bonkers sort
of thing that is not based in reality,

00:39:13.203 --> 00:39:14.233
and it's usually when I'm not well.

00:39:14.233 --> 00:39:15.353
So,

00:39:15.733 --> 00:39:17.063
Joe: so like volcanoes?

00:39:17.593 --> 00:39:21.023
Ali: Yeah, or something like some sort
of like Baba Vanga kind of like doomsday

00:39:21.053 --> 00:39:25.213
Nostradamus sort of some Absolutely
not based in any reality whatsoever,

00:39:25.313 --> 00:39:26.603
Joe: like the main calendar.

00:39:26.873 --> 00:39:27.973
Ali: Yeah, yeah, yeah.

00:39:27.973 --> 00:39:28.623
Like something like that.

00:39:28.623 --> 00:39:29.553
And I'll yeah.

00:39:29.753 --> 00:39:33.513
And I'll have a moment of
absolute despair and paranoia.

00:39:33.513 --> 00:39:33.793
Joe: Yeah.

00:39:34.143 --> 00:39:37.193
So I'm like that with a bit about
things that other people are, and

00:39:37.193 --> 00:39:39.313
Ali: then that's not based
in any sort of reality.

00:39:39.433 --> 00:39:42.058
And it's, it's whereas,
and that's just like.

00:39:42.148 --> 00:39:45.308
me letting my brain play in that
space for a little while until

00:39:45.308 --> 00:39:46.428
I'm like, okay, you, you Yeah,

00:39:46.428 --> 00:39:47.588
Joe: I do it because I'm bored too.

00:39:47.628 --> 00:39:49.308
I scare the shit out of
myself because I'm bored too.

00:39:49.778 --> 00:39:50.808
It's a form of entertainment.

00:39:50.818 --> 00:39:51.168
Sam: Yeah.

00:39:51.328 --> 00:39:51.658
Yeah.

00:39:51.698 --> 00:39:52.348
Look, that's the thing.

00:39:52.348 --> 00:39:57.248
So beware of the feelings, I think,
and we can't always be guided by

00:39:57.258 --> 00:40:00.888
the feelings we have in reaction to
things that we've read or looked at.

00:40:01.338 --> 00:40:06.408
And I actually think The guru
instinct is not entirely misplaced.

00:40:06.448 --> 00:40:08.438
Like we want to trust people and I'm fine.

00:40:08.758 --> 00:40:10.208
I think we can't operate without it.

00:40:10.328 --> 00:40:13.671
So, I think what's helping is
what we're doing right here.

00:40:14.461 --> 00:40:16.421
And I think that's the
best way to triangulate.

00:40:16.941 --> 00:40:19.811
So you can, you bring along
your Mersheimer, I'll bring

00:40:19.811 --> 00:40:21.781
along, uh, whatever else.

00:40:21.811 --> 00:40:23.231
I don't have anything on hand really.

00:40:23.271 --> 00:40:23.881
And Ali's.

00:40:24.511 --> 00:40:28.671
Just brought in some good stuff and I
think, you know, useful perspective and

00:40:28.671 --> 00:40:31.261
the triangulation occurs in dialogue.

00:40:31.481 --> 00:40:35.431
I think when we try and triangulate, we
try to get the dialectic going between

00:40:35.431 --> 00:40:40.211
like ourselves and then this and this
multiplicity of media that doesn't work.

00:40:40.451 --> 00:40:45.335
It's sort of like, all I mean by the
dialectic is thing A, that's true.

00:40:45.845 --> 00:40:48.161
Thing B, oh wait, maybe that's true.

00:40:48.411 --> 00:40:49.156
Thing C.

00:40:49.756 --> 00:40:52.456
You know, historical force
meets other historical force.

00:40:52.486 --> 00:40:53.246
What happens?

00:40:53.616 --> 00:40:55.996
It's just, it really,
it's just one plus one

00:40:56.056 --> 00:40:59.286
Joe: equals I don't think, yeah,
unfortunately I, I think I'll go to my

00:40:59.286 --> 00:41:00.836
grave not knowing what a dialectic is.

00:41:00.836 --> 00:41:01.486
It's as simple

00:41:01.486 --> 00:41:03.046
Ali: as But having like, yeah,
just like a friend check in.

00:41:03.046 --> 00:41:04.156
Do you know what a dialectic is?

00:41:04.206 --> 00:41:06.516
We'll just have, yeah, like I think
like Sam's saying, just have someone

00:41:06.516 --> 00:41:09.631
just check in like For the rational.

00:41:09.631 --> 00:41:09.991
I don't know.

00:41:09.991 --> 00:41:10.171
Yes.

00:41:10.171 --> 00:41:10.321
Yeah.

00:41:10.321 --> 00:41:11.821
Like you need, you need like, so

00:41:12.031 --> 00:41:14.041
Sam: yeah, I'm holding this thesis.

00:41:14.071 --> 00:41:14.341
Yes.

00:41:14.341 --> 00:41:16.501
Do you have an anti antithesis for this?

00:41:16.501 --> 00:41:16.711
Yeah.

00:41:16.711 --> 00:41:17.276
And then they say,

00:41:17.276 --> 00:41:19.381
Ali: yeah, like, am I being, it's like
when you go to a friend and you're

00:41:19.381 --> 00:41:20.791
like, am I being rational about this?

00:41:20.791 --> 00:41:22.381
Am I being exactly like, am I?

00:41:22.501 --> 00:41:23.556
But yeah, and that's the thing,

00:41:23.556 --> 00:41:24.361
Joe: like I'm, it's just having

00:41:24.361 --> 00:41:26.791
Sam: someone just, but you help me
evaluate this situation at work.

00:41:26.791 --> 00:41:29.671
Ali: I'm buffing and someone's
opinion that you value the gurus.

00:41:29.831 --> 00:41:31.331
Then is your friend in
that moment always there?

00:41:31.331 --> 00:41:31.371
But

00:41:31.371 --> 00:41:34.171
Joe: I'm buffing it around and I'm
looking for chances to act too.

00:41:34.211 --> 00:41:37.161
Like I almost went to an Extinction
Rebellion meeting and then all they

00:41:37.161 --> 00:41:41.001
were going to serve there was daal
and I'm like, no, I'm not going.

00:41:41.051 --> 00:41:43.351
That's how close I got to
joining Extinction Rebellion.

00:41:43.351 --> 00:41:43.901
XR, if

00:41:43.901 --> 00:41:46.301
Sam: only you'd serve
cheeseburgers made from proper

00:41:47.271 --> 00:41:48.321
Joe: climate change beef.

00:41:48.901 --> 00:41:52.411
Three years millenarian,
you know, death cult.

00:41:52.856 --> 00:41:54.926
But three years before, I
was willing to maybe join it.

00:41:54.936 --> 00:41:59.996
So I'm constantly being buffeted
by this stuff that I look around

00:41:59.996 --> 00:42:02.306
and I'm like, well, everyone else
just kind of stays where they are

00:42:02.776 --> 00:42:04.246
and it doesn't bug them that much.

00:42:04.266 --> 00:42:05.836
And then they just go make a cup of tea.

00:42:06.126 --> 00:42:10.513
I nearly go and join a death cult and then
I turn on them and become a right winger.

00:42:10.513 --> 00:42:12.946
And it's like, I just
watch it happen to myself.

00:42:12.946 --> 00:42:15.187
It's like watching a fucking
leaf be blown around in the wind.

00:42:15.187 --> 00:42:15.970
You can generate electricity

00:42:15.970 --> 00:42:16.196
Sam: from this.

00:42:16.196 --> 00:42:18.616
Ali: I think there's so much of
that is to do with our state of

00:42:18.616 --> 00:42:21.811
mind and all mental health and.

00:42:22.241 --> 00:42:22.631
G Tech

00:42:22.631 --> 00:42:23.801
Sam: would be nodding to all of this.

00:42:24.371 --> 00:42:25.281
My mum would

00:42:25.281 --> 00:42:28.671
Ali: have said to me that her biggest
fear for me would have been that I would

00:42:28.671 --> 00:42:30.501
have joined a cult or something like that.

00:42:30.951 --> 00:42:33.751
That she thought I was so
susceptible to joining a cult or

00:42:33.761 --> 00:42:34.801
running off and joining a cult.

00:42:34.801 --> 00:42:35.131
What, you've just

00:42:35.131 --> 00:42:36.641
Sam: got that credulous instinct?

00:42:36.641 --> 00:42:38.741
Ali: There's something about, like
she just thought that I, that was

00:42:38.741 --> 00:42:41.531
Joe: she I've been told not
that, like not that long ago

00:42:41.531 --> 00:42:43.071
that I could be a cult leader.

00:42:43.071 --> 00:42:43.399
Sam: Well there you go.

00:42:43.399 --> 00:42:43.871
Well yeah, indeed.

00:42:44.861 --> 00:42:47.621
Maybe I've joined the 10,
no, you would rock at that.

00:42:47.891 --> 00:42:50.846
I think Ali would You'd almost be immune,

00:42:50.966 --> 00:42:51.416
Joe: wouldn't you?

00:42:51.416 --> 00:42:57.366
Ali: No, no, no, but I think there
is some sort of, there is something

00:42:57.366 --> 00:43:00.436
lacking or something or some sort
of, that she's been able to sense.

00:43:00.466 --> 00:43:04.176
I can totally see it too, that where I
would completely just abandon reality

00:43:04.206 --> 00:43:07.366
and just throw myself into something
because it just seemed like a fun

00:43:07.376 --> 00:43:08.786
thing to do because of the boredom.

00:43:09.056 --> 00:43:12.606
I think that's something that would
have, and I think that's, you and

00:43:12.606 --> 00:43:15.556
I, we get bored and like, it's like,
you know, I'm just going to play

00:43:15.576 --> 00:43:17.526
with it, whether it, when we were
younger, it's like, I'm just going to.

00:43:17.701 --> 00:43:20.161
You know, when it was drugs, alcohol,

00:43:20.161 --> 00:43:22.211
Joe: even as a kid, just
imagining horrible things.

00:43:22.551 --> 00:43:22.841
Yeah.

00:43:23.031 --> 00:43:23.441
Dangerous

00:43:23.691 --> 00:43:26.041
Sam: ideas, I think it's
like I play video games.

00:43:26.921 --> 00:43:27.071
Well,

00:43:27.101 --> 00:43:28.841
Ali: I remember as a kid,
I'm going to scare the shit

00:43:28.841 --> 00:43:30.711
out of myself, just for fun.

00:43:31.101 --> 00:43:31.561
Sam: Yeah, exactly.

00:43:31.581 --> 00:43:33.051
Let's spin around until I feel sick.

00:43:33.361 --> 00:43:35.771
Joe: I remember as a kid being
mortified, thinking about spontaneous.

00:43:35.911 --> 00:43:36.391
Spontaneous combustion.

00:43:36.651 --> 00:43:37.871
Sam: Oh yeah.

00:43:37.871 --> 00:43:38.711
That one got me.

00:43:38.931 --> 00:43:39.881
Joe: What happened in the eighties?

00:43:39.881 --> 00:43:40.651
Why were we all being

00:43:40.651 --> 00:43:42.331
Ali: told about spontaneous combustion?

00:43:42.901 --> 00:43:45.741
That seemed like a much more
scary thing that was actually a

00:43:45.741 --> 00:43:47.511
viable way to die than it actually

00:43:47.521 --> 00:43:51.081
Sam: When I heard satanic child sacrifice,
I was like, what a load of shit.

00:43:51.321 --> 00:43:52.261
Spontaneous combustion.

00:43:52.311 --> 00:43:53.221
Oh my God.

00:43:53.341 --> 00:43:54.301
That could totally happen.

00:43:54.301 --> 00:43:55.821
Totally happen.

00:43:55.821 --> 00:43:56.581
Joe: Yeah.

00:43:56.581 --> 00:43:58.061
Which actually gets to the root.

00:43:58.211 --> 00:43:58.531
Yes.

00:43:58.581 --> 00:44:01.001
Root cause of the fear is
actually the fear of death.

00:44:01.001 --> 00:44:02.081
Oh, a hundred percent.

00:44:02.091 --> 00:44:02.371
But

00:44:02.521 --> 00:44:04.041
Sam: yeah, like, yeah.

00:44:04.041 --> 00:44:04.711
Joe: Or the neurosis.

00:44:04.711 --> 00:44:05.441
Yeah.

00:44:05.441 --> 00:44:10.561
The only thing that saved me really from,
from being dragged off to some extreme is

00:44:10.561 --> 00:44:12.541
my sense of irony and my sense of humor.

00:44:13.351 --> 00:44:16.976
So, you know, like, but yeah,
I'm at a point now at the

00:44:16.976 --> 00:44:18.696
end of 2023 where I can't.

00:44:19.276 --> 00:44:23.586
Thank God, but I can't find a guru who
can actually explain the world to me.

00:44:23.586 --> 00:44:24.696
It's too chaotic.

00:44:25.007 --> 00:44:26.617
no one, no one can predict it.

00:44:26.617 --> 00:44:27.657
No one can explain it.

00:44:27.767 --> 00:44:29.367
Uh, that's fine.

00:44:29.367 --> 00:44:30.777
That's actually a healthy place to be.

00:44:30.777 --> 00:44:31.567
I think it's healthy.

00:44:31.577 --> 00:44:34.852
Ali: And like, and to place the,
go back to sort of like, yeah, as.

00:44:35.462 --> 00:44:38.332
And I think that's and having like
the area of expertise and putting our

00:44:38.332 --> 00:44:41.522
trust into somebody who's got the area
of expertise and we've talked about

00:44:41.522 --> 00:44:45.092
this before on the show where people
have jumped out of their lane to

00:44:45.092 --> 00:44:48.032
talk on other things and they, that's
where they get derailed, but they're

00:44:48.032 --> 00:44:50.932
sort of playing to the algorithm
and you know, that's what's selling

00:44:50.932 --> 00:44:54.792
Joe: the, Mearsheimer talking on
foreign policy is his area of expertise.

00:44:54.812 --> 00:44:57.292
It's just that he happens to have got a
lot of big things wrong over the years.

00:44:57.372 --> 00:44:58.472
Ali: And so I think, yeah, we

00:44:58.472 --> 00:44:59.272
Joe: need to sort of.

00:44:59.522 --> 00:45:02.882
But if I believed him, I would sleep
better at night because he basically

00:45:02.892 --> 00:45:06.712
said a month ago he doesn't think China
will attack Taiwan anytime soon, which

00:45:06.732 --> 00:45:08.202
is something that keeps me up at night.

00:45:08.342 --> 00:45:08.562
Yeah.

00:45:08.562 --> 00:45:10.632
Because that's the one where
unfortunately, you know,

00:45:10.702 --> 00:45:11.942
our lives are at stake.

00:45:12.102 --> 00:45:16.422
Sam: Also, beware of the people that are
attacking Mearsheimer in a one eyed way.

00:45:16.907 --> 00:45:18.017
What's their agenda?

00:45:18.097 --> 00:45:20.087
Why are they so invested
in tearing him down?

00:45:20.247 --> 00:45:21.167
I mean, think about it.

00:45:21.297 --> 00:45:22.687
Joe: And it gets confusing very quickly.

00:45:22.688 --> 00:45:23.667
Sam: Yeah.

00:45:23.667 --> 00:45:26.897
I'm not saying he's a perfect guy,
but I really feel like he has said

00:45:26.897 --> 00:45:28.357
useful things from time to time.

00:45:28.377 --> 00:45:30.097
And I think that's all we can really ask?

00:45:30.117 --> 00:45:31.277
Like, I mean,

00:45:31.697 --> 00:45:33.687
Joe: I don't He's more than a
clock getting it right twice a day.

00:45:33.687 --> 00:45:34.447
I don't think he's

00:45:34.447 --> 00:45:38.572
Sam: deliberately being, trying
to be evil, I don't think.

00:45:38.572 --> 00:45:42.367
Joe: No, no, and look, in defence of
your mate Zizek, he seems like a person

00:45:42.407 --> 00:45:43.697
I think he's a very genuine person.

00:45:44.007 --> 00:45:45.077
Yeah, he seems like a nice enough guy.

00:45:45.077 --> 00:45:50.987
He just, he's The stuff, when he
goes off, off piste about stuff,

00:45:51.147 --> 00:45:54.257
it's, it's, it's stuff that's
basically just reading the news.

00:45:54.267 --> 00:45:56.447
Sam: I mean, the problem is
defining piste, I guess, but

00:45:56.447 --> 00:46:01.097
Joe: he's just reading the
news and then theorizing.

00:46:01.097 --> 00:46:04.517
It's not, he's not doing whatever
rigor he puts into his books when

00:46:04.517 --> 00:46:05.827
he goes out and does these public.

00:46:05.837 --> 00:46:06.987
Sam: Yeah, no, that's interesting.

00:46:06.997 --> 00:46:08.127
I think like throwing

00:46:08.127 --> 00:46:08.867
Joe: stuff out there,

00:46:09.467 --> 00:46:11.107
Sam: but academics operate
on different levels.

00:46:11.397 --> 00:46:14.677
There's the serious, quiet, lonely work.

00:46:15.107 --> 00:46:17.077
of footnoting and referencing.

00:46:17.187 --> 00:46:21.117
And then there's the giving lectures,
which is, everyone agrees a little

00:46:21.117 --> 00:46:22.557
bit more of a popular medium.

00:46:22.767 --> 00:46:23.797
It's okay to make a joke.

00:46:23.817 --> 00:46:26.597
It's okay to be a little bit more
loose depending on the lecture context.

00:46:26.867 --> 00:46:30.897
Then there's the dialogue with, there's
a formal dialogue context, the informal

00:46:30.897 --> 00:46:33.977
dialogue context, popular outreach.

00:46:34.257 --> 00:46:35.237
You have to do all of that.

00:46:35.267 --> 00:46:38.557
Joe: As you've said, your therapist
said to you about your whole life from

00:46:38.567 --> 00:46:42.182
childhood onwards, is you've always been
Perfuddled people and confused people.

00:46:42.342 --> 00:46:42.852
That's true.

00:46:42.922 --> 00:46:43.162
Right.

00:46:43.192 --> 00:46:46.852
You found the adult world befuddling
and you decided to befuddle people.

00:46:46.852 --> 00:46:47.962
To me, that's Cizek.

00:46:47.982 --> 00:46:48.852
To me, that's Cizek.

00:46:48.892 --> 00:46:51.892
He's like, I sent it to one
of the most recent clips.

00:46:51.912 --> 00:46:52.272
It's 15 minutes.

00:46:52.302 --> 00:46:52.862
He is the

00:46:52.862 --> 00:46:54.282
Sam: avatar

00:46:54.282 --> 00:46:57.602
Joe: minutes where he doesn't finish a
single, he doesn't finish a single point

00:46:57.632 --> 00:46:59.192
and nothing he says makes any sense.

00:46:59.427 --> 00:47:01.257
Sam: I don't know, clearly
he has ADD or something.

00:47:01.258 --> 00:47:02.397
Yeah, yeah, so

00:47:02.397 --> 00:47:04.217
Joe: he doesn't Maybe
that's why I like him.

00:47:05.067 --> 00:47:06.177
It's 15 minutes of this.

00:47:06.217 --> 00:47:07.227
It's incomprehensible.

00:47:07.227 --> 00:47:11.277
It's posted on YouTube a week ago
as The Life and Work of Zuzek.

00:47:11.297 --> 00:47:13.047
That's what it says,
and then you watch it.

00:47:13.077 --> 00:47:16.587
It's just 15 minutes from a debate of,
or whatever, from a public lecture.

00:47:16.887 --> 00:47:17.287
Sam: I love you two.

00:47:18.237 --> 00:47:19.397
I'm going to get you two to wrap it up.

00:47:19.397 --> 00:47:20.727
I've got to go and grab a couple of kids.

00:47:21.637 --> 00:47:24.427
Joe: What I'm saying is that's
the reason you're drawn to Zuzek.

00:47:24.652 --> 00:47:28.662
Because he's, he, unlike, he's not
bringing clarity, he's bringing confusion.

00:47:28.862 --> 00:47:29.952
That's what I'm up for.

00:47:30.102 --> 00:47:33.622
And you want more confusion, I want
more clarity, let's finish it up.

00:47:33.632 --> 00:47:36.312
Sam: Yes, simple answers to
hard questions, not boring.

00:47:36.312 --> 00:47:38.782
Joe: There is unlimited confusion
out there, there's very limited

00:47:38.782 --> 00:47:39.862
clarity, that's what I would say.

00:47:39.872 --> 00:47:40.682
Sam: Yeah, yeah.

00:47:40.762 --> 00:47:42.742
In the confusion lies
the clarity somewhere.

00:47:42.812 --> 00:47:44.182
Clarity's false.

00:47:44.442 --> 00:47:45.492
Go get the cube, Sam.

00:47:45.592 --> 00:47:46.822
Okay, love you guys.

00:47:47.342 --> 00:47:48.832
What was that great intro you did?

00:47:48.962 --> 00:47:49.502
Joe: Outro.

00:47:49.807 --> 00:47:52.267
That's been another one
of the 10, 000 things.

00:47:52.327 --> 00:47:53.117
Whoa slick.