WEBVTT

NOTE
This file was generated by Descript 

00:00:11.029 --> 00:00:13.329
Joe: There's reality,
which is loving awareness,

00:00:13.409 --> 00:00:16.639
Sam: unconcerned by the arising
and passing away of phenomena.

00:00:16.939 --> 00:00:18.389
Ali: And then there are the 10, 000

00:00:18.389 --> 00:00:18.789
things.

00:00:19.404 --> 00:00:21.504
Sam: Hello and welcome
to the 10, 000 things.

00:00:21.504 --> 00:00:22.584
My name is Sam Ellis.

00:00:22.934 --> 00:00:23.584
I'm Joe Lowe.

00:00:23.794 --> 00:00:24.264
And I'm Ali

00:00:24.264 --> 00:00:24.964
Joe: Kachamatos.

00:00:25.534 --> 00:00:27.974
Today on the show, Feelpinions.

00:00:28.322 --> 00:00:28.892
Definition.

00:00:28.942 --> 00:00:29.122
Definition.

00:00:29.122 --> 00:00:30.942
What about Ali?

00:00:31.012 --> 00:00:31.252
Ali

00:00:31.252 --> 00:00:31.412
Sam: might

00:00:31.412 --> 00:00:31.662
Ali: have...

00:00:31.682 --> 00:00:32.362
Feel opinion?

00:00:32.392 --> 00:00:36.582
I feel like it's, well, it's a,
it's an opinion I think deeply

00:00:36.582 --> 00:00:39.342
rooted in an emotional response.

00:00:39.432 --> 00:00:40.452
There's an emotional...

00:00:41.987 --> 00:00:43.197
Behind the opinion.

00:00:43.647 --> 00:00:45.857
So it's not necessarily based in fact.

00:00:46.147 --> 00:00:47.247
Joe: Hmm.

00:00:47.267 --> 00:00:47.927
It's what?

00:00:47.957 --> 00:00:50.267
95% of the internet.

00:00:50.777 --> 00:00:51.697
Sam: Yeah, for sure.

00:00:51.897 --> 00:00:54.237
Ali: It's 95% of what
we all say all the time.

00:00:54.237 --> 00:00:56.707
I don't think we're all fact
checking ourselves most of the time.

00:00:56.707 --> 00:00:58.227
It's just fact checkings for other people.

00:00:58.247 --> 00:00:58.627
Yeah.

00:00:58.657 --> 00:00:59.487
It's the ABC.

00:00:59.907 --> 00:01:00.367
Sam: That's right.

00:01:00.527 --> 00:01:03.257
Well, no fact checkings for
people you don't like, you know,

00:01:03.267 --> 00:01:04.587
but like, yeah, that's right.

00:01:04.587 --> 00:01:08.167
So what we're saying is not
because we need to separate.

00:01:08.522 --> 00:01:12.727
We need to separate the kind of ordinary
subjectivity from like the normal, the

00:01:12.727 --> 00:01:16.967
normal sense of bias like that we ascribe
to people for like, Oh, well that person

00:01:16.967 --> 00:01:19.927
would say that cause they're rich or
they would say that cause they're poor

00:01:19.927 --> 00:01:23.907
or they would say that cause they're this
uh, this identity, but like deeper than

00:01:23.907 --> 00:01:24.167
Joe: that.

00:01:24.177 --> 00:01:28.067
But just to set up the
topic, yeah, few opinions.

00:01:28.797 --> 00:01:32.567
I'm going to say, if your
opinions are a problem, do we

00:01:32.567 --> 00:01:33.737
agree that they're a problem?

00:01:33.747 --> 00:01:36.157
Sam: Well, I'm not, I'm not, I
don't, I'm going to hedge my bets.

00:01:36.817 --> 00:01:37.377
Joe: Right.

00:01:37.407 --> 00:01:40.447
Cause what I would like is more facts.

00:01:40.457 --> 00:01:40.797
Sure.

00:01:40.817 --> 00:01:41.687
No, no, no.

00:01:41.687 --> 00:01:42.697
Like facts are great.

00:01:42.707 --> 00:01:44.827
Imagine if say, I enjoy facts.

00:01:45.232 --> 00:01:48.972
Imagine Vladimir Putin had a few
more facts and a few less opinions,

00:01:48.972 --> 00:01:50.702
he might not have invaded Ukraine.

00:01:51.282 --> 00:01:56.372
For example, same with say, the
US and Afghanistan, like facts are

00:01:56.752 --> 00:01:59.152
good things, I'm a fan of facts.

00:01:59.182 --> 00:02:03.852
Ali: They are, but I think, so in DBT,
and we've talked about this before, the

00:02:03.852 --> 00:02:09.742
dialectical behavioural therapy, and
they have the concept of the wise mind,

00:02:09.752 --> 00:02:15.182
and all decisions come back to the wise
mind, which is It's a Venn diagram and one

00:02:15.182 --> 00:02:18.062
part is emotion and one part is rational.

00:02:18.302 --> 00:02:23.032
And so that would be your facts, logic,
the things that are real and then how you

00:02:23.032 --> 00:02:25.582
feel about it is also just as important.

00:02:25.582 --> 00:02:26.192
It's actually real.

00:02:26.292 --> 00:02:26.722
It's actually

00:02:26.732 --> 00:02:27.252
Sam: really important.

00:02:27.252 --> 00:02:28.622
It's an objective fact itself though.

00:02:28.632 --> 00:02:28.932
Yeah.

00:02:29.182 --> 00:02:31.362
Because that's a state that
the A body is experiencing.

00:02:31.362 --> 00:02:31.662
Ali: Yeah.

00:02:31.832 --> 00:02:32.192
And so.

00:02:32.202 --> 00:02:32.222
Yeah.

00:02:32.897 --> 00:02:38.347
The wise mind is way to proceed mindfully
through decisions is somewhere in the

00:02:38.347 --> 00:02:42.437
middle where there's overlap between until
we are taking into account how you feel

00:02:42.437 --> 00:02:46.537
about something just as much as the facts
and that's probably the right decision.

00:02:46.777 --> 00:02:49.587
It can't all be emotional the time it
can't be all rational all the time.

00:02:50.417 --> 00:02:53.697
Like there'll be, there'll be moments
and some things where it will obviously

00:02:53.697 --> 00:02:57.977
all be your feelings or all be all facts,
but for the most part, both actually play

00:02:57.977 --> 00:02:59.947
a role in the decisions that we make.

00:03:00.177 --> 00:03:01.007
But social

00:03:01.007 --> 00:03:03.337
Sam: media, that's, that's the
app for today, everyone, Ali's

00:03:06.047 --> 00:03:06.127
Ali: cross smash.

00:03:06.127 --> 00:03:07.847
That's an A plus from a
psychologist from that one.

00:03:07.847 --> 00:03:08.317
I remember that.

00:03:08.397 --> 00:03:08.497
But

00:03:08.727 --> 00:03:10.517
Joe: we're all, we're not
all getting treatment for

00:03:10.517 --> 00:03:12.037
borderline personality disorder.

00:03:12.037 --> 00:03:12.417
No,

00:03:12.417 --> 00:03:13.757
Sam: no, no, that's true.

00:03:13.757 --> 00:03:14.237
But because

00:03:14.237 --> 00:03:15.927
Joe: that is a treatment
for that, isn't it?

00:03:15.977 --> 00:03:16.907
It's used for lots of different things.

00:03:16.907 --> 00:03:18.467
I thought it was almost always.

00:03:18.692 --> 00:03:19.362
for borderline.

00:03:19.382 --> 00:03:19.872
No, no.

00:03:19.872 --> 00:03:21.332
I mean, it's, it's used for that too.

00:03:21.342 --> 00:03:22.522
It's gold standard
treatment for borderline.

00:03:22.522 --> 00:03:23.222
For borderline,

00:03:23.222 --> 00:03:24.952
Ali: but it's, but it's
used for lots of different,

00:03:25.002 --> 00:03:25.022
Joe: yeah.

00:03:25.022 --> 00:03:26.282
Dialectical behavioral therapy.

00:03:26.282 --> 00:03:26.602
Sorry.

00:03:26.602 --> 00:03:27.672
I'm the one doing tangents.

00:03:27.832 --> 00:03:28.082
Sam: No, no, no.

00:03:28.082 --> 00:03:28.512
You're right.

00:03:28.532 --> 00:03:30.582
No, but I think DBT is exactly what's...

00:03:31.432 --> 00:03:31.582
It's

00:03:31.782 --> 00:03:33.722
Ali: repackaged cognitive
behavioral therapy, really.

00:03:33.732 --> 00:03:33.902
Yes, it

00:03:33.902 --> 00:03:34.142
Sam: is.

00:03:34.192 --> 00:03:34.242
Yeah.

00:03:34.252 --> 00:03:35.082
Um, that's right.

00:03:35.122 --> 00:03:36.522
But I think maybe there's a...

00:03:36.802 --> 00:03:40.942
I think the idea of getting at the
wisdom of the subject's judgments

00:03:41.012 --> 00:03:42.322
is part of therapy, right?

00:03:42.442 --> 00:03:45.222
So, let's all agree that it doesn't
matter what kind of therapy you're

00:03:45.222 --> 00:03:49.832
doing, one of the most powerful things
you can do is recognize your previous

00:03:49.872 --> 00:03:53.092
errors, and we've talked about this
before, and like taking account of

00:03:53.092 --> 00:03:57.992
them, like rationally taking account
of them, but also being accountable

00:03:57.992 --> 00:04:03.116
for them and accepting responsibility,
and also Accepting that which you were

00:04:03.126 --> 00:04:05.006
not responsible for at the same time.

00:04:05.006 --> 00:04:05.186
Right?

00:04:05.186 --> 00:04:11.236
So rationally separating out culpability
and responsibility into piles of like,

00:04:12.156 --> 00:04:15.799
otherwise you're trapped between the
twin poles of, uh, which I discovered

00:04:15.799 --> 00:04:19.939
in therapy, exalted being worthless
being, which it basically corresponds

00:04:19.939 --> 00:04:21.939
perfectly to my opinions are right.

00:04:21.979 --> 00:04:23.529
How dare you challenge them and.

00:04:24.004 --> 00:04:26.314
Oh, I'm actually a dumbass who
doesn't know anything, right?

00:04:26.314 --> 00:04:26.994
Yeah, yeah, yeah.

00:04:27.004 --> 00:04:30.994
And so basically, most people on the
internet are swinging wildly between

00:04:33.474 --> 00:04:34.924
Joe: those two poles.

00:04:35.104 --> 00:04:36.344
Have you ever thought that about
yourself, Sam, that you're a

00:04:36.344 --> 00:04:37.454
dumbass who doesn't know it all.

00:04:37.824 --> 00:04:38.374
Absolutely.

00:04:38.404 --> 00:04:38.484
But

00:04:38.484 --> 00:04:40.164
Sam: what's the flip
side of being a know it

00:04:40.164 --> 00:04:40.444
Joe: all?

00:04:40.814 --> 00:04:41.174
Why?

00:04:41.244 --> 00:04:41.854
But why is that?

00:04:41.854 --> 00:04:43.104
Why would there be a flip side?

00:04:43.514 --> 00:04:44.134
Because, no,

00:04:44.484 --> 00:04:46.789
Sam: because I've pretty much said that
from the start of this show that all

00:04:46.789 --> 00:04:49.544
extremes are mutually constitutive, right?

00:04:49.554 --> 00:04:54.579
So you've got, if you've got like
a Basically, MAGA people need the

00:04:54.579 --> 00:04:57.059
Libtards and vice versa, right?

00:04:57.099 --> 00:04:57.669
They create each other.

00:04:57.669 --> 00:04:57.889
They're

00:04:58.079 --> 00:04:58.639
Joe: complementing each other.

00:04:58.739 --> 00:04:59.439
Ah, yes.

00:04:59.469 --> 00:05:04.079
Ram Dass said, cops create hippies and
hippies create cops, hippies create cops.

00:05:04.109 --> 00:05:04.859
Got it in one.

00:05:05.209 --> 00:05:05.869
I don't get it.

00:05:05.919 --> 00:05:11.719
I, I, I can say what Ram Dass
said and I can dig that that was

00:05:11.719 --> 00:05:14.149
cool, but I don't quite get it.

00:05:14.249 --> 00:05:15.809
Sam: What it's saying,
what it's saying is.

00:05:16.424 --> 00:05:20.004
The hippies are kicking back against
the established order and cops are

00:05:20.004 --> 00:05:26.504
reactionary at best and fascist at
worst and they, uh, automatically

00:05:26.504 --> 00:05:31.384
perceive any threat to just the
established order as automatically bad.

00:05:31.514 --> 00:05:32.534
They create each other.

00:05:32.644 --> 00:05:34.094
They create each other to a degree.

00:05:34.124 --> 00:05:38.224
Obviously the cops pre exist
hippies, but the, the problem of.

00:05:38.839 --> 00:05:43.439
Needing fascist bully boys is a problem
created by the fact that people just

00:05:43.449 --> 00:05:46.449
won't do as they're told from the
point of view of authoritarians, right?

00:05:46.449 --> 00:05:49.889
So you've got to solve that
problem by like having this like

00:05:49.929 --> 00:05:51.389
stick to hang over people's heads.

00:05:51.449 --> 00:05:54.134
And by the way, ACAB 101.

00:05:54.194 --> 00:05:56.884
Cops are not there to solve
crimes and prevent crimes.

00:05:56.984 --> 00:06:00.564
They're there to protect private
property from the masses.

00:06:00.604 --> 00:06:01.844
All right, so that's just
sort of anarchism 101.

00:06:01.844 --> 00:06:02.164
And you are

00:06:02.164 --> 00:06:02.324
the

00:06:02.324 --> 00:06:05.504
Joe: only private property owner
on this podcast, so That's right.

00:06:07.434 --> 00:06:09.634
Me and Ellie have got like nothing to

00:06:10.074 --> 00:06:10.114
Ali: steal.

00:06:10.724 --> 00:06:10.834
That's right.

00:06:11.184 --> 00:06:11.614
That's right.

00:06:12.204 --> 00:06:13.844
When I call the cops, they come.

00:06:14.114 --> 00:06:14.714
She always worries.

00:06:14.864 --> 00:06:15.804
She's like, Oh, you know, you off.

00:06:15.814 --> 00:06:16.374
You always leave.

00:06:16.374 --> 00:06:18.134
Cause I'm always leaving the house
unlocked and things like that.

00:06:18.134 --> 00:06:20.124
And she's like, I'm like, they're
literally going to steal nothing.

00:06:20.124 --> 00:06:20.564
I have nothing.

00:06:21.164 --> 00:06:21.964
Like, please take

00:06:21.964 --> 00:06:22.354
Sam: my stuff.

00:06:22.354 --> 00:06:22.994
I said, Oh bro.

00:06:23.004 --> 00:06:26.854
Like if the, when the junkie comes
in your front window, you say.

00:06:27.424 --> 00:06:29.734
There's where the
valuables are off you go.

00:06:29.824 --> 00:06:30.044
Yeah,

00:06:30.044 --> 00:06:32.854
Joe: I'd be dark on the laptop
and that'd be about it I really

00:06:32.854 --> 00:06:35.384
Ali: have nothing that they could
take that I would be angry about.

00:06:35.644 --> 00:06:36.134
No, just like

00:06:36.514 --> 00:06:39.394
Sam: that guy needed it more than
me Yeah, I'll help him carry it out.

00:06:39.894 --> 00:06:40.294
Yeah, pretty

00:06:40.294 --> 00:06:43.104
Joe: fucking Zen Capitalist pig dog.

00:06:43.254 --> 00:06:47.934
Ali: I've been robbed twice like home
robbery and um the only and the first

00:06:47.944 --> 00:06:53.413
time they stole my grandmother's jewelry
and so that, so for, so part of me was

00:06:53.413 --> 00:06:56.133
really heartbroken about that because
she died only about 12 months before.

00:06:56.143 --> 00:06:59.743
So I was still quite, it was,
it was, it was fresh and I

00:06:59.883 --> 00:07:00.913
was really upset about that.

00:07:01.223 --> 00:07:03.003
And then the second time we got robbed.

00:07:03.353 --> 00:07:05.943
It was Christmas Eve and they'd
actually stolen all the presents

00:07:05.943 --> 00:07:06.873
under the Christmas tree.

00:07:07.433 --> 00:07:11.883
And we had, and we had a little, like,
this is when my son was really little.

00:07:11.943 --> 00:07:17.433
So he was still very, much believed
in Santa and we had nothing to

00:07:17.433 --> 00:07:20.533
like on the morning, like we
had to like go and wrap up some.

00:07:21.093 --> 00:07:23.853
It's just some odds and ends just
to have something under the tree.

00:07:24.673 --> 00:07:27.463
It was really, it really
was, it was so horrible.

00:07:29.293 --> 00:07:32.313
How do you get to the point if you
like where you stole all the presents?

00:07:32.313 --> 00:07:32.343
Yeah.

00:07:33.943 --> 00:07:34.360
Joe: Oh my God.

00:07:34.360 --> 00:07:34.983
It was horrible.

00:07:35.573 --> 00:07:36.773
A drug could possibly

00:07:37.113 --> 00:07:37.533
Ali: be worth that.

00:07:37.563 --> 00:07:37.623
Yeah.

00:07:38.013 --> 00:07:41.513
So I mean, no matter how bad of a
situation I was in and how annoyed

00:07:41.513 --> 00:07:43.913
that like, yeah, Christmas presents
had been stolen, I was not the

00:07:43.913 --> 00:07:45.653
person who was resorting to stealing.

00:07:45.663 --> 00:07:48.033
Stealing someone's Christmas
presents under their tree.

00:07:48.053 --> 00:07:50.253
Like I feel for that person, really.

00:07:51.228 --> 00:07:51.368
That

00:07:51.748 --> 00:07:52.258
Sam: is brutal.

00:07:52.278 --> 00:07:52.688
Joe: Yeah.

00:07:52.838 --> 00:07:53.118
All right.

00:07:53.228 --> 00:07:54.578
So back to the topic.

00:07:54.598 --> 00:07:55.118
Feel opinions.

00:07:55.128 --> 00:07:55.648
What

00:07:56.168 --> 00:07:58.408
Sam: I feel like, and of course
the cops were no use in those

00:07:58.418 --> 00:07:59.528
circumstances, were they?

00:07:59.578 --> 00:08:01.508
Not before, not during

00:08:01.548 --> 00:08:02.188
Joe: or after.

00:08:02.288 --> 00:08:06.348
But see Sam, the way you
present things right, and maybe

00:08:06.358 --> 00:08:08.678
it's because you're autistic.

00:08:08.898 --> 00:08:10.618
Is that yes, you know facts.

00:08:10.708 --> 00:08:10.868
Yes.

00:08:11.288 --> 00:08:14.728
So you can explain what a
police officer is, right?

00:08:14.948 --> 00:08:17.388
Look, I don't should, I don't
should talk for what you just said.

00:08:17.398 --> 00:08:19.408
Most people probably wouldn't agree with.

00:08:19.428 --> 00:08:20.018
No, that's true.

00:08:20.348 --> 00:08:22.748
Let's just say, let's just
limit it to Australia.

00:08:22.798 --> 00:08:24.868
Most people in Australia wouldn't agree.

00:08:25.318 --> 00:08:29.178
that cops are, okay, I'll tell you why
the difference on a negative thing, right?

00:08:29.878 --> 00:08:33.368
The way you present your stuff is
as if it's a fact, which is why I

00:08:33.378 --> 00:08:35.288
wanted to talk about field opinions.

00:08:35.318 --> 00:08:39.408
Cause it's like your classic
example of basically what we're

00:08:39.408 --> 00:08:42.598
all doing, which is having these
field opinions, presenting them as.

00:08:43.213 --> 00:08:46.043
To get to facts, you've got to
back shit up with like numbers

00:08:46.543 --> 00:08:47.013
Sam: and stuff, right?

00:08:47.183 --> 00:08:47.913
Ali: Look, I just said it was an app.

00:08:48.143 --> 00:08:52.933
I think that's part of it, but I do think
also speaking with a sense of authority

00:08:52.943 --> 00:08:57.323
in an authoritative tone can also convey
something in the same sort of way, in

00:08:57.363 --> 00:09:01.273
the way you are feeling like it's a
fact, when it might not be based in that.

00:09:01.473 --> 00:09:03.653
Joe: But as a straight white
guy, I talk about things in

00:09:03.693 --> 00:09:05.113
authoritative tones all the time.

00:09:05.123 --> 00:09:05.343
Yeah.

00:09:05.553 --> 00:09:05.693
So,

00:09:05.693 --> 00:09:05.883
Ali: and

00:09:06.783 --> 00:09:07.303
Sam: exactly.

00:09:07.633 --> 00:09:08.643
Yes, that's true.

00:09:08.863 --> 00:09:11.833
Joe: Now, I have no, no, almost nothing.

00:09:12.333 --> 00:09:16.773
I've very few facts, which is why
I come across as self obsessed

00:09:16.773 --> 00:09:20.053
because I've worked out in the last
few years to just stick to what I

00:09:20.193 --> 00:09:22.093
do know, which is sort of just me.

00:09:22.323 --> 00:09:22.623
Yeah.

00:09:23.033 --> 00:09:25.343
And the rest is unknown.

00:09:25.423 --> 00:09:25.803
Yeah.

00:09:25.853 --> 00:09:26.568
Sam: But, but the confidence.

00:09:26.568 --> 00:09:29.033
Even the self is unknowable
to a degree, you know, so it's

00:09:29.033 --> 00:09:29.213
Joe: like.

00:09:29.253 --> 00:09:29.953
Oh yeah.

00:09:29.993 --> 00:09:33.693
But like the confidence is still there
because of my conditioning, right?

00:09:34.573 --> 00:09:35.093
That's true.

00:09:35.586 --> 00:09:38.846
But Sam, on the other
hand, knows a lot of stuff.

00:09:38.986 --> 00:09:39.326
Yes.

00:09:39.356 --> 00:09:43.296
Like a lot of information has gone in
at some point before he stopped reading

00:09:43.296 --> 00:09:45.876
books and it's in, and it's in there.

00:09:45.876 --> 00:09:46.016
And it very

00:09:46.226 --> 00:09:49.916
Ali: reliably comes out as, as facts
and, and actual statistics and numbers

00:09:49.986 --> 00:09:51.356
and the ability to recall what.

00:09:51.586 --> 00:09:53.006
Information in real, in such

00:09:55.676 --> 00:09:58.226
Joe: detail that it's, yeah.

00:09:59.766 --> 00:10:01.886
Sam: So I only Philpinions, and
doing yourself a disservice,

00:10:01.886 --> 00:10:04.286
because you've probably got an
excellent recall of cricket facts,

00:10:04.286 --> 00:10:04.656
Joe: for example.

00:10:04.666 --> 00:10:08.926
Yeah, like I'm, but Philpinions and
cricket go together really well,

00:10:08.946 --> 00:10:13.326
and what you create, hopefully,
is fun group chat content.

00:10:13.416 --> 00:10:13.766
That's true.

00:10:13.766 --> 00:10:14.206
Right?

00:10:14.206 --> 00:10:16.646
And that's what I'll do, and
that's what I'll be doing tonight.

00:10:16.806 --> 00:10:17.136
Yeah.

00:10:17.236 --> 00:10:17.616
Right?

00:10:17.626 --> 00:10:20.186
But, but...

00:10:20.486 --> 00:10:24.466
I think it's very problematic that so
much of, what am I really talking about?

00:10:24.466 --> 00:10:27.206
I guess I'm talking about
how shit Twitter is.

00:10:27.246 --> 00:10:27.466
Well, I

00:10:27.466 --> 00:10:31.346
Sam: feel opinion, I feel opinion is that
all Collingwood supporters have bad teeth.

00:10:31.416 --> 00:10:36.236
And then I guess you might hold that
opinion in either a joking manner or

00:10:36.246 --> 00:10:37.776
like my mother in law, God bless her.

00:10:37.896 --> 00:10:38.381
Love her.

00:10:39.301 --> 00:10:41.631
I'm like, yeah, it's just a
fun thing we say, isn't it?

00:10:41.641 --> 00:10:41.931
Granny.

00:10:41.931 --> 00:10:42.791
And she's like, no, they

00:10:42.791 --> 00:10:43.811
Ali: really are awful.

00:10:43.811 --> 00:10:45.661
I have a colleague that's
supported by the way.

00:10:49.661 --> 00:10:54.831
Sam: But also like it's for some, for
some sport is often a great way to

00:10:54.831 --> 00:10:57.071
illuminate any kind of idea in sociology.

00:10:57.071 --> 00:10:58.371
Well, not any, but a lot.

00:10:58.861 --> 00:11:03.811
So for example, the idea that your team
is the best in the world, no matter what.

00:11:04.231 --> 00:11:05.521
But here's a more interesting one.

00:11:06.111 --> 00:11:08.651
Other people's teams
cheat, mine doesn't, right?

00:11:08.701 --> 00:11:12.111
So that's like, that's a classic feel
opinion of the sports supporter and

00:11:12.111 --> 00:11:13.581
it's putting the cart before the horse.

00:11:13.891 --> 00:11:17.411
It's justifying the feeling the
person want to have, wants to have.

00:11:17.751 --> 00:11:19.231
So that's what we really
were talking about here.

00:11:19.681 --> 00:11:24.901
So obviously You know, kind of rational
platonic, uh, you know, cognition, you

00:11:24.901 --> 00:11:28.621
know, the ideal republic governed by
detached intellectuals and so forth.

00:11:28.801 --> 00:11:33.821
All of that is dreadfully, uh, um, um,
it's, it's taking us down the wrong path.

00:11:33.971 --> 00:11:37.291
So basically Joe's just a nice
old fashioned Platonist who wants

00:11:37.291 --> 00:11:42.161
to, he wants the Republic to be
ruled by, by slave owning men.

00:11:42.541 --> 00:11:43.911
That are just thorough rationalists.

00:11:43.911 --> 00:11:44.011
As long as

00:11:45.221 --> 00:11:46.391
Joe: it's completely rational.

00:11:46.411 --> 00:11:46.701
Yeah.

00:11:46.801 --> 00:11:48.641
In terms of actually running the world.

00:11:48.671 --> 00:11:51.591
Not in terms of like what
I do with my spare time.

00:11:52.004 --> 00:11:53.864
which is really, it's, yeah.

00:11:54.694 --> 00:11:59.014
Cause I'm happy to give 15 Australian
dollars to a guy called Matt Yglesias.

00:11:59.374 --> 00:12:00.854
Because he will essentially.

00:12:00.854 --> 00:12:00.864
Irrational.

00:12:00.864 --> 00:12:07.364
He will essentially just write
what he sees as logic in a

00:12:07.364 --> 00:12:09.714
blog over and over again.

00:12:09.734 --> 00:12:10.924
And he'll write logic.

00:12:12.419 --> 00:12:12.759
And

00:12:12.989 --> 00:12:14.109
Sam: sometimes he's even right.

00:12:14.759 --> 00:12:18.399
Joe: It's, I don't know,
but it soothes my brain.

00:12:20.189 --> 00:12:23.609
As opposed to like trying to expose
myself to Twitter or something

00:12:23.609 --> 00:12:25.229
where it's just like, what the fuck?

00:12:25.239 --> 00:12:26.209
Sam: But do you, do you feel

00:12:26.209 --> 00:12:29.739
Ali: like, I was going to say like
all decisions, like if you had people

00:12:29.749 --> 00:12:31.689
running the show who are making.

00:12:32.144 --> 00:12:35.544
Decisions purely based on fact
without any emotion at all.

00:12:36.084 --> 00:12:37.234
That's a really...

00:12:37.244 --> 00:12:38.824
Joe: For running things, yes.

00:12:38.834 --> 00:12:41.434
For like building, say,
an aircraft or whatever.

00:12:41.434 --> 00:12:41.664
Do we want

00:12:41.664 --> 00:12:43.934
Sam: detached ASD people
like me running everything?

00:12:44.014 --> 00:12:44.394
You really don't.

00:12:44.394 --> 00:12:45.544
You, you, you want...

00:12:45.554 --> 00:12:46.614
Joe: But you get...

00:12:46.674 --> 00:12:48.644
Imagine how good the
housing outcomes would be

00:12:48.914 --> 00:12:49.944
Ali: really good if Sam was running stuff.

00:12:49.944 --> 00:12:52.184
Actually, I should be in charge.

00:12:52.794 --> 00:12:53.274
But I think...

00:12:53.509 --> 00:12:57.120
Again, it comes back, you really need
to have, we are not without emotion,

00:12:57.120 --> 00:13:01.860
we are not without feelings, you
cannot, you know, it might be the,

00:13:02.430 --> 00:13:06.010
the, you know, based on facts, the
right decision, but there's going

00:13:06.010 --> 00:13:08.600
to be a lot of people that will feel
unhappy still about that outcome.

00:13:08.630 --> 00:13:11.430
And that needs to be considered
when making the decision.

00:13:11.660 --> 00:13:15.050
And like we were talking about the other
week, the Preston market, like you've got.

00:13:15.395 --> 00:13:18.545
The facts of the situation and how
the, you know, what the government

00:13:18.575 --> 00:13:20.465
owns, what this person owns.

00:13:20.485 --> 00:13:23.005
But then the feelings of the
populace who, you know, frequent

00:13:23.005 --> 00:13:24.335
the market and love the market.

00:13:24.605 --> 00:13:25.955
That actually really matters as well.

00:13:25.955 --> 00:13:28.515
Joe: People have different
feelings about wind turbines.

00:13:28.525 --> 00:13:31.865
Like they feel like they give them
strange headaches when they don't.

00:13:32.065 --> 00:13:33.985
Or they feel like they're
a blight on the landscape.

00:13:34.005 --> 00:13:34.965
Whereas I think they look...

00:13:35.080 --> 00:13:35.570
Cool.

00:13:36.090 --> 00:13:39.330
However, Joe, there should be completely
fucking irrelevant because we really

00:13:39.330 --> 00:13:40.900
need the fucking wind turbines.

00:13:40.900 --> 00:13:41.070
Right?

00:13:41.090 --> 00:13:45.360
So in my perfect logical world, we're
building the wind turbines regardless

00:13:45.360 --> 00:13:47.230
of what anyone feels about them.

00:13:47.240 --> 00:13:47.580
You know?

00:13:48.060 --> 00:13:48.220
Yeah.

00:13:48.220 --> 00:13:49.720
Sam: But it turns out
there's a rationalist.

00:13:49.890 --> 00:13:51.410
This is, this is really
going to blow your mind.

00:13:51.850 --> 00:13:56.170
Turns out there's a rationalist basis
to feeling funny about wind turbines.

00:13:56.500 --> 00:13:58.350
So send in the anthropologists

00:13:58.400 --> 00:14:01.940
Someone is adjacent to a property
where wind turbines get built.

00:14:02.270 --> 00:14:06.440
And how it works is the
landowner is given money.

00:14:06.540 --> 00:14:07.690
There's some sort of contract.

00:14:07.839 --> 00:14:11.822
they're compensated for the installation
in the first place, and then they receive

00:14:11.852 --> 00:14:13.892
like an annual payment after that.

00:14:14.782 --> 00:14:18.712
The person next door does
not receive the benefit.

00:14:19.412 --> 00:14:22.542
And this is perceived as unfairness.

00:14:23.052 --> 00:14:25.192
Because it is imposing a cost on them.

00:14:25.192 --> 00:14:27.422
So there's an, there's an
externality which has gone to

00:14:27.422 --> 00:14:31.062
the neighbor or there might even
be many neighbors, not just one.

00:14:31.602 --> 00:14:37.142
And that person feels as though
something's been, this psychoanalysis

00:14:37.142 --> 00:14:38.552
is what the anthropologist concluded.

00:14:39.552 --> 00:14:41.942
That person feels that a cost has
been imposed on them, but they

00:14:41.972 --> 00:14:43.102
haven't received the benefit.

00:14:43.392 --> 00:14:46.302
And rather than connecting with
that rational thought in like a...

00:14:46.857 --> 00:14:50.077
Proper way, like, maybe if they person
had done a whole lot of therapy, they

00:14:50.077 --> 00:14:51.427
might have been able to work all this out.

00:14:51.487 --> 00:14:56.117
But where they end up instead,
is, These are making me sick.

00:14:56.627 --> 00:15:00.417
And in a sense they are, but
they're misattributing the

00:15:00.417 --> 00:15:03.657
cause and they've arrived at the
right place by the wrong means.

00:15:03.697 --> 00:15:05.227
And that's often what feelings do.

00:15:05.447 --> 00:15:09.367
And actually facts can take us
to the right place by, uh, by the

00:15:09.367 --> 00:15:10.897
wrong means and vice versa as well.

00:15:10.907 --> 00:15:14.737
Facts can be enormously misleading
in the, in the wrong hands and at the

00:15:14.737 --> 00:15:17.537
right time that facts can be, you know.

00:15:18.117 --> 00:15:21.657
So basically putting our faith in
facts is an error and putting our

00:15:21.657 --> 00:15:23.277
faith in feelings alone is an error,

00:15:23.367 --> 00:15:23.697
Joe: obviously.

00:15:23.697 --> 00:15:23.757
Yeah.

00:15:23.757 --> 00:15:27.417
So what I wanted to say about Phil
Pinons is that I don't feel like

00:15:27.417 --> 00:15:29.607
I have access to almost any facts.

00:15:29.937 --> 00:15:34.227
So I've paid these two bloggers quite
a bit of money sometimes, depending

00:15:34.227 --> 00:15:35.727
on how busy I am in the film industry.

00:15:35.847 --> 00:15:38.397
Like 30 bucks a month is a
lot, but, but their stock in

00:15:38.397 --> 00:15:40.853
Sam: trade, Joe is look at us,
look at how detached we are.

00:15:40.853 --> 00:15:43.433
Look at how unmotivated
by feelings we are.

00:15:43.463 --> 00:15:46.253
You can trust what we're saying
because we are having feeling.

00:15:46.253 --> 00:15:46.493
But

00:15:46.493 --> 00:15:47.963
Joe: also, we'll read a lot.

00:15:47.993 --> 00:15:50.873
This is a masculinist dialectic as well.

00:15:51.303 --> 00:15:56.433
I don't have feelings, but what they're
gonna say is, we'll read a lot, and we'll

00:15:56.443 --> 00:16:00.573
write blogs, but we will write them from
the point of view of a couple of things.

00:16:01.013 --> 00:16:05.573
We want problems to be solved, and
we think some problems can be solved.

00:16:05.653 --> 00:16:07.233
But we don't think people
should get upset about drilling.

00:16:07.233 --> 00:16:09.323
Whereas the Guardian will
tell me the world is ending.

00:16:09.383 --> 00:16:09.623
What?

00:16:09.873 --> 00:16:12.353
Sam: Well, yeah, but one of those
saying, no opinion, you know,

00:16:12.913 --> 00:16:14.173
don't get upset about drilling.

00:16:14.618 --> 00:16:15.938
Biden did the rational

00:16:15.958 --> 00:16:16.198
Joe: thing.

00:16:16.228 --> 00:16:20.418
They're both pro drilling, they're both
pro drilling, they're both pro drilling

00:16:20.428 --> 00:16:26.748
in Alaska, but then the way my mind works
is, I just think, I don't understand

00:16:27.398 --> 00:16:29.898
why exactly we need some more oil.

00:16:30.573 --> 00:16:34.593
But if they're both telling me we
need some more oil, just in the

00:16:34.593 --> 00:16:39.103
short term, even though they're both
write a lot about climate change and

00:16:39.103 --> 00:16:40.683
there's hugely concerned about it.

00:16:40.693 --> 00:16:41.673
The siren call of

00:16:41.673 --> 00:16:42.403
Sam: the reasonable

00:16:42.403 --> 00:16:42.663
Joe: man.

00:16:43.343 --> 00:16:43.483
Right?

00:16:43.843 --> 00:16:46.293
If they say we need some
more oil to help make sure...

00:16:46.293 --> 00:16:47.323
Reasonable men will

00:16:47.323 --> 00:16:48.303
Sam: lead us to our deaths.

00:16:48.773 --> 00:16:49.323
And they've done it

00:16:49.333 --> 00:16:50.603
Joe: before and they'll do it again.

00:16:50.603 --> 00:16:54.573
And the lower petrol prices will make
the Democrats much more likely to get

00:16:54.573 --> 00:16:56.333
elected the next presidential election.

00:16:56.343 --> 00:16:57.013
Oh God,

00:16:57.013 --> 00:16:59.013
Sam: this is such
tortured logic, honestly.

00:16:59.243 --> 00:17:03.493
Joe: Like then And I've paid
my 30, then what happens to me?

00:17:03.503 --> 00:17:07.113
And I'm not joking, is their
opinions become my opinions.

00:17:07.143 --> 00:17:07.483
Of course.

00:17:07.483 --> 00:17:08.953
Sorry, my feel opinions.

00:17:09.003 --> 00:17:09.543
Exactly.

00:17:09.933 --> 00:17:10.613
But they're good.

00:17:10.693 --> 00:17:11.953
They're high quality Sam.

00:17:12.283 --> 00:17:12.543
Sam: I agree.

00:17:12.543 --> 00:17:13.943
They're better than other
people's feel opinions.

00:17:14.133 --> 00:17:14.573
Whereas

00:17:14.573 --> 00:17:17.293
Joe: I can ask you, and
I've done an experiment.

00:17:17.383 --> 00:17:19.273
Sorry, Ali, I'll jump to you in a sec.

00:17:19.543 --> 00:17:22.353
I've done an experiment with Sam
in the last couple of weeks where

00:17:22.353 --> 00:17:25.353
I've asked for like a thread
on something and you get like.

00:17:25.653 --> 00:17:31.153
Something that you, someone else would pay
15 for on, say, the war in Ukraine, right?

00:17:31.163 --> 00:17:33.353
Like, cause Sam can
work at that level too.

00:17:33.393 --> 00:17:34.023
I can't.

00:17:34.263 --> 00:17:35.783
I'm, I'm confused.

00:17:35.833 --> 00:17:36.473
I'm just trying to

00:17:36.473 --> 00:17:37.023
Sam: absorb.

00:17:37.193 --> 00:17:38.863
I'll set up a sub stack next week, Joe.

00:17:38.933 --> 00:17:39.695
Joe: You can subscribe to it.

00:17:39.695 --> 00:17:42.723
You won't because you fucking lack
the executive function or whatever

00:17:43.403 --> 00:17:45.963
the thing is that gets you around
to getting, doing the thing.

00:17:45.973 --> 00:17:47.423
Plus you've got all the Washington fold.

00:17:47.453 --> 00:17:47.873
That's true.

00:17:48.713 --> 00:17:53.123
But, uh, but, but like chat GPT or
whatever, I can just prompt you and

00:17:53.123 --> 00:17:57.663
you'll go, boom, because you've got all
these, all this information at hand.

00:17:59.433 --> 00:18:03.503
I don't know, something is highly
motivating me to pay these two people

00:18:03.503 --> 00:18:04.903
to tell me everything to think.

00:18:06.453 --> 00:18:08.093
Ali: Because you've, it's
a, it's an investment.

00:18:08.103 --> 00:18:10.513
It's that sort of, not a sunk
cost fallacy, but in the sense

00:18:10.533 --> 00:18:13.103
that you've invested, yeah, the
inverse of that kind of, yeah,

00:18:13.103 --> 00:18:14.943
you've, you've, you've spent money.

00:18:15.223 --> 00:18:17.813
On this person and you feel good
about it and you're like, okay, I feel

00:18:17.813 --> 00:18:20.413
good about giving this month, this
person, my money to, to break down

00:18:20.413 --> 00:18:22.983
all this information for me and give
it to me in a really digestible way.

00:18:23.513 --> 00:18:27.273
And so of course you, you want to
feel, okay, well I feel like I've

00:18:27.273 --> 00:18:29.743
got my value for my money also.

00:18:30.183 --> 00:18:33.803
Sam: And what it does, I'd be a sucker
if I paid for like worthless drivel.

00:18:33.813 --> 00:18:34.133
Ali: Yeah.

00:18:34.183 --> 00:18:34.703
And so, yeah.

00:18:34.703 --> 00:18:41.503
And so, and it's also your field opinions,
what it is and you, and you feel looking

00:18:41.503 --> 00:18:44.793
closely at your values and you know,
as sort of, you know, Center left, you

00:18:44.793 --> 00:18:49.573
know, sort of man who values climate, you
know, believes climate change is real.

00:18:49.573 --> 00:18:50.853
These are the practical solutions.

00:18:51.093 --> 00:18:54.863
You're paying somebody who's
sort of reinforcing those values.

00:18:54.863 --> 00:18:55.073
Yeah.

00:18:55.193 --> 00:18:56.243
So that's why you're...

00:18:56.313 --> 00:18:56.523
And

00:18:56.523 --> 00:19:00.603
Joe: they're both policy nerds too,
so the articles will be quite long.

00:19:00.603 --> 00:19:01.583
They'll have a lot of graphs.

00:19:01.593 --> 00:19:02.303
They'll have a lot of numbers.

00:19:03.133 --> 00:19:04.733
Ali: You'll place more value and weight

00:19:05.293 --> 00:19:05.873
Joe: on the things they're saying.

00:19:06.633 --> 00:19:09.353
What to is away from my field
opinions and towards some facts.

00:19:10.083 --> 00:19:11.373
I've only got so much bandwidth.

00:19:11.773 --> 00:19:15.613
So what I used to read is the
entire The Economist magazine.

00:19:15.613 --> 00:19:15.733
But

00:19:15.733 --> 00:19:16.883
Sam: Ali's got your number here.

00:19:17.158 --> 00:19:20.458
Like, the thing you're doing
is highly rational, but it's

00:19:20.478 --> 00:19:22.448
irrational, like at the same time.

00:19:23.148 --> 00:19:24.318
And, and like, Yeah,

00:19:24.318 --> 00:19:27.488
Joe: I sense that it's, it's,
I'm too much of a sponge.

00:19:27.548 --> 00:19:27.838
Yeah.

00:19:27.998 --> 00:19:28.358
Yeah.

00:19:28.418 --> 00:19:28.618
Sam: Yeah.

00:19:28.778 --> 00:19:30.178
It's got all the right forms

00:19:30.178 --> 00:19:30.898
Ali: of validation.

00:19:30.898 --> 00:19:31.748
But it's also, yeah.

00:19:31.748 --> 00:19:34.798
Being able to sort of take a step
back and cause, cause that's the

00:19:34.798 --> 00:19:37.028
thing, you know, like, you know, you
never want to meet your heroes or

00:19:37.078 --> 00:19:37.938
someone's going to disappoint you.

00:19:37.938 --> 00:19:41.478
You don't want to pay the money and
think, Oh, I don't agree with this.

00:19:41.478 --> 00:19:43.408
I feel like I've wasted
my time and my money.

00:19:43.958 --> 00:19:44.618
Investing in this.

00:19:44.658 --> 00:19:45.128
Oh yeah, see

00:19:45.128 --> 00:19:45.468
Joe: I don't need

00:19:45.508 --> 00:19:46.178
Sam: to agree.

00:19:46.468 --> 00:19:47.458
There is a sunk cost.

00:19:47.688 --> 00:19:47.838
Yeah.

00:19:47.838 --> 00:19:48.348
And there's a

00:19:48.908 --> 00:19:49.058
Joe: validation.

00:19:49.408 --> 00:19:51.628
Are you guys paying anyone for news?

00:19:51.978 --> 00:19:52.548
Mm.

00:19:52.700 --> 00:19:52.880
Sam: For

00:19:52.880 --> 00:19:53.510
Ali: news?

00:19:53.600 --> 00:19:54.380
Um,

00:19:54.920 --> 00:19:55.040
Sam: no.

00:19:55.040 --> 00:19:55.161
Well, do you

00:19:55.166 --> 00:19:57.080
Joe: pay for any of the
information that you get?

00:19:57.110 --> 00:19:57.980
Sam: Well, yeah.

00:19:57.980 --> 00:20:01.160
Paying by being subject to
ads, I guess, in that sense.

00:20:01.380 --> 00:20:01.960
No, you don't.

00:20:01.960 --> 00:20:02.200
You don't.

00:20:02.260 --> 00:20:02.600
I'm not having

00:20:02.700 --> 00:20:03.800
Ali: Tony's side.

00:20:03.900 --> 00:20:04.310
Oh no, I do.

00:20:04.310 --> 00:20:08.200
I subscribe to, um, uh, New York
Times and New York Opinion New Yorker.

00:20:08.450 --> 00:20:08.660
Yeah.

00:20:08.720 --> 00:20:08.960
I think

00:20:08.965 --> 00:20:09.710
Joe: that's really good.

00:20:09.830 --> 00:20:13.250
I think that's good to have paid for some
of the information that you're reading.

00:20:13.580 --> 00:20:13.730
Yes.

00:20:13.730 --> 00:20:15.230
With with, with your actual dollars.

00:20:15.440 --> 00:20:15.650
Sam: Yes.

00:20:15.830 --> 00:20:16.040
Oh, no, no.

00:20:16.045 --> 00:20:17.120
I tend to agree with that.

00:20:17.150 --> 00:20:24.000
Um, But I often absorb him from, I often,
my sources are in a sense, the person was

00:20:24.010 --> 00:20:26.750
already paid to produce that, if you see

00:20:26.750 --> 00:20:26.880
Joe: what I mean.

00:20:26.900 --> 00:20:28.430
Yeah, but if you never pay...

00:20:29.420 --> 00:20:30.710
Then, then journalism.

00:20:30.740 --> 00:20:31.070
No, but

00:20:31.075 --> 00:20:32.120
Sam: I don't consume journalism, Joe.

00:20:32.120 --> 00:20:34.850
I consume primary sources
and secondary analysis,

00:20:34.850 --> 00:20:35.610
Ali: but I, I do appreciate

00:20:35.610 --> 00:20:35.850
Joe: that.

00:20:35.850 --> 00:20:37.130
It's, it's a boring sidetrack.

00:20:37.190 --> 00:20:37.970
But, but, but yeah.

00:20:38.000 --> 00:20:40.790
No, but like I decided for some
reason in my own head that I could,

00:20:40.880 --> 00:20:42.170
I was gonna trust information.

00:20:42.170 --> 00:20:44.540
I paid for more than the
information that was advertised

00:20:44.540 --> 00:20:44.810
Sam: to me.

00:20:44.900 --> 00:20:46.370
No, I think it's good to pay for it.

00:20:46.370 --> 00:20:47.510
I do, I do agree with

00:20:47.515 --> 00:20:47.660
Ali: that.

00:20:47.665 --> 00:20:50.630
It's, it's not even trusting it
more, it's like I enjoy it more.

00:20:50.630 --> 00:20:52.370
It's like a little Sunday morning ritual.

00:20:52.400 --> 00:20:52.460
Yeah.

00:20:52.460 --> 00:20:54.800
And I go to, you know,
like to the New Yorker.

00:20:55.315 --> 00:20:56.285
It's get the opinion piece.

00:20:56.625 --> 00:20:57.755
It's a long read.

00:20:58.095 --> 00:20:59.195
It's I take pleasure in it.

00:20:59.195 --> 00:21:02.005
It's something I feel like
I've invested in for myself.

00:21:02.175 --> 00:21:03.705
So I'm more likely to read.

00:21:03.945 --> 00:21:06.205
Sam: Whereas if pirated
games don't feel as fun.

00:21:06.275 --> 00:21:06.955
It's true.

00:21:06.955 --> 00:21:08.565
Ali: It's like a, it's like the book.

00:21:08.975 --> 00:21:09.405
It's like Poojy

00:21:09.405 --> 00:21:09.855
Joe: News.

00:21:09.865 --> 00:21:09.885
Yeah.

00:21:10.565 --> 00:21:11.695
The New Yorker opinion.

00:21:11.745 --> 00:21:15.025
That's the feeling I get from one
of these long sub stack articles.

00:21:15.105 --> 00:21:15.425
Yeah.

00:21:15.495 --> 00:21:19.435
They've banked up on my phone because
I was unwell and unable to read.

00:21:19.845 --> 00:21:21.455
So now 30 to get through.

00:21:21.695 --> 00:21:22.595
No, you don't need to get through them.

00:21:22.645 --> 00:21:23.315
No, I don't.

00:21:23.545 --> 00:21:27.875
I fucking read every single word
and I usually read them within about

00:21:27.875 --> 00:21:29.095
five minutes of them coming out.

00:21:29.285 --> 00:21:31.145
But that's just how I am.

00:21:31.155 --> 00:21:31.485
Wow.

00:21:32.005 --> 00:21:32.265
Sam: Yeah.

00:21:32.265 --> 00:21:33.005
You're the dream

00:21:33.005 --> 00:21:33.655
Ali: subscriber.

00:21:33.675 --> 00:21:37.955
Yeah, I let them accumulate for a few
days and then I have to set some time

00:21:38.715 --> 00:21:38.885
Joe: aside.

00:21:38.995 --> 00:21:41.815
Like if, if, if it drops
into my email inbox.

00:21:42.010 --> 00:21:45.000
And I go straight to the substaff,
uh, and I read it straight away.

00:21:45.200 --> 00:21:45.390
Do me a

00:21:45.390 --> 00:21:48.150
Sam: favor, send this episode
to No Opinion and, and

00:21:48.170 --> 00:21:50.040
Matty Glacius, like, yeah.

00:21:50.070 --> 00:21:50.820
Joe: Yeah, sure.

00:21:50.830 --> 00:21:54.250
I mean, to me, like they live
on another planet, right?

00:21:54.420 --> 00:21:56.080
Sam: You'd be surprised how vain they are.

00:21:56.080 --> 00:21:58.415
Just like tag them on social
media with the episode.

00:21:58.415 --> 00:22:00.030
I'm like, we talk about you in this.

00:22:02.180 --> 00:22:07.630
Joe: But so, so, so the, the, the
relevance, the relevance of paying them

00:22:07.630 --> 00:22:09.680
or paying the economist was expensive.

00:22:09.680 --> 00:22:10.820
It was like 50 bucks a month.

00:22:10.940 --> 00:22:11.400
Oh yeah.

00:22:11.400 --> 00:22:11.920
That's dear.

00:22:12.140 --> 00:22:16.850
And the, the reason again, is I'm
trying to get away from poor quality.

00:22:17.270 --> 00:22:19.000
I'm trying to get away from feel opinions.

00:22:19.010 --> 00:22:19.240
No, no, no.

00:22:19.480 --> 00:22:20.570
I'm a hundred percent.

00:22:20.600 --> 00:22:22.810
That is very correct.

00:22:22.895 --> 00:22:26.595
The quality of The Age or
The Guardian is so bad now.

00:22:26.635 --> 00:22:26.845
Yeah,

00:22:27.905 --> 00:22:28.545
Sam: fantastic.

00:22:29.155 --> 00:22:31.075
Joe: It got so bad and
I was like, well, The

00:22:31.075 --> 00:22:34.295
Sam: Age is the channel nine
newspaper and The Guardian is, let's

00:22:34.295 --> 00:22:36.505
just say it's not left or right.

00:22:36.845 --> 00:22:37.365
It's its own beast.

00:22:38.445 --> 00:22:39.925
Joe: It's apocalyptic.

00:22:40.335 --> 00:22:40.585
Yeah.

00:22:40.625 --> 00:22:40.935
Yeah.

00:22:41.125 --> 00:22:41.915
Sam: It's like, yeah.

00:22:41.965 --> 00:22:44.195
Ali: It's, you just want to feel
bad about yourself for the day.

00:22:44.195 --> 00:22:44.765
It's just, yeah.

00:22:44.785 --> 00:22:45.265
It's having

00:22:45.875 --> 00:22:46.545
Sam: a panic attack about everything.

00:22:46.925 --> 00:22:48.045
And they're not a hundred percent wrong.

00:22:48.055 --> 00:22:48.075
Yeah.

00:22:49.825 --> 00:22:50.705
Terrible thing to say.

00:22:51.210 --> 00:22:52.150
Awful slur.

00:22:52.330 --> 00:22:52.770
I apologize.

00:22:53.000 --> 00:22:53.570
Sorry, say that again.

00:22:53.780 --> 00:22:56.890
MAGA people would call the Guardian
libtards and, and, but the truth

00:22:56.900 --> 00:22:59.970
is the Guardian is not quite
as unhinged as MAGA media, but

00:23:00.300 --> 00:23:02.120
Ali: like, it's not like, I mean,
like, and there's often like

00:23:02.120 --> 00:23:04.950
there'll be something really
good and worthwhile in there.

00:23:04.950 --> 00:23:07.300
It's not to say that like, I'm
not throwing the baby out with

00:23:07.310 --> 00:23:08.750
the bathwater, but there is

00:23:08.750 --> 00:23:10.640
Sam: a lot of, they're
victims of the clickbait.

00:23:10.660 --> 00:23:11.440
Yeah, the clickbait problem

00:23:13.970 --> 00:23:17.530
Joe: is why I was trying to, why I
was happy to spend 50 of my Australian

00:23:17.530 --> 00:23:23.190
dollars to give to The Economist, which
worked for a couple of years because

00:23:23.190 --> 00:23:25.740
at least then I was getting quality
journalism that wasn't clickbait.

00:23:26.910 --> 00:23:30.630
But then, you know, well, no,
they're on the internet too.

00:23:30.630 --> 00:23:34.820
So what they started doing was
putting like, uh, updates of like

00:23:34.820 --> 00:23:37.920
little bits of information, which
then those little bits of information

00:23:37.920 --> 00:23:40.010
stopped being thorough articles.

00:23:40.010 --> 00:23:43.880
They started to be rumors, which
is scary to someone with anxiety.

00:23:44.190 --> 00:23:46.930
Sam: Because the thing is these
paid outlets, the thing they

00:23:46.930 --> 00:23:51.740
really suffer from is that they're
constantly getting scooped by.

00:23:52.415 --> 00:23:55.475
People who call themselves
journalists and who don't bother

00:23:55.515 --> 00:23:58.375
with, I don't have any scruples
about fact checking the rest of it.

00:23:58.525 --> 00:24:03.795
And they know they can only get scooped
so much before their paid subscribers

00:24:03.845 --> 00:24:08.485
decide Guys, I'm sick of you not doing
any rumour mongering, so I'm going to

00:24:08.485 --> 00:24:11.445
go and get my rumour mongering right.

00:24:11.615 --> 00:24:13.125
I'm So they're caught between

00:24:13.125 --> 00:24:13.595
Joe: the dilemma of...

00:24:13.595 --> 00:24:16.295
So that happened to The Economist,
so then I'm like, I don't really

00:24:16.295 --> 00:24:18.115
want that as part of the product.

00:24:18.315 --> 00:24:18.395
See,

00:24:18.715 --> 00:24:23.195
Sam: stop consuming journalism, primary
sources and everything else can just

00:24:23.245 --> 00:24:23.495
Joe: fuck off.

00:24:23.495 --> 00:24:27.035
Then the Ukraine war started
and The Economist just went full

00:24:27.035 --> 00:24:28.825
propaganda mode and I was like...

00:24:29.135 --> 00:24:31.915
Oh, because it had always been a
center right publication, right?

00:24:32.085 --> 00:24:32.365
Of course.

00:24:32.375 --> 00:24:37.099
But without a war, it just seemed fairly
reasonable, problem solving, organization.

00:24:37.109 --> 00:24:40.869
The moment there was a war, it's like,
this pro Zelensky, like, I wouldn't trust

00:24:40.909 --> 00:24:43.109
Zelensky as far as I could throw him.

00:24:43.499 --> 00:24:47.709
And suddenly there's all this pro
Zelensky propaganda, and it's like, what

00:24:47.719 --> 00:24:49.189
the fuck's happened to the economists?

00:24:49.189 --> 00:24:52.229
And so, they stopped getting
my money, I unsubscribed.

00:24:52.559 --> 00:24:55.039
So then I find there's
two bloggers, right?

00:24:55.484 --> 00:25:01.434
And Noah, like, Noah Smith's very pro
Ukraine too, but he's more interested

00:25:01.434 --> 00:25:05.154
in, like, how do we build more
cool drones and shit and whatever.

00:25:05.194 --> 00:25:11.714
I don't know, it's, it's, it almost
feels like an unsolvable problem to me,

00:25:11.744 --> 00:25:13.674
like, I don't know how to find any facts.

00:25:13.984 --> 00:25:17.114
If I wanted to follow the Ukraine
war, which I don't, if I did,

00:25:17.144 --> 00:25:18.624
I'd be on Telegram, right?

00:25:19.094 --> 00:25:22.084
Reading Russian and Ukrainian bloggers.

00:25:22.459 --> 00:25:22.999
Right.

00:25:22.999 --> 00:25:23.269
Yeah.

00:25:23.389 --> 00:25:24.319
That's what I'd be doing.

00:25:24.319 --> 00:25:27.349
And I, I've met people on film
sets, well, I consume it now

00:25:27.349 --> 00:25:28.459
through during the shoot day.

00:25:28.459 --> 00:25:28.549
Yeah.

00:25:28.609 --> 00:25:31.609
Just when they get a moment, I
checking it out the telegram mm-hmm.

00:25:31.879 --> 00:25:32.689
to see the latest

00:25:32.689 --> 00:25:34.609
Sam: thing that's, I've stayed,
I've stayed off Telegram.

00:25:34.609 --> 00:25:37.759
I've been very tempted to download it a
hundred times, but I've decided not to.

00:25:37.969 --> 00:25:40.699
And it's, it would absolutely
rabbit hole me so fast.

00:25:40.699 --> 00:25:40.700
Mm-hmm.

00:25:40.779 --> 00:25:43.459
. And what I'm gonna do instead
is what I've always done, I

00:25:43.459 --> 00:25:44.989
consume the secondary analysis.

00:25:45.354 --> 00:25:48.324
So, and primary, primary
sources and secondary analysis.

00:25:48.334 --> 00:25:51.732
So what I mean by that is, I
don't mean firsthand reportage.

00:25:52.082 --> 00:25:57.686
I mean, what academics have said prior
to the war and during it, because

00:25:57.686 --> 00:26:01.756
there are already papers coming
out and I don't read those papers.

00:26:02.006 --> 00:26:07.246
I listen to those people talk
about their paper on an academic

00:26:07.466 --> 00:26:09.096
show called the New Books Network.

00:26:09.096 --> 00:26:10.676
Well, actually they publish books.

00:26:11.031 --> 00:26:13.811
And, well, some books are
really long articles, etc.

00:26:13.821 --> 00:26:14.551
You get the idea.

00:26:15.251 --> 00:26:18.361
Basically, all that in it's like, oh, but
that information's not up to the minute.

00:26:18.421 --> 00:26:18.801
No.

00:26:19.141 --> 00:26:21.891
But it's actually so much more
enlightening because it gets

00:26:21.891 --> 00:26:24.131
you up to like 12 months ago.

00:26:24.461 --> 00:26:27.401
And now, and then there's actually
a book has just dropped about

00:26:27.401 --> 00:26:28.591
the first 12 months of the war.

00:26:28.591 --> 00:26:30.151
Looking forward to listening to that one.

00:26:31.421 --> 00:26:36.211
Just go straight to the people
who are paid by a university

00:26:36.321 --> 00:26:38.021
to know about something.

00:26:38.251 --> 00:26:38.551
Yeah.

00:26:38.561 --> 00:26:41.661
So this whole problem of paying
for journalism, in my mind...

00:26:42.266 --> 00:26:46.546
Academics need more exposure to the
public, and journalists don't have

00:26:46.546 --> 00:26:50.056
the time to sit down and get across
things properly, like, no offense, but

00:26:50.366 --> 00:26:54.076
there's just, and occasionally you get
these hyper specialized journalists,

00:26:54.316 --> 00:26:58.496
and they crush it, but they basically
are academics, and the rest, the

00:26:58.496 --> 00:27:00.266
rest, I'm sorry, they're useless.

00:27:00.586 --> 00:27:00.846
Ali: Yeah.

00:27:00.846 --> 00:27:01.676
The majority of them are useless.

00:27:01.676 --> 00:27:04.656
Like, unless it's like a long form,
like a, like a, an investigative

00:27:04.706 --> 00:27:06.616
piece that there's, you know,
it's been months in the works.

00:27:06.616 --> 00:27:07.726
It's their whole job for six months.

00:27:07.736 --> 00:27:07.946
Yeah.

00:27:07.946 --> 00:27:11.996
And I, I, I, I live for those articles
and like, they, they're brilliant.

00:27:12.006 --> 00:27:15.646
And that's the last sort
of, I suppose, like really.

00:27:15.726 --> 00:27:17.916
Good quality journalism
that's still out there.

00:27:17.916 --> 00:27:20.866
And this, I do believe it is, it's worth
paying for and it's definitely out there,

00:27:20.876 --> 00:27:25.206
but, but outside of that, and because
it's so fast moving and it's so fast

00:27:25.206 --> 00:27:28.206
paced and like you said, someone's going
to scoop it up and post it on Twitter or

00:27:28.486 --> 00:27:30.066
you know, like it's and get it for free.

00:27:30.076 --> 00:27:30.836
Like there's, there's no.

00:27:32.096 --> 00:27:34.876
There's no incentive to provide
that really good quality of

00:27:35.566 --> 00:27:35.776
Sam: journalism.

00:27:35.776 --> 00:27:37.386
And there's no disincentive
not to pass on the rumour.

00:27:37.436 --> 00:27:37.756
Yeah.

00:27:38.296 --> 00:27:40.166
There is, but it's tomorrow's problem.

00:27:40.636 --> 00:27:41.856
Today's problem is pass on

00:27:41.856 --> 00:27:42.006
Ali: the rumour.

00:27:42.596 --> 00:27:45.646
Pass on the rumour and get the ad
sponsorship and like, you know, and

00:27:45.646 --> 00:27:48.406
it's just, and yeah, and that's a shame.

00:27:48.656 --> 00:27:49.006
Sam: It's a shame.

00:27:49.006 --> 00:27:51.616
Basically publicly funded
media is the only answer.

00:27:51.676 --> 00:27:52.846
And Well, that's what I've

00:27:52.846 --> 00:27:55.696
Joe: ended up with is
I've got the ABC app.

00:27:55.786 --> 00:27:56.446
Yeah, yeah.

00:27:56.506 --> 00:27:58.576
But then it's like facts
versus Phil opinions.

00:27:58.606 --> 00:27:59.206
Okay.

00:27:59.296 --> 00:27:59.746
ABC's

00:27:59.746 --> 00:28:01.066
Sam: got loads of that in there too.

00:28:01.066 --> 00:28:01.516
Don't worry.

00:28:01.516 --> 00:28:03.376
But, but what I'm, what I'm
saying is the ABC's got a bit

00:28:03.376 --> 00:28:05.596
Joe: of crap, but it's the it's, but it's

00:28:05.596 --> 00:28:07.216
Ali: still, if you want
your day to day, 20%

00:28:07.216 --> 00:28:10.126
Joe: of the crap that the age has,
like the age is just about what do

00:28:10.126 --> 00:28:11.596
you do with all your money in Europe?

00:28:11.596 --> 00:28:12.456
Yeah, yeah, yeah.

00:28:12.461 --> 00:28:14.446
Like where do your holiday in
Europe and what do you do with

00:28:14.446 --> 00:28:14.596
Sam: all it's,

00:28:15.256 --> 00:28:18.576
Ali: yeah, I was gonna say like, you know,
Family scoops up property, like in, you

00:28:18.576 --> 00:28:20.336
know, in Thornbury for only 4 million.

00:28:20.966 --> 00:28:23.346
And then you're like, Oh my
God, this is just like a, like

00:28:23.346 --> 00:28:25.066
a, an ad piece for realestate.

00:28:25.316 --> 00:28:25.536
com

00:28:26.726 --> 00:28:27.726
Sam: and it's a funny

00:28:27.936 --> 00:28:28.106
Joe: article.

00:28:28.446 --> 00:28:29.726
The age is absolute dog shit.

00:28:29.736 --> 00:28:32.706
And I finally had to just walk away.

00:28:33.006 --> 00:28:33.226
You've finally

00:28:33.226 --> 00:28:33.756
Sam: seen the light.

00:28:33.756 --> 00:28:34.926
But yeah, but

00:28:36.836 --> 00:28:40.663
Joe: the anxiety journey through
this show, this show has been

00:28:40.663 --> 00:28:41.786
going for about 12 months.

00:28:41.856 --> 00:28:42.046
Yeah.

00:28:42.416 --> 00:28:46.436
My anxiety levels are much lower
than they were around the news.

00:28:46.596 --> 00:28:46.946
Look, I

00:28:46.946 --> 00:28:49.786
Sam: don't mind these centrist guys,
because they're good for your feelings.

00:28:50.356 --> 00:28:52.086
Which brings us back to Feelpinions, which

00:28:52.086 --> 00:28:52.286
Joe: is like...

00:28:52.386 --> 00:28:55.146
Yeah, I like, they slow my brain down.

00:28:55.526 --> 00:28:56.356
Yeah, I like them for that.

00:28:56.356 --> 00:29:03.626
Matt Yglesias will have an opinion on
the attempted coup in Russia, and he'll

00:29:03.626 --> 00:29:07.866
write two paragraphs on it, and I'll
assume that Matt Yglesias knows a whole

00:29:07.866 --> 00:29:11.536
bunch of people who are fucking heaps
smarter and more well informed than me.

00:29:11.546 --> 00:29:12.986
Sam: Oh, I dare say he's
got some good sources.

00:29:12.996 --> 00:29:13.436
In the

00:29:13.436 --> 00:29:14.236
Joe: administration,

00:29:14.406 --> 00:29:14.796
Sam: probably.

00:29:15.186 --> 00:29:16.366
But Matt Tybee as well.

00:29:16.586 --> 00:29:17.866
Look, there's a lot of these...

00:29:18.186 --> 00:29:21.656
Loose end journalists out there
that are sort of largely funded

00:29:21.656 --> 00:29:24.166
through subscriptions and that
they've got a degree of independence.

00:29:24.446 --> 00:29:27.556
But what you need to keep in
mind also is that they become

00:29:27.556 --> 00:29:29.006
reliant on their subscribers.

00:29:29.076 --> 00:29:32.406
Yeah, so the subscribers are
the proprietor in a sense and

00:29:32.406 --> 00:29:35.056
the proprietor is also going to
distort the editorial policy.

00:29:35.356 --> 00:29:36.086
So they're going to

00:29:36.096 --> 00:29:39.346
Ali: continue to write and investigate
or Write about things that that

00:29:39.346 --> 00:29:40.826
Sam: one got opened by everyone.

00:29:40.846 --> 00:29:42.006
This other one got ignored.

00:29:42.326 --> 00:29:42.446
Exactly.

00:29:42.706 --> 00:29:42.886
I'm going

00:29:42.886 --> 00:29:43.716
Ali: to make content.

00:29:43.746 --> 00:29:44.006
Sam: Yeah.

00:29:44.076 --> 00:29:44.576
Yeah.

00:29:44.576 --> 00:29:48.406
And what I've noticed, the market
beats us all out of shape in the

00:29:48.406 --> 00:29:48.586
Joe: end.

00:29:48.586 --> 00:29:53.026
With both of those bloggers is that
they hardly ever write about Ukraine.

00:29:53.026 --> 00:29:56.756
And I think Ellie said it the other day,
like she switched off from it largely.

00:29:56.776 --> 00:29:56.876
Well,

00:29:56.876 --> 00:29:57.036
Sam: yeah.

00:29:57.036 --> 00:29:57.606
You know why?

00:29:57.676 --> 00:29:59.496
Cause the audience aren't there anymore.

00:29:59.506 --> 00:29:59.826
Yeah.

00:29:59.856 --> 00:30:00.146
And

00:30:00.146 --> 00:30:02.406
Joe: that's what I want
to mental health wise.

00:30:02.686 --> 00:30:06.616
But I couldn't get that from, uh, the
economist because it was never going to

00:30:06.616 --> 00:30:07.146
Sam: switch off.

00:30:07.156 --> 00:30:07.746
So basically.

00:30:08.076 --> 00:30:09.376
I'm perverse, I guess.

00:30:09.526 --> 00:30:13.936
So I basically did not consume any of the
hot media about the war for 12 months,

00:30:13.986 --> 00:30:18.666
and now that it's old news, now I'm tuning
in because there's actually some proper

00:30:18.666 --> 00:30:20.456
analysis to be had because we're this

00:30:20.476 --> 00:30:20.886
Joe: far into it.

00:30:20.886 --> 00:30:21.916
And I think that's wisdom.

00:30:21.996 --> 00:30:22.156
Yeah,

00:30:22.396 --> 00:30:24.966
Sam: but it's also just more
interesting to me ultimately.

00:30:24.966 --> 00:30:29.166
Like, I try to read news and then I
get bored because it's not answering

00:30:29.166 --> 00:30:30.416
the questions I want answered.

00:30:30.546 --> 00:30:32.296
It's telling me about
something that happened today.

00:30:32.586 --> 00:30:34.946
But like, I guess I've just got
just enough historical training

00:30:34.946 --> 00:30:36.086
and the anthropology as well.

00:30:36.116 --> 00:30:37.406
I'm like, No, no, no.

00:30:37.446 --> 00:30:39.386
What happened 10 years ago?

00:30:39.896 --> 00:30:41.096
What happened in 2014?

00:30:41.166 --> 00:30:42.686
Why did that invasion happen?

00:30:42.706 --> 00:30:43.816
Why did Crimea happen?

00:30:43.816 --> 00:30:44.216
Why did...

00:30:44.616 --> 00:30:44.636
the

00:30:44.636 --> 00:30:45.816
Joe: questions just go back and back.

00:30:46.246 --> 00:30:49.006
The question for me is a
psychoanalytical question, which

00:30:49.176 --> 00:30:54.506
is why do I think Putin personally
wants to kill me with a nuclear bomb?

00:30:54.656 --> 00:30:55.696
Oh, that's a great one.

00:30:55.746 --> 00:30:56.156
Right?

00:30:56.306 --> 00:30:56.926
So, so...

00:30:57.156 --> 00:30:58.846
And I think what's happened in the last

00:31:03.396 --> 00:31:09.806
12 months is good psychoanalysis, good
addiction recovery work has gone in.

00:31:09.896 --> 00:31:10.136
Yeah.

00:31:10.186 --> 00:31:14.496
And that's why like, it's not,
I'm not saying I don't care

00:31:14.496 --> 00:31:16.086
about the fate of Ukraine.

00:31:16.086 --> 00:31:17.976
I'm not saying I don't care
about people dying now.

00:31:17.976 --> 00:31:18.486
Sam: I do.

00:31:18.676 --> 00:31:19.206
Of course you do.

00:31:19.266 --> 00:31:22.516
And that's why you can't pay too much
attention because it's not good for

00:31:22.516 --> 00:31:22.691
Joe: you.

00:31:22.691 --> 00:31:27.631
But it's different thinking that
Vladimir Putin knows who you are and

00:31:27.631 --> 00:31:29.331
wants to kill you with a nuclear bomb.

00:31:29.361 --> 00:31:29.761
Or

00:31:29.791 --> 00:31:33.891
Sam: that Vladimir Putin wants to
save you from globo homo gender

00:31:33.891 --> 00:31:36.321
fascists who want to sissify the boys.

00:31:36.351 --> 00:31:39.201
Because that's what the MAGA
people are getting high on, which

00:31:39.201 --> 00:31:40.531
is like a field opinion as well.

00:31:40.541 --> 00:31:41.511
Just take a moment.

00:31:41.671 --> 00:31:45.621
Vladimir is going to give us a masculine,
fascist vision of how the world should

00:31:45.621 --> 00:31:47.001
be and that's why we're supporting him.

00:31:47.381 --> 00:31:48.951
Everyone's, everyone's high on their

00:31:48.951 --> 00:31:49.021
Joe: own supply.

00:31:50.571 --> 00:31:53.621
I want you to define this
for me, because I'm confused.

00:31:53.661 --> 00:31:57.371
This MAGA stuff, what is this?

00:31:57.601 --> 00:32:01.701
Say Trump doesn't get the
nomination, what's MAGA then?

00:32:02.981 --> 00:32:06.881
Like, does it, is it, do I have to
pay attention to MAGA even though I

00:32:06.881 --> 00:32:08.441
think Trump's probably unelectable?

00:32:08.451 --> 00:32:08.631
No,

00:32:08.641 --> 00:32:11.191
Sam: the words you need to
think about now, forget MAGA.

00:32:11.916 --> 00:32:12.446
Joe: Right, because you've said it

00:32:12.506 --> 00:32:13.386
Sam: numerous times in this episode.

00:32:13.386 --> 00:32:15.526
You need to think about, I've been
saying it for 12 months, you need to

00:32:15.576 --> 00:32:20.156
think about Christian Identity, capital
C, capital I, and you need to think

00:32:20.156 --> 00:32:23.526
about, just, good old fashioned white
supremacy, but it will have different

00:32:23.526 --> 00:32:27.599
names, it will have unfamiliar sounding
names, British Israelism, uh, there's a

00:32:27.599 --> 00:32:32.379
thousand Basically, flavors of fascism
that are, have always been alive and

00:32:32.379 --> 00:32:34.449
well in the subsoil of American life.

00:32:34.779 --> 00:32:38.959
They're always there, just waiting
to sprout when conditions are right.

00:32:38.989 --> 00:32:41.239
And unfortunately, conditions
are super right at the moment.

00:32:41.629 --> 00:32:43.989
But the one, none of it
worried me really much at all.

00:32:45.089 --> 00:32:49.069
Because the numbers are just
not really on the racist side.

00:32:49.449 --> 00:32:53.449
Where it gets interesting is if you
manage to combine a bit of nationalism

00:32:53.449 --> 00:32:54.869
with some religious grievance.

00:32:55.349 --> 00:33:00.499
And now that they started talking
about MAGA communism, so they're

00:33:00.499 --> 00:33:03.219
basically saying, let's do socialism.

00:33:04.019 --> 00:33:06.479
For those that deserve it.

00:33:07.049 --> 00:33:10.899
And let's make everyone else
subservient to the good people.

00:33:11.042 --> 00:33:16.352
Joe: What I'm intrigued by from tonight's
episode is that you two seem a lot less

00:33:16.352 --> 00:33:20.872
worried than me about the idea that
it's very hard to establish any facts.

00:33:20.942 --> 00:33:21.262
Yeah.

00:33:21.672 --> 00:33:22.012
I

00:33:22.642 --> 00:33:22.952
Sam: think it's

00:33:22.952 --> 00:33:23.362
Joe: always been difficult.

00:33:23.362 --> 00:33:24.952
I think that's how we should finish up.

00:33:24.962 --> 00:33:26.392
Like, is it just, all right.

00:33:26.402 --> 00:33:30.082
So because what you described
is Christian identitarianism.

00:33:30.082 --> 00:33:30.302
Yeah.

00:33:31.137 --> 00:33:31.387
Yeah.

00:33:31.507 --> 00:33:32.487
Let's call it that.

00:33:32.497 --> 00:33:32.867
Right?

00:33:33.127 --> 00:33:37.897
So they, they're imagining that future,
but I'm the guy who's with the people

00:33:37.897 --> 00:33:42.457
in Montana who are expecting World War
III and the whole world gets blown up.

00:33:42.797 --> 00:33:46.367
So I'm actually much more
unwell than a Christian fascist

00:33:47.047 --> 00:33:48.447
Sam: when I'm reading the news.

00:33:48.967 --> 00:33:49.897
It's all the same degree of...

00:33:49.907 --> 00:33:49.927
No,

00:33:50.277 --> 00:33:52.577
Joe: but they imagine a future at least.

00:33:52.587 --> 00:33:53.577
I couldn't imagine if...

00:33:53.607 --> 00:33:57.457
I remember driving past some
infrastructure in the western

00:33:57.457 --> 00:33:58.557
suburbs of Melbourne and...

00:33:58.882 --> 00:34:02.542
And seeing that it was going to take years
to build these roads, and saying to my

00:34:02.542 --> 00:34:07.632
therapist, Wow, people in, the government
in Victoria is planning to build new

00:34:07.632 --> 00:34:11.332
roads and stuff, whereas I imagine myself
to just not exist in a couple of years.

00:34:11.422 --> 00:34:12.312
Mmm.

00:34:12.452 --> 00:34:17.262
And it's like, so, so whatever that
blackness is, whatever that thing

00:34:17.282 --> 00:34:20.212
about whether it was because my
dad died when he was 50 years old.

00:34:20.262 --> 00:34:20.562
My mum

00:34:20.562 --> 00:34:22.952
Sam: died when I was, when
she was my age currently.

00:34:22.962 --> 00:34:23.452
Yeah,

00:34:23.452 --> 00:34:24.462
Joe: like something has...

00:34:25.022 --> 00:34:30.572
It's made me not, made it impossible for
me to imagine a future for the world.

00:34:30.672 --> 00:34:31.202
Yeah.

00:34:31.422 --> 00:34:31.972
Right?

00:34:32.122 --> 00:34:32.422
Yeah.

00:34:32.422 --> 00:34:34.132
So then my political opinions are...

00:34:34.942 --> 00:34:38.652
That's why I can't read the Guardian,
because the Guardian agrees with that.

00:34:38.662 --> 00:34:39.692
You should definitely stay

00:34:39.692 --> 00:34:40.292
Sam: away from the Guardian.

00:34:40.352 --> 00:34:40.412
I

00:34:40.412 --> 00:34:41.102
Joe: do stay away.

00:34:41.382 --> 00:34:41.812
It's fine.

00:34:42.032 --> 00:34:42.515
I can't agree more.

00:34:42.515 --> 00:34:48.592
But, but the, but the insight is that
there's something in me that is not well,

00:34:48.592 --> 00:34:52.072
but, but I feel like facts are important.

00:34:52.272 --> 00:34:53.132
No, you're not wrong.

00:34:53.522 --> 00:34:56.062
Facts are an important antidote to that.

00:34:56.312 --> 00:34:59.672
Yes, I see what you very
dangerous, they might make you

00:34:59.712 --> 00:35:01.502
swerve into an oncoming truck.

00:35:01.572 --> 00:35:02.312
Yeah.

00:35:02.382 --> 00:35:03.012
Right?

00:35:03.012 --> 00:35:03.322
Yes.

00:35:03.332 --> 00:35:06.422
But you two seem to not need the facts

00:35:06.422 --> 00:35:06.582
Sam: as much.

00:35:06.802 --> 00:35:07.562
I see the issue.

00:35:07.602 --> 00:35:11.732
No, I think the real issue here is, I
think Ali and I both, uh, love facts.

00:35:11.742 --> 00:35:12.332
Yeah, yeah.

00:35:12.332 --> 00:35:12.722
As much

00:35:12.752 --> 00:35:12.872
Ali: as you do.

00:35:13.252 --> 00:35:13.902
Very much.

00:35:13.902 --> 00:35:14.482
Like, I.

00:35:14.583 --> 00:35:14.903
Yeah.

00:35:14.903 --> 00:35:19.713
Like I, and I'm constantly questioning
my, my emotional state or those

00:35:19.853 --> 00:35:21.143
behaviors that are born out of that.

00:35:21.143 --> 00:35:24.603
I, it's very, I'm very much
like facts, but I think it's,

00:35:26.702 --> 00:35:28.832
Sam: I think what it is is
there's a distrust of feelings

00:35:28.832 --> 00:35:29.972
on your side of things, Joe.

00:35:30.022 --> 00:35:31.002
That's the real issue here.

00:35:31.002 --> 00:35:31.732
If you ask me.

00:35:31.802 --> 00:35:34.027
And so I think experience has taught you.

00:35:34.167 --> 00:35:38.807
Uh, and you know, therapy will bear
this out that you can't afford to trust

00:35:38.807 --> 00:35:42.657
your feelings, you know, and some of
them are misplaced feelings to be sure.

00:35:43.207 --> 00:35:47.217
And some of them though, unfortunately
are well placed and, and those

00:35:47.217 --> 00:35:49.017
are also ones to distrust for it.

00:35:49.167 --> 00:35:52.787
The opposite reason, which is they're
actually, they actually might be right.

00:35:52.797 --> 00:35:54.997
And they might point me
to something quite grim.

00:35:55.287 --> 00:35:56.077
And so.

00:35:56.567 --> 00:36:00.057
Often the, the, the accurate feelings
we want to avoid and the inaccurate

00:36:00.057 --> 00:36:02.557
feelings we rationally should try to.

00:36:02.567 --> 00:36:07.177
Ali: Like, like the, the, the, the, as
I say, the, the probability of something

00:36:07.177 --> 00:36:10.707
really horrible happening because Putin
decides to, you know, let off the bomb.

00:36:10.707 --> 00:36:10.947
Right.

00:36:11.317 --> 00:36:18.067
And is so small, but it's not completely,
it's not completely off the table.

00:36:18.067 --> 00:36:18.427
That's right.

00:36:18.457 --> 00:36:18.727
So.

00:36:19.287 --> 00:36:22.187
But the, the, the reality of
that would just be so horrifying.

00:36:22.437 --> 00:36:26.527
It's better not to get
bogged down with that small.

00:36:26.667 --> 00:36:29.927
So I mean, that's, that's perhaps where
Sam and I don't get bogged down in

00:36:30.117 --> 00:36:34.107
those really small probabilities and
perhaps the facts around things that

00:36:34.107 --> 00:36:35.927
are more probable we put more weight to.

00:36:36.367 --> 00:36:38.017
Sam: But what Which is an urge you have.

00:36:38.087 --> 00:36:38.567
But what is

00:36:41.942 --> 00:36:44.732
Joe: Like, I had a really good
conversation with a friend, Cameron,

00:36:44.732 --> 00:36:50.032
who's got a history degree and is
following the Ukraine war very closely.

00:36:50.957 --> 00:36:53.747
And he explained to me why he
didn't think Putin would launch

00:36:53.747 --> 00:36:55.867
a nuclear attack, you know?

00:36:56.287 --> 00:36:58.677
And he just said, there was
10 years of war in Vietnam,

00:36:58.697 --> 00:36:59.797
the Americans didn't do it.

00:36:59.817 --> 00:37:02.957
Nixon woke up in the middle of the night
and told him Even Stalin didn't use the

00:37:02.957 --> 00:37:05.637
bomb and, you know, Stalin was a monster.

00:37:05.727 --> 00:37:06.387
Yeah, well, Nixon, I've got

00:37:07.567 --> 00:37:08.817
Sam: a very comforting story for you.

00:37:08.837 --> 00:37:11.867
Nixon woke up, talking of Phil Pinions,
Nixon woke up in the middle of the

00:37:11.867 --> 00:37:15.357
night more than once He, you know,
pills, he was having a lot of pills

00:37:15.777 --> 00:37:21.167
at that point, pouring sweat, rings
up Kissinger, Henry, drop the bomb.

00:37:21.357 --> 00:37:24.307
And then, and then, you know,
Kissinger, well, he's evil

00:37:24.307 --> 00:37:26.087
incarnate, but he's rational.

00:37:26.137 --> 00:37:28.477
He's your rational, platonic kind of evil.

00:37:28.987 --> 00:37:31.447
And he says, he says,
consider it done, Mr.

00:37:31.447 --> 00:37:33.057
President, and puts down the phone.

00:37:33.057 --> 00:37:35.067
And then the next day, just
no one says anything about it.

00:37:35.067 --> 00:37:38.827
And, you know, so that
happened twice, apparently.

00:37:38.927 --> 00:37:40.497
Joe: What I was going to say is that.

00:37:41.337 --> 00:37:43.357
Actually, I need to talk some
of this shit through sometimes.

00:37:43.367 --> 00:37:44.147
Of course!

00:37:44.257 --> 00:37:48.087
And the best I can come up with
is pay a couple of fairly, highly

00:37:48.087 --> 00:37:51.907
intelligent, rational seeming people
to tell me some information, and

00:37:51.907 --> 00:37:56.747
occasionally have some chats with
some people who are paying attention.

00:37:57.317 --> 00:37:59.137
Ali: But that's time and money well spent.

00:37:59.137 --> 00:37:59.687
But most

00:37:59.987 --> 00:38:00.797
Joe: people I notice...

00:38:01.063 --> 00:38:04.993
Just aren't that concerned that we could
all die in a nuclear apocalypse, even

00:38:04.993 --> 00:38:09.873
though, uh, the country with the most
nuclear weapons just started a massive

00:38:09.873 --> 00:38:10.303
Sam: war.

00:38:10.343 --> 00:38:10.913
No, but think about it.

00:38:10.923 --> 00:38:14.733
It's not rational to invest in
something you can't do much about.

00:38:14.743 --> 00:38:15.123
Yes.

00:38:15.143 --> 00:38:16.733
Joe: That's, and that's where I'm

00:38:16.733 --> 00:38:17.193
Sam: crazy.

00:38:17.363 --> 00:38:17.533
Yes.

00:38:18.383 --> 00:38:18.563
So the

00:38:18.563 --> 00:38:19.113
Ali: rational thing to do.

00:38:19.113 --> 00:38:20.763
Because, because Putin's
not going to listen to Joe.

00:38:23.218 --> 00:38:23.458
Yeah.

00:38:23.768 --> 00:38:27.278
So that, that, that's where I think
you fall down is that you, you, you

00:38:27.278 --> 00:38:31.108
feel like if you said enough or spoke
enough or had the right conversation

00:38:31.108 --> 00:38:34.128
with the right person, somehow it
would have an impact on the decision.

00:38:34.138 --> 00:38:35.448
It's a lack of surrender.

00:38:35.508 --> 00:38:35.798
Yeah.

00:38:35.798 --> 00:38:39.668
And whereas, whereas yes,
Sam and I have completely and

00:38:39.668 --> 00:38:43.578
Joe: everyone else, you
know, 99% of people just go,

00:38:43.628 --> 00:38:45.168
Ali: well, most of us are just
like, well, there's literally

00:38:45.168 --> 00:38:46.848
nothing we can do about it.

00:38:46.858 --> 00:38:47.538
So my time's up.

00:38:47.538 --> 00:38:48.068
Sam: My time's up.

00:38:48.118 --> 00:38:49.308
If it makes you feel better.

00:38:49.308 --> 00:38:49.628
Right.

00:38:50.018 --> 00:38:53.408
You, your blind spot is this
thing that most, even dodos

00:38:53.418 --> 00:38:54.828
step over that easy, right?

00:38:54.858 --> 00:38:57.328
And you've, you're smarter than
them and you've managed to just

00:38:57.328 --> 00:38:58.528
fall into this snare, right?

00:38:58.888 --> 00:39:02.908
But if it makes you feel any better,
everyone's got the, this thing, like

00:39:02.918 --> 00:39:04.728
everyone's got one of those at least.

00:39:04.788 --> 00:39:06.288
So all those dodos have some.

00:39:06.533 --> 00:39:10.083
Other, more asinine trap,
honestly, that they fall into,

00:39:10.767 --> 00:39:13.337
Joe: I've found the best balance
I've ever found so far, which is I

00:39:13.337 --> 00:39:16.002
check the, Douglas Murray said this.

00:39:16.522 --> 00:39:20.182
Douglas Murray, who is very right
wing, is probably the most right

00:39:20.182 --> 00:39:21.722
wing guy's books I've ever read.

00:39:21.772 --> 00:39:22.692
Yep, he's a gateway

00:39:22.692 --> 00:39:24.512
Sam: to Christian identity
and a few other things.

00:39:25.442 --> 00:39:26.312
Watch out for that, Joe.

00:39:26.982 --> 00:39:28.252
By the way, you're in the demographic.

00:39:28.312 --> 00:39:29.612
Disempowered middle aged man.

00:39:30.122 --> 00:39:31.152
Joe: Sure, exactly.

00:39:31.812 --> 00:39:33.192
Fascism beckons, my friend.

00:39:33.272 --> 00:39:36.812
But Douglas Murray said, you know
what, I'll just check the news

00:39:36.812 --> 00:39:37.912
for five minutes in the morning.

00:39:37.922 --> 00:39:38.132
Look,

00:39:38.312 --> 00:39:40.852
Sam: Douglas Murray's not a fash, but he,

00:39:40.862 --> 00:39:41.232
Joe: yeah.

00:39:41.262 --> 00:39:42.192
But, but.

00:39:42.512 --> 00:39:43.032
He's.

00:39:43.247 --> 00:39:43.477
Yeah.

00:39:43.587 --> 00:39:44.277
One thing.

00:39:44.307 --> 00:39:45.547
Five minutes is what he said.

00:39:45.637 --> 00:39:49.057
Well, Alain de Botton said, if we
were rational, we would check the

00:39:49.057 --> 00:39:51.327
news on a Sunday and that would be it.

00:39:52.197 --> 00:39:52.557
Sam: Yeah.

00:39:52.587 --> 00:39:53.337
Yeah, for sure.

00:39:53.377 --> 00:39:56.797
No, I think that there is a lot of truth
in that, but like I said, I think the

00:39:56.797 --> 00:39:58.587
even more rational thing is, so if you're

00:39:58.587 --> 00:39:58.847
Joe: looking for.

00:39:59.327 --> 00:40:03.057
And Douglas Murray, who I look up to
because I read a couple of his books

00:40:03.057 --> 00:40:04.977
and found him to be highly intelligent.

00:40:04.997 --> 00:40:05.317
Yeah.

00:40:05.367 --> 00:40:09.217
Says, I check the news for five minutes
in the morning, then I get on with my day.

00:40:09.757 --> 00:40:10.137
Yes.

00:40:10.507 --> 00:40:12.087
So, so these people like.

00:40:12.422 --> 00:40:17.322
I put up on a pedestal and then I listen
to what they say and it slowly gets in.

00:40:17.322 --> 00:40:18.522
So I end up with what?

00:40:18.592 --> 00:40:23.062
Just the ABC news app on my phone and
at the moment I'm checking it about five

00:40:23.062 --> 00:40:24.762
minutes a day and it's fucking great.

00:40:25.182 --> 00:40:25.342
Yeah.

00:40:25.342 --> 00:40:25.622
Sam: No, it is.

00:40:25.822 --> 00:40:26.212
It's not bad.

00:40:26.212 --> 00:40:29.142
But podcasts, there's a great
podcast called decoding the gurus.

00:40:29.142 --> 00:40:30.502
They've got an episode
about Douglas Murray.

00:40:30.522 --> 00:40:31.252
Give that a listen.

00:40:31.262 --> 00:40:32.212
They've got an episode about.

00:40:32.507 --> 00:40:33.247
Yeah, I've listened

00:40:33.247 --> 00:40:33.507
Joe: to that.

00:40:33.507 --> 00:40:34.007
You told me to listen to it.

00:40:34.007 --> 00:40:35.127
Oh, yeah, yeah, yeah.

00:40:35.127 --> 00:40:35.387
Sam: Sweet.

00:40:35.387 --> 00:40:35.847
Yeah, yeah.

00:40:35.877 --> 00:40:36.357
Keep at it.

00:40:37.037 --> 00:40:40.427
Novara Media, you can support
them for five bucks a month.

00:40:40.507 --> 00:40:43.197
Joe: Noah Smith and Matt
Iglesias are not gurus.

00:40:43.237 --> 00:40:44.667
They're just nerds.

00:40:44.737 --> 00:40:45.147
No, no.

00:40:45.287 --> 00:40:45.427
I

00:40:45.447 --> 00:40:46.917
Sam: don't think they qualify as gurus.

00:40:46.917 --> 00:40:47.147
No.

00:40:47.487 --> 00:40:49.627
Joe: Douglas Murray is,
yeah, potentially dangerous.

00:40:49.657 --> 00:40:52.087
No, he's definitely a guru.

00:40:52.087 --> 00:40:53.107
Because he's incredibly

00:40:53.107 --> 00:40:53.607
Sam: intelligent.

00:40:53.607 --> 00:40:55.327
But give Novara Media five bucks.

00:40:55.327 --> 00:40:56.737
They do excellent analysis.

00:40:57.212 --> 00:40:59.662
Anyway, it just happens to
be left wing, but they're not

00:41:00.212 --> 00:41:00.722
scaremongers like the Guardian.

00:41:00.752 --> 00:41:02.002
Anyway, Ali, what were you going to say?

00:41:02.012 --> 00:41:02.702
No, no, no, no,

00:41:02.762 --> 00:41:05.562
Ali: no, no, no, no, no, no,
I've lost that train of thought.

00:41:05.892 --> 00:41:06.362
Sorry.

00:41:06.412 --> 00:41:06.842
Damn it.

00:41:07.742 --> 00:41:08.572
No, I'm sorry.

00:41:08.632 --> 00:41:08.782
No,

00:41:08.962 --> 00:41:09.282
Joe: that's okay.

00:41:09.282 --> 00:41:10.542
All right.

00:41:10.542 --> 00:41:11.472
Well, I think that's it.

00:41:11.482 --> 00:41:12.052
Yeah.

00:41:12.217 --> 00:41:16.717
Sam: yeah, look, I think, can I,
I really want to plug like some

00:41:16.717 --> 00:41:18.877
non centrist journalism that is.

00:41:18.897 --> 00:41:19.207
Joe: Yeah.

00:41:19.207 --> 00:41:19.597
Go for it.

00:41:19.597 --> 00:41:21.067
I'd usually cut out a lot of.

00:41:21.397 --> 00:41:24.227
When you get specific about other
podcasts, they often cut it out

00:41:24.227 --> 00:41:27.587
of the edit, because I just think
people might find it a little bit

00:41:28.277 --> 00:41:31.257
the same as when you say someone's
name, who they probably don't know.

00:41:31.327 --> 00:41:31.847
Uh, look,

00:41:31.877 --> 00:41:36.297
Sam: no, no, I, it's more, it's more for
you, that like, if, you know, basically.

00:41:36.862 --> 00:41:40.512
If you want some America perspective
that is heterodox, like some of

00:41:40.512 --> 00:41:43.372
your guys, um, you know, she'll
talk to Michael Schellenberger.

00:41:43.372 --> 00:41:46.552
She'll talk to like all kinds
of cranks, Bad Faith Podcast.

00:41:46.562 --> 00:41:47.172
That's great.

00:41:47.237 --> 00:41:48.645
she'll talk to centrists as well.

00:41:49.115 --> 00:41:52.635
And she'll talk to trade unionists and
academics on, you know, and all sorts.

00:41:52.685 --> 00:41:54.415
Yeah, you're big on the Bad Faith Podcast.

00:41:54.415 --> 00:41:55.075
Yeah, it's fantastic.

00:41:55.075 --> 00:41:56.545
I haven't listened for a
while, but it's really good.

00:41:56.655 --> 00:41:59.812
And, Novara Media, which
is like more UK based.

00:42:00.130 --> 00:42:00.420
It's just...

00:42:00.840 --> 00:42:05.570
The same sort of mix of like hard nosed
analysis, triangulating sources, they're

00:42:05.570 --> 00:42:10.160
not, even though they're like avowedly
left wing, they're not stuck in a

00:42:10.190 --> 00:42:13.815
kind of orthodox echo chamber, they're
actually like it's It's remarkably

00:42:13.825 --> 00:42:17.835
balanced once you keep in mind that
they are a validly left wing, but it

00:42:17.845 --> 00:42:21.735
actually gives you, I think, I find it
relatively calming to listen to because

00:42:22.035 --> 00:42:25.445
occasionally there's a point of action,
but they're not trying to work you up.

00:42:25.465 --> 00:42:28.725
They're just, just trying to understand
the world with you and like, it's

00:42:28.865 --> 00:42:30.185
like a very rational project.

00:42:30.185 --> 00:42:36.824
I think you would find, whereas sometimes
with the centrism is, ah, if we take.

00:42:37.184 --> 00:42:39.854
The left and the right and add them up.

00:42:39.884 --> 00:42:41.254
It cancels out to zero.

00:42:41.264 --> 00:42:43.914
That's where I should hang out You
know, like I think that there's a kind

00:42:43.914 --> 00:42:48.617
of rational impulse there That well
the guy from Philip Morris and the guy

00:42:48.617 --> 00:42:52.767
from The the cancer doctor who says
cigarettes will kill you and the guy

00:42:52.767 --> 00:42:53.767
from Philip Morris, you know what?

00:42:53.939 --> 00:42:55.789
The answer is probably
somewhere in the middle.

00:42:55.849 --> 00:42:57.359
That's the centrist lure, you

00:42:57.359 --> 00:42:57.739
Joe: know

00:42:59.399 --> 00:43:02.229
Sam: No, they probably are
yeah in fairness, I haven't

00:43:02.229 --> 00:43:03.039
read a time I haven't read

00:43:03.039 --> 00:43:03.669
Joe: enough of it.

00:43:03.669 --> 00:43:05.809
Yeah, you won't read it
That's another thing.

00:43:05.809 --> 00:43:07.169
I'll say about facts versus filth.

00:43:07.169 --> 00:43:07.469
I've done

00:43:07.469 --> 00:43:08.709
Sam: to know opinion articles.

00:43:08.739 --> 00:43:09.919
Oh, yeah one matter glaciers.

00:43:10.209 --> 00:43:15.439
Joe: Yes recently Generally, what I've
noticed is people won't read articles that

00:43:15.449 --> 00:43:19.979
you share as well And there's there's a
giving up on there was a certain phase

00:43:19.979 --> 00:43:26.119
of the internet where we can't or wanted
To know the same stuff, or kid, that

00:43:26.149 --> 00:43:30.209
were on the same page, or even more
simply, that we all just, we all just

00:43:30.209 --> 00:43:35.009
got the age delivered, or we all read
it in our, our cafe, that's all gone.

00:43:35.449 --> 00:43:39.749
I don't, I don't, I say it in an angry
way, I don't expect someone to read

00:43:39.839 --> 00:43:44.449
an article I send them anymore, I just
don't, like, they just won't, that'll

00:43:44.449 --> 00:43:48.904
be too long did not read, that's gonna
be, and they're only gonna be from, Noah

00:43:49.004 --> 00:43:53.064
Smith or Matt Yglesias, let's face it,
but I'll get excited, you know, I'll

00:43:53.064 --> 00:43:56.949
be like I'm like, Oh my God, if more
people could just read this and see this

00:43:56.949 --> 00:43:59.159
way, then I could have this discussion.

00:43:59.159 --> 00:44:03.279
But I just have to accept you
guys are, I think I was autistic.

00:44:03.279 --> 00:44:06.109
Don't you have this same problem where
you get excited about it all the time.

00:44:06.439 --> 00:44:10.089
Ali: It's like, it's a constant, like,
yeah, that no one else is, you do.

00:44:10.119 --> 00:44:10.709
You really do.

00:44:10.739 --> 00:44:11.679
Like it's, it's disheartening.

00:44:11.969 --> 00:44:12.769
It can be disheartening.

00:44:12.929 --> 00:44:13.049
Sam: I give up.

00:44:14.024 --> 00:44:17.154
I, I put things in the phone and
go, Oh, I could, who would I send

00:44:17.154 --> 00:44:17.274
Ali: that?

00:44:17.304 --> 00:44:17.614
Yeah.

00:44:18.174 --> 00:44:20.294
I've got so many open tabs and
things and I'm like, I'm going to

00:44:20.294 --> 00:44:21.474
send that to some, and I never do.

00:44:21.504 --> 00:44:21.884
Sam: I don't know.

00:44:22.064 --> 00:44:26.244
But like, I think maybe the, you
know, I just content myself with

00:44:26.244 --> 00:44:29.374
the knowledge that like someone out
there right now feels exactly the

00:44:29.374 --> 00:44:31.184
same way I do about this exact thing.

00:44:31.184 --> 00:44:31.754
And yeah, I

00:44:32.034 --> 00:44:34.504
Joe: could jump in the comments
section, you know, which I never would.

00:44:34.854 --> 00:44:36.154
Joe, I would tell you what,

00:44:36.274 --> 00:44:38.795
Sam: if you gave me like here.

00:44:39.410 --> 00:44:42.160
We'll get some new podcast hosting
next month, which by the way, I

00:44:42.160 --> 00:44:44.920
want to create, I want to create
an appeal to the listeners.

00:44:45.080 --> 00:44:48.240
I want to solicit a handful
of subscribers who can pay for

00:44:48.320 --> 00:44:49.460
help us pay for the hosting.

00:44:49.460 --> 00:44:49.730
Right.

00:44:50.220 --> 00:44:53.360
But let's say once we get that hosting
in place, you should have your own

00:44:53.360 --> 00:44:55.750
feed, which is like Joe breaks down.

00:44:56.045 --> 00:44:59.725
Centrist articles from two bloggers.

00:44:59.755 --> 00:45:00.155
Trust me.

00:45:00.195 --> 00:45:01.125
I would listen to that.

00:45:01.335 --> 00:45:02.255
I would a hundred percent listen.

00:45:02.835 --> 00:45:06.495
He breaks, he breaks down Matt Iglesias
and no opinion and occasionally has

00:45:06.495 --> 00:45:10.525
a foray into Michael Schellenberger
and I would a hundred percent listen.

00:45:10.545 --> 00:45:12.915
And all you do is just read me bits
from the article and say, see, I

00:45:12.915 --> 00:45:16.655
like this bit because, and just
keep them, keep them under 20.

00:45:16.665 --> 00:45:17.235
It'll be great.

00:45:19.505 --> 00:45:21.015
Just summarize the article for me.

00:45:21.015 --> 00:45:21.345
Like you.

00:45:21.490 --> 00:45:22.920
Talk me through it for 20 minutes.

00:45:22.950 --> 00:45:24.310
I will 100% listen.

00:45:24.340 --> 00:45:27.220
Joe: I can remember reading
Michael Schellenberger at work to

00:45:27.220 --> 00:45:28.700
the point where I got in trouble.

00:45:28.840 --> 00:45:29.230
Oh yeah.

00:45:29.240 --> 00:45:34.250
Because for the first time in five years
I thought I might not die and all we're on

00:45:34.250 --> 00:45:36.450
our own might not die from climate change.

00:45:36.570 --> 00:45:36.820
Yeah.

00:45:36.870 --> 00:45:39.720
No one had said that to me
until he wrote Apocalypse Never.

00:45:40.030 --> 00:45:40.280
Yeah.

00:45:40.280 --> 00:45:43.340
And I was like, Oh my God,
we might be sort of okay.

00:45:43.340 --> 00:45:43.800
Like we might.

00:45:44.440 --> 00:45:44.730
Yes.

00:45:44.740 --> 00:45:46.170
Humanity might survive.

00:45:46.230 --> 00:45:46.390
It's

00:45:46.390 --> 00:45:46.710
Ali: awesome.

00:45:47.010 --> 00:45:49.130
Where am I going to put all
this existential dread then?

00:45:49.210 --> 00:45:50.220
Because that was your outlet.

00:45:50.230 --> 00:45:50.430
You'll have

00:45:51.050 --> 00:45:51.250
Sam: to put it somewhere.

00:45:51.250 --> 00:45:51.750
It was

00:45:51.800 --> 00:45:55.750
Joe: like, I got in trouble at work,
whereas the average person will just think

00:45:55.750 --> 00:45:57.480
Michael Schellenberg is complete nutter.

00:45:58.250 --> 00:46:01.450
He got pretty weird after that
book and got obsessed with

00:46:01.460 --> 00:46:02.860
San Francisco, but I mean,

00:46:03.640 --> 00:46:04.430
Sam: like nuclear power.

00:46:04.630 --> 00:46:05.650
Yeah, sure.

00:46:05.740 --> 00:46:10.510
Joe: But like, at least that was,
that was the, that was the moment

00:46:10.560 --> 00:46:13.000
I stopped being a standard lefty.

00:46:13.010 --> 00:46:13.340
Yeah.

00:46:13.680 --> 00:46:15.070
And became something else.

00:46:15.080 --> 00:46:15.470
Sure.

00:46:15.680 --> 00:46:21.120
With Schallenberger actually being
willing to put a book out called

00:46:21.120 --> 00:46:25.570
Apocalypse Never with a picture
of a polar bear feeding its cub.

00:46:25.990 --> 00:46:29.330
And he would write facts about how, you
know what, polar bear numbers are up.

00:46:29.630 --> 00:46:30.430
Didn't you read the Tim

00:46:30.430 --> 00:46:33.590
Sam: Flannery one years ago about,
let's go nukes, don't you remember?

00:46:33.590 --> 00:46:37.260
Joe: But, but, but, Apocalypse Now
is not about nuclear power, it's

00:46:37.270 --> 00:46:39.500
about the fact that climate change...

00:46:39.510 --> 00:46:40.890
It won't turn out the way you, yeah.

00:46:40.930 --> 00:46:43.090
It's not going to be the end of the world.

00:46:43.130 --> 00:46:43.770
Yeah, yeah, yeah.

00:46:43.800 --> 00:46:46.130
The literal end of the world.

00:46:46.130 --> 00:46:46.910
I never said it would be.

00:46:46.920 --> 00:46:49.180
But you never thought it would be.

00:46:49.330 --> 00:46:50.880
No, I never did, no.

00:46:50.880 --> 00:46:53.660
And most, I've got all these friends
that are wiser than me, or...

00:46:54.035 --> 00:46:57.305
That don't have the same fucking black
hole inside them, whatever, right?

00:46:57.565 --> 00:47:00.185
And they don't get worried about
shit in the same way that I do.

00:47:00.235 --> 00:47:02.515
But I need to read deeply.

00:47:03.055 --> 00:47:07.465
But I, but, but first I needed to get
out of my lefty apocalyptic bubble.

00:47:07.525 --> 00:47:08.535
Oh, that was hard.

00:47:08.535 --> 00:47:12.055
I don't even know how it, how
I chanced upon someone like

00:47:12.055 --> 00:47:13.265
Schellenberger, you know?

00:47:13.305 --> 00:47:13.865
Well, I

00:47:13.895 --> 00:47:16.325
Sam: think just the algorithm
throws it out there because

00:47:16.475 --> 00:47:17.875
there was a need for something.

00:47:18.065 --> 00:47:18.525
Basically.

00:47:19.155 --> 00:47:23.495
What we started with was like the
extremes are mutually constitutive, right?

00:47:23.525 --> 00:47:28.565
So, you know, um, well, the example
I gave was Marga and, you know,

00:47:28.895 --> 00:47:35.135
extreme Libs or whatever constituting
one another, but like the, the, the

00:47:35.135 --> 00:47:41.635
amount of, the amount of concern about
climate was created by, by denial

00:47:41.645 --> 00:47:43.085
that had been existing before that.

00:47:43.555 --> 00:47:49.745
And then that in turn, Created an
industry of everyone calm down now.

00:47:49.785 --> 00:47:50.775
Yeah, yeah, yeah.

00:47:51.065 --> 00:47:52.615
So these, these things just.

00:47:53.265 --> 00:47:56.495
Look, this is just Hegelian
dialectics, you know, antithesis,

00:47:56.925 --> 00:47:58.525
thesis, antithesis, synthesis.

00:47:58.925 --> 00:48:02.235
And history is kind of this weird
crab walk down the beach, you

00:48:02.235 --> 00:48:05.865
know, sideways, and you don't know
necessarily whether it's going to get.

00:48:05.915 --> 00:48:06.510
And if you're,

00:48:06.510 --> 00:48:07.915
Joe: and if you, if you.

00:48:08.385 --> 00:48:13.455
If you just think Greta Thunberg is great,
then you'd be shocked to hear Joe Rogan

00:48:13.465 --> 00:48:19.155
take the piss out of her, and despise her,
but he becomes the most popular podcast

00:48:19.185 --> 00:48:22.735
host in the world, so it's like, where
even is the mainstream, is the mainstream.

00:48:23.475 --> 00:48:23.665
Yeah.

00:48:24.265 --> 00:48:26.065
Greta Thunberg or is the
mainstream Joe Rogan?

00:48:26.085 --> 00:48:26.585
It's both.

00:48:26.585 --> 00:48:27.755
It seems to be more Joe

00:48:27.755 --> 00:48:28.455
Sam: Rogan actually.

00:48:28.465 --> 00:48:31.615
But if he shitcans Greta Thunberg,
it really just confirms what a lot

00:48:31.615 --> 00:48:32.835
of people already thought about him.

00:48:32.885 --> 00:48:35.305
So, you know, which is another
field opinion yet again.

00:48:35.735 --> 00:48:36.615
But yeah, but

00:48:37.005 --> 00:48:40.475
Joe: it's, but I lived in a
world for most of my adult life

00:48:40.475 --> 00:48:41.650
where I never would have heard.

00:48:42.160 --> 00:48:42.420
Yeah.

00:48:42.500 --> 00:48:42.850
Sam: That.

00:48:43.080 --> 00:48:43.930
No, that's true.

00:48:43.970 --> 00:48:44.230
Yeah.

00:48:44.230 --> 00:48:44.290
Yeah.

00:48:44.290 --> 00:48:44.890
But go on Ellie.

00:48:44.890 --> 00:48:45.130
Yeah.

00:48:45.180 --> 00:48:45.730
No, I was going to say

00:48:45.740 --> 00:48:48.320
Ali: like, it becomes like
your, those feel opinions are

00:48:48.320 --> 00:48:50.200
just reinforced by the same.

00:48:50.270 --> 00:48:50.470
Yeah.

00:48:50.470 --> 00:48:51.950
It's just a, an echo chamber.

00:48:52.100 --> 00:48:52.340
I think

00:48:52.340 --> 00:48:55.920
Sam: it's dreadfully droll to express
an opinion about climate change

00:48:56.000 --> 00:48:57.660
via the medium of Greta Thunberg.

00:48:57.750 --> 00:49:01.560
I mean, that is just Thunberg or whatever
that is just so droll to like boil it

00:49:01.560 --> 00:49:03.780
down to her, which is a hundred percent.

00:49:04.156 --> 00:49:07.806
let's say the forces of denial were like
keen on doing, and it wouldn't necessarily

00:49:07.806 --> 00:49:09.826
put Joe Rogan in the 100% denial camp.

00:49:09.826 --> 00:49:10.336
No, it's not.

00:49:10.336 --> 00:49:14.186
But it's just very droll to go, I'm
annoyed by this girl having opinions.

00:49:14.186 --> 00:49:14.936
Joe: My God.

00:49:15.016 --> 00:49:19.596
But the point was that I, all I really
ever thought was, she's great and thank

00:49:19.596 --> 00:49:21.526
God someone's telling it like it is.

00:49:21.536 --> 00:49:22.576
You were on her side.

00:49:22.606 --> 00:49:25.586
And then somehow I end up in this
other part of the internet where

00:49:25.586 --> 00:49:30.411
Joe Rogan is like, Taking the piss
out of her and, and, and yeah,

00:49:30.411 --> 00:49:32.611
seemingly like not liking her at all.

00:49:32.621 --> 00:49:33.321
But I would actually like

00:49:33.321 --> 00:49:35.351
Sam: to shut the fuck
up, but actually being a

00:49:35.351 --> 00:49:40.381
Joe: bully, and I'm not a Joe Rogan
fan, I don't listen to Joe Rogan, but

00:49:40.421 --> 00:49:40.601
Sam: yeah.

00:49:40.936 --> 00:49:45.046
I listened to him before he turned into
what he turned into back when he was just

00:49:45.046 --> 00:49:47.466
on the pods before his YouTube success.

00:49:47.836 --> 00:49:50.886
And because YouTube basically
said, come over here and be famous.

00:49:50.886 --> 00:49:53.456
And he did, but like before that
he was already doing really well.

00:49:53.476 --> 00:49:55.886
But let's say when he was back in
the days when he was only getting a

00:49:55.886 --> 00:50:01.296
million downloads, let's say I was in
there and like, he was very likable.

00:50:01.576 --> 00:50:06.446
Now, obviously he was just basically
a happy go lucky bro who was like

00:50:06.456 --> 00:50:08.956
interested in a bit of science
and he would introduce, he would.

00:50:09.376 --> 00:50:13.336
He would, you know, interview people who
were grounded and people who weren't, but

00:50:13.336 --> 00:50:15.856
it was all pretty harmless, bro y stuff.

00:50:16.406 --> 00:50:18.896
And mostly he talked to comedians
and just was shooting the shit.

00:50:19.466 --> 00:50:23.486
Like, just very likeable and blokey, and
there's a, there's a market for that.

00:50:23.756 --> 00:50:25.396
And there should be a market for that.

00:50:25.456 --> 00:50:26.186
There's nothing wrong with it.

00:50:26.476 --> 00:50:27.166
But when...

00:50:27.801 --> 00:50:32.511
Obviously, when the algorithm beats
us all out of shape eventually, and so

00:50:32.721 --> 00:50:36.231
basically what I think maybe this is
how we could conclude the episode, a lot

00:50:36.231 --> 00:50:40.401
of, a lot of us are being led around the
nose by like our feelings and what we

00:50:40.411 --> 00:50:43.991
need to be true based on our feelings
and like, I need this feeling to go away.

00:50:43.991 --> 00:50:45.141
So I need this thing to be true.

00:50:45.141 --> 00:50:45.431
Right.

00:50:46.101 --> 00:50:49.861
But then in turn, the people that are
providing that service to us, they're

00:50:49.861 --> 00:50:53.011
getting led around the nose by the
people that have that emotional need.

00:50:53.301 --> 00:50:57.601
So, so, so those people are slaves
to the audience and the audience are

00:50:57.601 --> 00:50:59.991
slaves to what they're trotting out.

00:51:00.021 --> 00:51:01.161
It's, it's dismal.

00:51:02.861 --> 00:51:03.201
Ali: So just

00:51:03.201 --> 00:51:05.711
Sam: listen to academics and
just screw these other people.

00:51:05.711 --> 00:51:07.001
I just don't worry about journalism.

00:51:07.091 --> 00:51:07.771
That's my advice.

00:51:09.161 --> 00:51:10.211
Ali: That's pretty solid advice.

00:51:10.271 --> 00:51:13.251
Sam: And the best kind of academics
will acknowledge the feelings they

00:51:13.251 --> 00:51:16.371
have and that they have a subjectivity
and it's not about denying.

00:51:17.351 --> 00:51:20.531
The fact that you have a human
response to these things.

00:51:20.551 --> 00:51:21.931
It's just about triangulating all that

00:51:21.931 --> 00:51:22.551
Joe: stuff, isn't it?

00:51:22.591 --> 00:51:23.241
Yeah, I don't know.

00:51:23.241 --> 00:51:27.391
I mean, personally I think I'm
lucky that I have wise friends.

00:51:29.841 --> 00:51:29.881
You know?

00:51:29.901 --> 00:51:30.941
Most people probably

00:51:30.941 --> 00:51:31.246
Sam: don't.

00:51:31.246 --> 00:51:32.861
Well, I would count you
among my wise friends.

00:51:32.881 --> 00:51:33.221
Yeah, same.

00:51:33.331 --> 00:51:33.351
Yeah,

00:51:34.441 --> 00:51:36.081
Joe: but there's blind
spots or obsessions.

00:51:36.181 --> 00:51:36.471
But we all

00:51:36.471 --> 00:51:39.631
Ali: have blind spots and
vulnerabilities and, yeah.

00:51:41.016 --> 00:51:43.436
Sam: And also if you want to
remember more facts here, I'll

00:51:43.436 --> 00:51:44.646
leave, I'll leave you with this.

00:51:44.876 --> 00:51:47.666
If you want to get more facts into you,
you know, like more fiber into your

00:51:47.666 --> 00:51:50.286
diet or whatever, the best way is to.

00:51:50.841 --> 00:51:54.711
Absorb things through a framework
of what you care about and what

00:51:54.741 --> 00:51:56.891
you, what feels important to you.

00:51:56.891 --> 00:52:00.761
And feelings will help those facts
stick so much more than like a

00:52:01.091 --> 00:52:02.951
rational sense of this is important.

00:52:02.951 --> 00:52:03.841
Like that doesn't do it.

00:52:04.231 --> 00:52:04.871
It's the feelings.

00:52:04.881 --> 00:52:05.141
Yeah.

00:52:06.871 --> 00:52:07.111
Yeah.

00:52:07.171 --> 00:52:08.051
Ali: I think that's weird.

00:52:08.051 --> 00:52:11.181
Joe: Like what I care
about is my kid's future.

00:52:11.321 --> 00:52:11.571
Yeah.

00:52:11.901 --> 00:52:18.681
And what, and the, but yeah, but the
perceived threats to my kid's future as.

00:52:19.366 --> 00:52:20.486
Impending.

00:52:20.616 --> 00:52:20.876
Yeah.

00:52:21.216 --> 00:52:22.816
Within the next five minutes.

00:52:22.826 --> 00:52:23.016
No.

00:52:23.536 --> 00:52:27.151
Is Makes you incapable of functioning.

00:52:27.181 --> 00:52:28.281
Yeah, that's exactly right.

00:52:28.401 --> 00:52:31.141
And part of my kid's future
is whether they have a roof

00:52:31.141 --> 00:52:32.691
over their heads at my place.

00:52:32.731 --> 00:52:34.431
And part of that is
being able to go to work.

00:52:34.431 --> 00:52:36.841
And part of being able to go to
work is not having panic attacks.

00:52:37.071 --> 00:52:39.981
And part of not having panic
attacks is never reading The

00:52:39.981 --> 00:52:40.381
Sam: Guardian.

00:52:40.651 --> 00:52:41.331
Actually, and you know what?

00:52:41.341 --> 00:52:43.961
These 15 substacks are a good investment.

00:52:44.211 --> 00:52:44.381
Yeah.

00:52:44.451 --> 00:52:45.251
If they're doing that.

00:52:45.251 --> 00:52:45.561
Like I

00:52:45.561 --> 00:52:47.741
Ali: said, it's a good investment
of your time and your money.

00:52:47.741 --> 00:52:48.721
I think it's valuable.

00:52:48.721 --> 00:52:53.631
And if it gives you a sense of, you know,
peace and calm and understanding of the

00:52:53.631 --> 00:52:55.051
world as it is right now, it's worth it.

00:52:55.051 --> 00:52:55.881
You know, all it is

00:52:55.881 --> 00:52:59.821
Joe: is just a vague sense that
maybe we'll muddle through and solve

00:52:59.821 --> 00:53:01.621
a few of these fucking problems.

00:53:01.621 --> 00:53:04.071
You know, Glazius once said...

00:53:04.716 --> 00:53:06.436
There's always been big problems.

00:53:06.516 --> 00:53:06.826
Yeah.

00:53:06.836 --> 00:53:08.336
Do you know how soothing I found that?

00:53:08.536 --> 00:53:10.106
I think that's a good attitude to have.

00:53:10.276 --> 00:53:12.766
Imagine if you were in the Great
Depression, you'd think today was

00:53:12.766 --> 00:53:13.896
pretty fucking good, you know?

00:53:13.926 --> 00:53:14.676
You might, yeah.

00:53:14.786 --> 00:53:14.966
Yeah.

00:53:14.986 --> 00:53:15.636
Like, yeah.

00:53:16.056 --> 00:53:16.886
It's just that.

00:53:17.106 --> 00:53:20.476
I just, so yeah, maybe you're
glad this is basically my dad.

00:53:20.756 --> 00:53:21.476
I think there's

00:53:21.496 --> 00:53:24.746
Sam: always been big problems, and
that's a good thing to remember.

00:53:24.766 --> 00:53:25.336
It's true.

00:53:25.436 --> 00:53:26.686
Yeah, it is very good.

00:53:27.006 --> 00:53:27.366
Yeah, yeah.

00:53:27.366 --> 00:53:28.136
But why is he like your

00:53:28.136 --> 00:53:28.526
Joe: dad?

00:53:28.826 --> 00:53:30.786
Well, cause he says that
and then I calm down.

00:53:30.986 --> 00:53:31.686
Oh.

00:53:31.976 --> 00:53:32.576
Oh, I mean?

00:53:33.326 --> 00:53:36.666
It's just like that one person
who is so smart, your dad's still

00:53:36.666 --> 00:53:39.726
around, you know, maybe he can say
that to you on the phone and you go.

00:53:41.226 --> 00:53:44.376
Ali: He's a calm, rational friend
who's giving you the information

00:53:44.376 --> 00:53:48.606
like in a really easy, in a way that
yeah, is centering and grounding for

00:53:48.606 --> 00:53:48.826
Sam: you.

00:53:49.636 --> 00:53:53.316
That, that, that, and that is
something we all need and it's

00:53:53.316 --> 00:53:54.646
perfectly okay to need that.

00:53:54.716 --> 00:53:54.926
Yes.

00:53:54.936 --> 00:53:55.516
There you go.

00:53:55.636 --> 00:53:58.746
And maybe if there was more of that,
people would have less feel opinions.

00:53:58.986 --> 00:54:00.946
There you go.

00:54:01.046 --> 00:54:01.256
All right.

00:54:01.256 --> 00:54:01.926
Well, this has been fun.

00:54:01.926 --> 00:54:02.486
You guys.

00:54:02.486 --> 00:54:03.126
Joe: It's been good.

00:54:03.686 --> 00:54:04.226
Sam: See you next week.

00:54:04.246 --> 00:54:04.696
See ya.

00:54:04.736 --> 00:54:05.106
See ya.

00:54:05.106 --> 00:54:05.327
Bye.