Anchored in Chaos

Artificial Intelligence (AI) is rising in prominence across our world in areas such as psychotherapy and personal relations. Today we are going to dive into our perspectives and concerns around AI. How is AI altering human interactions, especially in psychotherapy, where human interaction is core? We will explore the interface between AI and therapy, discussing AI's potential and drawbacks in psychotherapy, data privacy, and emotional comprehension. Our discussion covers the potential for AI to replicate empathy, confidentiality issues with AI, and about how therapy should be conducted. You’ll also hear stories about people who have interacted with AI-based systems, while we discuss effects of AI usage on sleep patterns, cognitive ability, and increasing likelihood of creating fantasy worlds and false memories.

03:21 Concerns about AI Replacing Human Interaction
03:37 Historical Perspective on AI
05:07 AI in Everyday Life
20:51 AI and Confidentiality Concerns
22:28 Personal Experience with AI
31:21 The Importance of Human Interaction in Therapy
33:50 Understanding the Role of AI in Psychotherapy
34:25 Limitations of AI in Detecting Human Emotions
36:43 Importance of Human Interaction in Therapy
37:18 The Inseparability of Emotions and the Body
37:59 Dangers of Over-Reliance on AI for Emotional Support
40:08 Role of AI in the Future of Psychotherapy
51:20 Impact of AI on Sleep and Mental Health
01:00:12 Potential Dangers of Living in a False Reality
01:08:15 The Importance of Human Connection and Love

Additional Resources:
Learn more about Anchored in Chaos, contact us, or join the Mind Meld at our website, www.anchoredinchaos.org.

The environment around us is a swirling vortex of chaos, but you can navigate it when you have an anchor that can keep you steady.  Each episode, Liz Herl dives into data driven strategies and real world tactics with Dr. Tim Caldwell to help you become more grounded and centered in a world that is constantly shifting and changing.  Learn to effectively navigate family strife, career challenges and handle the anxiety of the unknown that the news is constantly bombarding us with. Liz is a Licensed Clinical Marriage and Family Therapist and Dr. Caldwell is a retired primary care physician and personal trainer.  You can lean on their decades of experience to find stability and peace without having to control circumstances or people around you.  You can be anchored in chaos.

What is Anchored in Chaos?

The environment around us is a swirling vortex of chaos, but you can navigate it when you have an anchor that can keep you steady. Each episode, Liz Herl dives into data driven strategies and real world tactics with Dr. Tim Caldwell to help you become more grounded and centered in a world that is constantly shifting and changing. Learn to effectively navigate family strife, career challenges and handle the anxiety of the unknown that the news is constantly bombarding us with. Liz is a Licensed Clinical Marriage and Family Therapist and Dr. Caldwell is a retired primary care physician and personal trainer. You can lean on their decades of experience to find stability and peace without having to control circumstances or people around you. You can be anchored in chaos.

Ep05_AI
===

[00:00:00]

Tim Caldwell: Welcome back. Liz, how are you? I'm good. How are you?

Liz Herl: Good. Good. How was your Christmas?

Tim Caldwell: It was good. It was good. It was, pretty mellow, pretty mellow at my place. My son was home from the Navy and my other son's a, professor. We call him professor, but he's an instructor at a college. So he was home too.

It was good to see everybody. Nice. Yeah. Good. How about you?

Liz Herl: [00:01:00] It's great. It was wonderful. Yeah. I have a, you know, a nine year old. It's very excited. yeah. It was very, tricky getting Miss Claus to help Santa put out gifts without someone being up all night waiting for it. I see. Literally all night.

Tim Caldwell: Yeah. So, Santa magic is still very much alive in your house. So far so good. Okay, good.

Liz Herl: I know I'm kind That's okay. I'm kind of sad when it goes.

Tim Caldwell: The innocence is sweet.

Liz Herl: I've only got one more. Yeah. The other two are like, all

Tim Caldwell: right, we know what's going on.

That's sweet. What, what are we talking about today?

Liz Herl: I attended a, one day seminar back in September. Mm-Hmm. . And it was talking about AI in psychotherapy. Yeah. And I have a lot of, I don't want to say concerns, but yeah, I guess they are concerns about that. Yeah. especially how I take this field so seriously and the work that, it involves. And it's, interesting to think that technology can do a better job than humans.

Yeah. And I'm not saying that [00:02:00] that's what they're saying, but I kind of feel like that's what they're saying.

Tim Caldwell: Yeah. Our, our discussions are, we, we have some concerns about how people are so willingly, It's so willing to turn themselves over to just a voice and an avatar.

Liz Herl: Sure. and I saw some really interesting, work that's being done that larger companies are wanting to utilize this kind of technology in maybe sidestepping a real life psychotherapist because the need is so great.

Tim Caldwell: I think the stats you shared with me, it's over 300 clients. per potential patients. Yeah, 300 to 1. It's better than 301. Sure. Yeah.

Liz Herl: And, to minimize, cost, I imagine. Yeah. That's probably part of it as well. Hopefully it's just the, the need for services.

Tim Caldwell: I think the intent is to have, at least to have, some service available to someone at some time.

Right. [00:03:00] Yeah. So, pick up your phone. And I think that's the idea is AI is readily available if you just pick up your phone, which

Liz Herl: It is right now, right? Not in a psychotherapy ideal way. I mean, you could probably use it for psychotherapy, which I do not recommend, but yeah, it's. It's very much so. That's what we're going to talk about.

Yeah. So I wanted to cover that in my overall idea of how this is going to impact psychotherapy relationships and then overall our whole life dynamics. Like how we engage with one another, and if that could set us back and I have some ideas around that.

Tim Caldwell: Yeah. well, let's see. So historically, some of the things we talked about is, from a military, from a military background, I can tell you that AI has been on the cusp of since the fifties, right? And that one of the big things I've always put out, in just general conversation about artificial intelligence versus man made stuff is our drone technology. [00:04:00] And even back in the sixties when we had the ability for a two key system to turn and launch missiles, we were not ready to surrender that power over to AI.

Because we didn't trust it. Right? We, we just don't trust it. The ability to literally just go, AI, take care of that bad guy. and yet bit of a gamble. Well, not just a gamble, but it's, there's no ethics or morality towards correct. Yes. You want me to take out a bad guy with a drone? Sweet. But I took out a hundred people doing that, right?

One big giant bomb and I killed the bad guy. Yeah, but it wasn't, there was no discretionary actions there. It just followed its instructions. Can it pinpoint and do all its stuff? I suppose that that technology is available. But that's exactly why we don't surrender that control over just AI and no, no human component involved.

Liz Herl: Right. And so the question [00:05:00] is, so what might be lost if we utilize AI in every category of our life, right? We were listening, when we were in Genuine Effort gym this morning, we were listening to, the, just some ongoing podcast about AI. The broad spectrum, not anything really necessarily psychotherapy related.

But one thing I caught that really, that I I posed a question to you was the AI ended its interaction and it said, it was my pleasure, and I said, how, how was it it's pleasure? Yeah, pleasure. How? Tell me how that works. Yeah, yeah. I mean, it's a program, right? So it's like, well, that's what I'm programmed to say.

Tim Caldwell: Yeah. Well, you and I both know that in the makeup of somebody reaching out to AI and that interaction. They've only been programmed to tell you certain responses. But we listened to a demonstration of an AI robot. [00:06:00] And, I thought that the robot was very snarky and condescending?

Condescending. And I, I'm thinking, that's how you're programmed? Why, it almost had this cynicism. And sarcasm to every response that it made. And I thought, that's weird. is that built in or is it just programmed to do that? I think it's supposed to give the qualities, I'm trying to be funny. But I found, I found it a little off putting.

Also in that same conversation we heard another commentator mention that, we are, we have essentially in, In taking this step into AI, which we took a long, long time ago, we are up to our eyeballs in AI, and people don't even know it. Right.

Liz Herl: There's no going back now. There's no going back. Right.

That's not what this is. This is about awareness. That's what ideally bringing it up. attending Dr. Frye's seminar is about awareness, what is happening, how to be involved and don't be so, we are so [00:07:00] conditioned by all of these easy access tools in our lives. Right. Just go with the flow. Well, what does that flow entail?

Tim Caldwell: Don't be naive to the fact that we're, you're, Alexa, turn on the house lights. Alexa, unlock my door. Alexa, what's the name of that song? You have, you already have computers running your home at convenience. I don't, FYI. Me either. But the other is too, are phones. Yep. They're listening devices full time. I try to always tell people you think you're seeing things, they're seeing you, right?

but it goes way past that. It's Siri. . find me this, find me a new path to work. Okay, Siri, what's my day look like? What's my day planner look like? So it knows where you work, how to get there. It's looking at your day planner. it probably has some contact information or day. So there's, now we're one person removed.

Call my sister in Big Bear. Well, now there's a sister in Big Bear. It knows that [00:08:00] phone number. It knows her information. It's, it's all expansive and the AGI, which has its own ability to, formulate, new algorithms for learning, it does it on its own, right? It just, all it takes is one more ripple and it turns into something else, right?

And that's all inclusive, so. Yeah, we, I have concerns. I know you have. We're, we're especially concerned as it applies towards its application to people in need who need help.

Liz Herl: And the outline of the seminar, there was some questions that I'll pose yet again. Dr. Freiss. A Dr. Freiss seminar, that there was no, I would say irony, but the irony is lost.

I heard at the very same, I don't know what, was that a podcast we were listening to? It was just a series of different interactions.

Tim Caldwell: Yeah, yeah, I was, I was just playing it, you know, in background preparation.

Liz Herl: Yeah. and. It said the exact same thing [00:09:00] and the question is, what happens when machines strip people of their skills and what about prolonged unemployment?

Well, we kind of talked about, and I'll touch base just gently on this,, in our last, episode around masculinity, the struggle to get people to actually work in some aspects. Right. Well, what if we make, you know, software so efficient because that's what it is, the software, so efficient that you don't have to need real people at the drive thru window, you know, it could all be by card access and there's some sort of apparatus that brings, belts down your food or there's only cooks in the back.

I don't know. I'm just saying that there's

Tim Caldwell: We're seeing total automated. It already exists. Yes. And by the way, you don't have to go get it. Siri, call McFlugles and bring, order me a giant something. Right. Right to your door.

Liz Herl: Now that a person delivers that though.

Tim Caldwell: Do they? We have drone delivery. Now we [00:10:00] are getting to drone delivery.

But there is drone delivery. I mean packaged drone delivery. We already have that.

Liz Herl: Yeah, and that is trying to get expanded. That's in the process of going actually, actually through the house to get a bigger, expansive bill on that too. So that can go, even more so,

Tim Caldwell: You know, so, let me just finish that thought.

Yes, please do. And the person who was making comment in the podcast or in the, what we were listening to in the gym. He said that we had, he had, he had a very, let's just say it's not utopian, problem with, with the AI that exists. We have created either a savior or our new master and in that, it's to, for it to be your savior, you have to be willing to give up a lot.

Now, it'll be incremental, but the reason I say a lot, it would be, at some point, it's going to want to become your master. If it gives you everything, pretty soon it's going to [00:11:00] say, I do everything and your power of selection and choice will be limited. Now you can't go to the, you've given it up so much, now you don't even know how to go to the supermarket.

Liz Herl: Well, I was going to say, it's an intellectual decline. For an individual, you don't have to think. You don't have to operate. That's my

Tim Caldwell: point. That is exactly my point. Yeah.

Liz Herl: That's the other thing. I'll get to that. But it's who's becoming more machine and who's becoming more human, right? Because you're so lost to that.

So, what about, and this kind of again, goes right back to psychotherapy, but, what about the loss of human contact? Like, we are already at, you know, post COVID of not wanting to interact with people. Two

Tim Caldwell: years, almost sequestered

Liz Herl: and Separated. Sure. And we're still struggling to, to re engage with one another.

Still coming out of that. Yeah. And now there's this huge push for AI. Well, that helps me stay [00:12:00] more, you know, isolated. That's right. And gives me justification for it as well. That's right. Now what we know about, we need interaction from one another, like our brains need interaction with each other. Well to have any higher level of intellect anyhow.

Tim Caldwell: So here's something, you've probably heard me say this before but I'm going to say this probably for the first time. A professor in a college, Clark W. Lacey, I don't think his middle name was W but for some reason it sticks in my mind, Clark Lacey. He taught me, this, it was graffiti written on an outhouse wall in the thirties.

And the graffiti read, Pet, Pat, Play, and Provide Parlance for the Human Primate. It's a lot of Ps. It's a lot of Ps. But in that, he breaks down that this is almost the formula to remain human. Pet, pat, play, and provide. Pet, people need to have [00:13:00] contact and be stroked mentally. Physical touch. Physical touch.

Pat, we do need, we do need, we do need physical contact. Play, we have to be able to interact. Dialogue. And, and, well that's the parlance. Pet, pat, play, provide parlance. That's we have to have that means of communication. Remove any of, any of those elements and we are no longer human. Which I, that's stuck with me since the 80s.

That, that's a powerful thing. And

Liz Herl: yeah, you've mentioned that to me now that you say it a few times now.

Tim Caldwell: Yes. Pet, pat, play, and provide parlance for the human primate. So, post COVID, we know that we have, we now have these young people tragically cut off from the world. They're not exercising, they're not interacting, and their education is limited.

And their, what was terrible education, is now emojis, parlance, nosedive. You know as well as I [00:14:00] do, we, you've talked to young, young kids today on the phone, and, Are you still there? There's no communication, there's no sharing, there's no okay, right, and then click.

Liz Herl: Like, right, in an articulate interaction of language, of, yeah, language used.

And when you say that, I told you my pet peeve. My children, all three of my children know that, I do not text them and interact with them with acronyms. Yeah, me either. It bothers me, and I will not respond to them, and they'll be like, Mom, I'm like, and then, like, fine, and they know. I'm like, no, no, no, you know the English language, and you are educated grammatically, and you will utilize that.

And Mom, nobody does that. I'm like, you will. I do. And I'm like, that's what I tell them. They know, I know. But I'm like, old school, right? But no, it shouldn't be old school. It should be great. Anyhow, I, that's another thing that is being lost. Now, in my home, they're like, well, with my friends, you can do that with your friends.

I don't [00:15:00] recommend it. Right. But with me, if you want a response, you will write me a written response. Now, you say this and it was really interesting, my son, who is about to turn 19, he and I were having a conversation and when you were stating that, it just popped into my head and that, he said, you know, mom, I feel like I'm so out of sync with people my age.

And I said, well, what do you mean by that? He said, I would rather pick up the phone and talk to someone than text them. And I said, you are, but I love that. Like I always say, I'm like, you have an old soul. I love it. But he said, but then when I go to call someone, they're just kind of like, almost like, what are you doing calling me?

Like,

Tim Caldwell: oh yeah, that's true.

Liz Herl: That's crazy. Yeah, I know. It's like, didn't, didn't you know how to text? And he's like, yeah, I actually just wanted, and then we went into that. Let's talk about human interaction for a section, second, if I can talk. And that is, you can hear inflection, you can hear emotion, you can hear laughter, you can hear joy, you can hear anger, you can, right, you can hear all of these things.

You can [00:16:00] feel the conversation. You can

Tim Caldwell: hear a person smile on the phone. Right. You can hear a person sad on

Liz Herl: the phone. You can hear a person sad on the phone. That's right. And you miss all of that, of course, through text. Now, I'm not saying, circling back to AI, that they aren't making AI sound pretty darn good.

I know. Seductive. I know. They sound, we were, you were just sharing with me about these scams where they're recording people's voices and doing certain

Tim Caldwell: things. Yeah, , a person receives a phone call. Phone call goes, I'll play out the scenario. Grandma, it's me, Julie. I'm broke down. My car's a mess.

I've had to call for a tow truck. Now the problem is I can't get my bank card to go through. And he's going to leave me here stranded on a thing. Will you send this money to this account? And I'm going to give you the number. 5 5 5 Grandma does. Grandma does. Well, we know that that's a scam.

We know that AI is so,[00:17:00] so good at copying your, voice, your inflection. It only takes a few words. In fact, they can build a big profile of your voice and it, and it, and it's profile, it's individual profile just by simply saying hello. They will build an entire thing, and this whole entire scenario.

So now they're instructing people when you're answering the phone, who's calling? That's how you're gonna, that's how you're gonna answer the phone, who's calling? We had a whole thing, but it still doesn't prevent them from building a profile. It's just that, in that you're trying to, you're, what you're trying to do is make sure that there are key words that they can't build off of, right?

So, that's probably a poor example. I don't remember exactly what the example was to deter that but the idea is to not say hello and then communicate freely. You need to go Don't say anything. Call the number. Call your daughter and say are you, [00:18:00] are you actually, no I'm not. Click. You're done. Try not to give up anything.

This is tricky now.

Liz Herl: That is really tricky and I apologize to listeners and viewers because I will be more diligent in the future because I am referring back to what we are listening to this morning and I want to make sure I provide that information. So in the future I will try and look back and see what podcast we are listening to.

We can provide that. But yet again, he, the doctor stated that he was outraged that he, his voice was dictated over a book. He was so, you know what I'm talking about?

Tim Caldwell: That's right, yeah and he was so, we'll provide, we'll provide that.

Liz Herl: Yeah, and I thought, oh my goodness, this whole, you know, but I also, and again, I'm trying not to, I want to make sure we stay aligned in our agenda here, but to springboard off of that, I know that, you know, Dr. Peterson said that, I think that they had, he had AI do something regarding his book, and I can't, I don't want to [00:19:00] speak to that.

Tim Caldwell: Dr. Jordan Peterson actually went to chat. Chat GPT. And said, write me a paper. Over one of his books? Make it sound like me and here's the title and here's the buzzwords. Ding, ding, ding, ding, ding. And that thing spit out a paper. That's what it is. Within a minute.

And he read it and he said, if I, if I didn't know, I didn't write this, I'd have no idea that it wasn't me. That's alarming. And he's one of the greatest minds. Intellects in the world. That's right. Oh yeah, that's an opinion.

Liz Herl: That's my opinion. I mean, I just want to say my opinion. So this kind of goes back to something you mentioned earlier.

What happens when a person loses their dignity as free individuals when every purchase or move they make is a decision that is monitored? Kind of like when you say your, your options get limited. Yeah. Because something, someone else is making more of that decision making process.

Tim Caldwell: We, we, we have to, the, one of the biggest lessons that is probably going to be overlooked, and it is [00:20:00] crucial that people understand.

Is that anytime we're on a device, anytime we're around any type of device, it's listening. It's watching, it's learning, it's profiling. Everyone's like, that's conspiracy theory. No, that's real. It's not. It's, that's very real. It's very real. When you go onto these chats, when you go, when you use artificial intelligence for locating this, I use, I use maps.

I like to see how far away I'm from, even though I typically use my own sense of direction. I still wanna know what, what's the time from here? And I'll pull up a map and there it is. It's still profiling travel. It's profiling everything you do. Remember when you go to the grocery store, there's a camera right up there.

That camera sees you. It sees what you buy. You now have a receipt. That receipt's in the computer. It knows what you buy, when you buy it. It knows how much you spend. What about your bank? Everything.

Liz Herl: I mean, it's just the expansion. I mean, those are minimal things. Everything. I did, learned about a couple of different, ChatGPT was one of them at the, seminar as well as, it sounds really kind [00:21:00] of almost insulting, my opinion, yet again, but another mental health resource within AI is called WoeBot, and, I was able to watch them interact with this WoeBot.

AI, and ask certain questions.

Tim Caldwell: Wait, is this the artificial, they built a robot?

Liz Herl: No, this is just on, on chat. Okay. It's just a messaging bot. Okay, gotcha. And, and it's, it's an app you put on your phone and you can interact with it whenever you're feeling down. Right. Or however you're feeling. Another, popular one is Wyza, and they're kind of centered, they're focused around the mental health field.

Yeah. And it's interesting because, how they're creating maybe the algorithms and ways to interact with individuals around suffering from any walks of life, whether it be a traumatic episode or, or a mental health illness that they've suffered for many years. When they're [00:22:00] interacting, they have like this.

Like, viewpoint of what the AI is taking in is all digitally monitoring their body language or their facial expressions and then it's being generated to calculate a response to that, right? It's reduced

Tim Caldwell: emotional response into an algorithmic calculation. Right. Yeah.

Liz Herl: And so if someone turns away or that looks like that disappointed you, I mean, that sounds really very, very comforting, right?

Yeah. Oh, you look sad.

Tim Caldwell: So you and I watched a video of a young girl and she's telling her plight about into college. She was so excited about going to college. She went to college. And in, in her very first exposure at college, she realized she had no friends. And she was miserable, remember? I recall this now.

So, she's so miserable, so what she does is she reaches out, by phone, she reaches out to an artificial intelligence, and she begins to ask questions, they begin to build a profile, and she has a day to day [00:23:00] A couple times a day. I don't remember what they suggested in the video, but she has now a friend on the phone.

Sure she does. An avatar. Absolutely. A voice that talks to her and she is totally willing to go along with that. So, my point is, is that a curative answer to her problems? No.

Liz Herl: Absolutely not. Absolutely not. And it just raises so many more,

Tim Caldwell: it deepens, it, it deepens her plight truthfully, because not only is she, not only is this device now enabling this communication, which is artificial, but it still diminishes her ability to

communicate with people, flesh and blood, right around her, should rather talk to an artificial than to talk to, that's tragic. That's tragic.

Liz Herl: Well, and it's not even addressing the issue at hand. I mean, it's, it's completely [00:24:00] masking the issue of why this individual is struggling to interact with other people.

But, yes, one of the things, when you were talking about that, that again popped into my head, when I was playing around with these different, and there was quite a few different ones out there, not just again for psychotherapy, that's where I was kind of looking at, I found that in, I was setting up all these profiles, which someone's going to think I'm in desperate need because I have like all these AI profiles for psychotherapy, and I shared this with you prior to us, kind of going over our information, the little screen that popped up for a second, that, and then, does anyone read that information? oh yeah,

Tim Caldwell: that's kind of the, let's say like a prescription drug, it gives you the little scroll

Liz Herl: at the bottom. Yeah, like the little bit.

Like, for ChatGPT, which isn't, that's a big one that everyone goes to, it has three little boxes. Yeah, so pay attention to this. It does say, like, chat history may be reviewed or [00:25:00] used to improve our services. Learn more about your choices in our Help Center. It doesn't say what it's kind of gathering there.

Check your facts. It says ChatGPT may not be accurate. It may be inaccurate with information. It's not intended to give advice. It does say that and oh, but it does say ask away chatGPT can answer questions, help you learn, write code, brainstorm together, and much, much more. So it's very, very interesting.

Tim Caldwell: So when I, when I hear stuff like that, it's the strategist in me, right? ~And if I were to if, if I knew that I had,~ if I knew that I had this, devious agenda, and I knew that this person is using this thing, and I can dip into that information and go, well, this person tells me she's this and she's this, she talks about her cousin, she's, once again, collecting information.

I just have I just have the biggest cynicism to the fact that that thing is collecting information. It's not of any curative value, right? So they're just telling you right off the bat that we're not giving you advice. [00:26:00] But do you think the person speaking to that doesn't, doesn't think that this is what's good advice?

That's exactly what they're thinking. Oh, that's great. Oh my goodness. That person, that person, the human who's using that service, it has the ability to use an avatar. You can build an avatar of a pretty girl about my age, or whatever, it's voice, an accent, it can build all of this stuff. We talk about your, is it your car navigation in your car?

Liz Herl: My Siri. Your Siri. That I call, I call it Chris. She thinks it's for Chris Hemsworth. Yes. Oh, okay. Whatever. Because he is the Australian. Well, it's that my, it's not, my Siri

Tim Caldwell: is, it's not even close, so whatever. But the,, so well, it, it, yeah, it sounds more like, teach, it sounds like, a Douglas Murray to me, but in, in leave Chris alone, in that it's, in that it's tragic that people would be so willing to give up stuff like that

and we live in an, this is an, power comes from information. And when you're willing to just give up [00:27:00] information to somebody you don't, somebody, I said somebody, to people, organizations,

Liz Herl: a computer. Somebody, this information is going to somebody of some sort, because that was the other thing about consent.

Yes. Going in line with this, which it goes right back to what is lost, what is threatened with AI, confidentiality is huge. And the next one was even more problematic because it states, I understand, you have to click these by the way, but I didn't click them and it sent me right into it. So. Yeah, I feel like that's a little suspicious.

Takes you right down there, huh? I was still able to keep going. I understand my data protection rights and consent to my data is being stored and shared. Yeah. Period. Yeah.

Tim Caldwell: With who? Yeah.

Liz Herl: That's the end of it, by the way. There's nothing else to be said. Oh, and then it says, well, then I'm saying I'm 18 years of age and older, and oh, I'm happy to receive offers and emails and all that jazz

Tim Caldwell: stuff.

So, not only are they using you as an advertisement tool, [00:28:00] right? They're collecting that information in the cloud or wherever it goes, but more to the point is 20 years from now you go to apply for a position, and they go I want to help rehab people from alcoholism or domestic abuse or whatever.

Oh, okay. They put in the information, they pull up on another AI thing, and they've compiled a big thing on you. It says here you've had some mental imbalance, it looks like you talked to an artificial intelligence for ten years and blah blah blah. Really? That's, we already see people who go in and apply for positions, and you don't have to be high powered positions.

I even see this now for football players, college coaches are going through their social medias for their players, and they will make, they will make a decision of what type of player they're gonna have by what they see on social media. They will use that

Liz Herl: against you. Well, yeah, that's, that's something actually, I, I told my children, cautiously be careful what you're putting in Snapchat or it's forever in any [00:29:00] of social media.

Not just, you know, pictures, but your opinion of self. It could be reviewed and that could really give someone a viewpoint of you that's not really true. As an employer, is what I said to them. Not, I don't care what your friends think at this point, but as you get older that sticks with you. I mean, we've seen, and again, I'm not going to venture off, we've seen how celebrities have really paid the price of past postings.

Absolutely. Politicians. Everyone, right? Absolutely. They're digging up, they're going back in the history books of Twitter and 20, 30 years. Yeah. Facebook, whatever. Absolutely. And anything, and they're just ripping it out. Well, now you'll have it all neat and tucked away in a pretty pretty quick little box of information, right?

And all of your, and that really just ignites something in me of, I take what I do so incredibly serious and passionate that an individual sits across from me and revealing information [00:30:00] the confidentiality of that. And it's not really that, it's as much as the receipt, it's not the legal, it's holding someone in a space for a minute and being able to give that to them and know that there honestly is no judgment and it's, you know, and that's kind of what I've always said, I'm like, you can say whatever you want to say in my office, I don't care, like you can cuss, you can scream, you can do whatever, you know, whatever it is you need to, within reason, get to, um, Where you need to go in a healthier way.

Let's put it that way. This next section kind of goes into that, of the nervous system co regulation, which is a clinician's job, is to help people with regulation when they're not just in session, but being able to regulate their system outside of session within session. Jobs or relationships, if they've had a traumatic experience and, I always, I have utilized this, ever since I've been trained in EMDR, I've utilized this, um,[00:31:00] What is EMDR?

Eye movement desensitization reprocessing, it's, another technique, it's an amazing, uh, tool in psychotherapy. we'll have another episode about that, but, it was utilizing something in your past that kind of gets in your way. I, which is kind of funny, I cannot stand.

public speaking. I don't even like doing this, but here I am. and, I've been asked to go and, and do some conversations and having some talking points, and I'm just like, absolutely not. Well, I've gotten much better at it, which is why I'm here now, but I wondered why this kind of stuck with me. And any time, through, my undergrad and my grad, I had to do any type of presentation, I would become incredibly, My nervous system would go haywire.

I would perspire really, really bad. My whole neck would become inflamed and I'd be like this red beacon. and I hated it and I'd be like telling myself, like, it's fine, you're fine. that was really helpful. but I wondered why I [00:32:00] Instantaneously would respond this way and so digging back into past and what happens and of course in childhood.

In third grade Mrs. Dahl's class, I'll never forget, see then Mrs. Dahl. I had a social studies paragraph I had to stand up and read. When I stood up to read it, and I, this was all discovered through, my, training with EMDR. So, this, this was, again, through psychotherapy, by the way. and I discovered why I, I tend to do this is she had me stand up to read this article and I just felt the Eyes and the snickering and, and I, that moment, my body just kind of did the exact same thing.

Solidified, great, right into my brain. It was really, really wonderful because then any type of interaction that that recreated, my body would just go right back there every single time. Yeah. And I was like, Oh, I really need to nip that in the bud after 44 years, I'll get there.

Tim Caldwell: So, so just to say that back to you, [00:33:00] the, It's incredibly important, and we talk about this all the time and I know that it is, is that as a psychotherapist, as a behavioral specialist, when a person comes to you and reveals all of these things that are very important, they're very personal, they're very private, you don't get that with a machine.

You will not get that. No, right. But in that, it literally is the, it's all of the nuance that goes with a human to human reaction, interaction. Or maybe lack of interaction, right? But just tracking and watching and observing. But you get a chance to then say, look, I see that maybe there's some things we need to work on.

We need to have this discussion. But, it's, that is the golden element that would be removed if we saw artificial

Liz Herl: intelligence. Right. And that, when it talked about the nervous system component, that, that is really, Again, near and dear to me because I can start [00:34:00] visiting with an individual and, when we get into some heavier items, there, there's a, a notable, difference because that's what happens in our nervous system.

And that's what I'm trained to do to help an individual, identify that first off and then be able to, gain some coping mechanisms, awareness, and tools around it. I'm not saying AI isn't going to be able to do that because I watched them do that. In that little example of them watching, the individual, you could see them, see when it comes back to my mind it fires me up because this person looked like they were suffering and this AI is just like, oh well this is, you know, seems like this is difficult for you or, so they're noting dysregulation.

Tim Caldwell: So, let me, I'm gonna clear up, I'm just gonna set it up for just a minute. We have a person, either actor or actual, and then we have an avatar on a screen. That is correct. Which happens to be female, I think it was. Yes, it was. [00:35:00] So, what happens is the avatar is tracking this guy, and this is a facial thing, they have facial patterns and dots and all this stuff.

They have it

Liz Herl: pointed to shoulders, and like the upper, but not the full

Tim Caldwell: body. Yeah, that thing is literally tracking you via the camera, and that, and what happens is the the patient, makes this motion where he kind of shrinks back and turns or something like that. And the AI picks up and goes, I see this is hard for you to talk about or something to that effect.

Well, that's not how you would address that, is there? Absolutely

Liz Herl: not. That's right. It's like, are you kidding me? And when, that's what I'm saying. It fires me up because that is not, oh, I see that that was difficult for you. Yes, that's exactly. Oh, yeah. Good job, genius. Yeah. It's. It's. Yeah. I mean, my response is, so what's going on for you right now?

Yeah, yeah. and then that is getting them to a point of their own recognition, being within their, aware of their own body and movement. I mean, I don't want to get bogged down with that, but to think that a system can [00:36:00] And I, I, I shared this with you earlier, we are energy beings and that we can connect and we can feel each other's energy and we can know, it's kind of like when someone walks in a room that no one's a big fan of and the temperature drops like 50 degrees and everybody's aware of it.

well, AI can't do that. No, it can't. Can't be aware that, oh, you know, I mean, of course, but. Algorithmed out the wazoo, it might be able to say, Well, it looks like you don't care for

Tim Caldwell: that person. Yeah, well, I'm going to push back a little bit. I bet AI has developed those types of things. However, it's still the human interaction.

It's, it has come so far and you know, you know I'm right. It has come so far and is so pervasive, so invasive that it's everywhere. But ~in, in,~ in the realm of real treatment, you and I will agree that it's this human to human Interaction. It's, ~it's~ the only true way that to show empathy, compassion, that's not going to come across on a [00:37:00] computer.

Liz Herl: Correct. Which is kind of my, my outline I have here of embodied resonance of the experience of having loneliness and shame undone through shared experience of human embodiment of love and loss, disease, touch, nonverbals, limitations, catastrophes, and all the things. I mean, that emotions are inseparable from the body.

Like that's one of the biggest things about how our collaboration even came to be with the physical side and and the psychological and emotional components of self of those shouldn't be separated out, because it's inseparable when we feel emotion. It, it changes us. That's right. How, again, when I go back to an AI and say, it was my pleasure, what, you don't even know what pleasure is.

Tim Caldwell: Yeah. So, if, so, in playing with this thing, some of the questions, you can pose questions to it and sample. Oh, yes, yes. Now, [00:38:00] one of the, one of the interesting things was, in the questions that are posed, it'll answer questions, and it'll answer questions, and it, it's always profiling you, it's always profiling you why you do that, but then if you get to a point where it's, where it's beyond that, let's say, let's say the patient or the person participating in this goes, you know what, this isn't getting me anywhere, I, I'm, I think I'll just end it all.

The end game for that AI is, let me put you in contact with a human. Yes. Which I

Liz Herl: find, by all the, I find amazing, I find amazing. Most of the, well, all of them that I encountered that I wouldn't create an account for, when it came to that, they were, they were going, well let me, you connected to a real life human.

Yeah,

Tim Caldwell: which I find that to be, I find that to be amazingly [00:39:00] hypocritical, but proper in that you've already surpassed my ability and now I'm gonna give you back to a human, which they, I think in, I think in In the understanding from the podcast that I'd listened to and, listening to other websites that AGI, which AGI , artificial Global

or it's worse.

Liz Herl: Is it general? No, it's global. It's global. It's worse. Okay. Than, I shouldn't say worse. That sounds like I'm, it's more intense than of, of, of the two

Tim Caldwell: Artificial intelligence AGI is the learning. Right? It's actually the learning. And it has the ability to build and build and build. but I just find it to be just amazing that, you know what?

Let me get you back to a human. You've obviously gone too far. Well, I've spilled my guts to you for months, and then you're going to take me back to a human. Who [00:40:00] knows nothing about me, by the way. Yeah, but we're starting

Liz Herl: over. Yeah. Which is, I know their standpoint on, or their speaking point for AI psychotherapy.

I am very aware that individuals finding a clinician that you, can create a therapeutic alliance and feel like there is, that therapeutic alliance relationship is core for anyone in their treatment process. And when you find that and having that, is something that people that they hold value as they should.

Now the challenging parts of big agencies or individuals that maybe not in, in private practice, that the recycling of a therapist leaving a place or moving or doing different things, then you have to get a new therapist and start all over again. Yeah. Well, with AI therapist, it follows you everywhere.

In the palm of your hand, and you'll never have to tell your story all over again, which is, that in itself is, is traumatizing when, when an individual has to [00:41:00] start . Even when there's a shared of some, medical files and what have you, but you're still having to start over with another

and that's the, you know, another part that they can stand on. You don't have to start over. You'll always have this AI therapist with you. Yeah.

Tim Caldwell: Here's the, how long do people go to therapy?

Liz Herl: Well, in this, I think that, that really does depend on the line of care and what you're treating at the time.

Yeah. I think I've said before and you'll hear me say it a lot. I always share with the individuals that I serve, my job is to put myself out of a job and right. Same. Same. My job is to put myself out of a job. Same. And that's always kind of interesting, response that I get because the idea is to really own oneself and be able to have the ability to identify and heal and move forward in a healthy capacity.

Educate and give

Tim Caldwell: them the

Liz Herl: tools. Well, the ownership of oneself is crucial. Yes. This, it's actually really [00:42:00] important that a therapist and clinician is aware when there is a dependency. Yeah. And absolutely happens to every clinician to be able to identify when it's like I, you have to push them into, well, that's one thing AI would not do.

I mean, because you'll, they'll literally be at your palm every single day. Any moment, I need a reassurance. I need you to make me feel better about myself and I'm not ugly today and I'm not this today. I need this. It's really a false sense of reassurance that this algorithm is just literally placating you into making you feel good.

And I know that sounds really harsh, but I want people to know within self, no matter what you've done and what you've persevered against and you did this on your own. You have created this, not the reliance of, of an AI or even another individual. Yes, we all need validation and support and we need each other.

I'm not going to say that those aren't things we don't need, but to have that as your resource every day to be your check in [00:43:00] to make sure I'm getting through my day with some level of movement is not healthy and

Tim Caldwell: they, they, they essentially become a permanent crutch. Correct. Right? And it's one thing to have a touchstone, right?

It's another thing to have a touchstone. But, I, so I will reflect that on my side of the shop. And my side of the shop is as a professional trainer. I've, I have, I've been doing this 38 years. I started as just this wide eyed kid who people wanted help with and that was great because I was competitive and I probably knew more than the average bear.

But the whole point was, you must be asking yourself, at what point does this end, right? And when I go on to, I've trained people who've gone on to be champs, right? I don't know that, I've had, I've had two women athletes that went on to become, women pros, in, in bodybuilding and, and, condition, in, competitive sports.

However, my job is to educate you. [00:44:00] The time that you spend with me is, you're paying not for my time to count to 10, good job, mine is, you're counting to 10, but why did we, why did we do that, what was, what's the purpose, what did you take away from that, and the next time you come back, I better see that again, because over time, you should need me less, and there is a time, I've cut the cord with people, I go, man, you guys, you do great.

Go, you know, go, what my professor used to say, everybody needs to go hang the art. Right? At some point, you can't just keep diddling with this thing forever. At some point you have to stop. Hang the art and be critical of it. If you don't like it, take it back down and do it again. However, hang the art, let it, let's see how it works.

So now you take the tools that you've learned from me, you go do your thing. If you need me, I'm here. If you, but I'm encouraging you to take what we've learned and do what you can and from that from that, we'll see how your progress is. Now, do I have people who've been, I have, I have clients who've been [00:45:00] with me for a long time.

Every time I travel back to Colorado or every time I go to Wyoming. If I find, if those guys find out I'm in the state, they want to work out with me. That's awesome! Well, yeah. Show me what, show me how you've done, show me all this stuff. I love that aspect, but I'm, I'm not here for

Liz Herl: you forever. Well, and I, I guess I want to clarify that, you know, I, I've seen people for, long periods of time.

Sure. Actually, almost in, in the entirety of my practice. but not on a consistent basis. I call them booster sessions. Yeah. Yeah. when something else comes up, it's kind of like a booster shot. It's like you come back in for a booster session, to kind of, you know, different lay of the land kind of thing.

They've done

Tim Caldwell: well for months, years, maybe? Years, yeah. And then they go, Hey, Liz.

Liz Herl: Hey, this came out of left field. Uncertain. My boat got rocked. Yeah, absolutely. That's exactly what I am here for. Yeah, right. It's always, you know, always my, you know, my pleasure to do that, but I think that this is just a different, Resource for [00:46:00] people to have it, you know, one on one.

I was going to say just real quickly, I haven't looked into this. You probably have. There, are there, oh, I think there are. No, I was getting ready to ask a question. Go ahead. There are AI trainers, right? Personal trainers? Yeah. So what I, I think is interesting about that, yet again, with personal, person to person interaction, is you are so big on, just, you know, being, a former client and now your partner

on movement, like when I'm doing the movement, am I sitting in the right position? Am I doing the right, like, that

Tim Caldwell: Perfect analogy. That's a perfect analogy. This is why I'm so critical of, online coaches. Which is

Liz Herl: your, I mean, you'll do it, but it's not like something you love to do. If I,

Tim Caldwell: if I write you a program and I say, every Monday, Wednesday, Friday, you're going to do three sets of ten for push ups.

And then, you'll do six sets of five for chin ups or something like that. And then and you come rolling in and I see you in person or you that you've been with this [00:47:00] trainer for years and you come in and you do a push up in front of me and I'm like, what the hell is that? That's not a push up.

Right, right, right. No the the whole point about being a good trainer, now this will be something that will sting but, I mean this, I don't think that you can be a good professional trainer, and I have yet to do it. Every single person I've trained in my life, and I think there's, you know, in the 38 years, I've heard people claim that they've trained thousands, tens of thousands.

They're in their 20s and 30s. Really? Well, I'm about to turn 60, and I probably only have maybe 300 people under my belt. Maybe. Short term, long term, whatever. But in every single one of them, it's face to face. I see them every day in the gym and I know what they trained and that's how, that's how I think they've progressed better and better.

How can you do that? How can you possibly do that if you don't have somebody there face to face? You do every, you [00:48:00] do. You do 19 out of 20 steps in this procedure correctly, but one you don't, then it's not right. It's just not right. And that's what AI, that's the, that's the problem with AI. Can it hear stress in my voice?

Is it reading my eyes, my body? Yeah, sure. But maybe there's key words in there that are just being missed because that's not part of their lexicon, but it is in a trained professional. Right. A human.

Liz Herl: A human. A human trained professional. Right. And that's, I mean, I mean, we've really just touched on, honestly, both sides, so many of the issues with concerns regarding AI.

And, I mean, this is, again, again, I mean, it's not like a broken record. This isn't changing the fact that AI is here and it's going to, it's awareness of you really need to pay attention to these things and what you're, you know, Yeah following,

Tim Caldwell: I guess. In that, in that, for your profession and, and to mine, [00:49:00] I don't want people to misunderstand that maybe you're not the best of trainers, but at least people are making the step towards betterment, right?

the old saying is, you know, I've seen people, I've seen people who are recovering from injury. If you've never seen a treadmill for morbidly obese people, or called a zero grav, gravity thing, they actually put it in a harness. They're barely putting, they're like 10 percent of their body weight moving.

Well, you'd ask yourself, what good does that do? It's better than the guy who's sitting on the sofa. You're doing more than them. So, so in that, in your realm, we, we ask ourselves, well, if there's artificial intelligence that goes into physical therapy, at least the guy is talking to something. Oh, you mean psychotherapy?

Psychotherapy.

Liz Herl: I say, what did I say? Physical therapy. Physical, physical therapy. You're, you are in your, you're in your jam. Yeah.

Tim Caldwell: Yeah, yeah. Well, so in, so, so in that,

Liz Herl: I agree there are levels you to know, there are levels. I, I am not poo-pooing this [00:50:00] whole idea. Right. I guess it sounds like I am, I know. I am cautiously aware of slippery slope mentality that as with anything that things have well, and this is one of them well said.

And I think that I, people in dire straits and, or whatever, wherever they're at, that they just need a, maybe, to feel something like something cares, even if it is a machine or even a program or whatever. Oh, we just want to feel some sort of connection. Yeah. then absolutely grab onto something.

Don't completely lose yourself, right? Right. and I think these can be resources to that. Right. Again, I just think you have to be, the reason I bring this up is because there's levels that this goes to, again, in our society we're very much so quick gratification, and we are already talking about we're on our devices entirely too much, we're addicted to our devices, like you said.

No argument. The, the, I always love your analogy of the, What do you, what do you say? The, [00:51:00] the scratching at glass mentality. We've reduced

Tim Caldwell: ourselves to animals scratching at glass. Yeah,

Liz Herl: I, I, I really love that, because that's exactly what we're doing. And the areas where this can be, I mean, we touched on confidentiality in our nervous system, and in our, the embodiment of our, of, of having a body, and, and our emotional connections.

But when we go further into just, like, the very, very tip of the iceberg of this, It goes back to sleep disturbance issues, where someone's thinking, well, I am getting help for my sleep issues because I'm talking to someone all night about my sleep we're already talking about how we need even, I got a new, a newer AI phone and I can't get the screen to go completely black, you know, and I think I just put it in another room cause I'm like, even the illumination, like my brain, like I have my eyes closed.

Oh, you mean in time of rest? In time of rest. Yeah. It's always illuminate. I'm like, And I can see it, and I'm [00:52:00] just like, my brain's like, oh look, there's something over there. Look, something shiny. So,

Tim Caldwell: I'll, I'll ask you then, do you have Alexa?

Liz Herl: I don't have, I mean, the only thing I have, and I have to be careful, because it's, it's here.

Yeah, I know. It's, SIRI I know. Yeah. Because it'll start saying, how can I help you? Chris will start talking to us. See, so that. But what I'm saying is that, before, that we have so much access, That, we're getting, yeah, less and less sleep. That's one of the research that they found is that we're getting less and less sleep, which we need sleep for our brains to operate and for us to engage and be productive and motivated, engaged, and all those things.

And, but then you want to, you know, you can make, well, I'm talking about something that. is bothering me. Well, does AI have the, and I'm not saying it won't, have the ability to say, you know, it's 330 in the morning. You really should get some sleep. And you know, let's, let's sit, what are three [00:53:00] breathing exercises we can get you to do before you go to sleep.

Like something like that. That's what I'm

Tim Caldwell: saying. So, so I want to just say this. Boom. I don't do it now because I was driving my family nuts, but I have a phone full of alarms. No, I'm aware. My alarm goes off, I have alarms that go off all the time. Yes. But what people don't realize is I have, I'm up early to prepare food for clients and myself every day, fresh food.

And then my first client's at 5am and I have, so in that, I have an alarm to go to bed. Oh, I didn't know that. I have an 8. 30 alarm to go to bed. Now, that seems rather bizarre. I know that sounds strange, but it's because I'm dependent on the sleep that I get, right? And that is part of the, that is part of my makeup to make sure people understand it's time to go to bed.

But I also have, I have an alarm an hour before that that is totally irrelevant. And you know what that tells me? Get off any device. [00:54:00] I don't want any device in my head at least an hour before I go to bed. It's not just that I'm amplifying all of the information that's flowing through my mind. I need to, I need to decompress.

But it's also all of the screen light, right? White light, blue light. You need to be shifting over to red light. You need to be slowing down the process of your brain. And I know we're going a little bit off, off track here, but

Liz Herl: Those are the

Tim Caldwell: disciplines of being away from technology so that our brains can get back to a rather circadian rhythm.

Right, that's what I'm

Liz Herl: saying. Is AI going to disengage this psychotherapy session in the middle of the night and say, you need to be sleeping?

Tim Caldwell: So I train, you know, let's hypothetical, I train well, I eat well, but I'm not resting well. Right. Wash it. Because it did you no good. You have to have that reparative time, that downtime, for your brain to function.

That

Liz Herl: is absolutely right. That's the scientific [00:55:00] proven study on, that you're talking about. It's an impairment to your cognitive ability and reduced gray matter in your brain. And that's actually leading to more of heightened diseases around Alzheimer's. You

Tim Caldwell: can go, I would venture to say years without

Liz Herl: eating.

Yeah, we

Tim Caldwell: talk about that. You can go. You can go days without drinking, but sleep, you'll go insane. You will go, you will go insane. You'll go, you'll go insane. And not only that, not only are all those factors involved, but as soon as you involve the, the, as soon as cortisol and those hormonal responses that go from Improper sleep and improper rest.

As the cortisol goes up, now you begin to just literally scavenge the good from your muscle. You begin to shrink and die away and cachexia sets in and it's, it's so horrible. We are, we have wandered off track, but I do want people to understand that this is tied together. No, it is all tied together. [00:56:00] And as we, as we walk down this AI trail, which by the way, we're all on Mm-Hmm.

whether you want to admit it or not, in some way, shape or form, you are being, you are being a part of this, you need to make sure that you're managing as the best you can. Sure.

Liz Herl: And, going back to that, I've, I've kind of shared, my, they're showing that A DHD has a ability to increase. Well, I say that A DHD should be like on a spectrum.

And, we've designed our brains to have on, take on so much information so rapidly at all given times that we are, our task bars in our brain are open, like 50 different tasks are open. That's in your mind. Yeah. Just in your head. And, and you're trying to analyze and solve each one and then go back and, you know.

Yeah. Yeah. And they're saying just the, the rapid pace of that. And now you're, connecting that to, The [00:57:00] device still, yeah, you're still interacting with the very thing that drives you. Yeah, right.

Tim Caldwell: It, anyway, it's, that's a, that's a wicked brew that you're playing with there. and if you don't separate yourself away and get yourself disciplined enough to say that.

There's time for this and there's not time for this. There's time for what Jeff Bezos called puttering around. You wake up in the morning. You need to be slow to rise. Get yourself going. Do some simple tasks and then slowly work your way into it. And then find your rhythm. Get full blast. Do whatever you need to do.

But it has to happen. There has to be regularity in it. There just has to be. And there has to be a time when there is none. And that's how we separate this. And I always want to make sure that I clarify. When we talk about this, and you already know how I feel about this, I'm not saying no to AI, especially for people who need help.

And who is that, Liz? That's everybody. That is everybody. We are all broken in some way, shape, or form. That is everybody. We may be high performing, [00:58:00] popular people, you know, good with a lot of things, but everybody has some skeletons. Everybody's carrying some baggage that they could easily drop if they had the tools.

So I'm not saying this isn't a good thing. I'm just saying, is it the proper thing? And I want people to understand, especially, especially those that are affected by PTSD, my, my military brothers, those people who are first responders, fire, at some point in your life and sometime, sometime in your career, you're going to see some things or be a part of some things that will rip away your innocence.

And whether it happens at 6 years old or 60 years old, if it, if it's upset the balance of the way the world works for you, and you work in it, you need to speak to somebody about it. And that's what this is all about, is to reach out. That's why I think gyms or churches, there's fellowship, there's camaraderie, there's, you're being held accountable, responsible, we help and [00:59:00] encourage.

And we're, most people in the gyms, if they're truly in the gym business for the right reasons, they're in the gym reason, they're in their business because they want to see people get better. And I don't care if you come to the gym every day. Come to the gym every day. I encourage that. But I want people to get better.

Yeah.

Liz Herl: the other, last point I want to make here is, And this is the, what I started the whole conversation off was actually increase areas of loneliness and disorientation of self. And that, we're just putting ourselves, it's off the charts. We're shrinking so far back from one another, that it's going to impair us.

Well, we're already impaired. So it's going to,

Tim Caldwell: I don't have, I don't have the suicidal information or depression. I don't have those stats in front of me, but I know for a fact that suicide and self harm is off the charts. At all ages. this separation during COVID and the pandemic and all of these things that have separated us.[01:00:00]

We just see people just suffering all around us and we need to be able to come back together. And the only way I know how to do that is, is you need to reach out and communicate with the people. You need to find people you can speak to.

Liz Herl: Well, this, when I say that the, psychological aspect of what it can do to one's mind is this living in a fantasy world.

And that is, you're so I know a lot of people are into D& D and that's great and so I don't want to be like I'm going, that's awesome. I was not aware of the commitment to, Dungeons Dragons. and the I mean, it's a series of, of stories and, and, you know, someone that designs it and, and like the collaboration of all these individuals that come together and it's, it's [01:01:00] It's very, very, it's actually really interesting to be honest.

Now I'm only utilizing that as a way of saying that if we reduce ourselves into a false fantasy of AI technology, not just for psychotherapy, the wide span of that, and it, you know, they had stated that as we become more comfortable. Our bubble increases and the similarity between our reality and others decrease.

This will further incentivize us to prefer our personalized worlds. Shared objective reality, if you will, would change our beliefs and our comfort to, you know, it would, it would change us. It would change us in a way of, we could create our own society within our home. And it's all false reality. I mean, I don't want to go matrixing on you, but

Tim Caldwell: I, man, from what you're saying there, and I do understand what you're trying to, what you are saying there, I would say [01:02:00] there is, there is the potentiality of just this exponential thing from that, right?

If you are, and I'll, I'll be as delicate as I can, if you're not If you're not the sportsman, if you're not built for athletics, if you have no interest in it, maybe you like books, maybe you're bookish, maybe you're a game player. I'm bookish. So if you're BND or you play video games all the time, that becomes a big part of your world.

Absolutely. Pretty soon it becomes A lot of your world, right? And pretty soon, as you know, an addiction is anything that interferes with what you should be doing normally. Correct. You're taking, you're doing more and more stuff here. Well, the, when you were talking about the bubble and the bubble girl, see, I think, I think we see that in people who, cosplay.

Mm hmm. I think cosplay has turned into a new aspect of cosplay. I won't say what it is, but I think in society we see people who are playing out these [01:03:00] things that they think they are, and in that they believe that everybody should have addressed them as that, right? And this cosplay has become real play, and real play is the psycho is literally psychologically embedded in yourself now, where you can't tell truth from falsehood.

And I

Liz Herl: think that's really tragic. I sent you a reel the other day about when you create a reality and a false reality of self for so much of storylines and stories that you actually have created false memories. You've made everything up to believe that that's exactly who you are. And it's so, I mean like, you're so, you know, firm in it that this is, this is who I am.

and I, I do see some really big concerns about just, yeah, we, we reduce ourselves in this. And again, I don't, wanting to make sure we kind of narrow this down and wrap this up. But, when we think about individuals that are, and [01:04:00] I don't want to say like staying in their basements and playing video games and not leaving their homes and, now with post COVID, we have everything that comes to our door, groceries, food, entertainment, anything you want.

What's the term for that?

Tim Caldwell: Cocooning. It's actually a societal plan, right? You don't have to do anything. You don't have to go to work. You don't have to travel. We'll bring it all to you.

Liz Herl: You know what? I'm so glad you said that. And again, my son had just taken a train trip, which was such an amazing explorer.

I love his spirit and he spoke with all these different individuals on the train and they thought it was really interesting that a young man was speaking to them, by the way, because young men don't do that. They're just like, you know, obviously not watching. I'm looking down just tapping on my phone.

But he had him sign the book. He had him, yeah, he had all these great signatures in the back of his notebook. It's incredibly awesome, but he said, and I said, oh I have to, and I might be messing it up, but he said, you know, it's so nice that [01:05:00] some young people, but you're seeing that people are not made to be looked at, they're made to be spoken to, or something along those lines.

And Because normally, when you take the train, people just look at you, but don't talk to you. Say not heard. Yeah, but it was a really great statement. I'll have to clarify that in the future with what actually he said. But it was, I told my son, I said, I love that idea that people are not just to be looked at.

There's to be engaged with and spoken to. Everybody's got a story. Right. And it's not this competitive drive of like, is my story better than your story? Is my life better than your life? And all that nonsense. It's just like Intrigue and interest. Yeah, it's

Tim Caldwell: neat when we, it's neat when we go to lunch or when we go post production or whatever like that.

Oh, yeah. We'll sit and we'll watch people. Oh, yeah. We, you and I will go over and we'll speak to, and they think that's neat. Now, I will get a sharp elbow in the ribs from my family because I've been in elevators with people in wheelchairs and they only have one leg and I will literally go, hey, man how'd you lose your leg?[01:06:00]

And that guy will open up and I tell my, you know, later my, my family's going, Why do you do that? That's embarrassing. Because that guy's got a story and he's dying to tell the story. And he loves the fact that I care. Do that. Reach out. Does it, is it a hundred percent? No. Some, some people are very hurt and bitter and they may snap back, but I guarantee you they still want to talk about it.

Liz Herl: One really quick story since you said that. Yeah, go ahead. I have to say this. talking about engaging with people and our interactions and I have a 9 year old beautiful innocent little girl and we were at Walmart checking out and this checker was you know, putting our items through and she had like a, a, a stint on her finger and my 9 year old, you know what she said?

What'd you do to your finger? Yeah. And, oh my goodness, it was quite adorable, but, she stopped, like, checking out. She goes, well, sweetheart, and she takes off, she goes, do you see this bump here? Do you know what gout is? That's funny. I'm like, okay. Yeah, [01:07:00] that's great. And I'm like, and my daughter intently just looked at her with so much innocence and she's like, no?

And she's like, well, my doctor messed up my blood pressure, I mean, literally told my nine year old, by the way, this whole story of drama. Yes, my, my blood pressure medicine got, you know, and, and she's like, well, I hope your finger gets better soon. Well, it's better. I can almost move it some. And I just thought this is exactly what I'm talking about.

Even it's that easy that it is that easy. And that's the innocence of children. They'll just ask you, like, you know, if you're like have an eye patch on, they're like, what's happened to your eye? You know? she had this on. She was like, what happened to your finger? And we lose that in, and we're losing that.

You know, in a

Tim Caldwell: qualifying statement, I will say this too, just to tag to both of our stories. I have never encountered a person who got mad at me. Never. And I've asked about people who've lost their legs and their eyes and severely burned. I've asked them, but you know what? It's, it's not like, dude, what'd you do to your face?

No, I didn't do that. I, I just came up and I said, wow, what happened? And they [01:08:00] tell you.

Liz Herl: Well, and there's something about intently engaging somebody with care and, you know, authentic, authentic engagement of, you know, like what happened there and. And I'm, because I'm curious, I worry about you, I don't even know you, and, So AI won't do that?

No, AI won't do that, and at all, thank you for bringing us back in, AI won't do that, and I, I've said my closing thoughts a few times, but saying this I think is really, really important, And that is, I told my son, love is in everything, and love is in any interaction that we do, we do not have to know someone, intimately or personally and have an act of love in it.

Our words have love in it. They have our actions and so the act of love, of compassion and consideration and encouragement is love and I think that is when we say, you know, I love so [01:09:00] many different things, how we express that. we can love, we were, we were having a conversation we can love our friends, we can love certain things, but it's understanding that there is love in everything.

Yeah. And that we reduce that in, in a lot of different ways, but most importantly I do not. My closing point here is that an algorithm, a very soft voice, I can even have AI mimic my voice right now, and give you all these words and soft tones, but it will never be able to show the love that I actually have as a human being.

It doesn't have that capacity.

Tim Caldwell: Good point. Really good point. mine is, you're not alone. Every one of us is broken, and find someone you can trust. Find, find the opportunity to share, sit down and have coffee. I like to buy breakfast for other people, and he's a big breakfast buyer. When I, when I can buy breakfast for other people, maybe one day I have the opportunity to slide in next to them.

And I, I've [01:10:00] done that, right? I like to, I like to tease the old men that are gathered in their little parliament of owls, and I'll slap down the newspaper and go what's going on in the world? And then they just start you can do that too. And, it's, it's not enough that you reach out. You reach out too. Right? Take care of yourself.

Because somebody depends on you. Yeah. Thanks.

Liz Herl: Be compassionate towards yourself. most importantly, take all this information. we'll have the information on our website where we got all of, some of this data from, but be aware. Be knowledgeable for yourself. Take control of yourself. Take care of yourself in a compassionate way.

and just be well. Yeah. Thanks, Liz. All right. Thanks guys. See ya. Bye bye.