WEBVTT

NOTE
This file was generated by Descript 

00:00:11.906 --> 00:00:17.966
Welcome to Transformative Principle, where I help you stop putting out fires and start leading.

00:00:18.476 --> 00:00:19.976
I'm your host, Jethro Jones.

00:00:19.976 --> 00:00:22.676
You can follow me on Twitter at Jethro Jones.

00:00:35.471 --> 00:00:35.981
Alright.

00:00:35.981 --> 00:00:38.441
Welcome everybody to Transformative Principal.

00:00:38.441 --> 00:00:43.781
I am very excited to have on the podcast today, Linda Burrick.

00:00:43.841 --> 00:00:51.911
She is a behavioral scientist, uh, behavioral scientist, and she does all kinds of amazing cool things.

00:00:52.271 --> 00:01:00.101
Uh, she runs a consultancy called Linda b Learning out of Seattle, and we started talking.

00:01:00.861 --> 00:01:20.751
Uh, last week before we, uh, decided to do the interview and I was like, we definitely gotta talk, because Linda's been in the learning space for a long time, has a ton of really fascinating perspectives and information and experience about, uh, artificial intelligence.

00:01:20.751 --> 00:01:23.781
Before it was the marketing word, artificial intelligence.

00:01:23.781 --> 00:01:26.871
So Linda, welcome to Transformative principle.

00:01:26.871 --> 00:01:27.471
Great to have you

00:01:27.626 --> 00:01:30.686
Thanks so much for inviting me and having the opportunity to chat.

00:01:31.871 --> 00:01:33.941
Yeah, I'm, I'm really excited to talk with you.

00:01:33.941 --> 00:01:37.871
Our conversation last week really inspired me and made me think about some things.

00:01:37.871 --> 00:01:46.691
And I wanna start with this idea of artificial intelligence and how, uh, you mentioned that it's just a marketing term and that you've been doing this for a long time.

00:01:46.871 --> 00:01:53.861
So can you just give like a brief idea of some of the things that you've worked on, uh, already in your career?

00:01:53.996 --> 00:01:54.686
Yeah.

00:01:54.716 --> 00:01:55.196
Yeah.

00:01:55.196 --> 00:02:03.626
So when I talk about artificial intelligence, I use the term machine learning, and then sometimes I refer to the specific kind of statistical learning.

00:02:04.556 --> 00:02:05.966
You know, category that it is.

00:02:05.966 --> 00:02:07.916
So that's typically how I talk about it.

00:02:07.916 --> 00:02:13.436
But because AI is such a buzzword, I, I will use those terms, but please know, that's what I'm talking about.

00:02:13.856 --> 00:02:23.411
And so from that perspective, really, when you're building anything with a computer that helps you memorize or learn things, I. You are building like a machine learning peer.

00:02:23.591 --> 00:02:27.041
And I don't know that people think about it that way, but that's how I think about it.

00:02:27.041 --> 00:02:31.571
And that was literally the first experience I had with machine learning when I was 14 years old.

00:02:31.871 --> 00:02:38.471
And then, you know, as I got into graduate school, um, I was studying behavior analysis, so that's mostly human and.

00:02:39.206 --> 00:02:43.766
Animal learning, but we also worked with plants because plants actually can learn.

00:02:43.856 --> 00:02:46.796
And then for me, I'm like, well, machine learning, obviously, duh.

00:02:46.886 --> 00:02:58.226
So I mean, I worked on things like some of the very early, uh, voice recognition and voice synthesis, um, technologies used because pop people thought that would be a way better way of interacting or interfacing with a computer.

00:02:58.601 --> 00:03:04.211
Not realizing how difficult it actually is to match a voice when you're thinking about how that actually works.

00:03:04.811 --> 00:03:06.731
Um, and then just progressed on from there.

00:03:06.731 --> 00:03:12.191
You know, I built, I, uh, assisted Paul Allen building one of the very first online schools ever before there was a thing.

00:03:12.431 --> 00:03:15.191
Uh, built my own companies and sold it when I was 28.

00:03:15.191 --> 00:03:21.761
And then continued to work, uh, with different companies in Silicon Valley, mostly Sun Microsystems in Oracle for a substantial part of my career.

00:03:21.761 --> 00:03:23.951
But lots of other, uh, places as well.

00:03:23.951 --> 00:03:24.401
And really.

00:03:24.936 --> 00:03:31.566
My role was to get closer to the technology because I wanted to not just be a learning person who develops training.

00:03:31.926 --> 00:03:34.416
Um, you know, and that was something I always used to get is like, well.

00:03:34.796 --> 00:03:37.406
For a training person, you really don't want to do training.

00:03:37.406 --> 00:03:38.546
I'm like, oh, I do training.

00:03:38.546 --> 00:03:39.776
I just put it into the software.

00:03:39.776 --> 00:03:41.846
I'm not about like, everything has to be in a classroom.

00:03:42.056 --> 00:03:42.416
We are.

00:03:43.166 --> 00:03:47.876
That ship has sailed like decades, you know, before, even when I first started when I was a teenager.

00:03:47.876 --> 00:03:53.516
So like where we are now with virtual reality and all these other kinds of immersive technologies, the internet of thing called computing.

00:03:54.056 --> 00:03:55.706
It's opened up so much possibility.

00:03:55.736 --> 00:04:03.806
But the challenge I think, for people, especially in the K 12 space and higher ed space, is just, you know, sifting through the hype from.

00:04:04.391 --> 00:04:07.596
What can you actually do and how can it be beneficial for the populations that we're serving?

00:04:09.391 --> 00:04:24.341
Well, and what I, what I think is so interesting is you and I took a, uh, a similar path that we said, how can we make this happen more automatically so that we don't have to spend the human cycles doing the things that machines can do?

00:04:24.671 --> 00:04:26.141
Better than us.

00:04:26.351 --> 00:04:42.071
And so you talk about that idea, a little bit of outsourcing, repetition, repeated learnings, repeated practice kinds of things to technology and why that makes sense and has made sense for decades now.

00:04:42.851 --> 00:04:43.781
Yeah, well.

00:04:45.311 --> 00:04:48.371
Think about what we call these things that we're interfacing with.

00:04:48.371 --> 00:04:49.931
We call them computers.

00:04:49.931 --> 00:04:53.771
So that alone should suggest to you what they're really good at doing.

00:04:53.771 --> 00:04:57.401
What technology is, was the underpinning of what technology is really good at.

00:04:57.401 --> 00:04:59.471
And that is computing, right?

00:04:59.471 --> 00:05:00.911
Being able to put things together.

00:05:01.151 --> 00:05:08.471
So I remember back in the day, and you probably remember this too, when you were told that you know, can't rely on the calculator, you know you're not gonna have a calculator and then you know, you look at your.

00:05:08.531 --> 00:05:10.121
Smartphone and you're like, oh, really?

00:05:10.811 --> 00:05:11.201
Yeah.

00:05:11.921 --> 00:05:16.031
um, so there is so many things that are just in your hand and just in your pocket that you can do.

00:05:16.451 --> 00:05:18.131
But the challenge is, is that.

00:05:19.931 --> 00:05:34.121
We're kind of like doing the wrong things with the devices, so yeah, computing, calculating, sorting, memorization, those kind of repetitive skills where there's a specific kind of response that you're after and it's a correct response.

00:05:34.211 --> 00:05:41.591
Like I just wrote a newsletter about psychomotor learning and the whole underpinning of psychomotor learning is this notion of there's good form.

00:05:41.616 --> 00:05:44.346
So, and that's what you coach to, is what is that good form?

00:05:44.346 --> 00:05:54.756
You know, that Olympic level performance or that surgeon who's now ready to operate, or that, um, Potter who's working at the wheel shaping that pottery bowl.

00:05:54.786 --> 00:05:58.507
How do they get to the point where their motor skills are done in a way where they's.

00:05:58.941 --> 00:06:00.111
They're exhibiting good form.

00:06:00.291 --> 00:06:07.221
So for psycho motor skills, it becomes really, really obvious when we're trading up physical skills, and we even think about it when you watch Chi children develop, right?

00:06:07.221 --> 00:06:16.431
And they're, they get feedback from their natural involvement environment on how to walk, but they also get feedback as the repertoires get more complex from a skilled observer.

00:06:16.761 --> 00:06:21.231
And sometimes the computer can be a better skilled observer depending on what it is that you're training up.

00:06:21.861 --> 00:06:25.071
But if we're trying to train up things that are creative in nature, like.

00:06:25.901 --> 00:06:38.831
Oh, we could have a whole conversation about generative AI and GPTs and that whole thing because I was, again, I was very early in the front of that as well and actually had the opportunity to work for one of the bigger vendors and after reviewing their model declined.

00:06:39.251 --> 00:06:46.871
But um, but yeah, there's definitely things that computers are very good at and can be excellent coaches for.

00:06:47.606 --> 00:06:50.906
It comes down to this notion of what's the res required response?

00:06:50.906 --> 00:07:01.316
And we already get a bad rap in, in K 12 and higher ed around too much focus on memorizing bodies of curriculum as opposed to critically thinking about them.

00:07:01.316 --> 00:07:01.646
Right?

00:07:01.856 --> 00:07:10.526
And so it's not to say that computers can't be, you can't design computer programs that can challenge and, and help augment the development of, of, um.

00:07:11.186 --> 00:07:19.106
Um, creative behavior or, or, you know, problem solving and those kinds of more complex, uh, complex cognitive skills.

00:07:19.496 --> 00:07:23.276
It's just the way you get there isn't the way we're doing it right now.

00:07:23.276 --> 00:07:29.876
You don't apply this mechanistic, step-by-step procedural methodology to train a far transfer complex task.

00:07:30.356 --> 00:07:38.906
And, and quite honestly, sometimes it's like, well, do we even want to, like, there's so many things where if you consider the emotional underpinning of learning, which is present, we can talk again.

00:07:39.581 --> 00:07:44.441
Whole conversation about what that means, but the, that's where the human element really comes in.

00:07:44.711 --> 00:07:50.351
And so there's definitely ways that feedback can be reinforcing just in terms of build, building skill development.

00:07:50.441 --> 00:07:53.111
And that's motivating, but there's also the human touch.

00:07:53.141 --> 00:07:59.141
And especially in the digital world that we're in, coming outta COVID, we're almost forgetting how to be human, you know?

00:07:59.171 --> 00:08:08.081
'cause we're not spending a lot of, as much time even now in, in actual, in-person, you know, in the room kind of experiences.

00:08:08.231 --> 00:08:08.501
Yeah.

00:08:09.296 --> 00:08:10.286
Yeah, that's true.

00:08:10.346 --> 00:08:18.236
I mean, it's been five years since Covid, and yet it is still fresh on everybody's mind, so it's not like it's, um, like it's that far in the rear view mirror.

00:08:18.236 --> 00:08:24.386
But you, you started talking about these near transfer versus far transfer and organic and mechanistic skills.

00:08:24.386 --> 00:08:28.616
And so I want to talk about that a little bit, um, because that's something that I've been.

00:08:28.766 --> 00:08:39.956
Thinking a lot about that these mechanistic skills, which I'm defining as skills that you can easily say, this is how to do it.

00:08:39.956 --> 00:08:44.456
This is the right way, there's a wrong way, and I can transfer this knowledge to you.

00:08:44.456 --> 00:08:48.656
And by transferring the knowledge and you practicing, you can show that you've got it.

00:08:48.746 --> 00:08:50.936
And, and that is sufficient.

00:08:50.966 --> 00:08:54.866
And you know, that is the memorization, the, the, uh.

00:08:55.331 --> 00:08:59.891
The, the process skills, the things that we know need to happen in schools.

00:09:00.101 --> 00:09:07.151
Um, also a lot of times considered, uh, secondary, biological knowledge, I think is another term they use for it.

00:09:07.151 --> 00:09:14.861
But those are things where it's like, how do you, how do you live in this world and do the things that you need to do that are like, there's a yes and a no right?

00:09:14.861 --> 00:09:20.021
And a wrong way, binary skills, whereas organic skills are skills that are.

00:09:20.501 --> 00:09:29.381
Things that, uh, there's a right way, there's a better way, there's a really good way, there's an okay way and there's a really bad way.

00:09:29.441 --> 00:09:31.811
And there's this gradient of how to deal with it.

00:09:31.811 --> 00:09:41.471
Things like empathy, grit, perseverance, uh, patience, all all kinds of, these, what we typically call soft skills that I'm now calling organic skills.

00:09:41.711 --> 00:09:46.091
And then you, uh, brought up near transfer and far transfer skills.

00:09:46.091 --> 00:09:47.231
And how do those relate?

00:09:47.231 --> 00:09:48.311
Let's talk about that a little bit.

00:09:48.371 --> 00:09:48.941
Yeah.

00:09:48.941 --> 00:09:49.241
Yeah.

00:09:49.241 --> 00:09:52.061
So I, I touched on it a little bit, but let's go a little deeper on that.

00:09:52.061 --> 00:09:55.331
So, a near transfer skill is procedural in nature.

00:09:55.331 --> 00:10:00.131
So to your point, there's a right way to do it and everything else, right?

00:10:00.191 --> 00:10:00.941
And so.

00:10:01.916 --> 00:10:03.836
Think about like making a Starbucks latte.

00:10:03.866 --> 00:10:07.676
There is a step one, step two, step three, step four on how you do it.

00:10:07.796 --> 00:10:13.346
And yes, there's variations on it and, but for the most part there is, this is the way you build it.

00:10:13.346 --> 00:10:18.986
And that's not to say that that's representative of all lattes, that's representative of Starbucks lattes for standardization purposes.

00:10:18.986 --> 00:10:19.286
Right?

00:10:19.526 --> 00:10:24.086
So that's the idea of near transfers, uh, near transfer tasks and near transfer training.

00:10:24.086 --> 00:10:29.636
Is that the way that you teach it is the way that you, uh, the way that you perform it on the job.

00:10:30.281 --> 00:10:32.321
In real life is how you train it up.

00:10:32.621 --> 00:10:35.141
So it's not, there's no variation to it.

00:10:35.141 --> 00:10:36.911
It's always done the same kind of way.

00:10:36.911 --> 00:10:41.291
So that's what for far transfer, um, usually or near transfer usually refers to.

00:10:41.381 --> 00:10:47.141
And I should say too, that's comes from Ruth Clark, who is very prominent in technical training and corporate training.

00:10:47.141 --> 00:10:52.781
So that's comes from her, uh, Noma, Cher, that's where the, the terms near transfer and far transfer comes from.

00:10:52.961 --> 00:10:57.641
And she also talks about 'em in terms of procedures being near transfer tasks, and then.

00:10:57.891 --> 00:11:00.231
Principle based learning being far transfer tasks.

00:11:00.231 --> 00:11:05.421
So if near transfer tasks is, it's always done the same way every single time, under every condition.

00:11:05.421 --> 00:11:06.891
Step one, step two, step three.

00:11:07.461 --> 00:11:09.951
Far transfer tasks are those tasks that are more.

00:11:10.256 --> 00:11:11.336
Strategic in nature.

00:11:11.336 --> 00:11:19.436
So you form a strategy about how you're gonna approach the task at hand, meaning that depending on the context, what you're gonna do varies.

00:11:19.526 --> 00:11:23.636
And so, like a classic example that people would often give is making a sales call.

00:11:23.696 --> 00:11:31.046
It's not just step one, step two, step three because how the customer responds or how the potential prospect response matters.

00:11:31.051 --> 00:11:33.956
And you're gonna, uh, start pivoting and do things differently.

00:11:34.256 --> 00:11:35.546
So far, transfer tasks.

00:11:36.056 --> 00:11:41.516
Are also referred to oftentimes as complex cognitive tasks, whereas they, you would call a procedural task.

00:11:41.846 --> 00:11:44.576
Um, some sometimes referred to as a simple cognitive task.

00:11:44.576 --> 00:11:45.866
So these are different theorists.

00:11:45.866 --> 00:11:53.486
So that la latter term that comes from teman and markle, that's very top of mind since the psychomotor work I was just writing about comes from them as well.

00:11:53.636 --> 00:11:54.056
Right.

00:11:54.056 --> 00:11:57.566
So, and the way that they propose it is that everything builds on everything.

00:11:57.566 --> 00:12:01.886
That there's a psychomotor response, a physical response using your musculature and your, your.

00:12:02.156 --> 00:12:04.316
Um, anatomy and physiology to perform.

00:12:04.526 --> 00:12:12.326
That's a precursor for the simple cognitive tests, which are precursors for, um, complex cognitive, and they are like hardcore programmed instruction behaviorists.

00:12:12.326 --> 00:12:14.426
So that's where that particular theory comes from.

00:12:14.726 --> 00:12:29.786
But this, and, and quite honestly, it's worth revisiting programmed instruction considering the computer age we're in now because so much of that was dependent on our understanding of how computers and information processing and memory and those pieces work.

00:12:30.761 --> 00:12:42.641
And now if you look at it in the context of, like I said, cloud computing, internet of things, generative ai, you know, then it starts to take on, oh, so what can we actually program?

00:12:42.971 --> 00:12:46.121
And again, you can program things that have a definite right and a definite wrong.

00:12:46.331 --> 00:12:50.741
This really comes to bear, like when we start thinking about subjects where history.

00:12:51.296 --> 00:12:52.886
Even geography, right?

00:12:52.886 --> 00:12:59.306
Some people have, have had questioned is like, are maps really representative of what our actual world actually looks like?

00:12:59.606 --> 00:13:13.136
Or is that even a skewed response based on, you know, political and you know, the ruling power or the Victor's powers presentation of what geography is or what history is, so,

00:13:13.196 --> 00:13:16.976
Yeah, so, so you talked about these, uh, standardizable

00:13:17.201 --> 00:13:17.621
mm-hmm.

00:13:17.666 --> 00:13:24.866
and, and much of what we do in schools is really standardized information, and that's what we're testing on.

00:13:24.866 --> 00:13:25.856
That's what we're assessing.

00:13:25.856 --> 00:13:28.916
That's why we use multiple choice tests and things like that.

00:13:29.246 --> 00:13:33.356
And so there are, there are things where it makes sense to use.

00:13:33.686 --> 00:13:38.066
These mechanistic approaches to, to learning things.

00:13:38.126 --> 00:13:48.206
And so, uh, for example, um, we know how to teach, uh, reading with phonemic segmentation, oral reading fluency, those kinds of things.

00:13:48.506 --> 00:13:55.826
And, and so that makes sense for that to be something that we use a computer to teach.

00:13:56.186 --> 00:14:01.796
Whereas other things, um, like how to understand what an author is saying in the book.

00:14:02.141 --> 00:14:08.501
Is probably not going to be the, the same way that we teach someone how to read.

00:14:08.711 --> 00:14:13.241
How do you determine what kinds of things are worthwhile to use?

00:14:14.501 --> 00:14:21.821
specifically machine learning technology for learning and not use that for learning.

00:14:22.421 --> 00:14:26.021
I think for, for me, it is definitely.

00:14:26.996 --> 00:14:28.706
The context in which you're gonna use it.

00:14:28.706 --> 00:14:36.176
So I actually love using computers for peer practice partners on those tasks that you were describing.

00:14:36.206 --> 00:14:42.236
Oral reading, fluency, math, fact memorization, those things where there's definitely right, definitely wrong answers because I.

00:14:44.066 --> 00:14:45.746
One, the computer is always accurate.

00:14:45.806 --> 00:14:48.296
It always gives you correct feedback.

00:14:48.326 --> 00:14:52.526
So you actually, so it's not like, oh, you want, like a human is, has human foibles, right?

00:14:52.526 --> 00:14:53.636
They will look away.

00:14:53.636 --> 00:15:02.726
They will not listen or they will nod and say, Ah-huh, even though, so they're not as reliable as a computer is in terms of giving feedback when there's a correct response.

00:15:03.326 --> 00:15:09.206
But when there could be variation, you run into challenges, right?

00:15:09.206 --> 00:15:09.746
And so.

00:15:09.991 --> 00:15:10.411
Mm-hmm.

00:15:10.586 --> 00:15:14.756
What I've seen happen with GPT models recently is it falls into a couple categories.

00:15:14.756 --> 00:15:27.056
So early GPT, when you would ask questions of it, it would, it kind of made you feel bad about asking questions because it came back with this very like, you know, just a tone to it.

00:15:27.056 --> 00:15:31.228
Even though you know, you know, it's just uh, uh, you know, basically.

00:15:31.586 --> 00:15:33.326
A word processor on steroids, right?

00:15:33.326 --> 00:15:35.546
That's putting together the probability of this word coming.

00:15:35.546 --> 00:15:37.466
You know, that's how they're constructing these answers.

00:15:37.466 --> 00:15:43.406
And I mean, that's just a very, very simplified version of talking about how GPTs work, but that's essentially it.

00:15:43.406 --> 00:15:48.056
Like it, the fact that it's able to produce the kinds of response that it is and make you feel some kind of way about it.

00:15:48.956 --> 00:15:52.436
That's your own anthropomorphism that you're introducing into what you're seeing.

00:15:52.436 --> 00:15:53.366
But it's, it's true.

00:15:53.366 --> 00:16:02.576
Like people would comment on the, the, you know, for lack of a better term, the mansplaining, um, tone of the original versions of Chatt PT 3.5 and on.

00:16:03.056 --> 00:16:08.696
But now we're seeing in this emphasis, we were talking about empathy, you know, to make the computer.

00:16:09.086 --> 00:16:22.706
Interaction appear more empathetic is this whole thing where it's almost pandering to you and telling you how wonderful you are and how special you are and how that was a really insightful even it's like having some garbage response.

00:16:22.706 --> 00:16:22.946
You know?

00:16:22.946 --> 00:16:27.116
You could have put in any other series of words and it would've picked even that would make no sense.

00:16:27.116 --> 00:16:28.136
Near you are praising me on how.

00:16:28.196 --> 00:16:28.616
well at.

00:16:29.651 --> 00:16:32.531
And the challenge to that is that that is actually.

00:16:33.551 --> 00:16:36.041
helpful to say that it's been, that.

00:16:36.041 --> 00:16:42.461
It's like, oh, that was a great response, good catch when like, no, that don't patronize me.

00:16:42.461 --> 00:16:45.881
Don't, don't tell me that this was good when it actually wasn't.

00:16:46.211 --> 00:16:53.891
Because if I'm not familiar with what that is doing, then I might believe that and that's another big problem.

00:16:54.131 --> 00:16:54.611
So.

00:16:54.671 --> 00:17:01.031
So before we move into that arena, what you're saying really is that things like the.

00:17:01.481 --> 00:17:05.501
The, the terrible phrase we used to use is drill and kill for specific things.

00:17:05.741 --> 00:17:07.541
That's what the technology is really good

00:17:07.606 --> 00:17:08.651
It's excellent for that.

00:17:08.711 --> 00:17:09.071
Yep.

00:17:09.191 --> 00:17:20.171
that's what we should be using it for, because it can create answer correctly on those things, millions of times faster than we can do it.

00:17:20.171 --> 00:17:20.381
So

00:17:20.426 --> 00:17:21.806
And consistently.

00:17:21.896 --> 00:17:22.586
Oh, and that's,

00:17:22.631 --> 00:17:23.291
consistently

00:17:23.636 --> 00:17:29.126
that's super important too, because that's an important thing that we can come back to talking about GPTs, why they're also problematic.

00:17:29.126 --> 00:17:32.126
It's not just this pandering fe way of giving feedback.

00:17:32.306 --> 00:17:32.936
It's that.

00:17:33.201 --> 00:17:38.031
You can run a trial of exactly the same interaction and you're gonna get different feedback.

00:17:38.031 --> 00:17:51.561
So GPTs aren't consistent because they aren't, you are not giving a consistent response, and so they're not gonna give you consistent feedback because what they're giving you feedback on depends on your response versus in the drill and kill.

00:17:52.166 --> 00:18:08.066
You're always gonna have correct feedback now, and that's not to say that you can't train computers to be able to give better feedback and to be able to like for more complex cognitive skills and to be able to better help the human come to their own conclusions.

00:18:08.336 --> 00:18:11.696
But it's not done with GPT technology, like it is absolutely not.

00:18:12.491 --> 00:18:14.051
But you can code it so you can be done.

00:18:14.051 --> 00:18:25.751
I've done small language models, so as compared to large language models, I've done small language models that actually work very well to shape behavior into a behavioral repertoire or behavioral class.

00:18:26.021 --> 00:18:31.181
So when we're talking about training up things like concepts or you know, more strategic learning.

00:18:32.036 --> 00:18:35.276
We have a frame of referent, we have to have some frame of referent, right?

00:18:35.276 --> 00:18:36.836
It's the same thing when you're teaching them in school.

00:18:36.836 --> 00:18:48.476
You have your rubric of what you're looking for, but the way that you apply that rubric is gonna vary based on the circumstances under, you know, that come to bear and the what's being presented to you in the case at hand.

00:18:49.736 --> 00:18:53.336
Yeah, well, and, and that hints on my big problem.

00:18:53.636 --> 00:19:01.706
With any kind of grading in school is that it is all subjective and we try to make it objective, but it is not.

00:19:01.736 --> 00:19:05.816
And what I constantly tell my own children all the time is grades are made up.

00:19:07.319 --> 00:19:12.329
once you understand that grades are made up, then you, they can, one, stop having power over you.

00:19:12.539 --> 00:19:15.209
And two, you can understand that you are playing a game.

00:19:15.409 --> 00:19:23.299
And you need to figure out the rules that the teacher is, is going by as she or he grades your work because it is all made up.

00:19:23.299 --> 00:19:33.259
And anybody who argues that it's not and that it is actually objective, has no basis in reality, they're just absolutely wrong because every teacher makes up their own grading system.

00:19:33.289 --> 00:19:44.269
Even if they have a rubric, even if they have standards and guides from the, the district or the state or whatever it is, they still make that ultimate decision about whether or not it is.

00:19:44.684 --> 00:19:48.884
Uh, it is a certain thing and that makes it subjective.

00:19:48.944 --> 00:19:58.394
And so understanding that and being okay with it is one thing and like saying, all right, I get what's going on here and I, and I can react in a different way.

00:19:58.664 --> 00:20:07.934
But the problem is that we pretend like they are actually objective and that that is part of a, a bigger problem that we're not gonna go all into here.

00:20:08.894 --> 00:20:11.414
you brought up this example of, of GPTs.

00:20:11.879 --> 00:20:14.279
Reacting differently to the same prompt.

00:20:14.279 --> 00:20:15.509
And I actually did this.

00:20:15.509 --> 00:20:24.089
I have, I'm in a doctoral program right now and I had an assignment where we had to basically write a paper on what we've learned this, uh, this semester in class.

00:20:24.389 --> 00:20:29.639
And to me, this assignment was just really tedious and I did not.

00:20:29.954 --> 00:20:42.254
Want to do it because it, it did not, I, I didn't see the value in it, and so I decided to make it meaningful to me, which is often how I approach, uh, my assignments in school, is I figure out a way to make it meaningful.

00:20:42.254 --> 00:20:55.694
So what I did is I put it into two different gpt to see how it would write it, uh, based on my notes, what my dissertation topic is, um, and what, what other things I could feed all my assignments and stuff.

00:20:55.694 --> 00:20:57.224
So I put all my assignments in there.

00:20:57.494 --> 00:21:04.934
I put my notes that were very like basic chicken scratch notes that I had just typed into my note taking system.

00:21:05.924 --> 00:21:09.824
And then a few different documents also to like give context to it.

00:21:10.124 --> 00:21:19.124
And then I said with one that I had already trained to have my voice and sound like me as best it could, uh, and to know what next word it should predict for

00:21:19.219 --> 00:21:19.639
Mm-hmm.

00:21:19.994 --> 00:21:23.774
on, on that I use that to, to write this.

00:21:24.419 --> 00:21:29.939
Paper and then I used another one that I hadn't trained on all my information and I just said, here's the stuff.

00:21:29.939 --> 00:21:32.849
Now make it how it should, like how it makes sense.

00:21:33.149 --> 00:21:34.229
The second one.

00:21:34.619 --> 00:21:36.929
Um, made it very academic.

00:21:37.259 --> 00:21:41.549
Included the citations for my other assignments and said, this is where I talked about this.

00:21:41.789 --> 00:21:44.429
And it sounded very much like an academic thing.

00:21:44.639 --> 00:21:46.589
The one that I had trained to sound like me.

00:21:46.799 --> 00:21:57.569
I had to do way more editing because one, it didn't sound like me, and two, it, uh, it got a bunch of things wrong, had more hallucination, but that other one was very academic.

00:21:57.569 --> 00:22:00.839
I did, I had to do very little editing because.

00:22:01.169 --> 00:22:04.289
It already didn't sound like me because it wasn't trying to sound like

00:22:04.334 --> 00:22:04.694
Yeah.

00:22:04.709 --> 00:22:10.139
one that was trying to sound like me made mistakes and hallucinated and put things in that weren't there.

00:22:10.379 --> 00:22:13.139
And it was so fascinating because I did the same exact prompt.

00:22:13.139 --> 00:22:23.549
I just copied it from one or the other and, and let it do its thing and, and then as I was giving it feedback, going paragraph by paragraph, which I also did, then it was.

00:22:24.509 --> 00:22:31.649
It would, it would give different responses based on what I said, and then it would add stuff in that weren't there and things like that.

00:22:31.649 --> 00:22:36.419
And so you can't, can't think that this is a perfect technology.

00:22:36.659 --> 00:22:43.709
However, it was pretty amazing that it could do what it could do and that it could generate this stuff from what I had said.

00:22:44.039 --> 00:22:46.619
And, and so that part was pretty cool.

00:22:46.619 --> 00:22:48.719
But if you don't understand what it's really doing.

00:22:49.079 --> 00:22:58.649
It's so easy to think that it's either magic or that it's always right and, and that are, those are the two big problems in my mind with it, is that it's not always right.

00:22:58.649 --> 00:23:02.129
It's basically just guessing and doing a good job of it most of the time.

00:23:02.129 --> 00:23:05.159
But it's really just guessing and it doesn't actually know the right answer.

00:23:05.664 --> 00:23:06.084
Yeah.

00:23:06.144 --> 00:23:06.624
But it does.

00:23:06.654 --> 00:23:07.254
But yeah.

00:23:07.254 --> 00:23:08.154
But again.

00:23:08.969 --> 00:23:10.739
Don't look behind the curtain, you know?

00:23:10.739 --> 00:23:11.999
And it all seems like magic.

00:23:12.359 --> 00:23:19.019
Yeah, it's a, I mean, don't get me wrong, I've grown up with the evolution of this, so I think it's super cool that we can do this.

00:23:19.019 --> 00:23:21.389
And you were mentioning about training up your own voices.

00:23:21.389 --> 00:23:35.579
That was literally some of the first work that I did in this area was working for, like, I, I left Oracle in 2019, so I'd take on a few gigs working for some social media influencers, and they were just telling me the big problems about, ah, I can't respond to everybody and I don't, I'm losing followers because I'm not responding.

00:23:35.579 --> 00:23:36.119
I'm all like.

00:23:36.494 --> 00:23:37.334
Cool story, bro.

00:23:37.334 --> 00:23:39.134
I could totally make a bot that could respond.

00:23:39.134 --> 00:23:40.724
That sounds just like how you respond.

00:23:40.724 --> 00:23:47.564
And I did be, but again, the kind of responses were very much tailored to whatever, you know, what they would've said.

00:23:47.564 --> 00:23:53.384
So I had, and we had tons and tons of data to build with, and it was just a, it was a small, uh, language model.

00:23:53.564 --> 00:23:55.604
I didn't need to scrape the entire internet.

00:23:55.784 --> 00:23:58.364
The reason that your paper sounded so good in academic was.

00:23:58.394 --> 00:24:08.744
All of those academic models that had to draw from, in addition to, as opposed to, you know, there's a time when you wanna narrow the field in terms of what data you wanna draw from, and then there's a time where you wanna widen the field.

00:24:08.804 --> 00:24:10.154
And so that's the difference.

00:24:10.154 --> 00:24:22.064
Like, like with search, you wanna widen the field, you know, you wanna be able to find the thing that you're looking for, but you wanna be able to wi, it's sort of like there's a concept when you play sports about having wide focus and then narrowing your focus.

00:24:22.064 --> 00:24:26.774
So when you're out on the field and you're sort of taking everything in and then there's something that you need to beeline on.

00:24:27.314 --> 00:24:28.574
That's how search should work.

00:24:29.234 --> 00:24:31.934
It doesn't, it's because it's, it's not built for that.

00:24:31.934 --> 00:24:33.674
It's not built to really search and be helpful.

00:24:33.674 --> 00:24:45.434
I mean, I think that was the intention originally, but we're so far from that where we've gone with what we can do with the compute capability that we have now and, and, and the connectivity that we have now.

00:24:45.434 --> 00:24:50.354
I mean, it just, it's frustrating and having spent like, you know, 15 years working in Silicon Valley and just.

00:24:50.924 --> 00:24:58.004
I know why we are where we are, but I'm just, I, I just wanna shake some of my, all the tech bros I've worked with, I'm like, why though?

00:24:58.124 --> 00:24:59.624
You're not the only one who has ideas.

00:24:59.624 --> 00:25:00.074
And guess what?

00:25:00.074 --> 00:25:04.094
Lots of people think way more creatively than you because you think you're the default.

00:25:04.094 --> 00:25:07.724
And that, you know, like I run into it constantly and I'm like.

00:25:07.944 --> 00:25:08.724
Oh my God.

00:25:08.724 --> 00:25:11.364
Do you know how, how much hubris you have?

00:25:11.364 --> 00:25:14.874
Do you, do you recognize like how much you just center yourself?

00:25:14.874 --> 00:25:16.554
Like just by default.

00:25:16.554 --> 00:25:17.304
I'm like, oh my.

00:25:17.394 --> 00:25:18.444
Like really, dude?

00:25:18.444 --> 00:25:23.814
And now you've built this megaphone that just amplifies all this out and said, this represents the world.

00:25:24.024 --> 00:25:25.824
And I'm like, no.

00:25:25.824 --> 00:25:26.274
Oh, no.

00:25:26.634 --> 00:25:28.944
Talk about making people feel not seen, I mean,

00:25:29.229 --> 00:25:30.629
Yeah, exactly.

00:25:30.629 --> 00:25:36.149
So, so with what we have access to now, focusing on schools,

00:25:36.174 --> 00:25:36.474
mm-hmm.

00:25:36.509 --> 00:25:42.719
where should we be putting our time, energy, and our money in investing in using these tools?

00:25:42.719 --> 00:25:44.789
And what should that look like in your opinion?

00:25:45.509 --> 00:25:49.199
I am so I loved your take on grades.

00:25:49.259 --> 00:25:56.069
The challenge with grades, of course is that is not that, you know, yes, we can agree that they're subjective and that, you know, it's, they're inconsistent.

00:25:56.189 --> 00:25:59.339
However, they have a massive impact on people's lives.

00:26:00.134 --> 00:26:10.544
And so they, for that reason, that's why you can't just like go, oh yeah, I don't care about whether I get the A, but I have noticed being that I'm not educated formally until I got to graduate school in the United States.

00:26:10.934 --> 00:26:12.494
Big cultural differences with that too.

00:26:12.794 --> 00:26:23.564
So I think when you're thinking about how to implement technology in a classroom, it's gonna be completely based on what is the culture that's already there and how does the technology support the culture.

00:26:24.554 --> 00:26:26.624
I promise I did not pay her to say that.

00:26:28.874 --> 00:26:31.424
I'm just saying that Yeah.

00:26:31.664 --> 00:26:35.534
For, for anybody to say like, this is the answer to.

00:26:36.104 --> 00:26:40.214
Schools, problems with technology and say they have that solution, they just don't.

00:26:40.244 --> 00:26:42.764
It has to be focused on what your culture is.

00:26:43.214 --> 00:27:04.364
And I, I don't have my book right here, but, but that is literally what I talk about in both the books that I've written, how to be a Transformative Principal in school X is you have to design your school for the people that are there, not for who you hope would be there, not for who was there 20 years ago, but who is actually there right now, which means that it has to be a continuous, ongoing.

00:27:04.664 --> 00:27:08.654
Design process for you to make it work for those people.

00:27:08.744 --> 00:27:17.024
And it's going to change because that is the nature of schools and, and it has to, so, so, I, I like that you said that, but go deeper.

00:27:17.024 --> 00:27:20.894
What do you, what are the things that you have to be looking for and paying attention?

00:27:20.894 --> 00:27:24.344
Maybe give a few examples of, of what that looks like in a school.

00:27:24.704 --> 00:27:25.454
So.

00:27:26.324 --> 00:27:31.214
There's already a predisposition in society to use technology as a babysitter.

00:27:31.244 --> 00:27:39.074
So recognizing what your learners are coming in with and what their experience is with technology versus what you can actually support.

00:27:39.104 --> 00:27:46.424
So for example, I moved at the Seattle area in 1994 to do my one year doctoral internship at a school for children with learning differences.

00:27:46.634 --> 00:27:52.724
And they had a big part of the way that they did their generative instructive approach included drill and practice.

00:27:52.724 --> 00:27:53.054
So.

00:27:53.164 --> 00:28:07.894
For me, seeing all these reams of paper being wasted hurt my heart from, you know, just from a conservationist perspective, but also knowing that, well, I thought it was cool that they did peer learning and they, they were coaching each other and there's a substantial amount of their days where they function as peer coach.

00:28:07.894 --> 00:28:08.434
Love that.

00:28:08.434 --> 00:28:09.364
For a lot of reasons.

00:28:10.204 --> 00:28:11.584
I didn't love that.

00:28:11.584 --> 00:28:14.404
They could have had that relationship done more effectively with.

00:28:15.374 --> 00:28:20.114
A computer and using technology even in 1994, we could have, that's was entirely possible.

00:28:20.144 --> 00:28:25.394
And then use the time, uh, where kids can be together more pro socially.

00:28:26.174 --> 00:28:29.924
'cause that was the other reason why some of the children were in there with, with learning differences.

00:28:30.104 --> 00:28:31.724
It wasn't just that they, it was.

00:28:32.924 --> 00:28:35.684
It's sort of a chicken and egg for me because I, I worked with the older kids.

00:28:35.684 --> 00:28:36.824
It's like, what happened first?

00:28:36.824 --> 00:28:38.684
Did they have the, the learning differences first?

00:28:38.684 --> 00:28:41.774
And that led to social issues or vice versa, right?

00:28:41.924 --> 00:28:45.464
And so, um, so that's, that was a lot of what I saw.

00:28:45.464 --> 00:28:51.914
And while I worked there for five years after I finished my, um, my internship, that was my contention all the time.

00:28:51.914 --> 00:28:56.234
And plus I wanted to do more complex things, uh, with the actual component composite.

00:28:56.254 --> 00:29:07.774
Building that you can do with generative instruction and how that really can make these huge cre uh, huge leaps so that you can do some really fun work in person, together with humans.

00:29:07.834 --> 00:29:08.104
Right?

00:29:08.104 --> 00:29:18.064
So problem solving group, like, so much of what I loved about Morningside, I really loved, but the stuff that sat in my craw, I just designed a better version of, and that was the first company that I sold to a German company.

00:29:18.244 --> 00:29:18.664
So,

00:29:18.779 --> 00:29:19.069
Yeah.

00:29:19.174 --> 00:29:19.354
yeah.

00:29:19.354 --> 00:29:21.364
So I think it's really, but learning your co, like.

00:29:22.259 --> 00:29:26.579
I love that you said that, that, um, it, your culture is not a static thing.

00:29:26.579 --> 00:29:27.659
It's not this one and done.

00:29:27.659 --> 00:29:28.589
You design it, you're done.

00:29:28.769 --> 00:29:29.639
No, of course not.

00:29:29.909 --> 00:29:39.059
You're gonna see different things happening based on the demographics that come into your school, but then also world events and just you as an individual, human evolving as well.

00:29:39.059 --> 00:29:46.949
Like when teachers don't consider their own impact, like what they're bringing in and reflecting back, like that student in front of you is actually a reflection of you.

00:29:47.154 --> 00:29:52.884
And how much are you actually putting on that student instead of realizing, no, that's you putting it on them.

00:29:53.214 --> 00:29:53.514
You know?

00:29:53.844 --> 00:29:56.364
So being able to, to pivot as well.

00:29:56.634 --> 00:30:00.324
But I think really it's just, it's being really, and teachers know this stuff.

00:30:00.534 --> 00:30:08.934
I mean, a lot of times, I guess sometimes folks get jaded, but I have one amazing client who is like 28-year-old or 28 year veteran in the classroom.

00:30:09.204 --> 00:30:10.944
Uh, grades two through four.

00:30:11.334 --> 00:30:14.004
And she has written the most amazing.

00:30:14.254 --> 00:30:18.094
Technology, the most amazing curriculum that I just adore.

00:30:18.094 --> 00:30:24.874
That does, that, illustrates this use beautifully about how you can, and then also capitalizes on lots of other technology that we have available now.

00:30:24.964 --> 00:30:27.094
Like that's the other thing too, is like it's not just.

00:30:27.734 --> 00:30:29.834
What are some past really good uses?

00:30:29.924 --> 00:30:31.934
It's like imagining what else can come.

00:30:31.934 --> 00:30:40.304
Like I love also using technology to, uh, as a springboard, like getting kids interested in STEM from things they're already interested in.

00:30:40.304 --> 00:30:43.994
Like there's this, um, uh, you know, stem to dance group, right?

00:30:43.994 --> 00:30:47.174
Which is focusing on dance, which a lot of children love.

00:30:47.714 --> 00:30:48.614
Organically.

00:30:48.794 --> 00:30:51.524
And then what is the technology pieces that you can attach to that?

00:30:51.524 --> 00:30:55.094
So that's really more of like this doy ask progressive education perspective, right?

00:30:55.124 --> 00:30:56.744
Rather than the standardized curriculum.

00:30:57.014 --> 00:31:01.124
It's like what are the things that really grab the children's interest, interests?

00:31:01.124 --> 00:31:02.384
What do they love to do?

00:31:02.444 --> 00:31:07.424
And then how can we teach them those curriculum standards based on their interests, their unique passions?

00:31:07.964 --> 00:31:09.644
That's individualized instruction.

00:31:09.854 --> 00:31:09.884
Uh.

00:31:10.429 --> 00:31:14.479
Prescriptive diagnostic that you can absolutely do with computers too.

00:31:14.479 --> 00:31:18.889
So I mean, the potential to personalize instruction is amazing, but

00:31:19.124 --> 00:31:19.514
Yeah.

00:31:19.729 --> 00:31:25.219
lot more you have to under really understand all the different layers of learning, including that emotional layer to do that well.

00:31:25.519 --> 00:31:25.849
So,

00:31:25.979 --> 00:31:26.369
Yeah.

00:31:26.579 --> 00:31:26.879
Yep.

00:31:26.939 --> 00:31:28.529
And I didn't pay her to say that either.

00:31:28.559 --> 00:31:34.169
I mean, you're the reason why when we were talking last week and you started hinting at some of these things that you're now

00:31:34.474 --> 00:31:34.894
mm-hmm.

00:31:35.339 --> 00:31:37.799
I was like, yes, we are kindred spirits.

00:31:37.829 --> 00:31:38.999
We totally get this.

00:31:38.999 --> 00:31:44.399
And, and, and those are things that people who have listened to the show have heard me say a hundred times.

00:31:44.399 --> 00:31:45.929
And, and that's where.

00:31:46.409 --> 00:31:54.239
We should take advantage of the technology to, to give ourselves and students more opportunities to be human.

00:31:54.614 --> 00:31:59.714
With each other and, and be, and be the thing that the technology can't be.

00:31:59.834 --> 00:32:00.254
And

00:32:00.384 --> 00:32:00.804
Mm-hmm.

00:32:00.824 --> 00:32:02.444
that the technology can do better than us.

00:32:02.444 --> 00:32:04.394
We should have it do as soon as possible.

00:32:04.664 --> 00:32:17.624
But there are so many things that the technology will never be able to do because it is not human and, and there's not value there, but that we can do and that we should pursue and continue doing because it is.

00:32:18.039 --> 00:32:21.999
It's, it's what makes us unique and what makes us human and, and not machine.

00:32:21.999 --> 00:32:26.229
So, uh, this, this was awesome and I, I would love to interview that.

00:32:26.229 --> 00:32:26.259
Uh.

00:32:27.164 --> 00:32:32.054
That teacher you were talking about, uh, that created that, uh, curriculum for kids.

00:32:32.054 --> 00:32:33.704
That would be, that would be interesting.

00:32:33.704 --> 00:32:35.174
I'll talk to you more about that afterward.

00:32:35.414 --> 00:32:35.714
Yeah.

00:32:36.374 --> 00:32:37.904
um, Linda, this was awesome.

00:32:37.904 --> 00:32:51.224
I wanna let make sure people know they can check, uh, your stuff out at Linda b learning, uh, dot com, linda b learning.com for Linda Burrick, and, uh, any parting words or anything you'd like to say before we sign off?

00:32:51.959 --> 00:32:59.639
I. Just wanna encourage teachers, especially those who are a little more maybe reticent when it comes to technology.

00:32:59.819 --> 00:33:05.729
Don't be afraid of it, and let the kids lead too, because that can take you to some really, really interesting direction.

00:33:05.729 --> 00:33:13.704
So rather than thinking about fighting against the technology and that it's something that we have to overcome, or like the whole cheating aspect, nah, lean into it.

00:33:13.704 --> 00:33:15.749
Get a little more creative of what we can do instead.

00:33:16.499 --> 00:33:16.979
Yes.

00:33:17.039 --> 00:33:18.389
Oh, preach.

00:33:18.869 --> 00:33:22.079
Alright, Linda, this has been awesome having you on Transformative principle.

00:33:22.079 --> 00:33:23.009
Thank you so much.

00:33:23.039 --> 00:33:27.959
Again, Linda b learning.com for her newsletter and it's been great chatting with you.

00:33:27.959 --> 00:33:28.739
Thanks so much for your

00:33:28.889 --> 00:33:29.279
You bet.

00:33:29.279 --> 00:33:29.699
Thanks.