WEBVTT

NOTE
This file was generated by Descript 

00:00:11.164 --> 00:00:17.224
Welcome to Transformative Principal, where I help you stop putting out fires and start leading.

00:00:17.734 --> 00:00:19.234
I'm your host, Jethro Jones.

00:00:19.234 --> 00:00:21.934
You can follow me on Twitter at Jethro Jones.

00:00:21.934 --> 00:00:21.994
Okay.

00:00:34.944 --> 00:00:37.284
Welcome to Transformative Principal.

00:00:37.794 --> 00:00:50.694
Last week I did my handoff episode with Mike, and then the next day I defended my dissertation, and then I recorded it, of course, because I'm a nerd and I thought it'd be great to put out here on my podcast.

00:00:50.694 --> 00:00:52.404
So I hope you enjoy this.

00:00:52.434 --> 00:00:55.584
Thank you so much for listening to Transformative Principle for all these years.

00:00:56.154 --> 00:01:01.014
What you're gonna hear first is Tom Hur the, dissertation committee chair.

00:01:01.164 --> 00:01:12.234
He's going to share a little bit about the process and what it's gonna look like, and then I'm gonna do the presentation, and then there's gonna be some question answers, some comments, and, uh, conversation at the end.

00:01:12.294 --> 00:01:16.554
So, uh, just wanna say thank you, uh, for listening.

00:01:16.584 --> 00:01:22.794
Thank you for being part of Transformative Principal and what a huge accomplishment to finally finish this dissertation.

00:01:23.364 --> 00:01:29.214
And I so appreciate everybody who's listened and been a guest on the show who's helped me get to this point.

00:01:29.484 --> 00:01:31.554
So I'll turn it over here to Tom.

00:01:31.554 --> 00:01:32.169
Thank you so much.

00:01:32.493 --> 00:01:38.943
So let me go ahead and officially welcome everybody and kind of give you the, the plan for what's gonna happen today.

00:01:39.023 --> 00:01:43.073
as you know, you're here for Jethro's dissertation defense on his EDD.

00:01:43.763 --> 00:01:46.493
And we're excited to have all of you here.

00:01:46.883 --> 00:01:59.213
What will happen is in a minute I will ask the committee, I'll ask Jethro to give you a paragraph about himself and then the committee members, and then he will go ahead and proceed with a PowerPoint he's prepared.

00:01:59.693 --> 00:02:00.953
That does two things.

00:02:00.953 --> 00:02:10.313
One is it, it depicts his study, what he is researching, what he's learned, and it also shows scholastically from where he drew his data.

00:02:10.823 --> 00:02:12.413
So it's a good academic piece.

00:02:12.463 --> 00:02:13.753
after he finishes that.

00:02:15.283 --> 00:02:26.083
Then the committee will ask questions or make comments, and you should know right now we may not do a whole lot of that because we've done time with Jethro before.

00:02:26.083 --> 00:02:27.403
Sounds like a prison sentence.

00:02:27.643 --> 00:02:32.333
We've done time with Jethro before, we met with him last week for a practice run.

00:02:32.633 --> 00:02:33.803
So we're all comfortable.

00:02:33.803 --> 00:02:36.473
So it may be that we don't have anything new.

00:02:36.783 --> 00:02:44.253
in any case, after we are finished, then I'm simply gonna say, does anybody in the audience have any questions, have any comments?

00:02:44.253 --> 00:02:45.363
And we'll throw it open.

00:02:45.723 --> 00:02:53.713
Once that all happens, then Jethro Row will thank everybody who's a guest, for attending, and you all will leave.

00:02:54.133 --> 00:03:00.613
And then the three of us, Dr. Dr. Be and I will meet in a little room and we'll talk about what happens next.

00:03:01.013 --> 00:03:02.423
there are a few options.

00:03:02.423 --> 00:03:04.553
One would be, who is this guy?

00:03:04.553 --> 00:03:05.423
This is terrible.

00:03:05.423 --> 00:03:06.143
What is this?

00:03:06.393 --> 00:03:06.963
doesn't work.

00:03:06.963 --> 00:03:07.953
That's not gonna happen.

00:03:08.418 --> 00:03:10.758
the other one could be, this is perfect.

00:03:10.788 --> 00:03:12.048
he can, we'll see him in May.

00:03:12.048 --> 00:03:12.678
It's finished.

00:03:12.708 --> 00:03:13.968
That's not gonna happen either.

00:03:14.328 --> 00:03:17.898
What's likely to happen is, Jethro, this is really good work.

00:03:17.928 --> 00:03:22.998
We know that already, and here's some things we want you to play with and refine and move forward.

00:03:23.278 --> 00:03:24.958
but we're really excited about it.

00:03:25.408 --> 00:03:29.248
And then we will bring back in, tell him that, and then we'll be finished.

00:03:29.248 --> 00:03:37.868
So, before I go any further and ask for the committee to identify themselves, Belinda, Mindy, did, did I say that correctly?

00:03:37.868 --> 00:03:39.098
Anything you'd like to add

00:03:40.313 --> 00:03:41.303
No, I think that was

00:03:43.373 --> 00:03:44.123
good intro.

00:03:44.198 --> 00:03:44.438
good?

00:03:46.418 --> 00:03:46.618
Yep.

00:03:47.483 --> 00:03:47.773
Okay.

00:03:48.668 --> 00:03:52.508
So let's go ahead and these folks don't know us, most of them.

00:03:52.748 --> 00:03:57.653
Linda, would you give us a, a paragraph or two about your background and, and why you're here?

00:03:58.763 --> 00:03:59.033
Sure.

00:03:59.343 --> 00:04:01.293
my name is, I'm Dr. Linda Burrick.

00:04:01.503 --> 00:04:06.843
I am a behavioral psychologist specializing in artificial intelligence and machine learning.

00:04:07.263 --> 00:04:07.953
I met Jethro no.

00:04:09.363 --> 00:04:10.773
Year and a half ago, a while ago.

00:04:10.773 --> 00:04:12.573
Anyway, we had a lot of common interests.

00:04:12.843 --> 00:04:16.833
He had mentioned to me that he was doing his dissertation and asked me to be on his committee as an outside member.

00:04:16.833 --> 00:04:18.123
So I'm here doing that.

00:04:18.553 --> 00:04:26.943
I got my PhD, I finished it in 94, defended it in 2000, on generative instruction, so a specific type of machine learning.

00:04:27.303 --> 00:04:31.053
And I've been working in the industry ever since, well, even before that, right?

00:04:31.053 --> 00:04:40.933
So I've got over 40 years experience working in high tech, and I'm really excited to see, Jethro Defend because he is doing really good work and, he's adding a significant contribution to the field.

00:04:45.028 --> 00:04:45.358
Hi.

00:04:45.358 --> 00:04:46.718
my name's, Mindy Beer.

00:04:48.248 --> 00:04:48.728
I'm sorry.

00:04:48.728 --> 00:04:49.988
Am I getting feedback?

00:04:51.098 --> 00:04:51.608
Okay.

00:04:52.418 --> 00:04:55.388
Is it, I don't know if it was on my end or someone else's.

00:04:55.868 --> 00:05:07.748
I'm the Theresa M. Fisher, endowed chair for Citizenship Education at the University of Missouri St. Louis, and I'm also the co-director for the Center for Character and Citizenship.

00:05:08.798 --> 00:05:16.378
I've been, on this journey with Jethro for three years and also very excited about it.

00:05:16.408 --> 00:05:27.038
I did my, I got my PhD in, technology and education and computer science way back when in, 97.

00:05:27.308 --> 00:05:35.058
And boy, it's come a long way, although some of the concepts that I've seen Jethro use are things I also used.

00:05:35.058 --> 00:05:48.138
But, this is just something that's been really exciting as we have gone through three years, from not hearing the words AI at all at the beginning of this journey to having AI everywhere in education.

00:05:48.348 --> 00:05:51.278
So I'm really excited about, about this.

00:05:51.278 --> 00:05:55.728
And Jethro is also the first of our cohort, of 13 students.

00:05:56.148 --> 00:05:57.438
We go through the defense.

00:05:58.098 --> 00:06:00.588
So, with that, I'll turn it over to Tom.

00:06:04.253 --> 00:06:07.008
Sorry, Tom, you were causing the feedback so I muted you briefly.

00:06:08.658 --> 00:06:09.588
I'm so glad it wasn't.

00:06:11.373 --> 00:06:11.663
Okay.

00:06:11.928 --> 00:06:12.678
All right, good.

00:06:12.928 --> 00:06:18.808
I'm Tom Hur and despite my youthful appearance, I received my PhD before many of you were born.

00:06:19.228 --> 00:06:21.478
I've known Jethro Row for quite a few years before.

00:06:21.478 --> 00:06:27.088
He was a student at UMSL, and I've always been impressed by him, and I'm excited to be on this journey.

00:06:27.448 --> 00:06:30.628
I learned a great deal from him, and I think you probably will too.

00:06:31.078 --> 00:06:35.278
So, Jethro, give us a paragraph about you and then jump into your presentation please.

00:06:36.013 --> 00:06:36.733
Sure thing.

00:06:36.833 --> 00:06:40.523
first of all, I wanna thank everybody for taking time outta their day to be here.

00:06:40.573 --> 00:06:59.293
that means a lot to me and, and I could not have gotten here without other people helping me out because, all this work takes a lot and I have very truly, stood on the shoulders of giants as, as I've been able to learn so much through my podcast and working with so many different people.

00:06:59.293 --> 00:07:01.343
And, there's certainly not.

00:07:01.793 --> 00:07:07.133
Time enough to, to share all the things that I have learned and all the people and thank them all.

00:07:07.133 --> 00:07:12.983
But I just want you to know if you're here and you think that you've impact that I've impacted you, you've impacted me as well.

00:07:13.503 --> 00:07:23.223
so I was a principal for a long time and then, about six years ago I started, on my own doing, full-time consulting, work, training principles.

00:07:23.253 --> 00:07:38.673
And, a lot of that created the seed for this, dissertation today, which is helping people, know how to use technology and how to learn and, and how to apply, their learnings to what they're doing.

00:07:38.823 --> 00:07:42.213
so, with that I am, I'm going to get started.

00:07:42.213 --> 00:07:43.593
I'm going to mute everybody.

00:07:43.763 --> 00:07:46.293
so you all should, should be muted.

00:07:46.293 --> 00:07:51.333
And then Stacy, my wife, who is wonderful, she is going to mute anybody who comes off mute in the next.

00:07:51.708 --> 00:07:58.308
During the time that I'm presenting at the end, I have room for questions and you're welcome to come off mute and ask questions at that point.

00:07:59.178 --> 00:08:07.848
I'm gonna share today what I've learned about how school principals can use artificial intelligence, not just to save time, but to actually solve hard problems.

00:08:08.073 --> 00:08:24.258
And I want to start with a story, which is that in on August, April or August on Sunday, August 14th, 2022, I was signed up for a new tool called Dolly, which was an image generation tool from OpenAI.

00:08:24.873 --> 00:08:36.723
And the reason I did that is because I've never had much of an artistic skill in my body, and it seemed like a way for me to be able to create something that, I couldn't do on my own.

00:08:37.143 --> 00:08:45.963
And, my early attempts at that were not very good, and that's a lot to be desired, but the tool enabled me to do something that I just hadn't been able to do before.

00:08:45.963 --> 00:08:54.243
And I thought that was really amazing that I could unlock something where I could describe something and then make an, an image related to it.

00:08:54.843 --> 00:09:05.463
A few months later, open AI unveiled Chat, GBT and made it so that large learning, large language model technology was accessible to anybody who had the internet.

00:09:06.093 --> 00:09:10.833
And immediately marketing slogans just started focusing on saving time.

00:09:10.863 --> 00:09:13.503
They would say save hours a week on menial tasks.

00:09:13.863 --> 00:09:17.943
And ed tech companies especially told teachers to teach smarter, not harder.

00:09:19.023 --> 00:09:21.123
And I thought, is that really the best we can do?

00:09:21.648 --> 00:09:34.398
That approach of just doing the same old things faster is not really going to change anything meaningful or address the bigger issues in our school, in our schools, period.

00:09:34.458 --> 00:09:44.538
So I started this dissertation in practice with a simple question, is this AI tool destined to just make us faster doing the same old things or can it help us actually innovate?

00:09:45.138 --> 00:09:55.248
And when chat GPT launched in 2022, it reached a hundred million users in just two months, which made it the fastest growing consumer application in history.

00:09:55.698 --> 00:10:06.058
But almost immediately, the messaging from MedTech companies was all about, things like co-teacher helps you run your classroom better in fewer hours, save time on lesson planning.

00:10:06.928 --> 00:10:18.988
And I think that saving time is really valuable and good, but it misses a more powerful aspect of these tools, which is to enable actions and make ideas come to life that have not been possible before.

00:10:19.588 --> 00:10:27.358
I've argued in my previous work that being a great principal is really about designing your school for the people that are right there in front of you to meet their needs.

00:10:27.778 --> 00:10:32.758
This requires adaptations that go beyond just instructional leadership or efficiency.

00:10:33.178 --> 00:10:39.898
It requires principles to be innovative, vision focused, mission-oriented, and to operate from a moral purpose.

00:10:40.858 --> 00:10:46.408
So to me, that tension between efficiency and innovation became the heart of this study.

00:10:47.488 --> 00:10:51.928
So where I wanna start is with the literature review and I reviewed it.

00:10:52.798 --> 00:10:58.318
I reviewed research on AI and education principle effectiveness and innovation frameworks.

00:10:58.888 --> 00:11:05.518
And what I found is a lot about AI for teachers and students, and almost nothing about AI for principals.

00:11:05.998 --> 00:11:17.578
And this matters because the peer reviewed literature simply hasn't kept pace with the technology Chat CT launched in late 2022 and academic publishing cycles mean that we're only now seeing rigorous research emerge.

00:11:18.113 --> 00:11:24.803
Mariano and Neuro Leaf conducted a literature review of 66 articles published between 2020 and 2024.

00:11:24.893 --> 00:11:38.093
Note that halfway through that is when Chat GPT was released and they focused on AI's role in transforming learning environments and none of the sources they used related directly to principal leadership.

00:11:38.453 --> 00:11:45.683
So much of literature has focused on student and teacher use of ai, and very little is focused on principal use.

00:11:46.133 --> 00:11:57.983
Furthermore, there's a dearth of peer reviewed literature for principal leadership and ai, and it's especially lagging for leadership applications because everybody is focused on how teachers and students use it.

00:11:58.913 --> 00:12:02.633
That being said, there are some extrapolations we can make.

00:12:03.963 --> 00:12:08.833
research consistently shows that principals matter, grissom and colleagues conducted comprehensive.

00:12:09.138 --> 00:12:16.728
synthesis for the Walls Foundation, reviewing two decades of research on principle ECT effectiveness, and their finding is clear.

00:12:16.728 --> 00:12:21.228
Principles are second only to classroom instruction in their impact on student learning.

00:12:22.038 --> 00:12:37.578
They break down a large body of quantitative evidence into four mutually reinforcing domains of practice, instructionally focused interactions with teachers, building a productive school climate, facilitating collaboration in professional learning communities, and personal and resource management.

00:12:38.628 --> 00:12:40.518
But here's a gap I kept running into.

00:12:40.818 --> 00:12:45.198
None of the AI literature addresses how principals can use AI across these domains.

00:12:45.198 --> 00:12:52.578
Specifically as I wrote in my book, school X, the role of the school principal may be one of the most unique positions in any organization.

00:12:52.968 --> 00:13:01.758
There aren't many other roles that require a leader to interface with so many stakeholders with such drastic and diverse expectations for success in different areas.

00:13:04.833 --> 00:13:10.983
Bixler and SBIs published a paper in 2025 trying to build a conceptual model for principals using ai.

00:13:11.373 --> 00:13:15.783
And it's one of the only peer reviewed pieces, specifically about principals and ai.

00:13:16.233 --> 00:13:19.563
But even they had to rely mostly on teacher-focused studies.

00:13:19.863 --> 00:13:35.013
They argued that principals have the potential to lead AI to maintain and enhance instructional effectiveness in schools, but they're missing a key piece, how AI might be used to approach the role of the principal specifically differently through innovation.

00:13:35.793 --> 00:13:40.113
And there's a second gap even when we do talk about AI in education.

00:13:40.653 --> 00:13:45.843
Most of the research out there is talking about efficiency, time savings, and automation.

00:13:46.023 --> 00:13:51.273
We rarely talk about innovation, about using AI to do things that weren't possible before.

00:13:51.633 --> 00:13:58.323
And that's why I wanted to explore and and explore what actually happened with participants in my study.

00:13:59.523 --> 00:14:01.083
So what is an innovation?

00:14:01.878 --> 00:14:10.068
The Clayton Christen Institute identifies four types of innovation, sustaining innovation, disruptive innovation, hybrid innovation, and efficiency innovation.

00:14:10.728 --> 00:14:17.028
Pretty much whenever somebody's talking about innovation, they're talking about Clayton Christensen's definition of disruptive innovation.

00:14:17.658 --> 00:14:23.628
And these distinctions are typically found in business, but they do relate to education as well.

00:14:24.578 --> 00:14:33.008
Christensen's work on disruptive innovation shows that real disruption often starts at the low end, which is simple, cheap and accessible, and then moves up market.

00:14:33.308 --> 00:14:35.558
And that's exactly what chat GPT did.

00:14:35.948 --> 00:14:47.648
Christensen himself applied disruptive innovation to higher education, asking quote, if there is a novel technology or business model that allows, allows entrants in higher education to follow a disruptive path.

00:14:48.308 --> 00:14:49.778
That answer seems to be yes.

00:14:49.778 --> 00:14:52.748
And the enabling innovation is online learning.

00:14:52.958 --> 00:14:53.678
End quote.

00:14:54.338 --> 00:14:55.508
Now, this was back.

00:14:56.883 --> 00:14:59.433
many years ago when he first made that statement.

00:14:59.463 --> 00:15:13.923
And, you know, online learning has certainly changed the educational landscape considering that this whole program that I did was all online and Clayton Christensen didn't even, live to see this day and, and what it has become.

00:15:13.923 --> 00:15:16.383
But there's definitely some, innovation happening there.

00:15:17.193 --> 00:15:31.023
Arizona State University's Mary Lou Fulton Teachers College defines principled innovation as the ability to imagine new concepts, catalyze new ideas, and form new solutions, guided by principles that create positive change for humanity.

00:15:32.073 --> 00:15:41.733
And that framing is one that I really love because it helps us understand that innovation without ethics is just disruption for its own sake.

00:15:42.273 --> 00:15:45.513
And I'm not about disruption for its own sake.

00:15:45.543 --> 00:15:49.443
I'm about improving things that create positive change for humanity.

00:15:50.343 --> 00:15:57.213
James Besson's research on automation shows that technology typically doesn't eliminate jobs, but it transforms them.

00:15:57.393 --> 00:16:00.903
And that's the kind of an innovation that I'm interested in for principles.

00:16:01.383 --> 00:16:03.423
Lemme talk a little bit more about Besson.

00:16:03.783 --> 00:16:08.643
This is one of my favorite examples from the research, and I think this directly applicable to education.

00:16:09.333 --> 00:16:20.283
There's a lot of hand wring about AI taking over jobs, and I have even said that if a computer can teach students how to read, for example, better than a human can, why wouldn't we turn that over to them?

00:16:20.643 --> 00:16:24.303
Well, the obvious answer is that humans provide something that machines don't.

00:16:25.203 --> 00:16:28.593
Besson talked about the banking industry and the innovation of ATMs.

00:16:28.593 --> 00:16:34.563
Decrease the need for tellers per branch, but increase their reach as banks open more branches.

00:16:35.163 --> 00:16:38.613
He says, quote, thanks to the ATM, the number of tellers required.

00:16:40.053 --> 00:16:48.393
To operate a branch office in the average urban market fell from 20 to 13 between 1988 and 2004 end quote.

00:16:48.783 --> 00:16:55.803
But that redu cost reduction allowed banks to open more branches and the number of banks in urban areas increased by 43%.

00:16:56.223 --> 00:16:58.293
The nature of their work changed.

00:16:58.293 --> 00:17:06.513
Tellers went from being cash dispensers, a task the ATM could do to being part of a relationship banking team who could sell financial services.

00:17:07.203 --> 00:17:17.853
We can and should expect the role of a teacher and principal to change in a similar way, just as a principal used to be viewed as a disciplinarian manager, but is now seen as an instructional leader.

00:17:18.123 --> 00:17:20.133
Their role will evolve again.

00:17:20.523 --> 00:17:27.183
The question is whether we're intentional about that change and whether or not it brings about positive change for humanity.

00:17:29.343 --> 00:17:31.833
Here's the core tension of this dissertation.

00:17:33.048 --> 00:17:53.358
In June, 2025, researchers published a paper called Your Brain on Chat, GBT Accumulation of Cognitive Debt when using an AI assistant for Essay Writing Task, they
studied 54 participants in three groups who authored essays, one group using just their brains, one using Brains plus search engines and one relying on an LLM.

00:17:54.168 --> 00:17:59.268
The group who relied on the LLM showed decreased cognitive load as shown on their Alpha Band connectivity.

00:17:59.718 --> 00:18:10.728
The researchers define cognitive debt as a condition in which repeated reliance on external systems like LLMs replaces the effortful cognitive processes required for independent thinking.

00:18:11.688 --> 00:18:13.278
But I saw something different.

00:18:13.908 --> 00:18:20.988
I created a new term called cognitive equity, which is the opposite of debt, not the other kind of equity we often talk of.

00:18:21.363 --> 00:18:29.613
To clarify a different perspective, much like using assistive communication devices for a nonverbal person expands their ability to communicate.

00:18:30.003 --> 00:18:39.423
Cognitive equity is the situation where someone who is burdened by a cognitive load offloads that to an AI that will then help them perform tasks they couldn't do otherwise.

00:18:40.503 --> 00:18:41.853
Here's another way to think of it.

00:18:42.213 --> 00:18:46.173
If you are reliant upon ai, when you don't need it, then it creates debt.

00:18:46.593 --> 00:18:50.253
But when you rely on it, when you do need it, then it can create equity.

00:18:50.613 --> 00:18:52.533
So what is this cognitive equity?

00:18:53.283 --> 00:19:03.243
In a paper I wrote last year, I defined it as using AI as a tool that expands a user's cognitive capacity to address complex problems and lead adaptive change.

00:19:04.083 --> 00:19:05.643
Let me illustrate with an example.

00:19:05.973 --> 00:19:11.163
If someone is capable of a task and they don't use the muscles required for the task, it can't be damaging.

00:19:11.253 --> 00:19:12.303
That's cognitive debt.

00:19:12.888 --> 00:19:21.588
But if someone doesn't have the skills necessary for a task and they use the tools to bring their skills up to that level, then the use of that tool is meaningful and worthwhile.

00:19:21.828 --> 00:19:23.028
That's cognitive equity.

00:19:23.838 --> 00:19:34.968
AI can serve as what Liz Wiseman calls a multiplier, giving people skills they did not previously have, enabling them to create solutions to problems they never thought possible in her framework.

00:19:35.088 --> 00:19:42.618
Multiplier leaders quote, liberate people from the oppressive forces end quote in bureaucratic settings and help people be their best.

00:19:43.488 --> 00:19:46.908
This connects to the framing of assistive versus replacement technology.

00:19:47.118 --> 00:19:52.368
When we use AI for efficiency, we're often replacing human cognition and letting the human think.

00:19:52.368 --> 00:19:59.658
For us, when we use AI for innovation, we are assisting human cognition using the machine to extend what we can do.

00:20:00.468 --> 00:20:04.758
Cognitive equity is explicitly about assistance, not about replacement.

00:20:06.888 --> 00:20:10.368
So let's get on the same page about what AI actually is.

00:20:10.998 --> 00:20:22.308
Artificial intelligence, which right now is mostly a marketing term, is broadly defined as the development of computer systems that can perform tasks, which would typically require human intelligence.

00:20:22.608 --> 00:20:27.378
While it seems the AI is new, it actually originate in the sixties with the development of the computer.

00:20:27.888 --> 00:20:32.988
Machine learning is a subfield of AI focusing on pattern recognition and data.

00:20:33.618 --> 00:20:43.008
Deep learning is a subset of machine learning that utilize artificial neural networks loosely inspired by the brain, the way the human, pro human brain processes.

00:20:43.008 --> 00:20:49.398
Data and large language models are a specific area within deep learning that focuses on text.

00:20:49.398 --> 00:20:50.208
Specifically.

00:20:50.808 --> 00:20:56.808
These are powerful machine learning models that use neural networks to model complex relationships at a massive scale.

00:20:57.558 --> 00:20:58.428
Here's the key.

00:20:58.518 --> 00:21:02.778
LLMs essentially predict the me next best word in a text string.

00:21:03.493 --> 00:21:07.423
think of this process as putting together a very large all white puzzle.

00:21:07.753 --> 00:21:14.023
The LLM does, it does predict what the next best piece is going to look like.

00:21:14.983 --> 00:21:21.823
Understanding that AI in its current form is a prediction machine is crucial to understanding what it can and cannot do.

00:21:22.333 --> 00:21:30.043
It may appear to be magical, but it is simply trained on a massive amount of data and can produce better results faster than many other tools.

00:21:30.403 --> 00:21:33.763
But it is still solidly within the box of computer programming.

00:21:34.303 --> 00:21:36.133
This diagram adapted from Sto.

00:21:36.133 --> 00:21:40.783
Bower is a data scientist who wrote a clear explanation of how these technologies relate to each other.

00:21:41.053 --> 00:21:55.073
I definitely suggest you check it out if you are interested in learning more about that, going into more details beyond the scope of this, dissertation and practice, which I will probably say about a hundred more times throughout this conversation.

00:21:56.823 --> 00:22:01.333
in 2024, Michael FO's colleagues published a paper.

00:22:01.803 --> 00:22:07.293
which is again one of the few peer reviewed pieces, specifically about AI and, and school leadership.

00:22:07.683 --> 00:22:09.573
Their assessment is very direct.

00:22:09.633 --> 00:22:13.293
AI has crossed a threshold, and it's not a novelty anymore.

00:22:13.323 --> 00:22:15.903
It's not something we can wait and see about.

00:22:16.593 --> 00:22:24.933
They write the ai, AI has quote, transition from a mere toy tool to a disruptive innovation end quote, that language matters.

00:22:24.933 --> 00:22:29.493
A toy is something you play with a disruptive innovation is something that changes industries.

00:22:30.243 --> 00:22:38.553
They also emphasize that quote, school leaders need to create a long-term vision for integrating this technology into their schools in a careful but principled way.

00:22:39.213 --> 00:22:47.703
Despite the allure and promise of this brave new AI gen, AI world, school leaders must always put the learning needs of children and young people first.

00:22:48.318 --> 00:22:58.278
And I quoted that because that is an important piece that we have to remember, that educating our students is the priority in our jobs.

00:22:58.908 --> 00:23:10.428
The question for principal is not whether to engage with ai, but how will we use that disruption for efficiency, doing old things faster, or for innovation, doing new things that matter.

00:23:12.378 --> 00:23:19.848
I would say that we should, that we should not just do the wrong things faster, the old things faster, just because we've always done them.

00:23:20.448 --> 00:23:23.628
Education has a pattern, and I don't think that it's a good one.

00:23:24.048 --> 00:23:25.788
We adopted new technology.

00:23:26.148 --> 00:23:30.468
We use it to do the old things faster, and then we wonder why nothing changes.

00:23:31.368 --> 00:23:37.428
Ai startup companies quickly seized on the monetary potential involved in promising to make life easier for principals and teachers.

00:23:37.758 --> 00:23:47.448
While saving time is great and necessary, it misses the larger point of these tools, which should enable actions and enliven ideas that have never before been possible.

00:23:48.093 --> 00:23:53.793
It can be a disruptive technology in schools, but many school leaders are treating it solely as an efficiency innovation.

00:23:54.273 --> 00:24:00.243
This prevents leaders from embracing the change that can come when educators are able to create something that they never thought they could before.

00:24:00.903 --> 00:24:04.773
AI gives us a chance to break that pattern if we're intentional about it.

00:24:05.583 --> 00:24:07.323
The innovation imperative is this.

00:24:07.443 --> 00:24:09.903
Don't just ask, how can AI make us faster?

00:24:10.143 --> 00:24:14.343
Ask what can AI help us do that we couldn't do before?

00:24:17.763 --> 00:24:24.373
Everett Rogers created the diffusion of innovation theory, in the early seventies, if I remember my dates correctly.

00:24:24.373 --> 00:24:37.133
the modern understanding shows a bell curve like this one here, with different groups occupying different roles, innovators, early majority, ear, sorry, early adopters, early majority, late majority laggards and non-ad adopters.

00:24:37.853 --> 00:24:44.423
There is a chasm between the early adopters and early majority, and a similar chasm between the late majority and the laggards.

00:24:44.888 --> 00:24:49.868
These chasms are particularly important because they constitute opportunities for disruptive innovation.

00:24:50.498 --> 00:24:57.218
We can assume that about 2.5% of the school leader population was using AI before chat GPTU was released.

00:24:57.218 --> 00:25:01.928
These were the innovators, they were the people who jumped on the beta version as soon as it came out.

00:25:01.928 --> 00:25:11.048
And as an innovator myself, I saw enormous potential in chat GBT and other AI tools, and had actually been using them before they were broadly available to the public.

00:25:11.348 --> 00:25:26.188
I was, my, the first book I wrote was in 2012 called Paperless Principle about using these types of tools to, automate the work that I was doing as a principle so I could have more time to focus on the needs of the students that I was working with.

00:25:27.838 --> 00:25:39.128
This orange part of this diffusion of innovation, curve is where I wanted to focus, and I added a design linking layer for the innovators and early adopters.

00:25:39.518 --> 00:25:43.238
The goal of my intervention was to help principals move from early awareness of ai.

00:25:44.318 --> 00:25:51.068
To actually using it to solve real problems in their schools, not just understanding it, but application.

00:25:51.338 --> 00:26:01.478
As the 11 principals attending my presentation were all innovators in Wyoming, I wanted them using the AI for what I would call the right things being innovative and not just saving time.

00:26:03.068 --> 00:26:07.568
These are the three questions that guided my study and they're intentionally sequenced.

00:26:07.598 --> 00:26:15.698
The first question asks, as a result of the AI for innovation training, do principals report understanding how AI works better than they did before?

00:26:16.478 --> 00:26:25.148
The second question asks, as a result of the training, do Principals report being able to apply AI strategies to use AI for innovation in their schools?

00:26:25.928 --> 00:26:34.568
And the third question asks, what supports are needed and what barriers arise when leaders employ AI as a change agent to create innovative solutions?

00:26:35.048 --> 00:26:38.528
The first two are really about learning, is the message hitting?

00:26:38.558 --> 00:26:40.298
And the third is about reality.

00:26:40.628 --> 00:26:43.058
What actually gets in the way and what actually helps?

00:26:43.808 --> 00:26:56.328
So lemme describe the intervention that I did, which was a, full day workshop in, Wyoming to principals in connection with their, principal, conference for the association that year.

00:26:56.988 --> 00:27:10.158
So I officially started teaching how to use automations and machine learning in a school setting in 2007 as a teacher with what I called Techno Thursdays it's staff in my building that, that were there to learn how to use technology better.

00:27:10.578 --> 00:27:15.768
Then 2012 I wrote my first book, paperless Principle and I learned something very important back then.

00:27:16.008 --> 00:27:20.808
Educators loved collect tools and shortcuts, thinking they'll use them someday.

00:27:21.048 --> 00:27:28.608
I once ran an experiment at a principal's conference and offered a session called 50 Tools for FIF in 50 Minutes for Busy Administrators.

00:27:28.938 --> 00:27:38.318
And it was my best attendance session that year, but it had the smallest amount of impact because people just collected, tools like they were badges and didn't do anything with them.

00:27:39.838 --> 00:27:49.018
In 2018 as a principal, I started a new strategy where I focused on giving people time to think deeply about a problem before giving them any tools to solve it.

00:27:49.738 --> 00:27:50.818
This helped tremendously.

00:27:50.818 --> 00:28:00.058
So I designed this AI for innovation presentation around asking attendees to first identify a problem before using AI to generate innovative solutions.

00:28:00.688 --> 00:28:04.768
This approach is key because when presentations are tool focused, you use that tool.

00:28:05.098 --> 00:28:11.638
But when the presentation is problem or solution focused, use whatever tool makes sense in the moment.

00:28:13.828 --> 00:28:17.788
So I created this visual to illustrate what that looks like.

00:28:18.598 --> 00:28:28.618
The key is finding the sweet spot problems that are hard enough to matter, but tractable enough to make progress on in a day or whatever the length of the training is.

00:28:29.278 --> 00:28:36.388
As complexity and meaning rises, a person becomes hungrier for a solution and is willing to do more to achieve it.

00:28:36.913 --> 00:28:41.053
But if the problem is too big, like how do we end homelessness for our students?

00:28:41.563 --> 00:28:48.043
People quickly become disheartened because the problem really is huge and multifaceted, and they feel like they can't solve it on their own.

00:28:49.003 --> 00:28:51.463
On the other hand, solutions should be simple.

00:28:51.703 --> 00:29:03.613
There's a sweet spot where people are willing to take a second and third stab at something, and this is where AI solutions can both be good enough and simple enough that principles could actually implement them for meaningful change.

00:29:04.513 --> 00:29:08.533
My guidance was that the chosen problem could not be solved in 30 seconds.

00:29:08.533 --> 00:29:17.143
With ai that would indicate it wasn't a wicked enough problem, but also couldn't require multiple stakeholders for any workable solution that would be too wicked.

00:29:17.173 --> 00:29:23.053
The way to find the sweet spot is to think about it, talk about it, think some more, and talk some more.

00:29:23.923 --> 00:29:25.663
This diagram is my original work.

00:29:25.723 --> 00:29:35.293
Based on years of facilitating this kind of design work, 10 principals and one teacher gave up a Saturday in Wyoming to learn about ai.

00:29:35.848 --> 00:29:50.518
The participants self-selected to attend this AI for Innovation training conducted in partnership with the Wyoming Association of Secondary School Principals at their annual event on November 1st, 2025 in Casper, Wyoming, that they gave up personal time matters.

00:29:50.518 --> 00:29:52.828
These were people already leaning into change.

00:29:52.888 --> 00:30:09.328
Their innovators in Rogers framework and their willingness to give up personal time and lean into a topic suggests they were more open to change in
experimentation than the average principle, which of course is a limitation on transferability, but also shows what's possible with motivated participants.

00:30:09.718 --> 00:30:22.678
They were also a perfect group of people to see AI as a disruptive innovation rather than just an efficiency innovation, and I really wanted to help them see how to use it in this new way that was not just about saving time.

00:30:25.498 --> 00:30:32.488
I used a retrospective pretest design to assess their growth in traditional pretest administered before an experience.

00:30:32.893 --> 00:30:35.053
People often overestimate their knowledge.

00:30:35.113 --> 00:30:36.763
They don't know what they don't know yet.

00:30:37.303 --> 00:30:41.083
Also learning something new, changes how they perceive their level of understanding.

00:30:41.083 --> 00:30:43.423
This is called a response shift bias.

00:30:43.723 --> 00:30:56.203
A retrospective pretest solves both of these problems by having participants rate both before and after at the same time after the training, and they're using the same frame of reference for both ratings.

00:30:56.923 --> 00:31:03.133
The retrospective pretest has been shown as an effective way to measure self-reported learning in educational environments.

00:31:03.133 --> 00:31:09.943
Across many studies, there are risks, however, demand characteristics where participants want to please the facilitator.

00:31:09.973 --> 00:31:14.953
Implicit theories of change and meta memory related biases like hindsight bias.

00:31:15.403 --> 00:31:30.553
I mitigated these as best I could by keeping responses anonymous, establishing a culture of non-judgment and keeping the time period limited to just a single day so that they couldn't think they knew things better or worse before when it was days or weeks in the past.

00:31:31.528 --> 00:31:38.398
For qualitative coding, I used AI assisted coding by Atlas TI as a support for my own analysis.

00:31:39.848 --> 00:31:46.658
and then I also engaged a peer researcher, a doctoral candidate with qualitative research experience to check and refine the codes.

00:31:47.018 --> 00:31:55.338
She reviewed all the codes and categories and confirmed alignment or suggested refinements, which I, ended up with and will present to you here today.

00:31:56.538 --> 00:31:59.448
So I wanna go through some of the, the findings.

00:32:00.358 --> 00:32:00.868
excuse me.

00:32:00.868 --> 00:32:01.408
There we go.

00:32:02.248 --> 00:32:12.178
in re in the retrospective pretest, I asked participants to indicate their understanding of how they could use AI generally in schools and for innovation, for understanding how tools work.

00:32:12.238 --> 00:32:22.828
The average before rating was 3.2, which is about average, you know, on a five point scale showing most principals believed they were moderately proficient after the training.

00:32:22.828 --> 00:32:24.538
It was 4.1 above average.

00:32:24.538 --> 00:32:26.068
That's a gain of 0.9 points.

00:32:26.818 --> 00:32:44.518
For the question of using AI effectively in school settings, they went from a 3.1 to a 3.9, a gain of 0.8 points, and for using AI to solve problems and innovate, which was the heart of this study, they went from a 2.9 to a 4.0, which is the largest gain of one point points.

00:32:45.238 --> 00:32:49.768
They started slowly, below average and on innovation and ended above average.

00:32:49.798 --> 00:32:54.808
Even those who raided themselves highly initially, highly initially still showed growth.

00:32:54.808 --> 00:33:01.978
In the end, there was not one participant in any of the responses for any of the questions that did not self-report some growth.

00:33:03.928 --> 00:33:07.168
The second finding was that they could apply it.

00:33:07.258 --> 00:33:09.718
They weren't just learning theory they were building.

00:33:10.198 --> 00:33:12.778
My question was, can principals apply this learning?

00:33:12.808 --> 00:33:14.128
Yes, they could apply it.

00:33:14.548 --> 00:33:17.008
Let me share what the principals actually created.

00:33:17.518 --> 00:33:20.968
One principal wanted to help high school students reflect on their interests.

00:33:21.298 --> 00:33:23.398
They create a one month curiosity tracker.

00:33:24.148 --> 00:33:31.798
Another one To address students' emotional and mental state, they proposed an app that prompts students to complete check-ins about their mental health.

00:33:32.338 --> 00:33:35.368
Multiple participants want to improve digital citizenship.

00:33:35.398 --> 00:33:39.238
They designed games where students could progress through levels to learn these concepts.

00:33:39.718 --> 00:33:41.578
One participant captured it perfectly.

00:33:41.938 --> 00:33:42.598
You could.

00:33:42.748 --> 00:33:47.908
You used to have to depend on whatever vendor can sell you in an app, and you have to live with compromises.

00:33:48.268 --> 00:33:51.028
Well, if you know what you want, you can build it yourself.

00:33:51.598 --> 00:34:02.728
The tools that we used were varied, and because we were focused on a problem, they were able to find the tool that matched the solution that they were trying to build.

00:34:03.058 --> 00:34:12.568
Whereas if I went into this training and said, we're gonna learn how to use chat GPT, for example, then they would learn how to use chat GPT specifically, and then have to figure out the problems.

00:34:13.168 --> 00:34:19.018
This was an intentional design decision to help them use whatever tools would be necessary.

00:34:19.258 --> 00:34:31.858
So we used Repli, Zapier, Google Forms Cloud Code, Manus and Chat GT, and probably a couple others that I don't even know about because they were able to find solutions on their own as well.

00:34:33.778 --> 00:34:36.418
The third finding was about barriers.

00:34:36.808 --> 00:34:43.558
When I asked what would get in the way of continued AI use, three themes dominated, I call them the three Ts.

00:34:43.708 --> 00:34:50.998
Time training and treasure Time was the biggest barrier, which I'm sure is not a surprise to anybody who works with educators.

00:34:51.508 --> 00:34:55.738
Respondents noted lack of time for learning, implementing and practicing with ai.

00:34:56.248 --> 00:35:05.278
In my experience, this is the number one reason people are excited about using AI for efficiency because they get to save such a scarce resource, which I totally understand.

00:35:06.148 --> 00:35:14.698
Educators are caught in a catch 22 though their daily work consists of doing things they don't find purposeful, so they want AI to help reduce that time.

00:35:15.148 --> 00:35:21.808
One participant wrote that thinking more holistically about underlying issues rather than surface level problems was empowering.

00:35:22.468 --> 00:35:26.488
Others said their in their day jobs, they were just trying to put out fires every day.

00:35:27.538 --> 00:35:29.218
Training was second one.

00:35:29.218 --> 00:35:34.648
Workshop isn't enough and treasure resources, subscriptions infrastructure was third.

00:35:35.278 --> 00:35:44.398
If PRI principals truly use AI in innovative ways, they must start by identifying and abandoning unimportant tasks in favor of meaningful efforts.

00:35:46.468 --> 00:35:49.348
The fourth finding is about supports needed.

00:35:50.458 --> 00:35:55.018
The supports participants said they needed are almost the mirror image of their barriers.

00:35:55.288 --> 00:35:59.068
What they needed most was professional development, training, or hands on time.

00:35:59.698 --> 00:36:00.868
But here's what strikes me.

00:36:00.868 --> 00:36:03.778
There's a perceived need for someone to teach them.

00:36:04.378 --> 00:36:08.188
Schools have created an attitude of dependency on teachers to teach us things.

00:36:08.488 --> 00:36:11.998
One participant Beth said, I'm good creatively.

00:36:11.998 --> 00:36:16.288
I'm not good technologically, so let's sit together and we're better.

00:36:17.188 --> 00:36:18.718
I see a problem with this.

00:36:18.898 --> 00:36:22.738
Principals mistakenly believe the only way to learn is by an expert teaching them.

00:36:23.218 --> 00:36:29.428
I was brought in from Washington to Wyoming as the expert, but I didn't gain my knowledge from experts myself.

00:36:29.458 --> 00:36:32.488
I learned by experimenting and trying new things myself.

00:36:33.298 --> 00:36:37.978
AI has been trained on the content of the internet where there are many tutorials and all kinds of things.

00:36:38.218 --> 00:36:42.568
AI can eliminate the need for a teacher to be the fountain of knowledge as we currently see them.

00:36:43.108 --> 00:36:51.298
Instead, we should work towards systems that make experts a compass, constantly reminding learners of their North Star and helping them find their way back to it.

00:36:51.598 --> 00:36:53.038
This was my role in the training.

00:36:53.458 --> 00:37:00.638
Whenever they wanted to just be more efficient, I had to remind them that our purpose was to create cognitive equity for ourselves, which means doing things that were not.

00:37:01.753 --> 00:37:02.503
Were not possible.

00:37:02.503 --> 00:37:09.643
Previously, one principal initially wanted to solve student misuse of technology by composing an email about why it was problematic.

00:37:10.123 --> 00:37:14.323
Sure, AI can do that, but would that actually solve the problem?

00:37:14.833 --> 00:37:15.583
Of course not.

00:37:15.793 --> 00:37:18.103
Nobody even reads emails with coaching.

00:37:18.103 --> 00:37:22.903
He ended up creating a game for students to learn about digital citizenship in an interesting way.

00:37:23.408 --> 00:37:33.133
In the old days, he wouldn't have had the time, expertise, money or knowledge to make any kind of app, but now he can do something that was not possible before.

00:37:35.413 --> 00:37:39.463
Here's something else that I observed that the data seems to have confirmed.

00:37:40.018 --> 00:37:45.478
Learning to use AI for complex tasks involves predictable fluctuations in demeanor.

00:37:45.628 --> 00:37:47.608
I call it the effective rollercoaster.

00:37:48.478 --> 00:37:54.088
AI enthusiasts began with guarded optimism as they encountered compelling demonstrations.

00:37:54.088 --> 00:38:02.248
Their demeanor rose, excitement and curiosity showed on their faces during the ugly portion of my presentation.

00:38:02.248 --> 00:38:07.228
When limitations and ethical concerns dip, they dipped as well.

00:38:07.978 --> 00:38:11.938
Participants said things like, we should just ban it if this is what happens.

00:38:12.688 --> 00:38:15.868
AI skeptics started at a more negative place.

00:38:16.378 --> 00:38:21.478
Early demonstrations elicited concerns about job displacement and the black box nature of ai.

00:38:21.898 --> 00:38:25.798
Their line remained lower, but followed a similar oscillating pattern.

00:38:26.458 --> 00:38:28.588
Again, this was by design.

00:38:28.888 --> 00:38:36.928
The lowest point for both groups occurred around the first failed solution, and although that was a a different time for each person.

00:38:37.903 --> 00:38:44.353
Pretty much everybody got there when participants recognized that their initial prompts didn't yield what they had hoped.

00:38:44.833 --> 00:38:48.583
This moment of productive struggle is central to the training design.

00:38:48.673 --> 00:38:58.813
It forced participants to refine their problem framing and engage more thoughtfully with the solutions as they iterate and eventually produce more useful out output.

00:38:58.843 --> 00:39:01.348
But it was still often imperfect, but noticeably better.

00:39:01.678 --> 00:39:04.423
Their traject turned sharply upward.

00:39:04.993 --> 00:39:08.653
One participant joked about designing a beautiful black screen.

00:39:08.893 --> 00:39:10.453
Their prototype wasn't working.

00:39:10.903 --> 00:39:13.003
Later, they said It's incredible.

00:39:13.063 --> 00:39:14.113
It really is.

00:39:14.623 --> 00:39:19.093
Every participant, hi a wall, and every participant pushed through.

00:39:21.373 --> 00:39:22.303
There are some.

00:39:22.813 --> 00:39:26.023
another thing that I asked about was the culture impacts.

00:39:26.233 --> 00:39:31.888
And when I asked how innovation work would affect school culture, the most common responses were greater collaboration.

00:39:32.668 --> 00:39:34.078
And buy-in from teachers.

00:39:34.438 --> 00:39:37.018
These are essential for healthy functioning schools.

00:39:37.498 --> 00:39:45.958
Grissom's research that I mentioned earlier identifies the importance of principals building productive school climate, and facilitating collaboration for student outcomes.

00:39:46.348 --> 00:40:02.938
When school professionals have time to collaborate, because they're not wasting time on superfluous things, they can foster collaboration and teacher buy-in Through
regular professional development, collaborative planning, and peer observation, participants also mentioned that innovation could help them improve success breed success.

00:40:03.298 --> 00:40:09.568
Innovation breeds innovation and principals who model innovation, teach others how to do the things they're asking them to do.

00:40:09.928 --> 00:40:14.248
Principals who model behaviors they expect get the same behaviors from their teachers.

00:40:14.698 --> 00:40:19.198
As one participant said, if you know what you want, you can build it yourself.

00:40:19.468 --> 00:40:24.808
Imagine hearing that from your principal and thinking, I can build it myself.

00:40:25.108 --> 00:40:27.958
That's an empowering message to share with your staff.

00:40:28.693 --> 00:40:32.413
So there's some implications from this research.

00:40:32.563 --> 00:40:37.153
The first implication is that we need to fundamentally reframe how we talk about AI in schools.

00:40:37.903 --> 00:40:43.573
There are two aspects to repositioning AI from a time saving tool to an innovation partner.

00:40:44.143 --> 00:40:49.453
First, reframing AI as a problem solving tool shifts the focus to address the problem first.

00:40:49.873 --> 00:40:59.233
This involves employing design thinking, and recognizing that teachers that teaching a specific tool may not always be beneficial when focusing on tools.

00:40:59.233 --> 00:41:02.953
Individuals learn how to use the tools when focusing on problems.

00:41:02.983 --> 00:41:04.963
They learn how to solve problems.

00:41:05.443 --> 00:41:21.433
Second, presenting AI through cognitive equity and design thinking allows principles to perceive alternative methods of operation and underscores using AI for human-centered purposes, enabling more time for individual interactions, relationship building, and providing feedback.

00:41:21.868 --> 00:41:28.888
AI should automate processes to allow more time for interpersonal engagements rather than merely increasing productivity.

00:41:29.608 --> 00:41:33.718
I love the term term innovation partner because it captures this shift.

00:41:33.988 --> 00:41:39.988
A partner works alongside you bringing capabilities you don't have and helps you do the things you couldn't do alone.

00:41:41.218 --> 00:41:43.828
Implication two is about time.

00:41:45.028 --> 00:41:47.608
Principals need real time to learn and experiment.

00:41:47.698 --> 00:41:51.298
Not 45 minute sessions crammed in between other obligations.

00:41:51.628 --> 00:42:02.038
Cal Newport defines this work as defines deep work as professional activities performed in a state of distraction-free concentration that push your cognitive capabilities to their limit.

00:42:02.548 --> 00:42:07.558
These efforts create new value, improve your skill, and are hard to replicate.

00:42:07.678 --> 00:42:09.538
That's from his book Deep Work.

00:42:10.318 --> 00:42:13.078
When principals say they need time, this is what they really need.

00:42:13.228 --> 00:42:20.368
The benefit of this six hour workshop was that principals had the time to use the tools away from their campus and their everyday distractions.

00:42:21.418 --> 00:42:25.228
During the afternoon, they had most of the time just to work on their problem.

00:42:25.348 --> 00:42:31.858
They talked about it, took breaks, came back to it, worked with AI tools, took another break and came back to it again.

00:42:32.458 --> 00:42:36.778
This can be described as working on the school rather than merely working in it.

00:42:37.318 --> 00:42:46.828
Lecture only or tool focused workshops are unlikely to produce substantial changes because participants lack the time to practice during these sessions.

00:42:47.218 --> 00:42:52.318
The experience of deep work and time to do it was the most valuable part for principals in this workshop.

00:42:54.208 --> 00:42:58.888
The third implication direct comes directly from observing that effective roller coaster.

00:42:59.188 --> 00:43:00.928
We need to normalize this journey.

00:43:01.018 --> 00:43:02.518
Frustration isn't a failure.

00:43:02.518 --> 00:43:05.878
It's a sign that you're doing something that's hard enough to matter.

00:43:06.628 --> 00:43:11.728
The concept of productive struggle is often discussed in education, but seldom applied to adult learning.

00:43:12.268 --> 00:43:18.478
This workshop provided such a struggle resulting in participants feeling more adept at using AI for innovation.

00:43:19.423 --> 00:43:22.783
This has practical implications for how we design professional development.

00:43:22.993 --> 00:43:26.053
We should tell participants upfront, you will get frustrated.

00:43:26.353 --> 00:43:28.813
Your FO first prototype will probably break.

00:43:28.933 --> 00:43:29.593
That's normal.

00:43:29.593 --> 00:43:30.883
That's how this works.

00:43:31.723 --> 00:43:34.693
And then we should build in support for those low moments.

00:43:34.873 --> 00:43:44.323
Peer collaboration, facilitator check-ins, permission to take breaks and open-ended work time with a deadline for some known unfinished product.

00:43:45.343 --> 00:43:52.033
If we don't prepare the people for the struggle, they'll interpret their frustration as evidence that they're not cut out for this.

00:43:52.333 --> 00:43:58.843
But if we frame it as expected as part of the process, they're much more likely to push through to the breakthrough they're seeking.

00:44:00.403 --> 00:44:04.723
Lemme be honest about limitations, because the study has several of them.

00:44:04.993 --> 00:44:09.463
The sample was small, just 10 responding participants, 11 total.

00:44:09.793 --> 00:44:14.803
And they represent a self-selected group of innovators willing to attend a six hour workshop on a Saturday.

00:44:15.373 --> 00:44:18.493
Their willingness to give up personal time suggests they were open.

00:44:19.033 --> 00:44:23.653
To change more than the average principle, which also limits transferability.

00:44:24.613 --> 00:44:29.803
I Jethro Jones conducted the full day presentation with my own biases embedded in the design.

00:44:30.223 --> 00:44:41.983
I have a clear bias toward innovation and just creating stuff instead of efficiency because I believe certain practices shouldn't even be done in schools, even if they've been done for years.

00:44:42.703 --> 00:44:51.073
I also emphasize what I view as the hypocrisy of AI for me, but not for the, when teachers use AI but prohibit students from using it.

00:44:51.913 --> 00:44:53.293
There's no control group in this.

00:44:53.293 --> 00:44:58.033
The measures were self-reported and I didn't directly observe subsequent school level changes.

00:44:58.483 --> 00:45:00.043
All of that was beyond the scope.

00:45:00.523 --> 00:45:14.563
And finally, the study did not address the environmental, political, or societal costs of AI infrastructure or financial costs of ongoing subscriptions, as well as many other things, which are also important considerations, but are beyond the scope of this project.

00:45:16.033 --> 00:45:22.633
These findings are exploratory and not definitive, and to me, they're proof of a concept and not a final answer.

00:45:23.683 --> 00:45:26.443
Despite those limitations, though, I contend this work matters.

00:45:26.503 --> 00:45:40.963
The purpose was not to produce generalizable claims, but to closely examine how a small group of innovator, innovator principles experience a deliberately designed professional learning environment that positioned AI as a resource for innovation rather than mere efficiency.

00:45:41.533 --> 00:45:56.443
The combination of quantitative shifts and self-report understanding qualitative themes about supports and barriers, and a rich description of the training offers a grounded practice oriented account of what it could look like when principals begin to use AI to tackle wicked problems.

00:45:57.073 --> 00:45:59.923
For a dissertation in practice that situated.

00:46:00.478 --> 00:46:03.778
Experience near knowledge is precisely the contribution.

00:46:04.198 --> 00:46:17.758
It provides a concrete, replicable, replicable model services, real constraints of possibilities and points the way for future studies that can extend, refine, and test these ideas with larger samples over longer periods.

00:46:18.238 --> 00:46:30.208
As I've done this work for the past eight years in my own consulting, I've seen how much more valuable this type of approach is for any kind of problem based training that is happening in schools.

00:46:32.338 --> 00:46:34.888
Based on this, I do have five recommendations.

00:46:35.098 --> 00:46:41.828
First, we should design professional development that is sustained and not one shop work.

00:46:41.888 --> 00:46:47.378
One shop workshops, single PD sessions are insufficient and follow up is crucial.

00:46:48.008 --> 00:46:52.418
A subsequent session with this group is scheduled for one year after the initial training.

00:46:53.258 --> 00:46:58.778
Second, focus professional development on authentic problems, not on AI tools in the abstract.

00:46:59.213 --> 00:47:02.633
When focusing on problems, principals learn how to solve their problems.

00:47:03.023 --> 00:47:09.803
This was evident in participants projects, which included various tools, some which were not initially suggested by me.

00:47:11.693 --> 00:47:20.813
That's a lot more scary for the presenter, by the way, when you go in without knowing the exact answer, I didn't include that in here, but that's a good point to make that I that I keep thinking of.

00:47:21.623 --> 00:47:25.493
Third, we should integrate AI leadership into principal preparation programs.

00:47:25.493 --> 00:47:29.453
Now, our university programs are probably behind.

00:47:29.513 --> 00:47:33.743
We are preparing principals for a world that's changing faster than our curriculum is.

00:47:34.793 --> 00:47:38.093
Fourth, provide protected time for experimentation.

00:47:38.183 --> 00:47:40.378
Deep work requires protected space.

00:47:41.278 --> 00:47:52.973
And fifth, again, this is typically out of the scope of professional development, but we should conduct cost benefit analysis that are, include environmental and other implement implications.

00:47:53.453 --> 00:47:56.873
AI isn't free, not financially, and not in many other ways.

00:47:58.833 --> 00:48:03.128
The future research should also explore longer timeframes and diverse contexts.

00:48:03.128 --> 00:48:09.338
Conducting similar studies with larger and more diverse samples across various stages and types of schools would be advantageous.

00:48:09.788 --> 00:48:18.968
A longitudinal follow-up within six to 12 months would determine which practices persist and whether the workshops impact endured beyond the day of training.

00:48:19.628 --> 00:48:25.298
Although this dissertation did not include such follow-up, I plan to engage with these principals at a future conference.

00:48:26.138 --> 00:48:35.323
We would also benefit from systematic measurement of effective responses towards ai, attitudes, anxiety and confidence before, during, and after PD sessions.

00:48:36.463 --> 00:48:42.978
And we need to compare different professional development designs to reveal which methods yield more durable changes.

00:48:43.398 --> 00:48:48.648
However, one of my goals for this training with innovators was to help them see how they could use I AI differently.

00:48:49.308 --> 00:48:59.238
And they would hopefully take some of those lessons back to their schools and communities and suggest using AI for innovation instead of just efficiency when the conversation comes up.

00:49:01.188 --> 00:49:04.578
The purpose of this dissertation and practice is to change practice.

00:49:04.668 --> 00:49:08.748
For me, that practice is changing how professional development works for principals.

00:49:09.048 --> 00:49:15.348
There's a plethora of research that says PD should be job embedded and focused on the real challenges principals face.

00:49:15.858 --> 00:49:20.808
If I had to distill this entire dissertation and practices to two sentences, here's what I'd say.

00:49:21.468 --> 00:49:25.698
A focus on problems, not tools, and give time for deep work.

00:49:26.748 --> 00:49:27.918
The tool is secondary.

00:49:28.038 --> 00:49:29.298
The problem is primary.

00:49:29.358 --> 00:49:41.418
Start with what you're trying to solve and then find the tool that helps real learning, especially learning that involves creativity and problem solving requires sustained attention, and we can't shortcut this.

00:49:41.448 --> 00:49:43.248
AI won't do that for us either.

00:49:44.028 --> 00:49:56.148
I believe two critical issues are often overlooked in professional development, focusing on solving problems rather than merely teaching tools and providing sufficient time during workshops for human-centered approaches to real issues.

00:49:58.218 --> 00:50:01.998
AI will keep evolving faster than we can track in in the time.

00:50:01.998 --> 00:50:15.558
Since I started this project, OpenAI has released chat, GPT five, 5.1, 5.2, and after my presentation, my practice presentation last week, they released chat, GPT 5.3.

00:50:16.098 --> 00:50:17.688
The pace of change is remarkable.

00:50:17.688 --> 00:50:19.908
Every day I wrote this, there is more to add.

00:50:20.298 --> 00:50:27.408
By the time you read this dissertation, when it is published, it will be obsolete, and that is part of what we're dealing with.

00:50:27.888 --> 00:50:34.308
But the core principle holds individuals are leveraging AI primarily for efficiency rather than innovation.

00:50:34.758 --> 00:50:39.078
Focusing on expediting tasks without questioning the necessity of such tasks.

00:50:39.618 --> 00:50:47.688
Simply asking principals to reflect on their practice and consider whe whether they should even take a certain action, can catalyze innovation.

00:50:48.228 --> 00:50:54.198
Our commitment should remain focused on using AI to enhance the human capacity rather than diminish it.

00:50:54.618 --> 00:50:58.758
Principals play a vital role in stewarding this objective within their schools.

00:50:59.208 --> 00:51:01.938
I'm betting on innovation, and I hope you will too.

00:51:03.648 --> 00:51:09.738
I wanna thank my committee, Tom, Mindy, and Linda for being excellent supporters and guiding me through this.

00:51:09.948 --> 00:51:13.608
Tom, you've been a mentor for years and I'm grateful you invited me into this program.

00:51:14.058 --> 00:51:24.768
Mindy, your methodological guidance was invaluable and your gargantuan efforts that helped me and my cohort through this process that have all been behind the scenes have been amazing.

00:51:25.068 --> 00:51:30.978
And Linda, you push my thinking in ways that I didn't expect and help me to see the technology with clearer eyes.

00:51:31.428 --> 00:51:35.208
I also want to thank the Wyoming principals who gave up their Saturday of participate in this study.

00:51:36.198 --> 00:51:38.658
And thank you to my wife Stacy, and.

00:51:40.803 --> 00:51:41.733
And my kids.

00:51:43.293 --> 00:51:47.313
I know this has been tough and I'm so grateful to you.

00:51:49.323 --> 00:51:55.333
Anything good, that has happened in my life has come through my savior Jesus Christ, and I don't do anything without consulting him.

00:51:56.203 --> 00:52:03.523
And he was very clear this doctoral program would be worthwhile even when it didn't make a lot of sense.

00:52:05.053 --> 00:52:07.723
These are the key references I've cited in this presentation.

00:52:07.723 --> 00:52:12.493
The full reference list is in my dissertation document and on my website, dr Jethro dot com.

00:52:13.543 --> 00:52:19.013
I've, presented proof of concept for innovation focused AI professional development for principles.

00:52:19.133 --> 00:52:22.433
I've shared what worked, what got in the way and what we might do next.

00:52:23.033 --> 00:52:30.243
Not a single person left my, presentation with a fully completed solution, and none of them were disappointed on that.

00:52:31.213 --> 00:52:36.823
on the contrary, they were eager to continue working on it and try new things that had not yet, they had not yet considered.

00:52:37.438 --> 00:52:43.618
During the showcase portion, each person said something similar to, I know what else I need to do to finish this.

00:52:45.088 --> 00:52:46.948
That's exactly where I want people to be.

00:52:47.158 --> 00:52:50.458
Now, I welcome your questions, challenges, and feedback.

00:52:51.208 --> 00:52:51.508
Thank you.

00:52:56.158 --> 00:52:58.818
Can you, close your PowerPoint so we can see everybody?

00:52:59.013 --> 00:53:00.078
I sure can.

00:53:00.918 --> 00:53:01.998
And then let me start off.

00:53:02.028 --> 00:53:03.948
that was, that was wonderful.

00:53:04.028 --> 00:53:04.388
yeah.

00:53:04.478 --> 00:53:05.258
Really nice.

00:53:05.308 --> 00:53:10.458
so Linda, Mindy, do you have any questions or comments before I open it to the group?

00:53:11.508 --> 00:53:13.638
Sure, I'll go ahead and kick us off.

00:53:13.638 --> 00:53:15.258
Jethro, really well done.

00:53:15.318 --> 00:53:23.808
significant improvement from our practice, so very much more an academic rigorous focus, which is what we asked of you, and appreciate that.

00:53:24.118 --> 00:53:36.458
I will maintain what I said at the beginning, what you've produced here is a significant contribution to the field because yeah, innovation is definitely a way we could be using this technology and more people need to think of it that way.

00:53:36.488 --> 00:53:43.458
Need to approach from this perspective of what's the problem you're trying to solve and for whom, as opposed to what can this tool do.

00:53:44.408 --> 00:53:48.998
Because the tech companies that I've worked for, they build the tech, right?

00:53:48.998 --> 00:53:51.368
And it's just like, because they can.

00:53:51.698 --> 00:53:57.728
But the real issue for humanity is how are we gonna use this tool for the betterment of humanity?

00:53:57.728 --> 00:53:57.998
Right.

00:53:57.998 --> 00:54:07.478
And I love that you included the distinction between replacement technology versus assistive technology, because that is totally my job.

00:54:07.568 --> 00:54:09.608
So I wanted to thank you for including that.

00:54:09.863 --> 00:54:12.023
Yes, you have inspired me greatly in that arena.

00:54:12.023 --> 00:54:12.328
Thank you.

00:54:14.243 --> 00:54:14.873
Thank you, Linda.

00:54:14.873 --> 00:54:15.413
Mindy,

00:54:17.723 --> 00:54:17.843
you're.

00:54:22.953 --> 00:54:25.908
I just thought it was, a wonderful, presentation and.

00:54:27.063 --> 00:54:42.003
Scott, I, you know, wanna have a whole conversation with you about so many of these things, but the key that I think is that what you discovered and, and what you found and the implications go beyond the technology.

00:54:42.223 --> 00:54:56.973
so I really do believe that even though AI is progressing faster than any of us can, certainly than we could publish it, that your findings and your insights are, are really valuable to education.

00:54:57.273 --> 00:55:06.323
And the way that you have boiled it down, to your two major insights, I think are, are really, contribute to the field.

00:55:06.623 --> 00:55:20.183
And I'd like to know how you are going to, just knowing how good you are at dissemination, how you are planning on beating that, that publication bias that we have that takes so long.

00:55:20.183 --> 00:55:21.923
What are you gonna do with this next?

00:55:22.418 --> 00:55:27.818
Well, this recording is gonna go out on my podcast transformative principle, probably this Sunday.

00:55:28.178 --> 00:55:29.588
And so that will be out.

00:55:29.648 --> 00:55:47.258
And then my dissertation will be published as soon as I get it back from my editor, on my website for people to be able to see and access as soon as possible
because, you know, that's what I've been doing, the whole entire program, all of my assignments, I publish them on the, on the web, as I turn them in.

00:55:47.258 --> 00:55:52.368
So, I believe in that stuff, getting out there and not being stuck behind a long process.

00:55:52.368 --> 00:55:56.328
And, and so I'm gonna take a bias towards that as much as possible.

00:55:57.073 --> 00:56:04.278
Well, Jethro, I'd like to invite you to present to my class of 22, 2 weeks from now.

00:56:04.458 --> 00:56:05.028
So,

00:56:05.088 --> 00:56:05.538
got it.

00:56:06.018 --> 00:56:06.528
great.

00:56:07.698 --> 00:56:08.268
Go ahead Tom.

00:56:09.388 --> 00:56:10.533
I'm sorry, Mindy, go ahead.

00:56:10.668 --> 00:56:11.958
No, I was just saying go ahead, Tom.

00:56:12.948 --> 00:56:16.578
So, so let me, make one comment before I open it to the, to the crowd.

00:56:16.578 --> 00:56:18.598
And, I'm gonna take a different tack.

00:56:18.598 --> 00:56:20.878
I too am sitting here just delighted.

00:56:20.878 --> 00:56:22.048
Really, really pleased.

00:56:22.378 --> 00:56:24.958
But what I wanna talk about is Jethro Row's grit.

00:56:25.558 --> 00:56:34.388
I write a lot about social emotional learning, and I gotta tell you, I've been with him through this journey and there were lots of times when he could have said, Hey, it's not worth it.

00:56:34.388 --> 00:56:35.168
I don't need it.

00:56:35.528 --> 00:56:36.338
But he didn't do that.

00:56:36.338 --> 00:56:38.528
He plugged away, he worked.

00:56:38.528 --> 00:56:48.518
And so what you see today, which is a thoughtful, insightful, really well done presentation, it's not only due to his intellect, it's due to the fact that he had grit.

00:56:48.548 --> 00:56:50.108
He hung in and he pushed forward.

00:56:50.108 --> 00:56:51.918
So, good for you, Jethro.

00:56:52.038 --> 00:56:57.018
So, anybody else, feel free to, I guess Jethro raise their hand or what?

00:56:57.018 --> 00:57:00.288
We, we would welcome comments or questions from anybody.

00:57:02.448 --> 00:57:03.258
Go ahead, Gina.

00:57:06.038 --> 00:57:08.348
First, thank you for, for, for having me.

00:57:08.348 --> 00:57:09.638
This was really a treat.

00:57:09.638 --> 00:57:12.008
This is the first UMSL dissertation that I've seen.

00:57:12.378 --> 00:57:15.078
and I, I, I too was very, very impressed.

00:57:15.498 --> 00:57:15.853
I know very.

00:57:16.278 --> 00:57:19.428
All about ai, so you really have educated me on that.

00:57:19.928 --> 00:57:37.328
as a, a retired high school principal, I would've loved to have had the innovative bug to use the tools to solve problems, but what I really was impressed with in, in the dissertation was how you kind of uncovered the, the trap of professional development for principals.

00:57:38.108 --> 00:57:54.188
And I think my question, or, or what I'd, I'd like to have a maybe a, a follow up conversation with you on is how do we change the mindset from productive struggle is great for kids, but it's not okay for me as the adult leader of the building.

00:57:54.608 --> 00:58:13.448
You know, how do we make that, that professional development, that growth opportunity that you described in your dissertation and really, you know, change that paradigm from sit and get to that, that exploration mode and, and you know, setting, putting adult leaders.

00:58:13.778 --> 00:58:16.208
Back into that, that, that learning seat.

00:58:17.608 --> 00:58:18.118
yeah.

00:58:18.188 --> 00:58:27.788
my, 700 episodes on transformative principle and the masterminds that I've run for the last eight years have been focused solely on that.

00:58:27.848 --> 00:58:35.948
And when we take ego out of the equation and give people opportunities to be vulnerable, then, then that is what happens.

00:58:35.948 --> 00:58:46.748
And the point that I mentioned in the dissertation is that when people are tasked with problems first, then they have the opportunity to do that.

00:58:46.748 --> 00:58:51.308
And that really is key that you have to be open about the problems that people are facing.

00:58:51.548 --> 00:58:59.408
And if you are, then people can drop those barriers and have that kind of a professional development experience.

00:59:02.083 --> 00:59:02.353
Great.

00:59:02.353 --> 00:59:02.593
Thank

00:59:02.668 --> 00:59:04.558
I I agree with you a hundred percent.

00:59:04.558 --> 00:59:10.168
I just have never experienced a professional development like that, you know, in my career, which is unfortunate.

00:59:10.748 --> 00:59:12.158
Yeah, that is unfortunate.

00:59:12.158 --> 00:59:13.928
And that is what a lot of principals have.

00:59:14.878 --> 00:59:20.668
And that is why I created my mastermind approach to do just that and give people opportunities for that.

00:59:20.668 --> 00:59:26.358
So, I've seen that, all too clearly in my own professional development provided to me.

00:59:26.748 --> 00:59:35.118
And the reason that I started my podcast in 2013 was because I was an assistant principal and didn't think that I was getting the kind of professional development I wanted.

00:59:35.478 --> 00:59:37.158
And so I decided to make my own.

00:59:37.308 --> 00:59:43.638
And I've heard too many principals say that they haven't had that kind of pd and, and that really is a tragedy.

00:59:43.758 --> 00:59:47.808
So I, I definitely want to do my part to help make that happen.

00:59:49.068 --> 00:59:49.878
it is possible.

00:59:50.758 --> 00:59:51.188
Thank you.

00:59:51.423 --> 00:59:53.403
There are questions, comments.

00:59:57.678 --> 00:59:58.308
Go ahead Trish.

00:59:58.308 --> 00:59:58.788
Thank you.

01:00:00.763 --> 01:00:08.628
I just wanted to say I wish I, so I know a principal, I mean, I've never been a principal, but I was a teacher for 23 years and I've taught all the.

01:00:08.658 --> 01:00:11.808
Grades from preschool all the way through middle school.

01:00:12.468 --> 01:00:14.898
And I've never had a principal.

01:00:14.898 --> 01:00:27.468
I've had some wonderful principals, but I will tell you in the same school when a principal changes and someone else comes in that is not as strong or that is stronger, I mean the whole school changes.

01:00:27.888 --> 01:00:39.078
I mean, I think people don't understand just how important the principal is on campus and sometimes it's this us, them mentality instead of, we're all on the same team here.

01:00:39.078 --> 01:00:44.088
This is our, this is our school, this is our community, this is our safe space for students.

01:00:44.508 --> 01:00:50.868
And I feel like this is what you're creating, kind of creating this vision for people that this is possible.

01:00:51.618 --> 01:00:53.358
And, and I really appreciate that.

01:00:53.358 --> 01:01:06.018
And a lot of these concepts that you've put into your training and that I've listened to over a while, you know, and I've been a guest on your podcast, and I realized more and more as you were going through a lot of the things.

01:01:06.568 --> 01:01:10.858
in the, in your presentation, I was like, yeah, I'm doing this.

01:01:10.858 --> 01:01:19.258
Like in my presentation, I actually have them put in the prompt for helping kids define the purpose for their learning.

01:01:19.708 --> 01:01:34.168
I spent hours putting together a really solid prompt, and we usually use Gemini, sometimes chat, but we put in the prompt and then it comes up with 10 to 15 ways that a student can actually use whatever they're gonna be learning about in their lives.

01:01:34.168 --> 01:01:39.178
And they get to choose either something on the list or something they choose on this themselves.

01:01:39.478 --> 01:01:46.078
And I thought, you know, I got these kinds of concepts, this idea to do this from you.

01:01:46.138 --> 01:01:50.518
'cause I was just having kids brainstorm and they were, you know, getting confused.

01:01:50.518 --> 01:01:51.478
They weren't really sure.

01:01:51.478 --> 01:01:53.788
They didn't have enough background information.

01:01:53.788 --> 01:01:54.148
They didn't.

01:01:54.448 --> 01:02:00.118
And I thought, you know, I've been worried about ai, like writing papers for kids and that kind of thing.

01:02:00.118 --> 01:02:01.588
'cause that's what was in the news.

01:02:01.978 --> 01:02:05.578
But what I really got from you was, wait a minute, turn that on its head.

01:02:06.178 --> 01:02:09.748
How about if you have AI come up with, here's a whole bunch of ways.

01:02:09.748 --> 01:02:12.298
I mean, that I, that that was you.

01:02:13.378 --> 01:02:31.708
So I just wanted to acknowledge what you're doing because I don't know if you've listened to his episodes on transformative principle, but there, there really,
really helpful and now I am consulting with principals and working with teachers on this project thing that I'm doing, and it's, it's just been invaluable.

01:02:31.708 --> 01:02:33.373
So thank you for what you're doing.

01:02:34.238 --> 01:02:34.953
Thank you.

01:02:37.693 --> 01:02:38.173
Anybody else?

01:02:39.243 --> 01:02:39.843
Thank you, Dan.

01:02:39.843 --> 01:02:40.413
Go ahead.

01:02:40.503 --> 01:02:41.253
And then Aaron,

01:02:43.348 --> 01:02:44.008
Congratulations.

01:02:44.008 --> 01:02:44.548
Well done.

01:02:44.648 --> 01:02:48.518
as someone who's studying AI and school leadership, I just wanted to thank you.

01:02:49.208 --> 01:02:53.078
there is a huge gap in the literature around AI and how school principals are tackling.

01:02:53.573 --> 01:02:57.743
There is a significant gap in the literature in regards to AI and school principals.

01:02:57.773 --> 01:02:59.543
we're looking at teachers, we're looking at students.

01:02:59.923 --> 01:03:04.423
and I've seen that in my research and I appreciate you, adding to the literature here, and, wish you well.

01:03:04.423 --> 01:03:08.083
Congratulations, and you've been a great friend and mentor over the years, and, really proud of you.

01:03:08.383 --> 01:03:08.773
Good job.

01:03:09.283 --> 01:03:09.613
Thank you.

01:03:09.613 --> 01:03:10.273
Appreciate it.

01:03:10.303 --> 01:03:17.738
And I wanted to finish, early enough that you'd be able to cite me, Dan, so, all right, go ahead Eric.

01:03:20.088 --> 01:03:20.608
Thanks, Jethro.

01:03:20.678 --> 01:03:20.968
Yeah.

01:03:21.078 --> 01:03:22.598
I just wanna say very well done.

01:03:22.818 --> 01:03:25.623
I'm impressed at how this has come together.

01:03:25.623 --> 01:03:28.503
I've seen different parts of it throughout the stages, but this is a really.

01:03:28.623 --> 01:03:34.653
Will experience to see this version today and wanted to comment on the marketing part.

01:03:34.833 --> 01:03:42.693
we all know that's true, but I mean, there was literally a Super Bowl commercial that somebody paid millions of dollars for yesterday that said, use our AI tool, take the day off.

01:03:43.353 --> 01:03:53.403
And as someone who's been working with AI and education, that's always people like it could just write my emails and it's turning what I already did into an automated thing.

01:03:53.403 --> 01:03:59.643
And I have met very few people who approach it like you, who say, but should you even be doing that?

01:03:59.643 --> 01:04:06.603
Like you're, you're applying a an amazing potential thing to just do more of what you already did.

01:04:07.653 --> 01:04:10.803
And I know that can be hard to get people who want to hear that message.

01:04:10.803 --> 01:04:13.533
'cause they always say, you know, well, what would that even look like?

01:04:13.533 --> 01:04:15.243
How do I reimagine things?

01:04:15.723 --> 01:04:25.483
But to encourage you as both a friend, someone who's learned from you a lot and someone who's also in education's, like, we need more people who have that framework of.

01:04:26.488 --> 01:04:27.598
Reimagine things.

01:04:27.598 --> 01:04:29.038
Don't just spin the hamster wheel.

01:04:29.038 --> 01:04:30.028
You're already on faster.

01:04:30.058 --> 01:04:34.443
'cause that's 99% of the messaging and the products and the application of it that I see.

01:04:36.368 --> 01:04:37.783
Yeah, I appreciate that.

01:04:37.783 --> 01:04:38.353
Thank you.

01:04:40.423 --> 01:04:41.173
Go ahead, Linda.

01:04:42.668 --> 01:04:44.263
I just wanna throw one more thing in there too.

01:04:44.683 --> 01:04:53.343
I mean this, the approach that you've taken was actually the same approach that I took, but with children like aged 14 to 17 back in 2019.

01:04:53.613 --> 01:04:56.433
So this whole fo focus on what is the problem you're trying to solve.

01:04:56.463 --> 01:05:08.103
Is it, what I love is that you're introducing it to principals because again, as I believe it was Gina had said, the principal is, you know, it changes the whole dynamic and tone of the school.

01:05:08.103 --> 01:05:13.053
And if you start there, then it can encourage the trickle down.

01:05:13.173 --> 01:05:19.923
'cause kids are already, especially, you know, digital natives are already gonna be wanting to get in and use the tools, how they want to use them.

01:05:20.133 --> 01:05:25.203
It's the adults that have lost that, that feel that they need to be taught by someone.

01:05:25.383 --> 01:05:28.623
I mean, Jethro, you went out and just played around.

01:05:28.773 --> 01:05:32.853
I go out and just play around, but a lot of teachers, because of their profession.

01:05:33.273 --> 01:05:37.223
And time and other things just, you know, tell me, show me.

01:05:37.523 --> 01:05:44.783
Whereas now you're opening up this whole new field of possibilities because the assistive aspect of technology is the thing.

01:05:44.843 --> 01:05:51.563
And if you think this is cool, wait till you see what's coming out in my newsletter tomorrow in terms how you can expand your senses.

01:05:51.713 --> 01:05:53.063
'cause that's where I'm focusing now.

01:05:53.483 --> 01:06:02.363
People who can't see, people who can't hear, and even people who can, you don't see like other animals can see because we don't even think about that, right?

01:06:02.363 --> 01:06:06.653
So, I love Erin touched on it and your emphasis is completely here.

01:06:06.803 --> 01:06:12.473
How do we extend the capabilities that we already have as opposed to just automate things that we can easily do?

01:06:12.533 --> 01:06:21.053
Because, especially if you consider the environmental, ramifications of using this technology, do you really wanna, you know, waste all that water just to write an email for you?

01:06:21.473 --> 01:06:26.003
Or is it an, can you justify the use because of what you're creating?

01:06:26.213 --> 01:06:29.483
And that's the question that we need to have, again, beyond the scope of your dissertation.

01:06:29.483 --> 01:06:30.833
But we do need to ask that question.

01:06:30.863 --> 01:06:32.693
And that's the research that I like to see going ahead.

01:06:32.693 --> 01:06:35.993
For those of you who are doing more research, I wanna see systematic replications.

01:06:36.023 --> 01:06:38.453
I wanna see the application and transfer of training.

01:06:38.453 --> 01:06:42.083
Like you had me, and you mentioned this already, Jethro, so I'm just repeating things that you'd already said.

01:06:42.383 --> 01:06:49.403
But I also want people to imagine how assistive technology extends the fa, the facilities and abilities of the user.

01:06:50.063 --> 01:06:54.143
Instead of just relying on, you know, this is the tech that the company pushed out to me.

01:06:54.323 --> 01:06:58.253
Now you are empowered to envision and build your own prototypes.

01:06:58.313 --> 01:07:13.163
And if you don't have the technical skills to take it from prototype to MVP, then work with people who can, because that's what I see as the great equity piece here, is that, you know, I learned how to code when I was 14, but now anybody can do the things as long as you have the imagination for it.

01:07:13.283 --> 01:07:14.993
So that's the big piece.

01:07:15.053 --> 01:07:18.173
And I wanna challenge all of you to start thinking that way about tech.

01:07:19.233 --> 01:07:38.198
And there are, there are two people I know who have expanded or completely created their businesses based on being able to tell an AI what they want, something to be able to do, and then having it create it for them at a prototype stage and then hiring an expert to take it to the next level.

01:07:38.198 --> 01:07:40.808
And that is, that is very powerful.

01:07:40.808 --> 01:07:44.948
And these two people could not have done this.

01:07:45.698 --> 01:07:52.688
Four years ago, and in fact they had the idea then, but couldn't bring it to pass because it was too expensive, it was too prohibitive.

01:07:52.688 --> 01:08:04.808
And now they have created whole businesses around it that are flourishing because they're serving people in a way they knew would work, but they just didn't have the resources to make it happen.

01:08:04.838 --> 01:08:07.568
Now they do, and it's an amazing thing to watch.

01:08:07.598 --> 01:08:08.228
I love it.

01:08:11.588 --> 01:08:11.828
All right.

01:08:11.828 --> 01:08:12.998
Any other comments?

01:08:16.438 --> 01:08:26.373
Okay, Jessica, why don't you, close everybody else out and you can put Linda and me, Mindy and me in a room, and we'll talk, and then we will leave the room and we're ready to talk with you.

01:08:27.233 --> 01:08:28.368
Okay, sounds good.

01:08:28.398 --> 01:08:29.508
I'll open your room.

01:08:30.438 --> 01:08:34.068
Thank you everybody who was here, I appreciate it.

01:08:34.458 --> 01:08:35.148
And

01:08:35.303 --> 01:08:36.143
inviting us.

01:08:36.648 --> 01:08:37.308
You're welcome.

01:08:37.308 --> 01:08:38.058
Thank you for being.