WEBVTT

NOTE
This file was generated by Descript 

00:00:11.537 --> 00:00:17.557
Jethro Jones: Welcome to Transformative Principle, where I help you stop putting out fires and start leading.

00:00:18.097 --> 00:00:19.557
I am your host, Jethro Jones.

00:00:19.557 --> 00:00:22.317
You can follow me on Twitter at Jethro Jones.

00:00:34.931 --> 00:00:37.121
Welcome to Transformative Principle.

00:00:37.121 --> 00:00:37.606
I am.

00:00:38.261 --> 00:00:40.091
Boy, I'm excited for this conversation today.

00:00:40.091 --> 00:00:47.291
I've got Brent Zko on and he is the elementary principal at Mary Welsh Elementary in Williamsburg, Iowa.

00:00:47.771 --> 00:00:59.501
He's about 650 students and then of course, like most of you principals, he does a bunch of other stuff On top of that, the district level, he a LL director, migratory education Director program.

00:00:59.881 --> 00:01:12.361
Preschool program director and more obviously, um, we are gonna talk about how he's using ai, uh, to create curriculum for his students.

00:01:12.541 --> 00:01:15.121
And we'll get into a bunch of other stuff too, but we're gonna start there.

00:01:15.121 --> 00:01:17.401
So, Brent, welcome to Transformative Principal.

00:01:17.401 --> 00:01:18.271
Thanks for being here.

00:01:18.766 --> 00:01:19.846
Brent Zirkel: Thanks for having me on Jethro.

00:01:19.846 --> 00:01:20.806
It's great to be here with you.

00:01:21.461 --> 00:01:22.361
Jethro Jones: Man, I'm excited.

00:01:22.361 --> 00:01:31.601
We were on a, another Zoom talking about, um, ai and you just nonchalantly said, oh yeah, I use AI to write curriculum.

00:01:31.601 --> 00:01:32.891
Tell, tell me about that.

00:01:34.046 --> 00:01:35.816
Brent Zirkel: So, um, in our building

00:01:35.816 --> 00:01:47.276
we have teacher teams and, um, you know, there are certain content areas that are more, um, controversial, I guess you could say, at times than others.

00:01:47.666 --> 00:01:48.476
And, um.

00:01:49.781 --> 00:02:00.671
We've had great growth in literacy since Covid, and we've had done a lot of focus on literacy and math, but we felt at an, as an elementary that we were kind of falling behind in science and social studies.

00:02:01.211 --> 00:02:14.081
And, um, social studies in particular seemed to be a hot and button issue, um, both politically and, um, just with concerns about, you know, people understanding what you're teaching, um, how you're teaching it.

00:02:14.531 --> 00:02:16.181
Um, at a state level.

00:02:16.181 --> 00:02:16.751
I don't know if.

00:02:17.001 --> 00:02:18.741
If you've experienced this, this too.

00:02:18.741 --> 00:02:24.531
But there were questions about, you know, are we teaching critical race theory and different things?

00:02:24.531 --> 00:02:31.131
Is social emotional learning the same thing as critical race theory or there's just a lot of questions about that sort of thing?

00:02:31.131 --> 00:02:45.201
So, um, we, as we were thinking about looking at a social studies curriculum, we, we talked about the importance of it being transparent so that when we, when we have this curriculum, it's not something that's in a book somewhere.

00:02:45.641 --> 00:02:49.391
That you take off of a shelf and then people are able to view if they have questions about it.

00:02:49.391 --> 00:02:57.011
But we want it to be out in the public eye so that our parents have a full idea of what it is that we're teaching.

00:02:57.011 --> 00:03:02.291
They see the content and are able to, to work through that, able to have conversations with kids.

00:03:02.591 --> 00:03:06.731
Our school board, we wanted them to be able to see that our state officials, um.

00:03:07.406 --> 00:03:11.156
Just because we felt, um, I kind of had like a conviction actually.

00:03:11.171 --> 00:03:21.596
I, I took, I was with a group of, uh, students that went to Washington DC and while there, um, you know, my bachelor's degree was in social sciences.

00:03:21.986 --> 00:03:27.866
And as I'm walking around Washington, DC I'm thinking to myself, what a failure I've been as an, an individual and as a

00:03:27.866 --> 00:03:34.286
school leader in bringing, um, like civics to life and helping kids understand and be excited about.

00:03:34.796 --> 00:03:38.306
History and how it impacts them even today.

00:03:38.906 --> 00:03:43.016
And so when I came back from that trip, um, that's really when I started creating.

00:03:43.586 --> 00:03:47.306
And, um, I've been building websites for a while.

00:03:47.726 --> 00:04:00.766
Um, as a a school teacher, I found that I. Um, yeah, I, I taught Spanish in high school and I found that if I could put my Spanish content on a website, then I had greater, um, ability to reach kids.

00:04:01.006 --> 00:04:03.976
I had greater ability to help them practice at home.

00:04:04.516 --> 00:04:14.506
And, um, I. What I loved as a teacher was that it always put my best work out in front of me so that whenever I came back to it, I had another iteration where I could improve upon it.

00:04:14.506 --> 00:04:16.486
So I was always improving on my best work.

00:04:16.786 --> 00:04:20.716
And boy, you get two or three iterations as an educator and you've got some really good stuff.

00:04:20.716 --> 00:04:21.346
In my opinion.

00:04:21.346 --> 00:04:21.406
I.

00:04:21.956 --> 00:04:22.286
Jethro Jones: Yeah.

00:04:22.286 --> 00:04:28.136
Well, and Brent, I want to just, uh, touch on that for a minute too, because that was how I started teaching.

00:04:28.136 --> 00:04:32.846
Also that, uh, it's funny because I went at it from a little different way.

00:04:32.846 --> 00:04:36.416
I hated my students coming up and saying I was absent.

00:04:36.416 --> 00:04:37.136
What did I miss?

00:04:37.811 --> 00:04:39.821
Then me having to like put everything together.

00:04:40.121 --> 00:04:47.921
So I designed all of my lessons and materials so that students could do it all independently if they were absent.

00:04:48.161 --> 00:05:01.601
Which is so silly when I think back about it now because, um, because like, it's not like I had that many kids who were absent, but it was so annoying to me that I had to make a, a change because it just drove me nuts.

00:05:01.601 --> 00:05:04.091
But what was so good about it is exactly what you said.

00:05:04.391 --> 00:05:06.311
I had my best stuff out in front of me.

00:05:07.031 --> 00:05:18.551
I was intentionally designing my curriculum to be accessible to my students so that I didn't need to be there, that they could do it without me, that a parent could get on the website and check it out.

00:05:18.551 --> 00:05:28.736
And, um, and it just, it totally changed how I taught because I, I I was making sure that things were ready for them.

00:05:29.036 --> 00:05:31.856
My job planning wasn't done with a lesson plan.

00:05:32.246 --> 00:05:37.826
It wasn't done until it was published on the website for kids to be able to go and access themselves, I.

00:05:38.431 --> 00:05:39.361
Brent Zirkel: Absolutely.

00:05:39.451 --> 00:05:42.721
And that's, and it's amazing who else is accessing it.

00:05:42.721 --> 00:05:44.581
You know, I, I remember getting pings.

00:05:44.581 --> 00:05:46.471
I had a buddy and he was kind of doing the same thing.

00:05:46.591 --> 00:05:50.101
This kind of started as a, um, like a flipped classroom.

00:05:50.101 --> 00:05:51.721
Blended classroom type of thing.

00:05:52.141 --> 00:05:57.271
Um, and then we just kind of started using it as a tool that we used throughout the class and outside of the class.

00:05:57.271 --> 00:05:57.961
And it was just.

00:05:58.406 --> 00:05:59.426
Everything that we did.

00:05:59.996 --> 00:06:04.196
Um, but you get pings from Russia and from China and people get, you know, people just

00:06:04.196 --> 00:06:04.856
wanting to learn.

00:06:05.396 --> 00:06:15.056
Um, and I thought that was amazing and kind of also opened me up to, to see like if we are transparent and put our best work in front of us, that you will have a global audience.

00:06:15.971 --> 00:06:27.311
And that's powerful and also meaningful for kids, you know, if they can see that what you're putting out there, that you've got people in Russia that are so interested that they're wanting to, to learn about it or you know, from all different parts of the world, but.

00:06:27.851 --> 00:06:28.301
Jethro Jones: Yeah.

00:06:28.481 --> 00:06:51.881
And, and what that led to for me is, uh, collaborating with people during the 2008 presidential election in the United States with people from France, the Netherlands, Australia, Belgium,
and a couple other countries that I can't remember, but collaborating all across the world about this election that was happening in the United States that everybody was interested in.

00:06:52.181 --> 00:06:54.611
And I was an English teacher, so I wasn't even like.

00:06:55.001 --> 00:06:57.401
That wasn't even my quote unquote content area.

00:06:57.671 --> 00:07:02.471
But, uh, the good thing about being an English teacher is that it touches everything because it's about communication.

00:07:02.471 --> 00:07:05.621
So, so we could, we could write about anything in that year.

00:07:05.621 --> 00:07:13.661
We wrote about the election and did these collaborative, uh, writing assignments with kids in other countries, which was so, so cool.

00:07:13.661 --> 00:07:13.901
Wall.

00:07:14.561 --> 00:07:20.291
Brent Zirkel: Well, and as an educator, if you're willing to make yourself vulnerable by putting yourself out there in a, a public.

00:07:20.551 --> 00:07:25.861
Sense like that, it also motivates you to continue to create better and better things.

00:07:26.131 --> 00:07:37.651
And so like as I'm trying to populate my website with meaningful learning activities, then that got me into starting to create YouTube videos and making sure that it wasn't just some random person's face in front of my kids, but it was my face.

00:07:37.651 --> 00:07:41.671
And that they saw that, oh, Mr. Circle can make a, a video.

00:07:41.671 --> 00:07:42.631
Like he can be online.

00:07:42.631 --> 00:07:43.561
I could be online too.

00:07:43.561 --> 00:08:01.421
Like, it, it just creates, um, a. It switches the flip for kids in, or it flips the switch for kids in the sense that, um, so many of our kids are just consumers of technology and we need them to be producers and creators of technology.

00:08:01.421 --> 00:08:03.671
And that's also what excites me about ai.

00:08:05.020 --> 00:08:06.370
go back to your original question,

00:08:06.940 --> 00:08:08.530
like how are we using it?

00:08:08.710 --> 00:08:09.940
We're we're using it.

00:08:10.720 --> 00:08:11.980
We're creating a, a website.

00:08:11.980 --> 00:08:15.190
It's called uh, raider social studies.com.

00:08:15.910 --> 00:08:19.420
And on that website we are creating units.

00:08:19.420 --> 00:08:27.430
We've, we've worked as teams to identify like what are our priority standards, and then we've built proficiency scales, um, of how we'll assess those priority standards.

00:08:27.580 --> 00:08:33.310
And then we're working to create learning activities that we put, um, on the website for kids.

00:08:33.520 --> 00:08:37.690
And we use AI to create some of those learning activities.

00:08:38.200 --> 00:08:38.770
Um.

00:08:39.535 --> 00:08:42.385
And, you know, at different grade levels it looks very different.

00:08:42.385 --> 00:08:49.885
I work in a three-year-old preschool through sixth grade building, and in kindergarten, what they have on their website looks very different than in sixth grade.

00:08:50.725 --> 00:08:56.395
Um, but it's exciting, you know, like for me, I, I know a lot about social studies.

00:08:56.395 --> 00:08:57.745
It was my, my background.

00:08:58.165 --> 00:09:06.955
Um, but sometimes AI makes it easier to organize your thoughts and, and put it in a, um, consistent, um.

00:09:07.690 --> 00:09:15.160
Whether it's chronological or thematic or however you want it to do, um, AI can be assistive in that we use chat GPT for that at times.

00:09:15.400 --> 00:09:18.760
Sometimes it's just to be able to draw out details and then to be able to sift.

00:09:19.255 --> 00:09:31.375
You know, with ai, I like to think of that 80 20 rule where, you know, we'll put in our prompts and ask, you know, for, for information back, but then we're gonna take and make sure that 20% of what we're doing is our own work.

00:09:31.375 --> 00:09:33.205
We're gonna edit it, we're gonna pare it down.

00:09:33.655 --> 00:09:41.035
A lot of times what we do is we're, we're changing the lexile level so it's accessible at different levels, um, for students.

00:09:41.275 --> 00:09:43.705
And again, you can use AI for, for that.

00:09:44.515 --> 00:09:45.115
Um.

00:09:45.790 --> 00:09:59.440
One of the, the things that I'm really proud of here recently is we created a Middle East unit and, um, geopolitics was one of my favorite things in, in college, and, um, just makes me understand the world in a different way.

00:09:59.440 --> 00:10:03.430
And we talked a lot about, um, we call 'em bad actors in geopolitics.

00:10:03.430 --> 00:10:05.170
So, um, you can take.

00:10:05.335 --> 00:10:17.695
Different people and look at the choices that they made and the situations they were in, and determine whether they're acting in good faith to help improve life for the people they were serving as leaders or whether they were doing things, you know, to benefit themselves.

00:10:18.205 --> 00:10:23.605
And so, um, we created an activity of claims and counterclaims using ai.

00:10:24.025 --> 00:10:30.655
And, you know, we used AI to generate a list of, of 14 different people in the Middle East over the last, you know, 25 years.

00:10:31.105 --> 00:10:34.615
And what were some of the things that could be considered, um.

00:10:35.665 --> 00:10:40.825
Claims that they were bad actors, and then what were things that were, would be counterclaim that would show good things that they did.

00:10:41.215 --> 00:10:54.175
So you're showing like a fuller aspect of people in history, um, so that the kids then are reading through it and they're making the choices of whether they believe someone was a bad actor or not, and then justifying their thoughts behind it.

00:10:54.175 --> 00:11:03.715
So, um, it's, it's, what I love about it is, you know, just earlier this week I went into the lunchroom and I was talking with some of our sixth graders.

00:11:04.075 --> 00:11:11.785
And one of our sixth grade boys, um, at the table called me over and wanted to talk to me about, um, AAD from Syria.

00:11:12.085 --> 00:11:21.835
And he wanted me to know that, um, because we made this a few months ago, that now it was outdated and that we, uh, we need to, um, you know.

00:11:22.585 --> 00:11:23.305
Fix that.

00:11:23.725 --> 00:11:36.385
And, but just to be able to have a conversation with a sixth grader about Syria and Alad and some of the decisions, um, that got him, you know, into power, what he did to maintain power, and then now why he's out of power.

00:11:36.775 --> 00:11:40.105
Like, talk about powerful conversations with kids.

00:11:41.040 --> 00:11:43.765
Jethro Jones: That the kids are not too young for either.

00:11:44.155 --> 00:11:44.485
That's.

00:11:44.590 --> 00:11:45.030
Brent Zirkel: absolutely not.

00:11:45.670 --> 00:11:52.990
Jethro Jones: That's the amazing thing is that we, we think we need to put on kid gloves and not talk to them about what's really going on.

00:11:53.380 --> 00:12:00.400
But the reality is there are kids their age who are living through this and we have an opportunity to, to talk about it.

00:12:00.820 --> 00:12:09.610
And like, what I, what I love about this thing that you've created is that it's all accessible for parents to go in and see.

00:12:10.120 --> 00:12:11.680
And so they can go say.

00:12:11.940 --> 00:12:13.320
Well, I don't agree with this.

00:12:13.410 --> 00:12:15.720
And then they can say why they don't.

00:12:16.020 --> 00:12:19.020
But then you also have, like, here it all is out there.

00:12:19.050 --> 00:12:20.280
Like we're not hiding anything.

00:12:20.280 --> 00:12:24.990
So you can go see and you can, you can leave your, your comments and opinions and that's totally fine.

00:12:25.470 --> 00:12:33.120
Um, but at least it's out there, you know, and it's not hidden in a book on a shelf or in some curriculum document that's behind a password.

00:12:33.120 --> 00:12:39.570
Like, I can go on there and see all these things that you were just talking about and see where they're at and, and what it all looks like.

00:12:40.155 --> 00:12:45.975
Brent Zirkel: Well, and, and just even with the lesson itself, it's all about claims and counterclaims, so it's not like there's a right answer.

00:12:46.545 --> 00:12:48.255
It's the answer is complex.

00:12:48.315 --> 00:12:50.055
And so we're getting kids to think.

00:12:50.755 --> 00:12:54.145
At, at higher levels rather than memorize and regurgitate.

00:12:54.565 --> 00:12:56.185
And I think that that's what's powerful.

00:12:56.185 --> 00:13:02.185
And, and I love this Middle East unit just in that, you know, like our kids today, they weren't alive for nine 11.

00:13:02.610 --> 00:13:06.535
They, they, they're trying to understand the world that they're, that they're living in.

00:13:06.970 --> 00:13:16.030
And there are so many pieces, you know, we, we actually start the unit talking about, um, the three religions of the Middle East and just a general overview of what those religions are.

00:13:16.030 --> 00:13:17.650
There's similarities and differences.

00:13:18.160 --> 00:13:27.430
Um, then we go into like the creation of the state of Israel and how that came about, and then why there have been difficult feelings on different sides about that.

00:13:28.020 --> 00:13:31.590
Um, and then that brings us up into more of the modern day.

00:13:31.650 --> 00:13:33.660
Um, you know, the kids do learn about nine 11.

00:13:33.660 --> 00:13:37.470
We feel like as a district, um, that's a really important thing.

00:13:37.710 --> 00:13:49.710
And that was one of those things from Washington DC that trip where I just felt like there were certain things in history, like the Holocaust nine 11, like I felt ashamed if our kids did not leave our elementary building.

00:13:49.950 --> 00:13:54.780
Having some sort of understanding about it, and if we could do it at a deeper level, even better.

00:13:55.860 --> 00:14:03.390
Um, so, so yeah, it's kind of a place, it's a, a websites become a storehouse of, of information that kids can then manipulate.

00:14:03.390 --> 00:14:04.770
You can have conversations.

00:14:04.980 --> 00:14:11.340
That whole idea of the shift in common core years ago was that we're not getting kids to memorize a certain piece of information.

00:14:11.340 --> 00:14:17.700
We're getting them to take a stance and then be able to justify through evidence and reasoning why that is a correct stance and.

00:14:18.355 --> 00:14:20.635
That's the world we live in.

00:14:20.635 --> 00:14:22.945
You know, like we're, we're doing that on a daily basis.

00:14:23.365 --> 00:14:26.065
So I just love how it mimics and, and creates, um.

00:14:28.060 --> 00:14:38.410
Those skills and like the communication skills that you were talking about too, the, because kids are, kids are not only studying that in social studies, but then they're writing about it in their, um, literacy courses.

00:14:38.770 --> 00:14:39.700
They're speaking it.

00:14:39.760 --> 00:14:42.190
Um, so they, they're having like some debates.

00:14:42.550 --> 00:14:51.520
Um, so they're getting up and publicly like taking a stance and, and hearing pros and cons against that and then responding like, these are all really valuable skills for kids.

00:14:53.215 --> 00:15:00.010
Jethro Jones: So the thing that I really like about this is that you're creating a, a hyper localized curriculum basically, that is specific.

00:15:00.010 --> 00:15:01.780
Specifically for your students.

00:15:02.230 --> 00:15:17.860
Now, in the past, this would be too big of a lift because the, the writing needed to be done by your teachers, and now with AI you have the ability to use sources that are out there.

00:15:17.860 --> 00:15:21.610
You have, you can put a source into the AI and do all that stuff.

00:15:21.970 --> 00:15:26.680
Talk to me about that philosophical approach of you creating this specifically for your.

00:15:27.115 --> 00:15:33.380
For your, for your community because anybody can come and like use this and have it work for theirs also.

00:15:34.255 --> 00:15:41.155
This is specifically with your kids in mind, your background, the things that you think are important with the teachers.

00:15:41.155 --> 00:15:58.495
So like you guys are really focusing on this is what we want, and you have Iowa State standards and you have the district, you know, things that you need to do, but this is really custom tailored for your specific school and not necessarily something that you could just take out and put in.

00:15:58.760 --> 00:16:04.520
You know, I'm in Spokane, Washington and just put in the Spokane Washington district and say, here's the same thing.

00:16:04.850 --> 00:16:08.180
Why is that important specifically to you guys?

00:16:08.180 --> 00:16:08.210
I.

00:16:08.965 --> 00:16:13.945
Brent Zirkel: Well, I think, you know, we live in a, um, traditional rural community.

00:16:14.245 --> 00:16:15.805
Um, it's a great place to be.

00:16:15.805 --> 00:16:17.935
It's a great place to, to raise a family.

00:16:18.475 --> 00:16:26.035
Um, and like American values are super important to our community, like patriotism is, is very important.

00:16:26.065 --> 00:16:30.745
Like Veterans Day, um, you know, there's flags all over in the the square.

00:16:30.745 --> 00:16:37.795
There's things that kids, um, are doing, um, within the community to honor veterans and, um.

00:16:38.560 --> 00:16:41.710
There's just, there is that like American value.

00:16:41.710 --> 00:16:42.970
That's really important to us.

00:16:43.480 --> 00:16:54.910
And I think when we can arm our educators, um, with good content, um, then you have really interesting and deep discussions that they, they also want to have.

00:16:54.910 --> 00:17:01.600
That was one of the things I thought was really cool was, um, the feedback that I've got from the, the teachers.

00:17:01.750 --> 00:17:02.320
Um.

00:17:02.755 --> 00:17:24.055
So far has all been very positive in that they felt like they were learning along with the kids because, you know, as they're putting things into AI as prompts to get information back and
then coming back and, and having to filter that and, um, work through what is like most important to be able to present it, they're learning along the way and at, at an elementary level.

00:17:24.775 --> 00:17:30.955
You know, I came from the secondary world and you know, like we're all content masters in the secondary world and really deep into our content.

00:17:31.255 --> 00:17:38.845
The elementary, you've got teachers that are teaching four different content areas that are very different, and a thing like history is so expansive, right?

00:17:38.845 --> 00:17:42.895
Like, like you can't, like even knowing what to teach.

00:17:43.035 --> 00:17:43.875
Was really difficult.

00:17:43.875 --> 00:17:53.895
And so that's why going through that process of prioritizing first, um, what to teach and what's nice about our standards is our standards are all about skills, not about content.

00:17:54.165 --> 00:17:57.555
So it's not like a certain time of, of, of history or anything like that.

00:17:57.765 --> 00:18:00.315
So you could match any time of history to.

00:18:00.730 --> 00:18:03.160
The, the skills that you're trying to develop.

00:18:03.790 --> 00:18:12.880
Um, but it's, it's unique locally, um, in that there are, you know, kids are meeting other people in our community, right?

00:18:12.880 --> 00:18:20.590
Like other adults in our community, they're hearing from them, like, um, you know, like nine 11 is a great example of that as they're learning about the Middle East.

00:18:20.590 --> 00:18:23.470
Like what was their experience during nine 11?

00:18:24.400 --> 00:18:27.550
Um, how was that different from their parents' experiences?

00:18:27.550 --> 00:18:29.680
And so they're able to have conversations.

00:18:30.475 --> 00:18:38.425
With, with each other as students, with their parents, with the community at large, and then with, with other people that are within the community.

00:18:38.425 --> 00:18:43.825
And I think that that just creates, um, you know, it creates deeper thinking.

00:18:43.825 --> 00:18:49.855
It, it helps people think from different viewpoints and be able to value other people's viewpoints.

00:18:50.275 --> 00:18:54.025
And, um, just see that, um.

00:18:55.090 --> 00:18:57.610
There might not always be a right answer.

00:18:58.060 --> 00:19:02.830
That's what I love about the bad actors is there is not a right answer.

00:19:03.250 --> 00:19:09.370
The answer is what you can justify and feel that you can stand behind because you have evidence to support it.

00:19:10.030 --> 00:19:12.010
And sometimes that's the best we can do in this world.

00:19:12.040 --> 00:19:19.360
You know, I think that sometimes we try to teach everybody things are black and white and you know, I, I believe that we live in a pretty gray world

00:19:19.720 --> 00:19:20.260
in a lot of ways.

00:19:20.860 --> 00:19:21.970
Jethro Jones: Yeah, for sure.

00:19:22.000 --> 00:19:22.750
We really do.

00:19:23.260 --> 00:19:31.570
And you said something I think is really key there, that this essentially gives teachers an opportunity to have these light bulb moments with kids, you know?

00:19:31.570 --> 00:19:35.830
And in every interview I had with a teacher, like, why did you wanna become a teacher?

00:19:35.890 --> 00:19:37.930
Well, I'm looking for that light bulb moment with a kid.

00:19:38.320 --> 00:19:40.090
That is almost always what they say.

00:19:40.540 --> 00:19:40.930
And.

00:19:41.800 --> 00:19:44.230
Pretty much every teacher wants that to happen.

00:19:44.560 --> 00:19:49.990
And the challenge is that that is not always easy to do when you're teaching a standard curriculum.

00:19:50.320 --> 00:20:01.930
But when you do things a little bit differently, like what you're doing with social studies, uh, like what I did with student driven learning and personalized learning, those light bulb moments happen nearly every day.

00:20:02.320 --> 00:20:08.080
And that's powerful because you're not gonna get a light bulb moment from a scripted lesson.

00:20:08.410 --> 00:20:08.860
Okay.

00:20:08.950 --> 00:20:10.360
Maybe you will, but.

00:20:10.795 --> 00:20:21.835
It's gonna be pretty few and far between because there's not really room for the light bulb moments and because everything is so personal, all learning is personal to each individual student.

00:20:22.405 --> 00:20:34.225
A student is going to have a connection that you may not have even noticed, could be a possibility, but when you bring something in and you're like, here's this example of how they're a bad actor, they're like, oh.

00:20:34.660 --> 00:20:35.920
I know a story about this.

00:20:35.920 --> 00:20:43.900
Here's what my dad told me when I was three years old about this guy who lived down the street, and now I understand why that guy was a bad actor.

00:20:44.170 --> 00:20:56.980
And it's like, whoa, that's amazing because you're in fourth grade and you remembered this story from when you were three, and now you understand what a bad actor means in a way that is deep and personalized to you.

00:20:57.400 --> 00:21:02.200
That kind of stuff is really powerful and so much fun to do when you're a teacher and.

00:21:02.770 --> 00:21:05.830
You just enjoy life so much more, right?

00:21:06.095 --> 00:21:12.725
Brent Zirkel: Well, I think that, like you touched on something too that I've heard as feedback from our teachers so far is that there feels like there's a greater freedom.

00:21:13.265 --> 00:21:18.815
In the discussion point because it's a discussion, so there's, there's give and take and it's unique.

00:21:19.165 --> 00:21:25.405
Depending on which group you're in, um, it's unique, um, due to the background that different students are bringing in.

00:21:25.405 --> 00:21:31.195
I mean, you have some students that will bring in a lot of background knowledge, and then you have others that have very little.

00:21:31.465 --> 00:21:41.695
And so the questioning also becomes more authentic because, you know, you have peer to peer questioning that's happening rather than the teacher saying like, this is what happened and this is what you need to know for a test.

00:21:42.385 --> 00:21:44.215
So it, it really is.

00:21:44.545 --> 00:21:45.115
Um.

00:21:45.820 --> 00:21:52.240
Learning through dialogue, and that's, um, you know, all progress happens through dialogue.

00:21:53.614 --> 00:22:01.744
Jethro Jones: So let's get into the, the nitty gritty of how you're using, um, AI to help you with this because it.

00:22:01.809 --> 00:22:09.759
It is so difficult to write curriculum and people who love it, like they're just a special breed, right?

00:22:09.819 --> 00:22:16.089
And, and to, to be able to create all of this from scratch is, is a really big lift.

00:22:16.089 --> 00:22:23.589
So talk about the specific things that you have used AI for and that you currently use AI for.

00:22:23.589 --> 00:22:28.989
Like you used AI to create some of these things and then you use AI through the process.

00:22:29.364 --> 00:22:32.364
With teachers and kids like as they're teaching it.

00:22:32.364 --> 00:22:33.924
So talk about those two aspects.

00:22:33.924 --> 00:22:35.329
Anything else that I didn't ask?

00:22:35.754 --> 00:22:38.349
Brent Zirkel: Yeah, so, um, I call it layering.

00:22:38.349 --> 00:22:40.204
I don't know if that's the technical term or not.

00:22:40.719 --> 00:22:48.579
But, um, layering for me is when you're using multiple technologies to create something that wouldn't be possible with anyone individually.

00:22:49.149 --> 00:23:01.059
And that's what we do with AI, is we might use multiple different AI platforms to create things or to bring things together, and then we use the website, um, as a way to store those pieces.

00:23:01.959 --> 00:23:02.259
Um.

00:23:03.549 --> 00:23:07.089
So chat, GPT is is a big one that we've used.

00:23:07.179 --> 00:23:14.109
Um, I know that's one of the very simple ones, but the thing I love about chat GPT is the images that it will produce are amazing.

00:23:14.139 --> 00:23:15.789
Like, I'm not an artist.

00:23:16.149 --> 00:23:20.859
Um, I, I do know my social studies content, but I cannot create art around social studies.

00:23:21.249 --> 00:23:26.319
So I can go into chat GPT and, and ask it to create abstract.

00:23:26.694 --> 00:23:31.494
Representations of different concepts and what it comes back with.

00:23:31.524 --> 00:23:37.764
I, it is just stunning to me and I love that, that like there's no copyright to that.

00:23:38.349 --> 00:23:44.184
I, I can use that art and, and put it on the website and, and that in itself.

00:23:44.484 --> 00:23:50.094
Creates a lot of dialogue with kids, you know, like it is observational dialogue of like, well, what do you see here?

00:23:50.304 --> 00:23:51.564
What do you think that means?

00:23:51.564 --> 00:23:55.434
Why, why do you, why is this image or this symbol in it?

00:23:56.154 --> 00:23:59.004
Um, so it, it allows you to, to draw a lot, um, that way.

00:23:59.004 --> 00:24:24.219
Um, I. Uh, one of the AI platforms that I really love is, uh, SUNO ai, and it allows you to create music, um, with, um, you know, we, like, we can use vocabulary terms, um, we can use, um, a concept and describe the concept, but, um, I love rap and so, um, I think a lot of our kids do too.

00:24:24.669 --> 00:24:26.859
And so we'll create wraps.

00:24:27.104 --> 00:24:48.374
About, um, you know, different concepts and then that, that goes on the website and that becomes like a, a memory tool to help kids learn and maintain vocabulary because as they learn the
rap, um, and you can do, you know, we do that not only for social studies, we do that sometimes with social stories and things like that, but, um, you know, the human brain is built still.

00:24:48.374 --> 00:24:53.414
Like we're living in a, uh, beyond information, age, tech, you know.

00:24:54.189 --> 00:24:57.459
Era, but our brains are still built for tribal life.

00:24:57.849 --> 00:25:06.699
And so like having, having the chance and the, the rhythm and all that stuff like that all gets the brain working and encoding in a way that's powerful for kids.

00:25:07.629 --> 00:25:12.549
Um, then we'll use, you know, I, I use things like magic school.

00:25:12.639 --> 00:25:18.399
Um, at times I look at that as kind of like a gateway ai, um, platform for educators.

00:25:19.059 --> 00:25:25.089
Where we'll, um, maybe use that to develop hooks or like lesson starters, um, for kids.

00:25:25.959 --> 00:25:29.379
Um, we'll use it to get ideas at times.

00:25:29.379 --> 00:25:34.989
You know, like, um, they have a thing in, um, magic school that's called, um, make it Relevant.

00:25:35.559 --> 00:25:41.079
And so you can put in concepts and it'll give you different ways or different ideas to make it relevant.

00:25:41.079 --> 00:25:43.089
And I look at AI for educators as.

00:25:43.739 --> 00:25:44.249
Thought partners.

00:25:44.909 --> 00:25:46.799
So AI shouldn't be doing your work for you.

00:25:46.829 --> 00:25:51.779
It should be making your work more efficient in how you're doing it, saving you time and energy.

00:25:52.169 --> 00:26:05.219
Um, but it should be a thought partner so that, you know, you're putting, you're putting things in and then you're taking 'em out and you're adjusting them, um, knowing because you know the students that are in front of you, so you know what you need to do to adjust it, to make it most effective for the kids that are in front of you.

00:26:06.119 --> 00:26:11.909
Um, so, you know, I do have some people I, you know, that, that think like, oh, I just typed something and AI's gonna create all this for me.

00:26:12.269 --> 00:26:12.809
And I'm like.

00:26:13.009 --> 00:26:22.804
Man, that that's, that robs you of the experience because the, the beauty is that when you get to become the creator and you get to do something with it, the AI is just organizing all of it.

00:26:22.824 --> 00:26:27.049
So you got it in front of you, and then you can create from what you have in front of you that way.

00:26:27.854 --> 00:26:28.144
Jethro Jones: Yeah.

00:26:28.149 --> 00:26:28.389
Yeah.

00:26:28.389 --> 00:26:30.969
The way that I like to describe that is that.

00:26:31.659 --> 00:26:39.879
Uh, the AI becomes the, uh, the assistant who does the menial work for you.

00:26:40.149 --> 00:26:44.409
You know, AI is really good at doing anything that you ask it to.

00:26:44.409 --> 00:26:50.694
It can do it, but it's really hard to get it to do the exact thing that you want it to do.

00:26:51.124 --> 00:27:00.394
And so I often use several different AI tools to like use one to generate a prompt and another one to implement that prompt.

00:27:00.634 --> 00:27:07.354
Um, and, and different things like that because there are different ways to, uh, to make it happen that that is beneficial.

00:27:07.354 --> 00:27:11.289
I. That one AI tool can do better than another.

00:27:11.469 --> 00:27:24.849
And, and that's the thing that I love, is I can, I can take something from over here, make it work in a way I want it to over on this side, and then it, I get the thing that I actually wanted out of the project.

00:27:25.089 --> 00:27:27.939
Brent Zirkel: And that's what I call the layering process, right?

00:27:28.029 --> 00:27:28.269
And

00:27:28.269 --> 00:27:41.169
so, you know, like we use Google Suites in our school, and so a lot of times we'll take the pieces that we've, we've learned or gained, um, from ai and we'll drop those in a Google slide just so that we have something sequential.

00:27:41.529 --> 00:27:43.449
Um, that's, that's, um.

00:27:43.974 --> 00:27:46.794
Familiar to teachers that they can feel comfortable with.

00:27:47.124 --> 00:27:54.954
But, but there were lots of different stages in between of pulling that together to put it, um, in the way that we have it presented.

00:27:54.954 --> 00:28:04.314
Like whether it's it's in, with the activity, whether it's in the, um, the reading or the images that kids are seeing, but we're, we're taking it all and curating it within a Google slide a lot

00:28:04.314 --> 00:28:04.734
of times.

00:28:05.439 --> 00:28:05.889
Jethro Jones: Yeah.

00:28:06.039 --> 00:28:07.869
Brent Zirkel: you can also use AI for assessments.

00:28:07.929 --> 00:28:09.669
Um, that's been very helpful too.

00:28:10.149 --> 00:28:12.729
Um, you know, we always try to start with the end in mind.

00:28:12.729 --> 00:28:20.499
So, you know, we have our priority standard, we have our proficiency scales of how we're gonna measure, and then we try to have assessments that accurately are gonna measure it.

00:28:21.099 --> 00:28:27.639
Um, so, you know, we, we we use AI or AI can be used in all of those aspects.

00:28:28.149 --> 00:28:33.939
I think the important thing to remember with AI and technology in general is we also don't want kids on screens all day.

00:28:34.269 --> 00:28:34.689
Jethro Jones: Mm-hmm.

00:28:35.109 --> 00:28:36.249
Brent Zirkel: We, we wanna limit that.

00:28:36.249 --> 00:28:43.599
And so we try to use AI to curate the, the best stuff, get it all together in one spot so kids can go and access it there.

00:28:43.599 --> 00:28:51.609
And then they're gonna use that in, in conversation, in pairing, you know, partner activities, different things.

00:28:51.609 --> 00:28:53.229
So it's not like they're on the screen.

00:28:53.799 --> 00:28:59.889
Doing their learning, they're taking what they've seen in the screen or what they're, they've gained in the video or whatever it might be.

00:29:00.399 --> 00:29:03.519
And then they're gonna use that in communication with one another.

00:29:03.969 --> 00:29:09.009
And I think that's where sometimes educators are afraid, like, oh, AI is gonna take our job, or any, you know,

00:29:11.949 --> 00:29:15.129
are always gonna be the foundation of everything that we do in education.

00:29:15.999 --> 00:29:30.009
Teachers are only gonna become more and more important as we use AI because they're gonna be the ones that take it, curate it, to then allow kids to go back and have real world experiences with it, either within the classroom or outside of the classroom.

00:29:30.759 --> 00:29:30.909
Uh.

00:29:31.914 --> 00:29:37.344
Jethro Jones: So if, if, if education is just about dispensing knowledge.

00:29:37.854 --> 00:29:43.044
Like we don't really need teachers because the AI and technology can do that already.

00:29:43.374 --> 00:29:46.824
But a education is not about just dispensing knowledge.

00:29:47.154 --> 00:29:49.884
It's about teaching kids how to be adults.

00:29:50.214 --> 00:29:56.544
And so that's the key where we really need to have good hearted.

00:29:57.204 --> 00:30:02.994
Responsible, appropriate adults helping kids learn how to be adults themselves.

00:30:03.354 --> 00:30:05.844
And it starts at a very young age.

00:30:06.174 --> 00:30:19.464
I've, I've been in some men's groups, uh, recently where I've had the opportunity to talk to young fathers about how to get their kids to turn out good when they're, you know, ready to leave the house.

00:30:19.824 --> 00:30:22.404
And that's not something that starts when.

00:30:22.839 --> 00:30:27.969
They are 16 and you're like, okay, time to get serious about you leaving the house.

00:30:27.999 --> 00:30:32.019
No, that starts when they're born and it starts with the things you start doing.

00:30:32.499 --> 00:30:48.189
And sometimes you screw up and you teach 'em something bad, whether intentionally or unintentionally, and that sucks, but it happens and so, so you've got to constantly be thinking, how do we help them be the kind of adults that we want them to be?

00:30:48.219 --> 00:30:49.659
Who can make good decisions?

00:30:49.659 --> 00:30:51.489
Who can look at a situation and say.

00:30:52.269 --> 00:30:54.519
He, for example, here's a bad actor.

00:30:54.579 --> 00:30:56.079
This person's not acting right.

00:30:56.349 --> 00:31:03.819
And I need to, to make sure that I understand that and see that, and know that I'm not gonna fall into that trap that they're laying for me.

00:31:04.034 --> 00:31:08.469
Brent Zirkel: Yeah, there's life lessons to be learned along the way with the content and skills.

00:31:08.679 --> 00:31:10.299
That you're developing, right?

00:31:10.599 --> 00:31:13.479
And that's, that's part of the most fun of it.

00:31:13.809 --> 00:31:19.809
And then as teachers, you get to be, um, kind of like the guide along the way.

00:31:20.259 --> 00:31:33.219
You get to be the one that, that, you know, as they ask this question, well, you know, let me give you this and have you think about this and let's come back and have a conversation once you've got a little bit more information or a bigger picture of what you maybe originally thought this, this could be.

00:31:33.999 --> 00:31:35.469
Um, but kids, all along the way.

00:31:36.264 --> 00:31:38.424
Can learn and can have those conversations.

00:31:38.424 --> 00:31:43.884
I, I kinda like what you said earlier about, um, you know, they're ready for these conversations.

00:31:43.884 --> 00:31:50.214
It's, it's us sometimes as adults that are not, um, ready or prepared or think that they, they can handle it.

00:31:50.754 --> 00:31:52.074
Jethro Jones: Yeah, absolutely.

00:31:52.434 --> 00:31:55.914
So there's a lot more that we can talk about.

00:31:55.914 --> 00:32:03.594
We didn't even get to your test kitchen educational foundation that we should have a whole separate episode on because that is so cool.

00:32:03.744 --> 00:32:07.074
So I'm gonna have you come back on the program and, and talk about that.

00:32:07.464 --> 00:32:15.624
Um, but my last question is, what is one thing that a principal can do this week to be a Transformative Principal like you, Brent.

00:32:16.779 --> 00:32:18.009
Brent Zirkel: Well, that's quite a compliment.

00:32:18.129 --> 00:32:20.289
I, I try to be as good as I can be.

00:32:20.289 --> 00:32:24.459
I, I grew up with the phrase, good, better, best, never let it rest until you're good.

00:32:24.459 --> 00:32:25.964
As your better and your better is your best.

00:32:26.944 --> 00:32:27.164
So.

00:32:27.684 --> 00:32:28.344
Jethro Jones: That's awesome.

00:32:29.034 --> 00:32:42.624
Brent Zirkel: every day I think about what is the next step that I can take, and I think to be a Transformative Principal, that's what, that's a mindset that you need to have, is you might not be able to accomplish all of this in the next week, but you can take your first step.

00:32:42.624 --> 00:32:44.214
I. Towards accomplishing it.

00:32:44.634 --> 00:32:52.524
And the, the biggest, um, obstacle I think right now if we're talking about AI in general is, um, not with kids.

00:32:52.524 --> 00:32:55.464
It's with adults helping adults understand it.

00:32:55.494 --> 00:33:05.484
Helping adults not feel, um, a fear of it, but helping adults see that it's a tool that will help them be more effective eventually.

00:33:06.114 --> 00:33:08.964
And so, you know, it could be as simple as, um.

00:33:10.074 --> 00:33:14.364
Having a, a kind challenge with your, your faculty about usage of it.

00:33:14.364 --> 00:33:17.934
It could be as simple as modeling, um, and being vulnerable.

00:33:18.234 --> 00:33:24.264
Um, there, there have been times where we come into our collaborative meetings and I don't know the answer to something, and so I said, you know what?

00:33:24.894 --> 00:33:26.874
Let's just see what AI has to say about this.

00:33:27.279 --> 00:33:29.769
And then we jump in and, and, and we do it together.

00:33:29.769 --> 00:33:35.979
We practice different prompts and, and then we see what comes back and, and we move forward from that.

00:33:35.979 --> 00:33:54.939
So it's, it's an openness to it and it's, um, working to, you know, so much is coming out so quickly and you're not gonna be able to do everything but find where your passion is and, and what you believe could benefit kids and just take a step in that direction.

00:33:54.939 --> 00:33:56.109
And that would be my advice.

00:33:57.154 --> 00:33:58.269
Jethro Jones: Yeah, that's awesome.

00:33:58.329 --> 00:33:59.439
And I. Love that phrase.

00:33:59.439 --> 00:34:03.159
Good, better, best, never let it rest till your good is better and your better is your best.

00:34:03.339 --> 00:34:04.059
Did I get that right,

00:34:04.239 --> 00:34:05.019
Brent Zirkel: Pretty close.

00:34:05.199 --> 00:34:05.619
Jethro Jones: man?

00:34:05.709 --> 00:34:06.069
All right.

00:34:06.069 --> 00:34:07.119
I'll take it good enough.

00:34:07.239 --> 00:34:08.169
Brent Zirkel: That was impressive.

00:34:08.424 --> 00:34:08.814
Jethro Jones: Yeah.

00:34:09.204 --> 00:34:11.364
Um, all right, well, this was awesome.

00:34:11.364 --> 00:34:12.984
We're definitely gonna have you back.

00:34:13.314 --> 00:34:25.164
Just a reminder, if you're listening to this and you're interested in AI stuff, I do these AI leader office hours every month on the third Tuesday of the month at 1:00 PM Pacific.

00:34:25.524 --> 00:34:27.954
That was how Brent and I got connected.

00:34:27.954 --> 00:34:30.264
He showed up to one of those and we made it happen.

00:34:30.504 --> 00:34:32.109
So if you want more info about that.

00:34:32.694 --> 00:34:37.854
Uh, go to ai leader.info and you can, you can register to be part of that.

00:34:38.214 --> 00:34:43.434
Um, and that's fun because we just go and nerd out about cool AI stuff that we're trying out.

00:34:43.434 --> 00:34:47.574
And, and after Brent showed up there, I was like, well, you gotta come on the podcast.

00:34:47.574 --> 00:34:48.804
We gotta talk about this more.

00:34:48.804 --> 00:34:51.444
So, um, so Brent, this was awesome.

00:34:51.444 --> 00:34:54.684
Is there a way you want people to connect with you or reach out to you?

00:34:55.629 --> 00:34:57.939
Brent Zirkel: You know, I, I can be reached out.

00:34:57.939 --> 00:35:02.979
I don't do a lot of social media as a administrator, um, just to keep my own sanity.

00:35:04.139 --> 00:35:04.429
Jethro Jones: Yeah.

00:35:05.139 --> 00:35:07.239
Brent Zirkel: Um, but you can always reach me at my email.

00:35:07.239 --> 00:35:15.574
It is just my name, Brent, uh, B-R-E-N-T dot z. gmail.com and, um, that's my personal email and I'm, I'm happy to reach out.

00:35:15.574 --> 00:35:17.224
It's been a pleasure to be with you, Jethro.

00:35:17.224 --> 00:35:18.664
I find you fascinating.

00:35:18.664 --> 00:35:20.134
I learn every time I'm with you.

00:35:20.524 --> 00:35:28.924
Um, last, you know, when I came to the office hours, um, you shared, um, an AI platform with me that I've already used and have, have shared with other educators.

00:35:28.924 --> 00:35:31.414
And, um, so take that first step.

00:35:31.474 --> 00:35:33.454
You know, it's alright to be vulnerable.

00:35:33.664 --> 00:35:36.424
We're gonna do, we're gonna make mistakes, but we're gonna grow from it.

00:35:36.424 --> 00:35:39.334
And it's the growth, my, my motto is progress through struggle.

00:35:39.729 --> 00:35:42.159
If you, if you run from the struggle, you're not gonna get better.

00:35:42.159 --> 00:35:44.619
So embrace the struggle and let's move on.

00:35:45.249 --> 00:35:45.759
Jethro Jones: Man.

00:35:45.819 --> 00:35:46.479
Good stuff.

00:35:46.479 --> 00:35:47.469
Brent, this was awesome.

00:35:47.469 --> 00:35:49.929
Thanks so much for being part of Transformative principle today.

00:35:50.109 --> 00:35:50.769
Brent Zirkel: My pleasure.