Margin of Thought with Priten

In this episode, Priten speaks with Lance Eaton, Senior Associate Director of AI and Teaching and Learning at Northeastern University, about navigating the integration of AI and educational technology in higher education. Lance shares his 15-year journey through instructional design—from community colleges to Ivy League institutions—and offers practical wisdom on how educators can thoughtfully adopt AI without losing sight of pedagogy. The conversation explores everything from reflection bots and embodied learning to the tension between commercial tech platforms and educational values, ultimately landing on a hopeful note: we've navigated dozens of technological shifts before, and we can figure this one out too.

Key Takeaways:
  • Start small and ground AI in learning goals. Like any instructional design challenge, begin with what you want students to demonstrate—then find where AI fits naturally.
  • Use AI to deepen reflection, not replace it. A "reflection bot" that asks follow-up questions can help students dig deeper than a one-time submission ever could.
  • Pick two or three tools and stick with them. The app explosion taught us this lesson—chasing every new AI tool leads to burnout, not better teaching.
  • AI literacy is discipline-specific. Every field will be impacted differently; the goal isn't generic AI skills but understanding what AI means for your particular context.
  • We've been here before. Higher ed has absorbed 80+ technologies since the 1970s. The playbooks exist—we just need to adapt them for this moment.
About Lance Eaton:
Dr. Lance Eaton is the Senior Associate Director of AI in Teaching and Learning at Northeastern University. His work engages with the possibility of digital tools for expanding teaching and learning communities while considering the profound issues and questions that educational technologies open up for students, faculty, and higher ed as a whole. He has engaged with scores of higher education institutions about navigating the complexities and possibilities that generative AI represents for us at this moment. His musings, reflections, and ramblings on AI and Education can be found on his blog: AI + Education = Simplified | Lance Eaton, Ph.D. | Substack

Creators and Guests

Host
Priten Soundar-Shah
ED of PedagogyFutures / Founder of Academy 4 Social Civics / CTO at ThinkerAnalytix
Guest
Ethical Ed Tech: How Educators Can Lead on Digital Safety & AI in K-12
Strategies and tools to integrate emerging technologies into K-12 classrooms in a way that benefits all
Guest
Lance Eaton
Senior Associate Director of AI in Teaching and Learning at Northeastern University

What is Margin of Thought with Priten?

Margin of Thought is a podcast about the questions we don’t always make time for but should.

Hosted by Priten Soundar-Shah, the show features wide-ranging conversations with educators, civic leaders, technologists, academics, and students.

Each season centers on a key tension in modern life that affects how we raise and educate our children.

Learn more about Priten and his upcoming book, Ethical Ed Tech: How Educators Can Lead on AI & K-12 at priten.org and ethicaledtech.org.

Priten: Welcome to Margin of Thought, where we make space for the questions that matter.

I'm your host, Priten, and together we'll explore questions that help us preserve what matters while navigating what's coming.

Priten Shah: If you've been teaching for a while, you've seen a lot of education technology come and go online, learning, open educational resources, learning management systems, and now ai.

Today's guest has not only witnessed these shifts, he's been shaping how educators respond to them.

Lance Eaton is a senior associate director of AI and teaching and learning at Northeastern University, and he spent 15 years in instructional design working everywhere from community colleges to Ivy League institutions.

I had the pleasure of meeting Lance at College Bound Lance.

Lance thinks about technology differently.

For him, it's a spectrum from the pencil to virtual reality, and the question isn't whether to use it, but how to integrate it thoughtfully in service of learning.

Let's get started.

Lance Eaton: I'm Lance Eaton.

basically I started as a full-time adjunct.

Started doing full-time adjuncting in the late two thousands.

taught a whole bunch of courses way more than any individual should in a semester, over several years and found myself in instructional design.

over the last 15 years in that field, have like been moving into this hybrid space where like instructional design is still part of my core, but I've also leaped into doing faculty development.

And so much of, like my work is thinking about and sitting with faculty and educators to, figure out the different ways we can demonstrate learning?

And how do we do that with different technologies?

And when I talk about technologies, I mean everything from a pencil up through, AI and virtual reality.

I've moved around at different institutions in the.

Northeast area from community colleges to Ivy Leagues and most recently landed, at Northeastern University as the senior Associate Director of AI and teaching and learning.

a lot of that has been in the last two or three years, a mixture of writing workshops and collaborations and just doing a lot of public work and work in my classes that.

Been sharing out to help, in this rich conversation.

We're all trying to figure out what do we do.

Priten Shah: What do we do is definitely the question of the day.

I'd love to hear about the.

Different, transition points that you've noticed, when it comes to education technology.

especially when you think about like the last 15 years, we've had lots of different turning points, with the pandemic being one of the obvious ones, but of course also the technology turning points.

So, just big picture.

what is the general trend you're seeing, um, across time.

Lance Eaton: in my experience of the last 15 years, I was coming in when I would say online learning was hitting its stride and feeling more legitimized than it had in the previous decade.

I took my first online class in like nine, in 2000.

and so I was coming into doing instructional design around 2011 and it was feeling more validated.

Still dismissible, but still a lot more people interested in it.

And were starting to see more, more significant, appreciation of it.

So I think that was like.

A big one that was also happening as we just got wider range of web 2.0 tools from, you know, social media, blogging video that we could actually, you know, access that wasn't, you know, horrible in, in quality or took forever to download.

In the mid 2010s, the rise of open educational resources and recognizing the power they had in teaching and learning.

We started to see richer, complex conversations around the role of the cell phone in classrooms.

And started to see a lot more, synchronous online learning, where we could be in a Zoom room or, you know, pick your, pick or Skype room,
your Adobe room and start to do classes in that space, which was great because the pandemic happened and we all moved in to those spaces.

And I think one of the things I saw from that is.

At that point, like you could no longer, you can, but realistically, there's a almost universal expectation of you.

You need to learn, use the LMS, and have that which prior to the pandemic there were still lots of pockets that held out, never did anything with their LMS courses.

And then just a richer, you know, mixture of looking at OERA richer mixture of, thinking about what are other digital tools that are out there.

So did push a lot of people in directions they might not have been ready for, but had to figure out on the fly.

has been AI and I think it's really interesting 'cause the things that has popped up that's by and large hasn't taken off.

It's really only little pockets are like augmented reality, virtual reality, the metaverse, it's so interesting.

Like it felt like there was something there in the late 2000 tens.

Priten Shah: Do you think that most of that is accessibility of the tech or do you think there's something else at play?

Lance Eaton: I think it's the accessibility of the tech, you know, with generative ai.

All you need is a text box to interact with it.

And that is such a low, that's something everybody has been using.

They use on their phones like, oh, I see a box.

I type into it.

So I think it's that.

there's still not like your two pathways are to create it.

Or to buy a solution, whether that's from a publisher or some, you know, entity like that.

and you know those, I think for a lot of faculty, rightfully so.

Those usually don't hit the mark in the way that they want.

So you could have a simulation game like civilization if you wanna play around with history.

Priten Shah: Right.

Lance Eaton: And, I love it as somebody who majored in history and has taught history, but it still doesn't quite do the things that I want and it also has its own layer level of complexity to get used to.

So I think there ends up being, there's lots of possibility, but until there's a more easier entry in both as the educator to like figure it out and customize it in a way that makes sense to them.

And for the students on the other side to be able to access it, and, and have it be familiar enough for, whatever that size of the class is.

Priten Shah: Right.

I'm curious The AR conversation or even VR conversation is slightly, tangential to, the larger issues at play right now.

But I'm also curious about, the pedagogical value that you see in these tools if they were to be made more accessible.

I hear about virtual field trips and that's the go-to example of school implementation.

Simulations are obviously another one.

do you think this is an engagement tool predominantly, or do you think there's separate pedagogical value that's to be gained, from using that versus a different tool?

Lance Eaton: I think the simulation space is where, where it's so exciting and we're seeing some of that being used with ai.

there's a value in the immersion especially in courses, whether it's asynchronous.

Or courses that meet once a week, like courses where there's a lot of distance between people.

Anything you can enhance the senses with.

this is what I think a lot about, particularly as we went through the pandemic, is like the embodied learning.

What's happening in the body, like how that impacts how much you learn.

Uh, and I think about this in a very simple way of just like.

I've never been in a classroom where I'm comfortable because the chairs are horrible.

You like, they're like, just for my body type.

Like, I just always, like, I'm uncomfortable.

And that's a distraction from learning.

And so the reverse of that is if we're navigating, and, and this is where I think with immersive learning, virtual reality, like if
we're enhancing the sensory experience, hopefully for the better, There's something there to, to, to dig into pedagogically like that.

To me, I keep going back to nowadays,

Priten Shah: Do you think that, idea of embodied learning.

do you think AI helps us do any of that differently than we could do, um, pre ai,

Lance Eaton: I think there's ways we may start to get there as we have avatars, and avatars for engagement and going through simulations.

there's one way I'm thinking about it, particularly how I've often been using it in my own work.

I haven't really tried this out in classrooms, but one of the things I really like to do nowadays is to

Take out, in this case chat.

Bt the voice.

Turn on the voice on my phone.

I'll have my headphones in and I'll be going for a walk and I will use it as a iteration space.

And to me there's the linking of like, we know walking, we know movement is good for the brain, it is good for thinking.

And so to be able to, to do that and have those thoughts recorded.

'cause like that's what's happening is they're, you know, the AI is asking me questions, I'm giving responses.

We're going back and forth.

So that when it's time to take whatever the purpose is of that interaction, it's a talk or it's a, something I'm writing or something I'm planning.

it's all digitized and I feel it differently.

I experience it differently than, just sitting at a computer.

I'm getting to be more in my body while I'm also trying to think about things and I get to be.

Like, I'll do this while going for a walk in the park, which can, you know, and nowadays it doesn't.

I'm not crazy for just randomly talking because that's what we all do with our phones.

So I can have those conversations without people looking at me sideways.

Priten Shah: Right.

So do you have it just like, prod you with questions?

Is that what you

Lance Eaton: So I'll explain whatever it is that I'm working on.

I'll tell it to, take a particular approach goal or way of pushing at my responses or digging deeper it just becomes, a conversation that, takes a while to get into that rhythm.

But once you're into it, once you've done it, a handful of times It becomes a really useful way of thinking aloud.

Priten Shah: That's something I would wanna try.

I nor my go-to trick has been, every time I need to write something, I have a generated outline, but using questions so that, like, I find myself, much easier to answer questions than it is to start from scratch.

this feels like that's a, even a further extension of it and I like the idea of taking a walk and using it.

So, yeah, that's definitely a cool idea.

When you think about the kinds of suggestions that you've made to faculty, especially when you're thinking about the instructional design
component, what ends up being your go-tos for a faculty member who is for the first time thinking about, the role of AI in their work.

Lance Eaton: it's like everything else with instructional design.

It's like, start small.

Uh, this is hard with AI because unlike other technology that we could like see on the horizon and start to prepare for, it just kind of showed up everywhere.

And so the thing that's hard to advise is within instructional design, faculty development, you're always like, what's the goal?

Right?

That backwards design.

what do you hope they will be able to demonstrate by the end of the course You do that and you come up with the assessments, and then you come up with the learning activities Here, it's a, the tool has now just been made everywhere.

We know that students not all are using it.

Some are using it appropriately, some are using it in ways that are bypassing learning.

So it's how do you weave this in because while it is shiny, it's also something that's just there and to not be thinking about it.

in a classroom today is like not thinking about the internet or the whiteboard in a physical classroom.

it is part of that space and so how do you use it?

So that's kind of the first piece just the acknowledgement of where does it fit?

Then within that I lean towards let's have the conversation with the students.

What is their understanding of it?

How do you talk about it regarding or relating to the course, the discipline it sits in, what their learning goals are, and collectively create some kind of policy
or set of agreements to hold one another accountable or recognize this is what we're trying to do because we're still all learning with this tool, with these tools.

So I use that, that's like one way of the completely new person.

Just trying to figure out like, where do I step first.

In terms of an activity, one of my go-tos is if you do reflection in your course in any way, have students try it with ai.

One of the things I find really powerful is reflection is a, is, is.

Amazing.

It is a workforce for learning.

we know this is, validated in so much research, but it's a practice.

if you're earlier in your reflection practice, like you're gonna, you're gonna have harder time digging in versus like, you could give me a question to reflect on and yeah, I can go for days, that is just a muscle.

I've worked on a lot.

So to me, in what I engage with faculty around is getting them to think about creating a reflection bot, for lack of a better word, or creating a prompt
that helps the student do a reflection so that when the student puts in their initial thoughts, the AI comes back with questions to deepen their learning.

So the student might be like, I didn't like that reading.

the AI can say, well, what didn't you?

And keep poking and prodding.

Because what ends up happening right now is, you ask for a reflection, the student submits it on Friday, the instructor may get to it by Sunday or Monday, the student may, you know, see those results on Monday.

But when you're looking at it and you see that student say, I didn't like something, you're thinking, I wanna know why.

If something can prod that.

there's a really interesting win there of having the AI help in that process.

Help them understand what does reflection look like?

Well, it looks like, this pushing of questions and nudges as you're bringing up what's in your head.

so to me that's, that for many folks feels like a really good spot to start out because it does, It's something many of us, you know, feel challenged within education with.

When we're doing reflections is like, I, I want more from the students, not because I just want them to do more, but because I wanna, I wanna learn more about what the, what the thing is behind the thing that they just said.

Priten Shah: When you think about student responses to these reflection bots, do you notice, or have you heard about, like lack of student buy-in for it?

Or does it seem like students are generally, excited to use something like this?

To be fair, I'm imagining a middle schooler, interacting with this bot.

and in my head the answers are oftentimes, short and glib when they know it's AI on the other side versus a human.

higher ed is hopefully a different space with higher maturity levels, but, that's not, for granted.

have you heard that it's generally successful with students?

Are they generally engaged with it, or does that take a little bit of convincing?

Lance Eaton: in the, higher ed level I've seen positive responses and that there's also context of, is everybody using the same tool and does everybody have access to the tool?

I think so long as it's being framed of this is what you're doing and this is what you're submitting.

Right?

And so this is the other piece I would often recommend, which is submission becomes the tap log.

It's not do all this and then move it into like a document that is like properly formatted.

But no, it's the dialogue and being able to see that growth or see that response back and forth, in which the middle schooler who might be glib or, or anybody who might be glib and just one word responses.

I think there's a way of having the AI recognize that and try to be more responsive or find ways to connect with what the student, what the student is interested in.

Priten Shah: Right.

So when you think about the kinds of tools that are at faculty's disposal, um, but also just like the vast landscape of the number of different, general use tools that are popping up.

And then of course then like all of the education specific tools that are popping up.

one concern we often hear is how do I keep up with all of it, but also how do I choose?

I'm curious when you're having these conversations, but also when you're making policy, what, how are you making the decisions and are recommending other folks make decisions?

Lance Eaton: Yeah, I compare it to, we had the app explosion, in late two thousands, early 2010s.

You pick two or three.

And you stick with those until you have reason otherwise not to.

And sometimes there's, sometimes there's hurt with that.

Sometimes that means you have to move all of your notes from one tool to another, and that's unfortunate.

But I think it's really hard to pick winners in this.

And that being said, I would look to the tools that might already be there.

So if you know you're an institution.

That is a Google institution or Microsoft institution, it is worth thinking about if I already have all of the stuff in these environments,
if the AI tool that they have is equivalent or reasonably equivalent to what's out there, then I should probably just use the one here.

I know ChatGPT was the first out of the gate.

I think it is often trying to maintain that status by sometimes irresponsibly putting stuff out or quickly putting stuff out.

Priten Shah: Right.

Lance Eaton: I'm really interested in the stuff that Google is now doing and I think they have some really powerful stuff that people are missing.

I think really it's recognizing you can't know everything.

I think about this a lot and try to figure out what's the few tools that I feel like I'm getting enough or that, you know, what's the one general tool, what that be, two or three more specific tools.

And then can I keep some stream of information to let me know if there's.

Limitations or newer things on the horizon, but there's no chance to follow all of it.

Priten Shah: when you're mentioning, Google, Microsoft, OpenAI, all with their own incentives for, you know, what they are building the technology for.

you know, there's folks who raise concerns about allowing that much commercial influence within our schools.

AI is just the newest way that's happening.

This is in some ways, not a new problem.

It's a new, path for that problem.

do you share those concerns?

how do you view, the role of commercial entities within the educational spaces?

Lance Eaton: I think we're stuck in this dynamic of we can choose not to use those premier tools that are just readily available and people are absolutely familiar with.

Or we can choose tools that are not great or feel less intrusive from that commercial side.

There's a whole new learning curve, for every new faculty that comes in, every new student, every new staff.

And there's a lot of costs to that.

judging just from my own, uh, my own choices in the past at different organizations, if I'm not using a tool that feels pretty ubiquitous or, you know, I'm still gonna end up using it.

And I, I think.

A good example is, at a institution.

This was in the mid 2010s.

I was at one institution where they were a Microsoft institution, and Microsoft's sharing abilities for their documents and slides and whatnot.

I mean, they're still challenging, but they were, they were even worse then.

And I was just like, no.

if I'm doing stuff with faculty and I want them to have access to it, I'm going to Google Docs I think that's the thing is to go away from that.

And I understand there's a lot to be concerned about and thinking about in terms of how much third party entities, not just in technology but throughout the university, are there because the institution itself can uphold those.

But we're also gonna trade off a lot of things in terms of time and frustration Having to learn more things on top of all the other things we have to learn, like we have to learn the LMS, as faculty and students and staff.

there's the LMS, the student support system.

the different communication systems that, an institution might have within it, your slacks, your team messages, all of that.

I'm not a fan of it, but I don't know that I've seen a working alternative that has the resources to do it for everybody.

Priten Shah: When you think about the purpose of using these tools, a lot of what you've said thus far is tying it directly into, pedagogical strategy,

Makes sense.

I'm wondering if there's any framing of it that's about career preparedness for you.

oftentimes from disciplines that are not traditionally very tech, um, forward.

It seems like some arguments in those spaces have been that, we need to have the students at least practice using these tools because these are the tools they'll use later in life.

do you view the role of the technology that way in, for example, a history classroom?

or do you view it differently?

Lance Eaton: I think it's both and I think it's pedagogical and it's a recognition that the way I'm seeing and thinking about AI is we're 30 years since the formative start of the internet

Priten Shah: Right.

Lance Eaton: Industry business degree, like every program operates differently now because of the internet.

You don't do history the same way now.

in order to do history.

You have to actually understand, or like you have to understand the, the internet digital technologies that allow for that.

You have to understand archives and digital archives to me, that is the same line with AI it doesn't matter if the AI we have right now is mediocre it's still going to have this layered impact across lots of different areas.

And so therefore, there is a good reason within any discipline to be exploring there is the, there's the pedagogical pieces of where does this align with my course?

But then there's the professional pieces of I am engaging somebody in this discipline.

This discipline is going to be impacted.

we don't have students in history classes going to the library to card catalogs.

they're going to the digital, Resources, to me that's the connection both of those pieces are important around AI in teaching and learning.

Priten Shah: One thing we hear is like, um.

This tangential use of the technology as a way to expose the students to the technology.

, Like let's chat with the historical character, which maybe has not well documented pedagogical results, is my kind way of saying it.

Um, but it's, it's a popular, starting place because it makes things feel more engaging.

The argument we hear is, oh, well, like, it gets them chatting with ai.

And that's one way to like get them, to think about like, you know, when is AI making a mistake?

Which is a AI literacy skill, but it's not fully, It might not be the best way to teach historical, occurrences.

Do you think that's a responsibility of every instructor?

Lance Eaton: I do, but I think it's can still be grounded in the discipline.

So I'll give you a good example of what I used to do when I taught history I'd engage with the students and be like, okay, what kind of learners are you?

You know, what's your learning style?

And we know the myth, myths about learning styles.

almost that, you know, 90% of students would say, I'm a visual learner.

I'm like, great.

Let's watch this documentary.

And now I want you to take the insights from this documentary and like, tell me about like, what's accurate here and we would do that and
then we'd have this larger conversation about the construction of history and documentary, because it's another form of capturing history.

So I think that's the move I would wanna see for that.

Let's talk to a, uh, historical character with ai.

if you're feeding it, historical documents and all of a historical figures, writings that are known, and some books about this person.

that's a good way to start and to say like, okay, but where is this wrong?

That is part of what it means to practice history, to put things apart and understand what are the underlying things that create our narrative.

Priten Shah: right.

Lance Eaton: I think ultimately.

like the AI literacy, there's those big, broad goals, but it's always gonna be about what does that mean in this context?

Because it will change the emphasis, in what it looks like will change from discipline to discipline.

Priten Shah: that's helpful framing.

I'll end with, asking for your, your big picture feelings, what are you worried about when we think about, higher ed in particular over the next five years?

Um, and as the technology continues to develop, what keeps you up at night?

Lance Eaton: I think a worry that I have is this like push to delegitimize higher education and at the same time here comes this technology that feeds in some ways, into that narrative or, desire to delegitimize.

I appreciate the deliberateness in which many in academia try to figure things out, and also how long will it take for us to really work through this systematically,

Priten Shah: Right.

Lance Eaton: Because

the rate at which there's a way in which AI is as hard hitting as the pandemic.

But in this dragged out way, we don't have the collective response to figure it out.

there's an absence of accepting it and figuring out then what does that mean for teaching and learning.

And I understand resistance, I understand being really mad and frustrated at the AI or the companies and like all the challenges we live in in late stage capitalism

I get worried about, gaslighting our students by telling them this tool doesn't mean anything when many of them find really important, useful ways that aren't bypassing learning, but are just helping them figure out a world that's incredibly complex.

And so that's what I sit with that tension of.

Having to recognize it's here and therefore we have to find new ways of thinking about teaching and learning as we did with the internet, as we have with different educational technologies back to writing.

Priten Shah: Yeah.

there is this sluggishness, which I think education moves in general.

and again, like you said, Oftentimes that means when we do get out the other side, it's after thoughtful consideration.

the scary part is that it seems like the rate of change is like faster than we're keeping up with.

Now it feels like.

We're still, trying to get caught up, from November, 2022 in some areas of the country.

and the technology is moving, every month.

Lance Eaton: Yeah, let me share.

I have those feelings.

the hopeful piece for me, the thing that I try to counterbalance I've done this, in different conversations I've looked at the range of technologies in higher education that have become ubiquitous, since 1970s.

There's somewhere upwards of 80 or more, things like email to like projectors and, you know, LMSs and automatic grading what gives me hope is I can see this large amount, dozens of technologies have become ubiquitous in higher education.

we've figured our way out through all of them.

Whether we like 'em or not is another question, you know, but they're like, they're part and parcel of a higher education system, and in that it means we have dozens and dozens of playbooks that can help us figure it out.

It isn't going to be perfect because AI is something different.

There's a lot we can learn to help build the new playbook for navigating ai.

whenever I go down that, you know, down that rabbit hole, that's the thing that pulls me out.

It's like we've been here before with many different technologies.

the classroom that I teach in is fundamentally different from the classroom that I went to school in.

In many ways regarding technology.

that's the piece that makes me hopeful that even though there's deliberation for all the reasons there is, we know we've solved a lot and figured out a lot of these pieces already.

Priten Shah: Do you think that there is this feeling of, heaviness that comes from some of these conversations when you're talking to, some faculty
members, about almost an existential crisis for their discipline, the value of their discipline for a student who's not gonna pursue that discipline.

and of course then their assessment, methods.

does that feel parallel to other instances, like other technological shifts?

Lance Eaton: I think strongest is the pandemic because everybody felt that.

But I think there's a reasonable parallel shift in, thoughtful conversations I've had with faculty over the years as they've transitioned to online learning, I've heard the faculty be like.

what does it mean to not be in a classroom?

So like there is this big shift and I remember this happened time and again in which you'd work with a faculty member to develop an online course two things would happen.

One is they'd realize because they've gotta make everything explicit, how much they had left out in their regular classes.

Priten Shah: Right,

Lance Eaton: They were like, oh, I've taken a bunch of this stuff and now that's part of my face-to-face class.

And then also how it's made them a better teacher.

all of this grappling, if this is the best AI we get, still pushing us to think about how to do this better, how to.

Assess better how to figure out what's a closer assessment than just the five paragraph essay, what is a more approximate demonstration of learning?

And that'll be hard.

But there's rich reward there.

Priten Shah: I think having us, reevaluate the fundamentals of our practice, is definitely one of the biggest advantages, that AI has already, brought, because we're having conversations like this and that's, that's, that's, a result of the technology.

any other last, hopes, that you have for the technology even in terms of like what you hope the technology, can make possible for us?

Lance Eaton: My hope would be we can start to understand

where this might give us new answers.

one of the things, and I don't know that we'll solve it in higher ed or even, K through 12 is

we structure education for convenience, right?

So it's most convenient or efficient.

To have people, gather together in these large chunks of time to ideally interact and learn and be part of a community.

And sometimes, or in some spaces it's just information dump, it's that banking model.

I am curious, what happens when we do have this AI companion, tutor, bot, whatever.

Is there a way that we can learn.

Actually, what are the other models and how do we support that?

because people's lives are so intense, and this is why many people take, asynchronous courses.

But I am curious to find out is there other ways of doing learning better that we can capitalize on.

Because right now we're living with a legacy structure of education and it works for some people.

And then we also know it doesn't for lots of other people.

So what happens if we find out, oh, actually we can figure out people's, Chronotype for education.

Like when is it ideal for you to be learning and in what dosage?

You know, is it 30 minutes a day in the morning?

That's probably good for me.

If I'm gonna have to learn something after 8:00 PM I am done.

Good luck.

Like I'm up at 4:00 AM.

so I think that for me is the, like, I'm curious what this can help us figure out about that challenge.

Priten Shah: yeah.

and that would be impossible in a world without ai.

the reality, is even if we think, that, a teacher teaching you for 30 minutes at, 4:00 AM would be ideal.

the second best thing might be AI that actually does it versus, this hope that we figure out how to get humans to provide, that level of customization, and match the students' needs.

Thank you.

I appreciate, talking to you today.

there's oftentimes a lot of pessimism.

and then there's this blind optimism.

and I think folks need to hear more from people who are kind of managing this in the middle,

Let's not just embrace the technology for technology's sake.

let's think through I think just hearing someone talk through all of that provides some framework for you to think through it.

Yeah,

Lance Eaton: Thank you.

Priten Shah: Thank you.

Lance reminds us that each wave of technological change brings uncertainty, opportunity, and the temptation to either resist entirely or adopt uncritically.

His call for small, thoughtful steps in AI implementation.

Grounded in pedagogy rather than external pressure, offers a roadmap for educators who want to move forward without losing sight of what matters.

Keep listening this season as we explore how to make those choices with intention and integrity, and pre-order my book Ethical Ed Tech, how Educators Can Lead On AI and Digital Safety, and K 12.

For more support on thoughtful AI integration by visiting ethical ed tech.org.

Priten: Thanks for listening to Margin of Thought.

If this episode gave you something to think about, subscribe, rate, and review us.

Also, share it with someone who might be asking similar questions.

You can find the show notes, transcripts, and my newsletter at priten.org.

Until next time, keep making space for the questions that matter.