Make It Mindful: Insights for Global Learning is a podcast for globally minded educators who want deep, long-form conversations about how teaching and learning are changing — and what to do about it.
Hosted by former classroom teacher and Banyan Global Learning founder Seth Fleischauer, the show explores how people, cultures, technologies, cognitive processes, and school systems shape what happens in classrooms around the world. Each long-form episode looks closely at the conditions that help students and educators thrive — from executive functioning and identity development to virtual learning, multilingual education, global competence, and the rise of AI.
Seth talks with teachers, researchers, psychologists, and school leaders who look closely at how students understand themselves, build relationships, and develop the capacities that underlie deep learning — skills like perspective-taking, communication, and global competence that are essential for navigating an interconnected world. These conversations surface the kinds of cross-cultural experiences and hard-to-measure abilities that shape real achievement. Together, they consider how to integrate new technologies in ways that strengthen—not replace—the human center of learning.
The result is a set of ideas, stories, and practical strategies educators can apply to help students succeed in a complex and fast-changing world.
Seth (00:01.048)
Christian, welcome to the podcast.
Christian Pinedo (00:04.443)
Thanks Seth, yeah, I'm really excited to have a conversation today. We've chatted a little bit before, but I think this is a great chance to dive in a little bit deeper.
Seth (00:11.51)
Yeah. Yeah. So we met obviously at world savvy. This is a, an organization that I really believe in. on their West coast advisory board. they teach global competencies, the idea that the only constant in the future is going to be changed and that change is rapidly increasing. And so the whole purpose of education is to prepare people for the future. How do we prepare them for a future?
with change in it is through these global competencies. And they organize this event around AI. Obviously, AI is like a big thing that people are talking about. I've talked a lot about it on the podcast. One of the things that I really appreciated about your perspective on the panel was how grounded you were in what's like really happening in AI today. But let's back up just a little bit. Like, how did you end up?
at a IEDU. How did you find yourself here? Like people who are like experts are just like just a little bit ahead of everyone else. Because like we're all kind of figuring this out together. But obviously you've thought about this and worked on this more than most people. So just tell us a little story about how you ended up here.
Christian Pinedo (01:19.299)
Yeah. Expert is a very, it was a word I will use very lightly in this case. I truly believe that no one is an expert on AI. I think there are people that just get the opportunity to think a lot about it. And I for sure had the opportunity to think a lot about it. but yeah, I back up a little bit, I mean, my, my roots are always in the classroom. I was a classroom teacher for a while and upon leaving the classroom, I kind of saw myself in, in, this crossroads moment.
I was actually very interested in this concept of human centered design at the time, something that comes out of IDEO and Stanford back in the 90s and something that even implemented in my classroom. I taught visual arts, so I did web design. And so I did a lot of human centered design process teaching in what I was doing. At Stanford, there was this new research organization called the Institute for Human Centered AI.
And I was really interested in this idea of what human centeredness meant alongside something like artificial intelligence. This is 2019, 2020. So this is a very novel concept to me at the time. I actually had the chance to hear Fei-Fei Li, the founder, one of the co-founders of HAI. I got the chance to hear her speak and hear a little bit about how she thought about AI, the future of information, the future of
work, the future of the way that we might harness this technology. And I was kind of enamored. So I went to go work at HAI. And so my introduction to AI was through a research perspective. Here I am, someone from San Diego, moving up to the Bay Area, working at Stanford in the middle of Silicon Valley, surrounded by techies. I felt quite out of place because I was an educator.
I was kind of techie. taught web design. you know, there's some things that I was pretty aware of, but nowhere near the PhD students that were around me. And so my perspective on AI was always colored by my experience as an educator and me, myself being a learner in that space, all the time. And then Chad GPT came along and this, this rise of large language models started to create this, this dichotomy of
Christian Pinedo (03:36.077)
AI being a concept that was researched on and it was a little bit more hypothetical. It wasn't quite real until the time LLMs came across. And then once LLMs came forward, it became a very real thing. The conversation wasn't something that was just in conference rooms or in research labs. This was something that you saw on like Good Morning America was talking about it at some point. This is a very prevalent thing that people were thinking and talking about, especially educators.
coming out of COVID, coming out of this very disruptive point in time in education already. And now large language models are coming on and propagating that disruption. And that's the point at which I decided to leave Sanford and join AIEDU. I felt that the work or my work was gonna be most effective in a very practitioner centered space. in rooms with educators, being in conversation with administrators, I thought was going to be...
the real change that needed to happen, a very grassroots perspective of change, but again, a very educator informed perspective of what change should look like.
Seth (04:44.408)
So that's an awesome story. I love it when people are rooted in that research base and then can take that and apply it to the real world. that lends a different type of tolerance, of like risk tolerance, right? Because like you're looking for like solutions that have evidence versus just like this feels exciting.
And I think that we need a lot of that in this conversation. I am curious about the human centeredness of it all. the, like that, was part the main core of it. And like, what did that mean at HAI and what have you brought with you into your work at AIEDU and with schools and districts across the country in terms of that human centeredness?
Christian Pinedo (05:35.355)
Yeah, HAI itself was a really interesting organization. First of all, they had started in 2019. they're in the history of Stanford. They're a pretty new organization within the university. They were extremely cross-sectional, meaning AI is such a broad topic. They felt that this could not be something that was housed at the computer science department in the university. For example, this was an organization that sat outside of a specific department and instead worked
across departments. we had, worked with professors in CS, in healthcare, in education, in law. And it was really a space where AI was thought of and researched on in a very holistic way that I felt was way more practical and way more necessary in a world where technology was often siloed in a very technologist mindset. But what does it look like when you bring humanitarians, when you bring historians, when you bring
legal folks into the conversation. think the kind of conversation you have got a lot deeper. And so that was like something I really, really valued from Stanford. And I continue to bring into conversations with school systems, this idea that AI is not something to be siloed within a system, but it is such a holistic, it's going to have such holistic effects on society. It's going to be so ubiquitous in the way that we see it take part in our real world.
that I think school systems should be thinking about this as well. I worried at the time that AI was going to be thought of as a CS thing, that, we'll just take all of this AI stuff and throw it into our CS department. And therefore, our school district will be ready to teach about AI. Coming from Stanford, I knew that that was the wrong approach and that there was going to be a need for a little bit more through-line thinking, a little bit more
diverse voices in those conversations. And I needed, I think I wanted to see people see themselves as a part of the conversation rather than as outsiders of the conversation. And I think that that's still something that I'm hoping to continue to push for. I think we're still seeing a little bit of folks that feel like they're sitting on the sidelines and don't know really how to engage with the conversation. I'm hoping that through conversations like this, through media, through conversations we have with educators that people see themselves as part of that work.
Seth (08:00.184)
Huh. So I'm, I'm, I really liked the point that like with the arrival of LLMs, that was the moment where AI became real for a lot of people. didn't really have like a consumer facing, thing that everybody could play with. and, so obviously like that arrival right after the pandemic, like in that moment for schools, it was particularly ripe and every teacher, when you talk to them, the word that comes up always is
is cheating, right? And yet you have all of these other people thinking about like, what are the implications and what can you use it as a, as a tool? Like, what are the district level policies that we should be putting in place? and you said that school systems should be thinking about this because it's ubiquitous. And then you were talking about, putting, making sure that diverse voices and through line thinking are part of that process is, is that the point like simply to have the conversation?
and make sure that the conversation is had in a way that centers diverse human voices within that set of stakeholders? Or are there other things that we're trying to align on in terms of like best practices? And then if I can add this as like for your research mind, how do you know it's a best practice versus just something that we're trying right now?
Christian Pinedo (09:30.947)
Yeah, I will honestly say that starting the conversation and making sure that diverse voices were in that conversation was probably the focus a couple of years ago. think in this point, At this point in history where people were grappling with change, people were grappling with understanding. I think that that being a key goal was totally valid and necessary. We're several years away from that, I guess, disruption point now.
I'm hoping that the goalposts is put forward a little bit more. Now, having the conversation is a starting point to a deeper need for systemic change, for this idea. And you mentioned this in your intro and talking about the work that WorldSavvy does, this need to make learning and the K-12 system relevant for a person's future. How do you turn conversations into best practices? Honestly, it's really hard to do.
And frankly, oftentimes isn't thought of in parallel with AI. I find that a lot of AI conversations get caught up on this tool, these like tool implementation talking points where, Hey, I see a problem. Let's slap an AI tool on top of that problem and solve it this way. And that's as deep as conversations tend to get in a lot of districts that are starting this thinking. I think that that's not the right goal post.
I think the work that needs to be done is not tool focused. It's really systemic change focused. Honestly, it's hard. We've been talking about this kind of systemic change in school systems for decades. The idea that critical thinking is important is not a novel one. It's something that AI continues to spark more and more, but it's not novel at all. And so when thinking about best practices,
Honestly, I don't think of best practices as something that should be siloed within an AI conversation. I think best practices should be something thought of in the context of school transformation. What are we doing with K-12 systems? How are we preparing our teachers for a changing world? How are we preparing our students for a future that we can't even imagine for ourselves? Those are the kinds of best practices and solutions I would want to pull out.
Christian Pinedo (11:49.271)
If AI is an entry point to that conversation, fantastic. Let's start this conversation with AI. Let's talk about the way that AI might be affecting school systems. If you want to talk about how it can be used as a tool in the school system, that's great. But we want to deepen that conversation a little bit more into how does this actually feed into your overall. you as a school district, maybe even you as a state, how you are moving the goalpost of what K-12 education is doing.
Seth (12:14.936)
So is that your personal opinion or is that something that AI EDU is in the work of doing? And the question there is you talked about AI being the access point to talking about some of these larger systemic changes that need to be made within schools, within districts, within the entire system. WorldSavvy, who we mentioned at the top there, they're doing this work at one school at a time, right?
Is AI EDU also doing that work? you using AI as a tip of a spear to get in there and, and insight or inspire systemic change?
Christian Pinedo (12:53.242)
Absolutely. I think that's always been the ultimate goal for us, at least. We've never ever wanted to do something like build a tool. A goal of ours is not to get more teachers and students in front of a screen. There's no such thing as an education system in which everybody's using LLMs is now more effective one. It might be more efficient one, but more effective is, I think, a different question. The goal has always been to create
world in which students are being prepared by k-12 systems. Like, I'll bring this down to the ground a little bit. I think a good example of this is like this conversation of cheating. This is oftentimes a poignant starting point for teachers. How can we as an organization have a conversation that actually addresses this very real and very relevant anxiety that teachers are having about cheating and the prevalence of cheating using AI tools? And
move the needle forward a little bit and get the conversation into assessment design. How should teachers be thinking about assessment design in a very digital world? In a world in which human-centered skill is probably going to be very important. How should teachers be thinking about assessments right now? And yes, this can also address your cheating concern, but this also creates a situation in which we're having a conversation with teachers that makes that learning practice more relevant for a kid.
And so like that's a good example for how AI can be the quote unquote tip of the spear, but we're actually getting at, at the core practice of teaching and learning and what it's doing for kids.
Seth (14:31.192)
Yeah, it's, I'm as I'm hearing you talk about it, it's kind of what I do on this podcast. Just to like, you know, people want to talk about AI, they want to hear about AI and then we use that to dive into the deeper underlying issues beneath it. And I'm hoping to go even a little bit deeper here. You brought up the example of cheating and, and assessment design and what, how, how we should be, assessing and which skills we should be assessing.
Christian Pinedo (14:37.082)
Yeah.
Seth (14:58.242)
this systems change that you're talking about, think the one thing that we can kind of all agree on, and this is like, I think I mean literally everyone, is that it's not working very well and it needs to change, right? But like, what exactly isn't working well and how to change that, that's where you run into some different opinions. And so I'm wondering, like, what is the opinion of,
AIEDU or if you personally, you want to, if those are the same, you can express them that way. What is not working and what needs to change? And I know that's a two hour conversation, but like, is there, is there like a general philosophy you're subscribing to?
Christian Pinedo (15:38.861)
Yeah.
Christian Pinedo (15:43.611)
Yeah, and honestly, if I had an easy fix or an easy answer to this question, I would not just be talking about it, I would be doing it. But we do have a philosophy or a theory of change, both AI, EDU and myself personally. I'll start broad and I'll focus it in a little bit more. The broad point here is that this isn't novel. The idea that education...
Seth (15:48.991)
Hahaha
Christian Pinedo (16:09.178)
in the way that it's designed, the way that we're measuring effectiveness of education is wrong. We've had problems with standardized testing for decades now. We've known that something like, I think durable skills, that conversation happened in the early 90s, right? Like these things that we see as necessary to be a part of education and are maybe anchor points of change that we want to see in these systems, that's not very new.
And so that's kind of the broader theme here. I think the call to action that AI EDUCs for ourselves is that AI is right now a very disruptive force. We can use this disruptive force to maybe approach change in a way that's going to be sustainable for school systems. We, like many other organizations, have published a framework by what we mean by this. What are the skills and competencies that students need in a world
where AI is ubiquitous. We've published that, but we know that that's not enough. Systems change requires more than just student competency frameworks. There's so many of those that exist. They're very, very common. And honestly, they're very similar to one another. Almost every competency framework you look at is going to emphasize something like critical thinking. It's going to emphasize something like human-centered skills, communication.
Collaboration, creative thinking. When I was a teacher, I remember going to PD sessions that called this the 4C framework. But there's so many other different, yeah, there's so many examples.
Seth (17:40.234)
21st century skills. We're like a quarter of the way through the 21st century and we haven't begun to approach how to teach those four things well.
Christian Pinedo (17:49.935)
Like this, like this isn't, yeah, yeah, again, this isn't super novel. what we've tried to do is make it as practical as possible. So yes, we've published a student competency framework. We've also published a teacher competency framework. What skills should teachers be building in this world? How should teaching and learning be addressed in a very AI driven world? And how should PD be thought of as an effective, mechanism of change there? Take it a step further.
How should school sites be thinking about this? And if I'm a principal managing a series of teachers, how should I be thinking about the kinds of goals I should be setting for myself? How do I do something like policy enactment or community engagements with regard to all of this work? And then a level even, a level further, how should school districts be thinking about this holistically? What kind of.
human capital should I have within my district system? How should I be setting policies? How should I be thinking about the way that I engage with my board? I think all of these things need to ladder up in a way that feels holistic and it feels actionable. And so the goal here is not to be just put out another competency framework. The goal here is to say, these are the things that we want students to have. These are the things that we want teachers to have.
how we enact those changes within school systems. Well, here's a couple of those goalposts that we have in mind for you. And we've kind of published this as like a holistic AI readiness framework. The idea here is that readiness truly is the goal. We want school systems to be ready for the kind of change that we're going to see through technology. Not today, but in 10 years or in five years. It's really impossible to predict what technology is gonna look like in 30 years, but ideally,
you're going to be able to see the same kind of themes be pulled out. Human-centered thinking and design are going to be emphasized. And change management is going to be a necessary component of all.
Seth (19:46.265)
Yeah, change management is huge and important, especially given the shifting winds in the world and the way that things seem to be disrupted all over right now, not just AI. You use the word sustainable change, change management obviously is a huge part of that, right? Like you can, you can have a great framework and you can come in and you can deliver it in the wrong way and it can have zero impact versus like
really trying to consider like, are all the people who are impacted by this? What are the different ways in which we can communicate this information and how are we going to know if we've been successful, you know, kind of approaching, things, the way that a classroom teacher might to try to understand that they're trying to deliver information. have a bunch of different kids who learn in a bunch of different ways. And there might be some different ways that they show that they've learned something, right? You just kind of expand that out to a larger scale.
But sustainable change as an outside organization promoting sustainable change within an organization, what kind of wisdom are you imparting? Is it just about the frameworks? Is there a hand holding, an extended hand holding that happens? Because sustainable change feels like something that happens internally, right? Or internally from
within the system, right? Even if it's like on a broader scale, a scaled up version of sustainable change, it still happens like within the system, right? So given AI EDU's call to action being that disruptive approach to the sustainable change, how do you support the sustainability of the change you're looking to inspire?
Christian Pinedo (21:32.387)
Yeah, I think the answer to that question is probably two sides of the same coin. It requires a ton of bottom up work, meaning a lot of times we are partnering directly with school districts. We are doing a little bit of that direct engagement with districts. We want to hear from teachers, from students, from parents, from administrators and thinking and help them establish and codify their thinking around all of this.
maybe even putting out a roadmap for how that district should approach to change. How does something like teacher PD, policy, board engagements, parent education, how do all of these things ladder up to the overall goal that we're setting here for ourselves? We do a lot of that work, hands in hand with school districts every day. And that's probably more of like a grassroots approach. Concurrently, you have to have a little bit of a grass tops approach as well. Working with state agencies, working with
state governments with governor's offices looking at something like the budgetary, kind of the boring things, like budgetary allotments that your school system has. How much is the state actually investing in teacher PD? How much is the state actually investing in revision processes for curriculum redesign? All of that also needs just as much attention as the grassroots work. And so both have to happen kind of concurrently.
We have different teams doing this work, course. But what we know for sure is two things. One, you don't do grass tops change without grassroots change happening first. We want to make sure that if we are having conversations with the state government or a state agency that we're doing so because we work with so many school districts within that state. How can we have
partnerships with at least 10, 12, 15 school districts across California before we build our own context for how California as a system should change. So that's one thing. Grass roots is step number one. Grass tops is step number two. And then the other thing is depth and not breadth. We're a small nonprofit organization. We're not gonna be able to do this across the country with every single state, with every single district.
Christian Pinedo (23:49.115)
We'd much rather invest in depth of impact than breadth of impact. And that's how you make sure that changes are sustainable. Our district partnerships are three, four or five years old than their continued work. When we do teacher PD, we're not doing a one hour virtual workshop with educators. We're doing a 10 week fellowship with teachers in which they are paid a stipend to take part in, in which they're doing a deep learning, having deep learning experiences.
Seth (24:14.126)
Hmm.
Christian Pinedo (24:18.178)
in very community centered way. That takes a lot more investment of time, energy and resources, but it's how you have sustainable change. So like that theory applies just as much to teacher PD to district strategy work, just as much as does to state level work as well.
Seth (24:36.526)
Yeah. And it speaks to the necessity of both of them, the grassroots and the grass tops. You can't have a 10 week PD engagement that teachers are paid a stipend without getting support from whoever's creating the funding. Right. you mentioned, the deep learning experiences along the way in that like 10 week engagement. Can you give us an example of what some of those look like?
Christian Pinedo (25:00.89)
Yeah. And when we, just wrapped up, uh, several of our cohorts, we call them trailblazers, like teacher trailblazers. started this program and yeah, we started this in 2023, I believe. So this is, it's been a couple of years in the process. think in, 2025, we've had over 200 teachers take part in this and we've experimented a lot with this, with this format. Um, in some cases we've had.
Seth (25:07.342)
Hey, I'm up in Portland. I'm down with down with the blazers.
Christian Pinedo (25:28.346)
a three month program. In some cases, it was a 10 week program. In some cases, it fully virtual. In some, was like a combination of like a couple of in-person sessions, a couple of virtual sessions, but the overall goals remain the same. know, I mean, research shows that one hour PD workshop for teachers is extremely ineffective and actually having impacts on teaching and learning practices. What does have effects is
deepened scaffolded learning, something that you designed from the ground up as a scaffolded experience, and making sure that teacher to teacher relationships are a key component of that learning. Making sure that teachers feel comfortable with enough to experiment a little bit, to try things and to share the feedback they have with fellow teachers. Teachers are going to be in different places on their learning journey.
Do you want them to communicate with one another? How can a teacher that's further along the learning journey even think of themselves as a coach to other teachers that are maybe earlier in that journey as well? If you're an educator, this might sound very familiar in terms of how you manage classrooms and students. It's the same kind of thing. We know what good pedagogy is. We know what learning should look like. Why not apply that to teachers as well? And so that's what a lot of that fellowship is designed around.
Seth (26:44.236)
And then what kind of problems are they tackling? Is it the same things as it's like, I think my students are cheating. I don't know if they're cheating. Do I approach this with a, with a carrot or a stick or like, that the, it's, it's the real world application that you're, that you're grappling with.
Christian Pinedo (27:00.802)
It's a lot deeper than that. It's extremely informed by the teachers themselves. Some teachers are working in a community in which they have all students from indigenous communities. And so they're really anchored on this theory of data sovereignty and how AI might infringe on that sovereignty that
Indigenous people feel is a very high value to their community. Teachers are designing learning experiences in the classrooms that address this for their students, to help students build their own understanding of the modern world of AI, big data, social media, and what does this mean for your data sovereignty as a young person in this world. Teachers are designing that in these cohorts, and we're just coming in and helping them design that. We're giving them a platform to do it, but they're designing solutions that they want to see for their communities.
And there's other teachers that are in Colorado, and this is true in 2023 and 2024. They saw that rural communities in Colorado had very low voter turnout rates. civics teacher wanted to help have students use AI to build a website that was designed to help help people get registered to vote, especially in rural communities.
And so they designed this whole project-based learning, this learning lesson plan, in which students were actually engaged in this conversation where they were forming project teams. had a designer, you had a coder, you had a researcher, and students went through that human-centered design process. They put together this project with the explicit goal of building a platform that would incite higher voter registration rates in their counties because that was a problem that the teacher saw in their community.
And so there isn't a like kind of one size fits all thing that we want all teachers to do. The goal here is to give them the foundational understanding. So yes, there is going to be a little bit of what is AI, what is bias, how does data like all of these things tie into it, but give them that foundational understanding and allow the teachers to apply that learning to their real world. Again, good pedagogy. You want to go a little bit higher on Zoom's taxonomy. You really want to...
Seth (29:12.024)
We know what it looks like.
Christian Pinedo (29:16.12)
teachers not to just understand but to apply that understanding to what they're doing every day.
Seth (29:20.546)
Yeah, sounds amazing. I want to take you through a quick little lightning round, if that's okay. so, you know, not short, but like shorter answers. can you recommend a piece of media, anything that, that, you've been reading or watching or listening to whatever.
Christian Pinedo (29:27.29)
Okay.
Christian Pinedo (29:38.875)
Okay, books. I'm going start with books. I'm a relatively big reader. 2025, I was trying to push myself to be a better reader and I think I did an okay job at that. Two that I think really struck me. One is an older sci-fi book. It's called Neuromancer. It's written by William Gibson. Probably one of the earlier references to AI. think this is when AI was truly a sci-fi thing.
Seth (29:55.916)
Yeah, I read that in college.
Seth (30:06.434)
Yeah.
Christian Pinedo (30:06.852)
There's a quote in there that I use oftentimes that I think is really relevant today still. The quote is, the future is already here. It just isn't evenly distributed yet. I think it's so relevant for AI. This idea of like access and equity, like this is not a novel problem that society has. so Neuromancer is a book I think is really, really interesting. Another book that's maybe a little bit more immediately relevant to this conversation is like Nexus.
Seth (30:18.126)
That's awesome.
Christian Pinedo (30:37.114)
I think there's a longer title to it, but the main title is called Nexus. It generally talks about the history of how civilization has kind of been stacked on informational systems. And only recently has there been a non-human actor engaging on those information systems and how that might affect society more broadly. So it's a little bit more of a nerdy book I'd recommend, but still fantastic. And I'm going to also...
use your podcast as an opportunity to plug our podcast. We just finished a 10 series podcast focused on raising kids in the age of AI. So focus on parents and the considerations that parents and questions that parents have right now of what it looks like to raise a kid in age where we're beyond the social media issue now. The AI issue is only compounding those existing problems. I think it's an interesting dialogue that honestly is necessary for lot of parents. So I'll plug that one as well.
Seth (31:33.386)
And features a child psychologist in addition to some like, it's Silicon Valley types. In addition to educators, I've listened to a few episodes, really enjoyed what you guys are doing there. so thank you. what is something you are currently rethinking?
Christian Pinedo (31:44.6)
I appreciate that.
Christian Pinedo (31:51.353)
Hmm.
Honestly, this is actually maybe a week, I'm a week into rethinking this. So forgive me if my thinking is a little bit juvenile, but like this is fresh. I've honestly had to face my own AI use and like how I'm using it. I have like recently found that the, this seems silly because I I've talked about it so much and it's such a relevant conversation that we have for students and teachers. But I, you know, I just never connected it to myself. I'm I've.
Seth (31:58.287)
Perfect.
Christian Pinedo (32:22.104)
been really bad at cognitive offloading due to AI, like by using AI. Meaning historically I've used AI a lot to learn. love, I'm a very curious person by nature. And so I will often ask AI like, Hey, what's the history of this building that I'm looking at right here? And it'll give me a ton of information, ask questions about it. And I realized that like, that's a very easy, easy way to learn. And it, you know, triggers all those reward systems right away because I get information that I want right away.
My retention of the information is so bad. I don't remember a lot of the things that AI teaches me. And so I've been trying to get better at not leaning on AI too much as a, just a learning platform. I do still use AI a lot for like applying my learning. like, I'm trying to build a website for a wedding that I'm planning right now. I'm using AI to do that. Right. Like there's still a lot of really interesting ways to apply my skills, using AI, but the
the very, very bottom of that Bloom's taxonomy triangle of understanding and remembering information, maybe not the best place for AI, but the top half of that triangle, I'm still trying to use with AI. But I'm rethinking the instances in which I use it.
Seth (33:36.428)
Interesting. Do you have any inclinations of why you think it's harder to remember something or is it, is it that it's harder to remember or is it the setting in which you're at? Is it that you're trying to learn out in the real world? Like why do think you're not retaining this information?
Christian Pinedo (33:51.951)
AI is really good at giving you a lot of information at once. And it's really good at synthesizing information into like five concise bullet points. The reality is like learning in interest that I have lives in the nooks and crannies that aren't those five bullet points, right? I'm interested in the kind of, especially when it comes to history, right? Like I love learning all the context of the actions that have happened. I love learning about like all of the interesting things.
Seth (34:10.51)
Yeah.
Christian Pinedo (34:21.932)
And so I think for me, least AI does is not really good at aligning learning with like applicable resources that I need. then maybe interesting things. Maybe it's good to wonder and you don't need to know everything and sitting in that discomfort of wondering something is something I need to get better at.
Seth (34:34.734)
Yeah.
Seth (34:42.926)
Yeah. Yeah. I had a guest named Carly Delo from Michigan virtual. They're doing some awesome work around AI with, that state. Um, Oh, awesome. Um, and, uh, she talked about like desirable struggle. think, I think that was the word I'm, I might not be quite getting it right, but the idea of like that learning requires a certain amount of work.
Christian Pinedo (34:49.593)
I love Carly, yeah.
Seth (35:03.65)
And that if you're not doing that work that you're not going to learn. that was one of the things I was thinking about as you were talking is that like, it's just so easy. You're just like, Hey, tell me about this building I'm sitting in front of. it, the fact that it is so accessible, which is obviously it's one of its appeals, might also be the reason why we're having our time holding onto it. Right.
Christian Pinedo (35:24.28)
Yeah, I think we, you need to challenge maybe what is the difference between information and knowledge. If information is super accessible, then maybe you need to think about how you value knowledge and the way that you want to approach knowledge. I've, I've always throughout my entire life, I've loved information. I've loved knowing things right away. Even as a kid, like I would, my, parents had an encyclopedia set and I use that encyclopedia all the time because I wanted to know what.
was Mount Vesuvius and why is that such a commonly referred to reference for in conversation? And so I use encyclopedia to know that like information was accessible to me. Yes, it took a little bit more work than it does now, but it still was really accessible to me. And then it's only become more accessible and more frictionless as time has gone by. Honestly, that friction is fun. Beauty lies in that friction. so maybe let sit in that friction a little bit more and let your
let your curiosity fester a little bit before you, you, you give it a solution.
Seth (36:29.944)
Well said. my last question, and you can see this as, my last question is, do you have a question for me? Which we don't know each other very well. So it might be for me, or it might just be for like the every man. Like what, what is a thing that like you're curious about you're pursuing right now that you are looking for perspectives on.
Christian Pinedo (36:49.272)
Not necessarily something I'm pursuing, but always something I feel like I'm not as. Plugged into right away is I'm very plugged into the state of like AI and education, the work that needs to happen in the U S I'm less plugged into international instances of this kind of innovation. Are there any really cool or fun examples that you've had conversations around or you've done or you've, you've engaged in through your work.
from other countries that you think maybe we can learn from?
Seth (37:20.27)
Yeah, that's a great question. I would say like a lot of it sounds the same as a lot of the dialogue that's happening in the United States. a of it's around cheating, cognitive bypass, following guidelines that are maybe not specific enough to really know how to follow. Like, like a lot of the like school-based conversations are, are, are similar to what is happening in the U S the one insight I would add is that
there is kind of a special consideration for people who are using this in what is not their home language and what is often the target language of what they are trying to learn, which is English. And so a lot of the things that apply to students in an English class in America or to which, which
and they might apply on like a lot of the same principles, but there is even more work that AI can do for students that prevent an English learner from actually being able to do what they need to do in order to acquire the language. And so where we ended up in advising schools internationally in English programs is like an even more restrictive use of AI than a lot of other, and a lot of other schools might come to the conclusion.
is, is tolerable for them if they're not trying to learn the language.
Christian Pinedo (38:48.472)
Yeah, no, I think that's so relevant. When we talk about making learning relevant for a kid's future, think cultural relevancy is equally as important as intellectual relevancy, right? I think I come across these conversations in a lot of Indigenous classrooms and even in work that we've done in the past with Puerto Rico and in schools, technically being supported by our US education system, but are living within such different contexts, cultural contexts, language contexts, and the way that they're
thinking about AI is so different than how state governments are thinking about AI. But yeah, I would be so curious to dive into that in international contexts.
Seth (39:30.498)
Yeah, I think there's also a you bring up your indigenous students and the story of like data privacy or data management. I think there is less of an assumption of like privacy in a lot of other countries that that is a fairly American thing. And, you know, even a place like China, like they have they have like digital surveillance everywhere. Right. So like the idea that you would
need to be extra careful about the type of thing that you're putting into AI doesn't have the same kind of weight as the conversation does in the US, I think, or in Western countries, I should say.
Christian Pinedo (40:11.106)
Yeah. Yeah, that makes a lot of sense. Yeah. Cause I think that value I feel a lot in US and EU regulation, regulatory conversations, but not so much in places like in South America. My family's from South America. And so I've recently had a lot of conversations with folks in Peru about this, but yeah, that's not, not necessarily one of the values that they're bringing forth in policy conversations at all. A lot of it is more focused on access and equity and resources. Like, Hey,
you can take whatever data you want as long as I have access to prosperity in the same way that you have access to prosperity. I think that's a much higher value that I think we take it for granted here a little bit.
Seth (40:45.78)
Mm, yeah.
Seth (40:53.026)
Yeah. Yeah. Another thing I'm thinking of is the, the idea that like, some ways AI is collective knowledge. And so in a collective, a society that might be seen as, carrying, more truth than in an individualist society where we're more obsessed with like individual points of view. yeah. yeah, yeah.
Christian Pinedo (41:01.828)
Yeah.
Christian Pinedo (41:10.874)
Yeah.
Christian Pinedo (41:16.962)
Yeah. Yeah, that makes a lot of sense.
Seth (41:22.23)
I'm hoping you can leave us with, like kind of a, like if you had one thing you could say to the different stakeholders within all this, you've got students, you've got teachers, you've got district people, you've got policy makers. Maybe you can do all four if you want to, or you can just choose one. Like what is the most important thing that you have to that like in you, all of your conversations, the thing that like is really burning inside you that you want to communicate to people about this.
Christian Pinedo (41:53.892)
I'm going I'm gonna try and go for the gold here and say this, think this would apply to all stakeholders is push yourself to realize that these conversations are not about the technology. These conversations are not about AI and how AI as a tool or as a technology can help you. These conversations are about using AI as a bandaid to fill a hole that we have. Ask yourself, why does that hole exist to begin with? And what are some other ways that we can fill that hole?
Seth (42:24.014)
love that. Well said. well, Christian, thank you so much for being here. My, my very last question is where can our listeners find you on the internet if you indeed want to be found? and, and of course you mentioned several things here, including the AI EDU podcast, which I'll put in the show notes.
Christian Pinedo (42:24.932)
Thank you.
Christian Pinedo (42:36.666)
Cheers.
Christian Pinedo (42:43.226)
Yeah, I we have a podcast in which we bring a diversity of voices. think in the next couple of weeks, you're going to hear from a researcher. You're going to hear from high school students on a debate team and how they might use AI for debate. You're going to hear from policymakers. so just diverse perspectives on the AI and education field, so to speak. But more broadly, think AI EDU and our work is something that is not done in a silo. And we work a lot with partners like
world savvy and like a ton of organizations that I think we share this unified vision in making sure that education is relevant for folks in their futures. And so please, please, please feel free to reach out to us at aieedu.org. For any folks who have, they feel like this is a kind of conversation or this is kind of the work that they want to engage in.
Seth (43:33.839)
Awesome. Well, thank you so much again.