EdTech Startup Showcase

In this exciting and idea-packed episode, we catch up with Priten Shah and Aatash Parikh, two of our edtech founders who are also innovative thinkers on artificial intelligence and its potential for schools. 

Hear about:
  • How do we understand what technologies are truly “AI?”
  • What we’re still waiting to learn about what AI technologies can and will become
  • What it means to become “AI literate”
  • Helping to solve persistent problems in education
  • Appropriately embracing new technologies in schools
  • How are students responding to new AI tools in their classrooms?
  • The most important advances schools should make with their use of AI in the next year
About today’s guests
Aatash Parikh, founder of Inkwire, is an educator and an engineer. Aatash has worked as a software engineer at companies like Google & Khan Academy and as a middle school teacher in Oakland Unified School District. Connect on Twitter or LinkedIn

Priten Shah is the CEO of Pedagogy.Cloud, which provides innovative technology solutions to help educators navigate global challenges in a rapidly evolving world. He is also the author of Wiley’s Jossey-Bass publication, AI & The Future of Education: Teaching in the Age of Artificial Intelligence. He and his team are currently focused on helping educational organizations adapt to the ever growing capabilities of AI. Connect with Priten on Twitter or LinkedIn

About today’s host
Ross Romano is a co-founder of the BE Podcast Network and CEO of September Strategies, a coaching and consulting firm that helps organizations and high-performing leaders in the K-12 education industry communicate their vision and make strategic decisions that lead to long-term success. Connect on Twitter @RossBRomano or https://www.linkedin.com/in/rossromano 

Listen to his other shows:
  • The Authority — deep-dive interviews with the leading minds in leadership, management, education, and personal development.
  • Sideline Sessions — conversations with successful coaches and performance experts from professional, collegiate, and youth sports. 

Creators & Guests

Host
Ross Romano
Co-founder & podcaster @BePodcastNet | CEO & edtech advisor @SeptemberStrat
Guest
Aatash Parikh
Founder of @inkwireco, where students can build their body of work and showcase their strengths. Building tech to empower K-12 students, teachers, and schools.
Guest
Inkwire
Inkwire empowers students to build and share their body of work
Guest
Pedagogy.Cloud | AI in Education
We’re educators navigating the next era of teaching & learning. 🎤 Webinars 📚 PD Options 🧰 Class Tools 💌 Follow us for tips & tricks for your classroom!
Guest
Priten Shah
CEO/Founder @united4sc & CEO @pedagogycloud | M.Ed. @hgse & B.A. @harvard

What is EdTech Startup Showcase?

This series shines a spotlight on the innovative edtech companies working to make a difference for students and educators.

Through conversations with the founders and CEOs, partner organizations, and the educators who are using their products in schools today, listeners hear about solutions relevant to their chronic challenges and opportunity-expanding ideas to go into uncharted directions.

Ross Romano: [00:00:00] Welcome in everybody to the EdTech Startup Showcase here on the BE Podcast Network. Thanks for being with us for another episode and for a run of episodes in which we're going to be exploring a variety of the critical topics in education, revisiting with Our founders and kind of tapping into the things that sort of combine and integrate our efforts here in education.

So I'm going to be the host for this series of episodes. I'm Ross Romano, co-founder of the BE Podcast Network. And I host the shows, The Authority and Sideline Sessions. And today I'm joined by Priten Shah from pedagogy.cloud and Aatash Parikh from Inkwire. Guys, welcome to the show.

Priten Shah: Thanks for having me.

Aatash Parikh: Yeah, thanks for having us.[00:01:00]

Ross Romano: So, the topic today is AI, right? And we're going to look at it from a variety of angles, but it's relevant to the work both of you are doing with your companies, what you're doing to support educators as they're getting more familiar and hopefully comfortable and effective with these technologies.

Preet, I wanted to start with you. And kind of get from you the definition of what we really mean when we're talking AI. It's such a big topic, it's been discussed so much, but I think a lot of people who aren't intimately involved with it, a lot of individual teachers and educators, right, are still trying to wrap their head around what exactly does that mean.

And it's not getting any easier as it's become, I think, incentivized, right, for a lot of startups, In the technology space, regardless of what their tech does to say that it incorporates AI, and it may in some form or fashion, but not necessarily in an innovative way. So I just love to hear from you what's the simple definition of this?

What do we mean as a baseline [00:02:00] for when we're talking about AI and AI bringing something new to what we can do in education?

Priten Shah: Yeah, so we'll start with the broad level definition. The way we like to explain it is that AI is the, the field of trying to get computers to do human like cognitive things. And that's a wide variety of things. And so that can mean like, do the thing of like mimic human vision. Mimic human auditory skills, mimic human language skills.

And also just mimics like the human ability to classify things, right? There's all kinds of human cognitive abilities that we have. And various subsets of AI have like kind of focused on doing a variety of those over the last few decades. Recently, and this is what we'll probably talk about the most, is what everybody's kind of like, hearing about the most in the media and definitely what's like, the, at the forefront of a lot of the talk in education is generative AI.

And generative AI is a particular subset of AI that's actually trained to produce new things, and that's, that's the most important things for educators to kind of grasp, is that this AI doesn't just take what you have and find the most relevant thing [00:03:00] or classify it. And generative or recommend something else from the data set, it actually produces things that look like its current data set.

And so, if it has a bunch of pictures of cats, it's not just finding you the best picture of a cat, it's making a brand new image of a cat based on what it thinks a cat, or what it's, what it, the patterns it's understood about what a cat looks like from that data set. And that's, that's where the power of it comes from, but also a lot of the fears of it come from.

Ross Romano: Is it important to, currently, and we don't, you know, it's fast developing so even by the time people are listening to this in a week or two, it might be a little bit different, but there's certain limitations that currently exist. Is it, how important is it, I guess, to focus on and understand what the limitations are in your perspective?

Versus focusing more exclusively on what we can do, what we're trying to do. Not necessarily limitations or, or risks from a, you know, safety and [00:04:00] security standpoint, but just the limitations of, okay, generative AI can do this, this, and this. It can't really do this and this right now. But I, I'm curious, you know, your thoughts on, you know, Should we be, how much brain space should we spend on those limitations as a user, not as a creator?

Priten Shah: Yeah I think those limitations are really important for us especially when we're thinking about in the field of education. I think thinking about what the difference is between something mimicking human thinking and doing human thinking is a real important thing that I think just folks just need to grasp in order to kind of know how much trust to place in the systems how much fact checking to do.

Well, how much we rely on it, how much we directly expose students to, all that kind of does require us to fundamentally understand what is happening when we see these outputs that look really human and look like they might be true and look like they might come from a product of thought and cognition and the, the, and that's the first limitation I think we should acknowledge is that they're not, there's no actual thinking taking place.

There's no at current technology [00:05:00] is not at the point where there is actual active thinking happening. It is a language production model and that says a lot about human language that we can mimic human thinking so well using just a word engine but it doesn't say much about the computer's ability to think yet.

So that's, that's definitely a limitation that we want to think about keep in mind. The other one is the bias in the data set, right? So, no matter what folks are trying to do to safeguard and come up with, like, post training intervention methods to reduce the bias in the outputs that's still a significant limit.

The data itself is human data for the most part. Even the, like, Algorithms that are trained on AI data are still, like, pre trained on human data first. And so, all those biases that exist in all our generative outputs, whether it be images or text or audio do all get replicated in some fashion or another.

And so, there's great work being done as to how we can minimize it how we can reduce it, but that's, that's a limitation that I think folks need to understand, that it's coming from the fact that it is being trained on human data, and that's inherently flawed right now. (ad here)

Ross Romano: [00:06:00] Yeah, yeah, Atash, when it comes to student work, right, assignments evaluations, what we're, you know, the skills we're trying to develop I think thinking about where the limitations are of the AI technologies is an interesting way to look at that kind of being the lower bound at which we would think about what we want to be assigning our students to do in a sense.

In other words, There's been a lot of emphasis in the media, in, you know, from some tech companies, in being able to detect work that was generated by AI, right? Okay, we have this plagiarism detection, and we can tell if a student handed in an essay that was written by ChatGPT. And If it's work that could be completely performed by a technology tool, then it's probably not meaningful work and meaningful toward the skills of the future, [00:07:00] right?

And so students can understand that and they can look at it and say, well, why do I need to know how to do this? I can use this tool for it. So it's a good opportunity to rethink what should those assignments look like when we're looking at a student's portfolio of skills and abilities and capabilities.

Are we asking them to do things that are uniquely human, right? And that are creative. But how, how do you look at that and how are you sort of working with schools and investigating that even in their just redefinition of what are meaningful skills and meaningful ways of demonstrating knowledge and skill development?

Aatash Parikh: Yeah, absolutely. It's, it's interesting because One of the things that you mentioned was creativity and, and it's pretty incredible how creative some of these generative AI tools are. And so we, you know, we, that's something that we've typically associated with a, with a human, you know, with as a human only kind of skill set is, you know, [00:08:00] the, the creating new art, creating poetry, and it's pretty incredible what, What chat GPT and what generative AI can do on that front.

So I think there's a lot of a lot of learning that, that we as educators need to do to actually understand what is capable, but I, I a hundred percent agree, I, it raises the bar for, you know, for what kind of teaching and learning I think is, is going to be needed. Not just from an originality perspective, right?

I think that's where a lot of folks concerns are around the originality of student work, the plagiarism, detection, cheating, like you mentioned. So, A, it's like, what is, what is the type of work that's not that easy to just outsource to, to chat GPT, and how do we raise the bar from that perspective? But secondly, you know, the work, thinking about the, the way the workplace is changing.

Working professionals are leveraging these tools as part of their day to day tasks. So how are we preparing [00:09:00] students for a world in which these tools exist, and these tools are going to automate, you know, certain types of jobs and certain types of tasks that, that are currently present in the workforce.

So, how are we preparing students for a future in which that exists, and what are those skills that we need to be, that we need to be Teaching and I think there's a lot of kind of important discussion that needs to happen around that. It's not, it's not straightforward. You know,

Ross Romano: Yeah. What, what comes to your mind when we think about what do we not yet know? I think about what these technologies can and will become. Are there questions that you still would like to have answered about what will become possible in five years, in ten years? And

Aatash Parikh: I do, I, I wonder what will be possible in the next 5, 10 years and it's hard for me to pinpoint what specifically those questions are and what [00:10:00] to even look out for just the way that it's been changing so quickly, right? You know, I think what all the, all the new developments in the last year or two have kind of exploded everyone's idea of what's possible.

And I imagine it's going to keep doing that for the next several years. So I think the biggest thing is really just Staying open minded and knowing that it's going to change rapidly. And kind of understanding, like Breen said, like what are the limitations that we know are, you know, are inherent in the way this technology is developed with the large language models and the language prediction.

But you know, there are a lot of the other limitations I think will be kind of. surpassed pretty quickly. And so just being ready for, for what that could look like.

Ross Romano: is there anything on your radar as far as a problem you're targeting, you know, something you would like to see solved, become a thing of the past, maybe something that's already kind of being addressed by new technologies, but. But they're not at [00:11:00] scale yet or, or, you know, fully capable yet of eliminating it or, or something new that's out there.

As far as, okay, this would be meaningful innovation. This is what's potential potentially out there.

Priten Shah: Yeah, I think I'm in a similar boat where I have no doubt that things are going to keep developing that. The next five to 10 years to pinpoint what, what I'm even looking for right now, it feels difficult. And that's interesting. I'm having trouble figuring out like, what are the things there's things I'm afraid of.

And those things are just like like what kinds of tools are built, who has access to them? What kinds of like control do we keep over those systems? The things I'm, you know, the problem with the optimism part is I think a lot of that optimism is about existing technology. So I don't know how much more.

Like, we have so much more we can do with what is already out there in terms of the actual models that exist in the education world that I don't know if I need, like, the next, you know, GPT 8 in order to fully dream up the next pictures. And so, I'm hoping that the next few years we spend time [00:12:00] figuring out what the full capabilities of the technology are that we've already achieved and find innovative ways to implement those.

And, like, one thing that I've been, you know, I'm hoping some, some EdTech person out there who is listening and wants to take this on as their project, like, takes this on is that, like, I, I really want, like, immersion language learning headphones. Like, I want someone out there to build those where, like, while you're listening to a, like, your non native language, it kind of filters in and out your native language and your target language based on, like, what words you've already been exposed to, what grammar structures you do know and don't know.

And that's all doable. Like, we have the technology for that now, right? Like, it exists in, like, the, the lab. It's a matter of, like, building that particular tool out and making it accessible. And I think that's gonna be a key theme is, like, a lot of this stuff can be done somewhere at some point right now.

It's about how much of this can we do at scale in our schools.

Ross Romano: And I imagine a lot of that is reliant upon the you know, the, the creativity, the critical thinking, the, the ability of the user [00:13:00] to target the right challenges, right? And to not necessarily just say, what's the baseline usage of this tech, and I'm just going to do that and kind of stick with that. But to be thinking in ways that, I mean, because one, yes, companies.

may identify certain things that can be done with the tools, and they may explain that, provide some training on it, etc. But you still have to have a, you know, a willing partner in that, right, who's actually going to follow through with that. But also there's, A variety of different perspectives and, and challenges that each individual may notice if it's, let's say a teacher or a student or or anybody in other, other line of work that might be thinking, you know what, this would be really interesting.

You know, one of the things that I. I had some conversations with some some founders who were working on some, [00:14:00] you know, generative tools, some large language models was, you know, it would be really interesting is finding a way to use this to take a general curriculum and be able to adapt it to be culturally responsive in a bunch of different ways and not have to necessarily have, you know, An entire team dedicated to adapting that and they can only do one thing at a time, right?

So we're taking this and we're making it responsive to you know, English learners, and then we're adapting it to another demographic, right? But to say, okay, if we know what the curriculum is supposed to be in math, as far as the math skills and all of that, how do we Adapt that in more real time to say, okay, we have a diverse school here, a diverse classroom, and now that would take development, but it [00:15:00] requires somebody first.

To kind of say, hmm, you know what, what are the challenges I'm having with the student body here? And we have, if we have two kids that are from one place and, you know, do we actually, there's no way right now that they're going to be able to allocate the resources. to adjust to that very small percentage of the student population and yet technology could change that, right?

Are there things attached that you're looking at just in your discussions with, with educators, right, about the, the things that they're most interested in? The, you know, either the aspects of what, Again, they're already using some technologies to adapt to that are becoming more and more prevalent or the questions they're still asking about, Hmm, can we push it a little further here?

Or you know what? Oh, now that I see how it can do this, it's making me think, what if we could do [00:16:00] that?

You're out muted,

Aatash Parikh: One of the partnerships that, that we have with educators is with the High Tech High Graduate School of Education and specifically their professional learning team. They work with educators across the world to design more personal, culturally relevant, project based curriculum, right, and it kind of does speak to some of the curriculum pieces that you were just referring to.

And they invest a lot of time in really coaching teachers through this process and supporting them with that curriculum design, and they built some really good frameworks around how to do this. And so, we were excited to partner with them specifically, there's a couple folks there, Kaylee Frederick and Nuvia Rulin, they developed what's called the Kaleidoscope for Deeper Learning, and it's kind of a framework to really help people.

Educators think through what makes an engaging learning [00:17:00] experience, and it's a pretty intensive process, and with a lot of mindset shifts, a lot of kind of inputs that need to be that need to come into that design process for teachers, and so what they're excited about, so what we're working with them on is, is leveraging AI to help the teacher actually brainstorm different elements of that framework.

What they're really excited about is, is. how much more quickly they can kind of go through that process from a professional learning perspective. And then from the teacher's side, you know, leveraging the AI to actually develop unit plans, scope and sequence, and really think through the calendar of how they're going to do this work with students.

You know, it's a pretty intensive, time consuming process, and so they're excited about how much time can be saved, to allow them to then, you know, actually do some of the pieces that only they could do, around connecting with local partners, thinking about needs of specific students, and [00:18:00] then actually, Facilitating that learning.

Ross Romano: you know, given the rapid evolution of technologies, what What does it mean to you for an educator to be AI literate? Does that mean committing to continuous learning? Does that mean there's certain baseline literacies that they should have?

Aatash Parikh: I think there are, there are probably baseline literacies and I think there are folks that are, that are developing great kind of programming around this. I'm seeing a lot of great professional learning out there around AI literacy. Of course, that needs to be continually adapted as things, things progress and things change.

A big part of it is educators playing with the tools and using them in their practice and building that. Comfort, right? And getting over the fear that often comes with trying things like this, and often that is the best way is to figure out, you know, what tools are most relevant [00:19:00] to their work, right? Not, not just for the sake of building literacy, but really thinking about what's practical and what's going to help them in their day to day tasks.

And, you know, that's going to naturally build, build, build their literacy. And thankfully there are tools that are, that are accessible. And so finding out, you know, from colleagues, from, from their networks, which are the ways, which are the most accessible tools for them to help them you know, get over those fears and find easy ways to, to jump into it.

Ross Romano: Do you have thoughts on what Let's say at a school level because I do think it's really critical at the school and district level to create a culture of, you know, more of an embrace of all tools, not, not exclusively AI, but, but any tools that are going to innovate and make an impact on student learning versus requiring every individual teacher to do it and particularly teachers to do it inside a building where [00:20:00] they don't really think that.

That's supported, right? But what does it look like for those schools to appropriately embrace new technologies and to, you know, I mentioned earlier, right, or about some of this the plagiarism detection stuff and things like that. And I had, you know, recently. Develop this perspective that, that there's a lot of thought that that AI tools are the, are the biggest cheating tools in, in education.

And I said, yes, they, they are the biggest cheating tools because if your school is not teaching students how to use them, you're cheating your learners, right, out of their future, out of what's possible for them, because the students who have. access who have more resources, more privilege, et cetera, are going to get access to them.

They're going to learn how to use them. They're going to develop those skills, whether or not they're using it directly for school assignments now, or they're just developing the skills for the future, they're [00:21:00] going to get it. And the students for whom are not going to access it elsewhere or not get You know, appropriate information and guidance and training on how to use them really well and meaningfully, they're going to fall behind.

I mean, it's no secret how quickly things are changing in this space, and it's going to you know, enhance and augment whatever it touches, so it can, Maybe close some persistent equity gaps and it can widen them to the point where, okay, this thing that already was kind of a challenge now in a few years, it's really a big problem.

But of course, yes, there's. Issues around appropriate usage, what works, what doesn't work data privacy and you know, all those kinds of things. Right. So it's not like just make it a free for all. But what do you look at and how, how would you have that conversation with the school leader [00:22:00] who maybe hasn't been super immersed in it yet to kind of talk them through, look, here's kind of, you know, where you need to be getting to.

Priten Shah: Yeah, yeah. And you know, these are the conversations that I think are happening across the country right now. And I think That we like to talk about it in three buckets. And so there, there's just a policy and procedures angle here. And so, it is figuring out, like, what tools you're going to allow your students and teachers to use.

There's which, what is actually enforceable in that world too. So, like, these blanket bans on, like, okay, let me just block chat GPT or any URL with AI in it. All these are like short term band aid measures that aren't really going to help anybody. I think we talked in our, like, very first episode about just, like, the, the equity gap that it creates very rapidly when students with, like, other devices, other network access are going to go use these tools anyways.

And so we, we try to encourage schools to at least, like, figure out which tools they're going to, like, encourage teachers to use and focus on that, teachers and students, rather than focus so heavily on, like, which, what can't folks [00:23:00] use and then navigate the rest of the questions based on that. So figure out what, what are the productive uses you want your teachers to explore.

Make sure you're providing them that PD so that they can like have structured time to learn these tools, kind of play around with them themselves, become comfortable, and then stay up to date. I think one of the other things we're seeing in a lot of the data coming out is that schools are doing like one off beginning of the school year AI PDs, and then they kind of just like fizzles away for the rest of the year.

And that's leading to less uptake by teachers because they kind of want to make sure that they can go check it and see, okay, like, I used it for this month to, like, make lesson plans, but these were all really bad because I didn't maybe prompt it perfectly or didn't know what to ask for and then they just kind of walk away from it.

And so I think that continuous learning process right now, especially as folks build up their initial AI literacy is, is really important. And then figuring out what ways you want your students to use it, like, how, this can also, this can be both things, right, like, are you gonna, what, how do your assessments need to change, you know, so that students can use this more productively what are you focusing on as a school, what kinds of assignments are you encouraging teachers to give but also, like, what kind of [00:24:00] assignments should you give so they can use AI productively, right, and so, there's the, like, let's AI proof things, but also, let's, like, get students ready for an AI world aspect, and I think those are both important conversations to start having, and I think the schools that are like, like embracing this the most are starting with like getting as many teachers on board as possible.

And I think that that is the right way to do this. I think getting teachers to become fluent with the tools themselves, become like, become like well versed in what biases might exist and what limitations might exist, all means that they can have better conversations individually with their students as the students start using it.

But we, we have to start there. And that's, this is like, we're, we're already have so much catch up to do, and this is why it's so hard to think about, okay, like, what happens if, like, in two years, the technology is X, Y, Z place? It's like, we, we're, we still haven't caught up to the technology that existed, like, in November of 2022, let alone the technology that exists today.

Or will exist in November of 2024. So like thinking about November of 2025 right now just seems like a really daunting task, even to me who thinks about this quite a bit. Because I think we, we just have to like do all of that fundamental work [00:25:00] still because when we go around to schools, we're still talking to folks who are encouraging those plagiarism detectors as their main line, like offensive strategy against ai.

Like that is their, the school stance is we are gonna like. Beef up all of our honesty policies, make sure they get penalized if they're caught using AI, and then here's all these, like, tools that are snake oil that, like, we've been sold on that will, like, do the detection. And so, it's just, like, working backwards from there means that, like, we spend a good amount of time just telling folks, Let's, let's at least get away from that policy, all right, the banning and the plagiarism section.

So, unfortunate and not as exciting as the things we could be talking about, like, they're, I mean, right, like, portfolio based assignments would be a really, like, I'd way rather be talking about stuff like that, encouraging them to use things like that like Inquirer. But like, I'm, I'm trying to get folks to the place where they're, they start thinking, oh my gosh, we can actually do cool things with this and not just be afraid of it.

Ross Romano: Yeah. Yeah. And I think, you know, part of what you, you kind of touched on is that student voice piece, the student perspective that [00:26:00] schools need to be tapping into and hearing what do students think about this? I would much rather have my teachers. Having an honest dialogue with their students about an assignment where the students are telling them, Look, I can basically complete this by using this software and I could plug it in and that's all it really takes.

So, you know, like what's, and kind of dialoguing about, okay, well, what's the real point of this? And what, you know, what are we trying to learn here? And how do we maybe do that differently in a way that either is, I helping us develop the skills to best leverage tools, right? Prompting and, and things like that, that are going to be skills that can be useful or changing the nature of the assignments to be learning something different versus saying, okay, let me assign it to you.

And then once you hand it in, let's see if you cheat it. And we're not really having a conversation [00:27:00] about. What it's all about, what's the purpose of our learning, and it's not, you know, it's, it's not that much different from the types of student conferences, right, that would be effective to having students understand and have ownership of their learning in general.

Now it's just, you know, focused on some new areas, but the exact same thing of, okay, if I know what the learning goals are here, and I know you make a compelling case for why this is going to be useful to me, and all of that, now I understand why I want to learn this versus just get the assignment done.

But if that dialogue's not happening, then What is anybody really learning? I mean, maybe the school is on the, the, in some way learning, you know, most of our students are able to complete these things we think without actually spending time on them. But at the same time, if we're not actually adapting to that, and it's just punitive that's not very good.

So, [00:28:00] I'm interested to hear from both of you about what you've seen, what you've heard. From and about students, you know, how are students responding? What are students looking for? How are they starting to develop a perspective on, okay, you know, in the, I'm interested in doing these types of things in the future.

And so I know that I'm going to want to know how to use these tools, or I wish my school would. Or now that we're using, right, we're using this it's opening me up to some new ideas here because that's, I mean, that's who it's going to impact, right? That's who it's going to affect. It's most of the decision makers that right now are in position to have that impact there.

you know, they're only going to be in those roles so much longer. And ultimately they're not going to feel the 30 years down the road [00:29:00] of who's the kid that really became super skilled and prepared for the economy of the future and who's the one that was totally left behind. But you know, Attash, maybe you want to jump in here first.

Yeah. What, What, what do students think, you know, , what, what are you hearing from them? What are, or what are teachers telling you about how their students are responding?

Aatash Parikh: Yeah so students are, so first of all, there's definitely, you know, gaps, the equity gaps in terms of access are very much still present of, you know, there are students that know and actively use the tech, the technology and others that, you know, I still surprised when I hear students say, you know, I've never used chat GPT or I've never used an AI tool and it's still yeah, it's still, that's still a common.

a common thing for for many students too, but I've heard from students that that do use the technology that they're using it quite a bit [00:30:00] both in school and out of school. I, I don't, I haven't talked to too many students yet about some of the more social, like, companionship and some of the virtual avatars.

I hear those are popular. I haven't yet talked to students about that, but I, I've heard from students using it in school and at the college level. It's very common. Recently talked to a high school student about they actually were afraid to use it because they hadn't had that dialogue that we were talking about around the plagiarism.

And so, you know, there was an assignment they very much could have used chat GPT for but chose not to out of fear of being caught or penalized for it. So I think they're the students that are afraid in that way because of the lack of dialogue. One of the things that I'm excited to work on with some of the tools we're building Around, you know, project design and helping teachers design these projects is, you know, how can we share those with students?

And we've gotten a chance to test [00:31:00] some of those things out with students and it's, it's seeming like, like building for students might be something that we wanna look into more just because of. The kind of exciting potential for helping students actually design their own learning experiences. I know that's kind of a little bit of a different angle, but that's, you know, some of our more recent conversations with students have been actually using AI with them and seeing what they can create and seeing how excited they are about it, especially when it's in that creative and, you know, generative way.

In the type of conversation it seems like maybe they're not getting as much of that with the AI. So, you know, as much as we can do more of that is what we're thinking about. (ad here)

Ross Romano: Yeah. How about you preen, right?

Priten Shah: I think we've, there's a, or a, like a mixed bag of emotions from a lot of students. I think some of those fears exist where they, they're afraid of admitting they used it sometimes even though they are using it a lot because they don't, even though they think they might be using it in [00:32:00] non, like, you know, in ways that not, are not cheating there's not clear guidance about what is and isn't cheating.

And so I think there's this fear about even talking about it productively. I think students are also afraid of just like what, what this means for their careers. And I think especially like the older the student gets, I think there's a lot more open questions about like, will the jobs that I'm currently like aiming for, the skill sets I'm building, how relevant will they be the economy that I graduate into what do I need to be learning in school differently in order for me to be able to like, Succeed in that world and like, I think, and that, that kind of leads it to, right, like, that create, it's creating this sense of apathy, and I think a lot of students, I think, like, when we're, we already have, like, a massive absenteeism crisis, a massive engagement crisis, and I don't think any of this is really helping where there's some students who are like, well, look, like, AI does all this anyways, and so I'm not even going to try to do these assignments because AI can do it, so I'm going to go home, figure out how to, like, use AI to do it, and then, like, So be it, because this is useless and then there are folks who are just like, why, why is my school not changing, right, like, why, why are we not, why are we still being told to do the same things and there might be good reasons [00:33:00] to still do the same things, right, I'm not, I don't think that the entire structure of our, like, you know, curriculum needs to be revamped.

Just because AI has now entered the picture but I think justifying how we teach it, what we're teaching about it, and like how we assess it is probably something we want to spend a bit more time on to keep those students to so the students can put that put it into perspective, right? Like it's easier maybe it's easy for like a career English teacher to like quickly articulate why writing like an essay is still important but I think like that ninth grade student who's like writing their first like real paper and can go home and have AI do it, I think they're having a much harder time figuring out why am I bothering with this when like AI is taking over the world and I want to learn things that will like prevent AI from taking my job.

I think that that's that's kind of where like we're hearing a lot of the frustrations. The older students who are like in the higher ed space, the graduate school space, I think there's a lot more excitement. I think that in those places where like you have longer projects that you're working on where the end product is often something you have ownership over you're presenting it, you're kind of at a conference, you're setting all those kinds of things mean that like the, a little bit of efficiency in the like most annoying parts of a project [00:34:00] actually go a long way to allowing you to focus on things you really care about.

So like, if you're building a cool app for a school project, like not having to sit there debugging for 12 hours might mean that you can put five or six new features before your final project showcase. And that's kind of fun for folks, right? Like, they get to kind of focus on the fun things, really start to, like, think more creatively and kind of imagine, like, having a little bit of a help in the in ways that aren't isn't hindering their learning, but there's that's there's a whole spectrum here, and I think just, like, doing a little bit of catching up, depending on the context of, like, age and subject to get students to see, like, what exactly do these tools what role do these tools play in this exact stage in your life or in this exact class I think it's going to be important so that they they don't feel that ambiguous ball of emotions.

Ross Romano: Yeah. Yeah. In so many ways, it's just yet another opportunity for schools to think about why do we teach what we teach and how we teach it? Is it still relevant? Is it still pertinent? Another opportunity to think about, okay, should we try some flip learning models? Should we try having [00:35:00] students represent knowledge through multiple modalities, et cetera?

All these things that could say, okay, what it's really about. is the acquisition and retention of the knowledge that they really need to have. And what are some ways that we can work on that versus worrying about things that are really just fear based, right? We're afraid that We're going to be overrun by these technologies and ultimately they are the tools that, that kids are going to need to know how to use.

So as we're wrapping up here, so it's, it's May 2024. So we're getting to the end of the school year here. So let's look ahead one year, May 2025. What's one thing or, you know, one thing top of mind that schools need to by the end of next school year to make sure that they're Advantaging and at least not disadvantaging their students.

But what, you know, what do schools need to have done from either an [00:36:00] implementation perspective, from a PD perspective, whatever comes to mind from you to say, look, this thing is moving fast, right? So have a plan over the next year to at least get this far and, you know, you'll kind of be on track versus falling behind.

Preeti, I'll let you start with this one.

Priten Shah: Yeah, I think that they need to remove the AI bans and ban the AI detectors. Like that, if that is all they do in the next year will it be like, 100 percent ideal? No, but will they be light years ahead of any other alternative? Absolutely. So, those, just doing that will force so many conversations and so much experimentation and so many requests from folks about how to navigate individual questions that I think, if you go into the school year knowing that you're, you're going to lift your AI ban and you're going to ban AI detectors.

That's going to force all the relevant conversations, I think, that folks need to have. So I hope, I hope that's the first thing folks do when they're, you know, sitting on their poolside this summer and figuring out what's next for the next school year.

Ross Romano: Excellent. Attach.[00:37:00]

Aatash Parikh: Yeah, I agree with that. And one thing I would add is they should do at least one professional development you know, sequence that helps teachers Love, you know, use AI to make their life easier in some way. So, you know, forget about the students for, for that PD. Just focus on the teacher and how AI can help them with a specific task that they need to do in their jobs and experience the power that, that this technology has.

And I think that itself can, you know, like we've talked about spark the dialogue of what that means for school and for students.

Ross Romano: Excellent. Well, listeners, you can learn more about both of these companies, Pedagogy. cloud and their Pedagog. ai product and Inquire by listening to this series. There'll be more episodes coming out throughout the course of the year where you'll hear both from our founders, as well as educators who are using their tools in classrooms, other [00:38:00] collaborators that are helping to continue to develop them and we'll also put the links below to their websites and social media.

So check those out there and see how they might help you. Help you to find some solutions to the things that are challenging you in their schools or be able to help students achieve new heights. And please do just in general subscribe to the EdTech Startup Showcase here to hear those episodes and more from our other startup companies talking about the different ways that they are working to help you solve the challenges you have and, and support your students.

So please do that. Preton and Natasha, thanks for being here.

Aatash Parikh: Thank you, Ross.