CRAFTED. | The Tech Podcast for Founders, Makers, and Innovators

Dr. Kristen DiCerbo is Khan Academy’s Chief Learning Officer, where she sets the company's product, teaching, and learning strategy — with AI at the forefront. 

Khanmigo is amazing. It’s the GenAI-powered product from Khan Academy that helps students learn and teachers teach — and could revolutionize education. 

Khan Academy is used by 12 million students a month! And they’ve been an early adopter of GenAI. 

Khanmigo coaches students in math and English, helps teachers prepare lesson plans, enables parents to get help as they help their children with their homework, and much more…
 
On this episode of CRAFTED., Kristen shares how they build with Generative AI  — and how you can learn from their experiences to build your own AI experiences. 

We dig into how Khan Academy:
  • Builds with GenAI’s unpredictability in mind
  • Helps GenAI get good at math (and also built a UX that masks the ways it is bad at it)
  • Gave Khanmigo an “inner monologue” that helps it slow down and better tutor students
  • Built a “prompt playground” where they can evaluate various prompts
  • Built a “prompt library” where they can keep track of prompts
  • Employ fine-tuning, red teaming, hackathons, and more…

Sal Khan’s new book is entitled “Brave New Words: How AI Will Revolutionize Education (and Why That's a Good Thing)” – On this episode we explore how and why!

***
  • (00:00) - Introduction
  • (02:04) - Education + technology: Kristen early experiences
  • (05:10) - Khan Academy primer: 12M students a month!
  • (07:36) - Getting early access to GPT4
  • (09:02) - GenAI: early experiments with tutoring
  • (11:40) - GenAI is bad at math! How Khan Academy grappled with this
  • (17:02) - Building an AI-powered writing tutor
  • (21:09) - How AI can free teachers from grading homework and help students learn more quickly
  • (22:40) - Prompt chaining: why you need to break up your prompts to get good results
  • (24:08) - Preventing AI-powered plagiarism
  • (27:50) - The role of teacher in an AI world
  • (31:46) - Outro

***

CRAFTED. is brought to you in partnership with Docker, which helps developers build, share, run and verify applications anywhere – without environment confirmation or management. More than 20 million developers worldwide use Docker's suite of development tools, services and automations to accelerate the delivery of secure applications. 

CRAFTED. is produced by Modern Product Minds, where CRAFTED. host Dan Blumberg and team can help you take a new product from zero to one... and beyond. We specialize in early stage product discovery, growth, and experimentation. Learn more at modernproductminds.com 

Subscribe to CRAFTED., follow the show, and sign up for the newsletter 👉 crafted.fm

What is CRAFTED. | The Tech Podcast for Founders, Makers, and Innovators?

CRAFTED. is a show about great products and the people who make them. Top technologists reveal how they build game-changing products — and how you can, too. Honored twice by The Webby Awards as a top tech podcast, CRAFTED. is hosted by Dan Blumberg, an entrepreneur, product leader, and former public radio host. Listen to CRAFTED. to find out what it really takes to build great products and companies.

[00:00:00] Kristen Dicerbo: Sal and I are like on the phone going, look at this. Look. Oh my God, look at this. Look at this.
[00:00:06] Dan Blumberg: That's Khan Academy's Chief Learning officer sharing her reaction to when she and founder Sal Khan got an early preview of GPT-4. It was obvious to them that generative AI could have a huge impact on student outcomes, and so they pivoted.
[00:00:20] Kristen Dicerbo: We basically looked at our roadmap for the next six months. And threw it away.
[00:00:26] Dan Blumberg: On this episode of CRAFTED., we go behind the scenes with Kristin Dicerbo on how Khan Academy built Khanmigo.
[00:00:32] Kristen Dicerbo: We said to the folks at OpenAI this is great at answering questions, but we are interested in tutoring, so we don't want it to give the answer to the questions. We want it to help the student get to that answer themselves.
[00:00:44] Dan Blumberg: Khanmigo helps teachers create lesson plans, give students instant feedback on their essays, and coaches them on their math. We'll learn how Khan Academy built these tools, including why they gave their AI an inner monologue.
[00:00:57] Kristen Dicerbo: We had this idea of AI thoughts, so it's kind of it's thoughts behind the scenes, what it's not saying.
[00:01:06] Dan Blumberg: Welcome to CRAFTED. a show about great products and the people who make them. I'm Dan Blumberg. I'm a product and growth leader, and on CRAFTED. I'm here to bring you stories of founders, makers, and innovators that reveal how they build game changing products and how you can too.
[00:01:22] CRAFTED. Is brought to you in partnership with Docker, which helps developers build, share, run and verify applications anywhere without environment confirmation or management. More than 20 million developers worldwide use Docker suite of development, tools, services, and automations to accelerate the delivery of secure applications. Learn more at Docker.com.
[00:01:43] And CRAFTED. is produced by Modern Product Minds where my team and I can help you take a new product from zero to one and beyond. We specialize in early stage product discovery, growth, and experimentation. Learn more and sign up for the CRAFTED. newsletter at modernproductminds.com.
[00:02:04] Kristen Dicerbo: I lead our content team. So on Khan Academy, all those questions, you answer all of the articles. Now, the video, Al still makes most of those videos himself. He considers himself an honorary member of the content team. So I lead the content team, the product management team, the design team, and our community support team.
[00:02:22] But the idea of a chief learning officer in a company that is an education company is that I am setting the learning strategy. What are our points of view about how people learn and how teachers teach? And one of the things that I do is bring a lot of the research about what we know about those topics to the design of what we're offering.
[00:02:42] So, but I've always been interested in how people learn and so was thinking about career paths through that and ended up getting, uh, a PhD in educational psychology, which is the branch of psychology that studies how people learn, not the branch that does a lot of counseling and that work. And I worked in the schools as a school psychologist for a few years in Arizona where I'm based, uh, but then started to think about how education technology could help.
[00:03:07] Really scale what we know about good teaching and, and how to help more kids learn more. And so I did a, a left turn into education technology almost 20 years ago now, and then I started doing that actually with Cisco Systems. They have a. A program called the Cisco Networking Academy, where they design curriculum and assessments that they give away to high schools and community colleges to teach computer networking skills.
[00:03:34] Then went to Pearson, which is one of the big educational publishers, one of the biggest globally, and both experimented a lot with how to work. And, and bridge product groups and research groups as well as learned a lot about managing people. And then, uh, about four years ago, I joined Khan Academy as their first ever chief learning officer.
[00:03:56] Dan Blumberg: Was there an aha moment, you know, years ago that opened your eyes to what technology plus education, you know, could do?
[00:04:02] Kristen Dicerbo: There's been a couple. One of the first aha moments was at Cisco when we were designing this simulation tool, it's called Packet Tracer, where it allows folks that are configuring networks to also be able to see the packets of data that move in between that network, which is not something you actually can visualize.
[00:04:21] But when you're learning how to configure networks to be able to visualize the packets moving through the network and how they do that is a real aha for kids who are learning that. And so it started to make me think, oh, there's something that technology can do that builds on what is even possible in the classroom.
[00:04:41] It does something that we couldn't think of before. It's not that it's, you know, doing something better or helping do something we already did before. It's doing something new. And so that was really an, an interesting. Seeing kind of aha moment for me. Um, the second one was why are we giving them multiple choice tests to see if they can code when we can just ask them to code and, and set up the network and use that as the assessment of whether they're, they've gathered these skills or not.
[00:05:10] Dan Blumberg: I wanna spend the bulk of our time talking about, um, you know, how you've built, tested, shipped Khanmigo and, and other AI features at at Khan Academy. Before we get there, I'd love if you could start by orienting us on Khan Academy's mission and its user base. I, I imagine a lot of people, you know, listening to and watching this maybe familiar with.
[00:05:27] Khan Academy Videos and Sal, but they may not appreciate the full system that you've built for students, teachers, school districts, et cetera.
[00:05:35] Kristen Dicerbo: The mission of Khan Academy is to provide a free world class education to anyone anywhere. Which is a, a big mission, but we're excited to, to work towards it.
[00:05:44] Initially, Khan Academy very much worked with what we call independent learners. So that's the person who's trying to learn something new, uh, and is struggling with it. Come they can, you know, watch a video of Sal, figure that out and go on their way. Quickly, Sal realized as he was building this, that it's not really just the videos.
[00:06:03] If you're learning new skills, you need to practice those skills. Uh, and so he started building the, the practice platform. So one of the key things is it's not just videos. It's the videos with the practice that is really important to learning. And then we also found in the, you know, 20 17, 20 18 timeframe.
[00:06:22] That hitting all of those independent learners is great, but those tend to be the most self-motivated learners. The learners who maybe don't need us the most, they're gonna find their answer however it is. Yeah. And if we're really gonna hit the kids that need us the most, we need to be in classrooms. And so that decision led us to what is now our district's offering.
[00:06:44] And in districts, uh, we are in about 500 communities in the us. So we offer two districts, uh, not only the content that's free to everyone, but also professional learning for teachers, automatic rostering so teachers don't have to manually enter all of their students' names and better reporting. So reporting for the district about both use and learning progress, uh, and also for teachers and, and all of that.
[00:07:12] Overall, we have about 12 million students a month who, uh, do something on Khan Academy. We have about 1.2 million a month who meet a threshold of two hours a month, which is about 30 minutes a week. We measure that because our efficacy studies of does this work Show us that when you hit that two hours a week, you see greater than expected gains on assessment scores.
[00:07:36] Dan Blumberg: So Khan Academy and, and Sal Khan have been very out in front on generative ai, and Sal gave a TED talk titled How AI Could Save Parentheses Not Destroy Education.
[00:07:50] Ted Talk: If you were to give Personal One-to-one tutoring for students, then you can actually get a two standard deviation improvement just to put that in plain language that could take your average student and turn them into an exceptional student. It can take your below average student and turn them into an above average student.
[00:08:08] And we think it's, this is just the very tip of the iceberg of where this, this can actually go.
[00:08:19] Dan Blumberg: And he has a book out on sort of the same theme right now. What was your initial reaction to generative AI and how did that evolve over time?
[00:08:26] Kristen Dicerbo: Sal and I got. A sneak preview of what we now, uh, now know as GPT-4 in September of 2022. So you'll remember that's before chat GPT came out. So OpenAI set Sal and I up in a Slack channel.
[00:08:43] I. Where he and I could talk to the GPT-4 model, but when we got access, I actually happened to be in the Phoenix airport. But of course I can't, I don't, I can't say anything in the airport about what we're doing, but Sal and I are like on the phone going, look at this, look. Oh my God, look at this, look at this.
[00:09:00] And I was blown away.
[00:09:02] Dan Blumberg: I'd love to learn more about how you started experimenting with it, what possibilities you immediately saw where it fell down. I'd love to understand the, the, the product development process as you started playing with it at such early stages.
[00:09:14] Kristen Dicerbo: Yes. So within the first day, we said to the folks at OpenAI, this is great at answering questions, but we are interested in tutoring and thinking about how it could act like a human tutor.
[00:09:24] So we don't want it to give the answer to the questions. We want it to help the student. Get to that answer themselves. And they gave us our first lesson in prompt engineering. Uh, basically said, Hey, if you just say help me find it myself, and then ask your question, it'll act more like a tutor. And it didn't.
[00:09:42] I was like, oh my gosh, this is okay. Now Sal and I saw this under a strict NDA, we convinced open AI to let us have about 30 more people in the organization have access. We happen to have scheduled about two weeks after we saw a hackathon. So at Khan Academy, we have a history of hackathons once or twice a year, the whole organization, not just the engineers, takes a break from their everyday work and works on anything they want.
[00:10:10] That improves and moves us towards the mission of Khan Academy. But we brought these 30 folks in and said, okay, for this hackathon we're gonna see. What could we do with this technology? And so did you know kind of the let a thousand flowers bloom? Lots of potential different ideas for students and teachers, uh, about what this could be and what it, how does it work?
[00:10:31] Giving feedback on writing, you know, all kinds of things we ended up doing like chat with a literary figure and debate me and all of these kinds of different things. So we did a whole bunch of pieces and at the end of that, we basically looked at our roadmap for the next six months. And threw it away and said what we're going to do is when GPT-4 launches in March, they knew there was gonna be a March launch date for this.
[00:10:55] And they basically asked us if we would at that time launch something on Khan Academy. So we agreed that we would and made a huge pivot. OpenAI was actually a really good partner and paired us with one of their product managers and we ended up doing daily standups. We did a, we did a December trip where Sal put about eight of us in his minivan.
[00:11:20] We drove from Mountain View up to the open AI offices and, uh. Visited them and you know, did some red teaming. And then we worked through towards March to launch the tutor and a, an assistant for teachers that had, at the time a three or four different things that could help teachers, uh, kind of do their jobs as well.
[00:11:40] Dan Blumberg: Generative AI is famously bad at some things. Like some things like we take for granted that computers are amazing at like math. I'd love to learn more about the, the techniques that you use to get math to work properly, which seems almost like a ridiculous thing to ask of a computer program. But I know it was a, it was an effort.
[00:11:55] Kristen Dicerbo: It is. And I wanna be clear that it is not solved. It can sometimes still be wrong. Um, and it's important for kids to know that and for all of us to keep that in mind. But we have. I have done a lot of work and I could probably, uh, you know, write a book about just how we've been working on math and people keep finding new ways to do this.
[00:12:14] But the first thing, uh, that that came out is something that we found was helpful and other people were finding too, and we tried, was just telling it to think step by step. Like literally in the prompt. If you say, think through this step by step, it improves it a little bit. Then we said, okay. We had this idea of AI thoughts, so it's kind of it's thoughts behind the scenes, what it's not saying.
[00:12:37] And so we have it work out the math problems and we have the problem and we feed it. The answer to the problem on Khan Academy, I. Um, but we have the AI work out all the possible solutions it can to that problem. And so it has those in the background. So when a student's working through steps, it can try to find that and compare that to the steps that it has, that it's working through.
[00:12:58] Dan Blumberg: You gave it an inner monologue, is that true? Ba
[00:13:00] Kristen Dicerbo: it's kind of how to think about it. Yeah. It's got its own thoughts over here, but we also found that, uh, it still wasn't getting to the levels that we would've liked it to be so. We took the advice of a number of experts who said, you know what? These large language models aren't calculators.
[00:13:17] They're not meant to be and built in a workaround where when it detects that math is being done, it actually sends that out to a Python calculator. Now, of course, that increases the amount of time that someone's waiting for the response. So we found that if we just put in a message that says, doing math.
[00:13:36] And a little Khan Academy, uh, Khanmigo iss doing a little head bob, um, that people are okay with that for, uh, you know, having a five, six second latency. Uh, and I was sitting next to a, a kid who was doing with, and I said, does that seem too long? And she was like. No, it's doing math. Like I take a while to do math sometimes.
[00:13:55] It's okay. Yeah. Um, so there's an interesting lesson there about, it's not always the la it's just letting people know there's something happening and,
[00:14:02] Dan Blumberg: yeah.
[00:14:03] Kristen Dicerbo: That I identify with, uh, which I think is good. So we've done that. Then the other piece is having a good measure of how good or bad it is at math.
[00:14:12] And so we have been assembling a, a test data set that we can then use and are going to be launching shortly to a, you know, open source. So anyone who wants to can use it as an evaluation data set to be able to test new models as they come out. Not just for math accuracy, because it isn't just the, you know, does it get two plus two?
[00:14:34] Right? It's also the tutoring piece and being able to evaluate student work, uh, because one of the things we find is that the AI can have a positivity bias and it'll say, great job when it's not a great job. It's actually the wrong answer. So, um,
[00:14:48] Dan Blumberg: yeah,
[00:14:49] Kristen Dicerbo: we, we've included those kinds of evaluation scenarios in this dataset because we do find the models are improving every time one comes out.
[00:14:57] And so if we can help folks focus on this as a goal to actually improve math tutoring, we think that would be a good for everyone. Yeah.
[00:15:06] Dan Blumberg: The other not quite a bias. I think it built into the training of the models that I, I always find interesting is GPT uses passive voice a lot. Mm-Hmm. And it's actually on purpose.
[00:15:15] It's like you, you know, it might be better if you included X 'cause it doesn't wanna insult you. Right. It kinda wants to be a friendly suggestion, but, um, yeah. But I always find, 'cause I, as a writer editor, I'm always like, no passive voice. Get rid of that. Like, make it declarative. But, but it's funny as a computer is speaking to you, like it'll, it's almost softer and better to like use that.
[00:15:35] Milk toast voice.
[00:15:36] Kristen Dicerbo: It does. It does. I, I agree. I've seen that same phenomenon happening.
[00:15:42] Dan Blumberg: I'd love to move from math to English. I I know one of the tools that you've built recently is a, is an essay evaluator. And I, I wonder if you could share your screen and we could go through it and you could help us understand what's happening behind the scenes as, as Khanmigo coaches someone on an essay that, uh, that they're trying to finish.
[00:15:58] Kristen Dicerbo: Yeah, that would be great. And, uh, I'm going to do a little bit of a, like a baking show thing where I've got like. A part of an essay written that We'll, cool. We'll use for this example. Okay, so this is give me feedback on an academic essay. So we're going to here, go ahead and, uh, tell Khanmigo a little bit about an essay that we're writing and then paste it in and we'll get, we'll get some feedback and I'll walk through where that is.
[00:16:26] So we're going to, uh, do a, a 10th grade essay here. It's gonna be a persuasive, argumentative essay. And these are the instructions that my, uh. Hypothetical teacher gave me, which is write an argumentative essay about whether leisure time is better scheduled or unscheduled. Your argumentative essay must be based on this prompt and topic, and it must incorporate ideas and evidence found in the sources, et cetera, et cetera, et cetera.
[00:16:50] Dan Blumberg: Did, did you write this draft? Do you have a strong position here on leisure time?
[00:16:54] Kristen Dicerbo: I must admit this is the, uh, point of view of our product manager. Okay. So here is my draft. Now we are working on a full essay coach where students will actually draft their essay in the interface and we'll be able then to tell a lot more to the teacher about how much the student actually wrote and where they did.
[00:17:15] But this is the mm-Hmm, the feedback portion of the essay coach for now. So I'm gonna submit this for review. It's going to give me feedback on the introduction, evidence, structure, conclusion, and style. Now here's what's happening under the hood. For each of those five elements, we have a separate prompt because when we started writing a one prompt to evaluate all of these areas, that gets to be too long.
[00:17:40] And if your prompts get to be too long, the model starts to ignore parts. Yep. At random. So sometimes it'll ignore some, sometimes it'll ignore others. So we broke that apart. So we're doing prompt chaining. So first it's doing that introduction piece and saying, okay, here's all the things about the introduction.
[00:18:01] We have on our staff, former English language arts teachers that say, Hey, here's the things that we should look for in an introduction. So they, you know, write the prompt for, Hey, how are we going to look at the introduction? What are the things we're gonna look for? So what happens is we send the essay.
[00:18:16] With that prompt about the introduction, it gives the feedback on the introduction, and so then we have that available and then it passes that basically over and says, okay, next prompt evidence, next prompt, structure, uh, et cetera. As it's kind of going through there, what's, what's happening and where that all sits.
[00:18:35] It also, as part of that, passes through a moderation API, which flags any instances of hate, violence, self-harm, and tags them to the teacher or the parent. So that's the a, a big piece of making sure that kids are safe and using this safely as well. All right. Let's take a look at what it said about our introduction.
[00:18:55] Uh, your introduction clearly states your thesis and points you'll be discussing in your essay. You've done a good job of setting the context for your argument,
[00:19:02] Dan Blumberg: not passive voice there, by the way. I, I'll, it's like if it gives you good news, it's like it says it straight up. It's
[00:19:07] Kristen Dicerbo: right, right there. Yes, exactly.
[00:19:10] To make your essay even stronger considering engaging the reader more. In your introduction, perhaps you could start with a thought provoking question or a compelling statement related to your topic. Now I can chat with Khanmigo about this if I'd like, so I can ask it to give me an example. Now I can go in here and change something, so now I can ask Khanmigo if that helps.
[00:19:34] So it's now going to interact with me. About the pieces in the essay and my changes to it and be able to, yes, your changes have made the introduction more engaging, the new opening sentence, grabs the reader's attention and sets the stage for your argument. Good job. All right.
[00:19:49] Dan Blumberg: Can you zoom out for a bit of just how I.
[00:19:53] This helps the student, helps the teacher who, you know, has got 25 students or something, all of whom are writing an essay like this. Like what's the big picture of how this really changes things?
[00:20:03] Kristen Dicerbo: Yeah. So one of the things we know about learning is that getting immediate feedback, especially when you're learning a new skill, is really important.
[00:20:11] Now think back to when you submitted essays when you were a student in school. Give the, the, uh, hand the essay in. The poor teacher has a stack this big of essays to work through. Two weeks later. You'll get your essay back. You hardly remember even what you wrote, what you were thinking, what process you went through.
[00:20:32] Most people look at the grade and stuffed it in our backpacks, and that was it. Um, and so that is not a good learning experience. Uh, and so the, the big piece here is that before you even turn it in. You're thinking about drafting and thinking about how to improve what you're doing with immediate feedback and hey, let's try this.
[00:20:52] Is this working? Here's another suggestion. And getting to that before the teacher even sees it, so that from a learning perspective, I. We think is really important. And one of the things that we've heard anecdotally is that teachers don't assign a lot of writing because they don't have the time to grade it.
[00:21:09] And it's hard to get better at something without actually practicing doing it. So this opens the door up for students to do more writing and to get more feedback. Mm-hmm. On it, which we think is really important.
[00:21:20] Dan Blumberg: You've mentioned a few of them already, but what's another sort of tricky obstacle that you're either have figured out or are in the process of figuring out?
[00:21:26] Kristen Dicerbo: Yeah, so we started from, you know, ground zero. So one of the things we had to do was figure out all this prompting. So there's all these prompts we're writing. I. Changing one sentence in a prompt, for example, can change the entire behavior. And because of the model's respond probabilistically, you can't just test it one time and say, oh, great, it does what we want.
[00:21:46] You need to test it like 10 times, 20 times to see what that range of responses looks like that it's doing. So our engineers built what we call the prompt playground. Where it's basically behind the scenes where we can test out prompts, send them to the model, get those, you know, 20 responses and be able to look through them, make a quick edit, see how that goes.
[00:22:08] And then once we get it to how we like it, then we can publish it out to the activity or the site, uh, and what we're doing. So there was just a challenge of. How do you work with all these prompts? So the other piece is you almost need a, a librarian, like a prompt librarian to organize all the prompts and store them and keep track of things.
[00:22:29] So that was certainly another thing that we never would've thought, you know, 18 months ago that we were gonna be building. Yeah. This kind of infrastructure and what that looks like.
[00:22:38] Dan Blumberg: I love that. I, I love also that both of those analogies, the prompt playground and the prompt library are, are school analogies, and so I can almost picture them.
[00:22:45] Kristen Dicerbo: Yes.
[00:22:48] Dan Blumberg: There's a lot of fear about ai. E even back to that TED Talk was how AI could save parentheses not destroy education. There's a fear that it will destroy education. So I'd love if you could just address that specifically.
[00:22:58] Kristen Dicerbo: Yeah, so there's a lot of, as you say, a lot of fear of things. One of the biggest fears was of course, that that came out pretty early with all the thoughts about plagiarism and thinking about it's just gonna do the whole assignment and where those things sit.
[00:23:12] And so we did a lot of things to address that fear specifically. So I'm showing one of those, uh, pieces here. So for example, we have a chat history, so a student can always go back and review their old chat. But teachers and parents can see them too if they're linked into a student account. So that was one of the things where we said, you know, if the student's just coming in here and trying to get it to do with their homework, if I had gone into the chat and said, tell me the answer to eight x plus three equals nine, that's visible to the teacher and parents and the, and the students know it's visible.
[00:23:48] It's a, you know, we say all the time, this is being recorded and it's visible to your parents who are a teacher. So. Putting in some guardrails so that we're specifically addressing the fears of having this do people's work. And what that looks like, I think is an important piece and something we talk about a lot is the difference between educational applications of generative AI versus.
[00:24:10] Raw generative ai, you know, chat GPT in the wild and the, the differences for students in between using those two kinds of things. And that's an important piece we try to get across to teachers and districts and parents. Uh, so that was kind of the first piece. And then we also did a lot of work though, 'cause students being clever as they are.
[00:24:32] Would test all of those boundaries and would go, for instance, to, um, chat with a historical figure and they would find, uh, Pythagoras was on here and I don't know if he still is or not, but try to get Pythagoras to, you know, come up with, oh, explain the Pythagorean theorem and figure out where those were.
[00:24:50] So we had to put in a lot of constraints around not doing kids' homework for them and where they are. But we still have a ways to go. And this is, uh, and one of the interesting things we've learned, so we can also analyze the chat transcripts that students, you know, of their conversations and some of them are fantastic and they're amazing and you think, oh, this is what we built this for.
[00:25:14] But there's a lot of them where kids are just saying, not even, I don't know, but I-D-K-I-D-K, and they're just not putting that forth kind of the effort. Mm-Hmm. And there's kind of two explanations there. One is. They're not really great at asking questions and they're, they don't really know how to reflect on what they know and don't know, and they could use some help doing that better.
[00:25:35] And the other is, well, they're just kind of taking the easy way out and aren't need some motivation to put in effort. So now after doing the, you know, the data analysis and understanding this, we're thinking about. What are ways that we can make Khanmigo act to one, help students develop skills of questioning and being able to reflect on their own learning?
[00:25:56] And two, how do we motivate them to actually do the, the cognitive effort of learning? Because learning is hard and yeah, it takes effort and focus and kids don't always wanna do that. And so it's one of those things that we, we've been trying to crack even before generative ai. But we think some of these tools may help us, you know?
[00:26:17] Pull kids out and, and engage them a little bit more in some of these things. Like it's more fun to talk to a literary character than it is to, you know, read a article about them. So we'll see if, if we can help crack some of that motivational issues.
[00:26:30] Dan Blumberg: What is the role of the teacher going forward?
[00:26:33] Kristen Dicerbo: Yeah, so there's a couple things that we know that are really important for student outcomes.
[00:26:38] One is we know that if students feel like there's someone in the school who cares about them and their outcomes, those kids have higher graduation rates, higher post-secondary school attendance. That's an important role of the teacher. The teacher also knows a lot about the peers, the classroom, all of that and how the, all of that interacts together, and that's really important.
[00:27:02] What we see some of this technology taking on is the role of the coach and the tutor right now. But certainly in the future you can imagine it being a more, even more of a content expert. And so the question is, if some of that content expertise doesn't rely on the teacher. What are the roles of the teacher?
[00:27:24] And I think most teachers would say, you know, we need to continue to motivate students. There's not a picture of the world where kids are happy to sit at a computer all day with their headphones on working through and working on this tutor. Yeah. So I think we still need, you know, teachers who are in classrooms.
[00:27:43] That's a different skillset than being the one who's delivering all of the content. And so I, I think it's worth thinking about what that role of the teacher might look like.
[00:27:51] Dan Blumberg: Yeah. You're actually reminding me of work I've done. I've consulted to, uh, two different. Big banks in the realm of wealth management and, you know, picking stocks has kind of been commoditized, right?
[00:28:03] Yeah. That's not really what you need a, an advisor for. You may not need a human advisor at all, but they, they've sort of become coaches. It's, it's become much more of an emotional job than I think it ever was, like, you know, 5, 10, 25 years ago. And it, um, I think I'm hearing a similar potential direction here.
[00:28:17] Kristen Dicerbo: Yes. Like if the stock market's going down, that does not mean you should sell everything you have. Yeah.
[00:28:21] Dan Blumberg: Right, right. Or it's, or it's talking about your, your, your goals. Your goals and what you, and motivation, you talked about vision and what you wanna do. Um, I know you said you didn't wanna make predictions, but I'm gonna ask you a future question and take it where you, where you'd like.
[00:28:34] Uh, but if you look into the sort of relatively near future, let's say, you know, 10, 20 years from now, what do you think childhood education would or, or should look like?
[00:28:44] Kristen Dicerbo: I will talk about what I. Would hope things look like and we'll see if we get there or not. I think we will still have kids in schools with working together with teachers.
[00:28:57] It would be. I think more ideal if kids weren't forced to always be in a set cohort based on age and that there was more flexibility in terms of, hey, this is really a passion of mine and I'm really good at it. I can move ahead a little bit further than, you know, always having to be aligned with this fourth grade group that's doing this.
[00:29:20] Um, and there's a number of schools that are experimenting with things like that. Anyway. I also would hope that grading changes, so that grading isn't just, I get an A in math, but is more focused on the skills that you've acquired and being able to say, yes, you've mastered this skill, you're still working on this skill.
[00:29:39] Um, and just be a lot more informative about what you know and what you can do. And I hope there's a mix of. The kinds of independent practice that we know you need to end up being successful at in retaining these skills and mastering them. And then the, you know, combined, whether it's small groups or large groups who are working on projects where everyone has a role in a responsibility and they come together to create something bigger than the individual themselves.
[00:30:08] Uh, I think that's the. Combination of things that ideally would happen in schools. And so some of that's technology based, but not all of it. Uh, and we certainly need to be, I think, continuing to think about how we as humans interact with each other and building on those skills.
[00:30:24] Dan Blumberg: Yeah. Kristen, thank you so much.
[00:30:26] This is fascinating stuff.
[00:30:27] Kristen Dicerbo: Absolutely. My pleasure. Thanks for having me.
[00:30:31] Dan Blumberg: That's Kristin Dicerbo. I'm Dan Blumberg, and this is CRAFTED.
[00:30:36] CRAFTED. Is brought to you in partnership with Docker, which helps developers build, share, run, and verify applications anywhere without environment confirmation or management. More than 20 million developers worldwide use Docker suite of development, tools, services, and automations to accelerate the delivery of secure applications. Learn more at Docker.com.
[00:31:00] Special thanks to Artium where I launched. CRAFTED. Artium is a next generation software development consultancy that combines elite human craftsmanship and artificial intelligence. See how Artium can help you build your future at Artium.ai.
[00:31:16] And CRAFTED. Is produced by Modern Product Minds where my team and I can help you take a new product from zero to one and beyond. We specialize in early stage product discovery, growth, and experimentation. Learn more and sign up for the CRAFTED. newsletter at modernproductminds.com.
[00:31:33] Please share CRAFTED. with a friend, but also don't be Captain obvious.
[00:31:37] Kristen Dicerbo: You are a Socratic tutor. I am a student. Don't give me the answer.