25 Years of Ed Tech

Let's get behind the learning analytics and do better: “Because what we measure sends a message about what matters.” ~ @ammienoot
#truth

Show Notes

For Between the Chapters episode for Chapter #21 (2014), Laura is joined by Anne-Marie Scott and Dragan Gašević to talk about learning analytics (LA). This conversation outlines a definition of LA, in terms of higher education -- for practice, within products in ed tech, for online learning/teaching, and evidence-based research. There are so many interpretations as to what LA is, and we hope this episode unpacks the myths and misconceptions as to what metrics and data within learning and/or measurement science. For LA, it is important to talk about the use, ethics in data collection, and supporting learner agency talk about the use, ethics, and learner agency when examining learning, teaching, and design of both. Get your notebooks out for this master class in all things learning analytics!
Questions to reflect on from this chapter:
  • What does learner success look like? What’s the measure of learner success? 
  • What are students here for? And what are our higher ed institutions for?
  • How can data be used as a form of feedback to develop towards their own goals?
  • What are the core values you want learning analytics to promote? 
Connect to the episode guests:
How are you connecting your teaching, learning, and design to effective data measurement? Are you thinking differently about learning analytics after listening to this conversation? Let us know -- send us a message or tweet. Podcast episode art: X-Ray Specs by @visualthinkery is licensed under CC-BY-SA. Remix by Terry Elliot.

What is 25 Years of Ed Tech?

25 Years of Ed Tech is a serialized audio version of the book 25 Years of Ed Tech, written by Martin Weller of the Open University and published by AU Press. The audio version of the book is a collaborative project with a global community of volunteers contributing their voices to narrate a chapter of the book. Bonus episodes are a series of conversations called "Between the Chapters" to chat about these topics and more!

"In this lively and approachable volume based on his popular blog series, Martin Weller demonstrates a rich history of innovation and effective implementation of ed tech across higher education. From Bulletin Board Systems to blockchain, Weller follows the trajectory of education by focusing each chapter on a technology, theory, or concept that has influenced each year since 1994. Calling for both caution and enthusiasm, Weller advocates for a critical and research-based approach to new technologies, particularly in light of disinformation, the impact of social media on politics, and data surveillance trends. A concise and necessary retrospective, this book will be valuable to educators, ed tech practitioners, and higher education administrators, as well as students."

Credits:
Text in quotes from the book website published by Athabasca University Press CC-BY-NC-ND
BG music Abstract Corporate by Gribsound released under a CC-BY license. Track was edited for time.
Artwork X-Ray Specs by @visualthinkery is licenced under CC-BY-SA.
Audio book chapters produced by Clint Lalonde.
Between the Chapters bonus podcast episodes produced by Laura Pasquini.

0:03
Between the chapters, a weekly podcast discussion focusing on a chapter of the book, 25 years of edtech, written by Martin Weller. here's your host, Laura Pasquini. Well,

0:16
welcome what chapter 21 2014 is the year and we're diving into learning analytics. I'm thrilled to be joined by Anne-Marie Scott and Dragon Gasevic. Welcome my friends to the chapter and the book club between the chapters. Thank you.

0:31
Nice to be flora. Yeah, thanks for having us here.

0:35
Well, I do my best to bring on smarter people than I to this podcast. And you too, are in the world of learning analytics a lot with your research, your work, your teaching, and your practice. So tell the listeners a little bit around the world of LA or learning analytics. How did you fall into this space? and What brings you here to the table?

0:55
Yeah, I mean, I started doing something that we call today learning a lot. It's about 16 years ago or something like 2005 and I started to pasta in at Simon Fraser University in Canada and then about 2010. One day George Siemens asked me "Dragon, Why don't we run a conference on learning analytics, learning analytics, and then I'm doing learning let's Why don't we run the conference and endorse us?" Oh, can we do it in two months? No need for more few more months. So and that is, that's how we started doing learning analytics school, we hosted the first learning analytics conference. So aka in Banff, in 2011, George Siemens, however, still at a target, university, we're Emory's now. And ever since basically, I participated in different activities that are related to the Society for learning. Research, also held the chair and learning analytics at the University of Edinburgh. Now I'm with Monash University in Australia and also created a new Center for Learning Analytics. Basically, it also dates then March. So running a relatively drawing and nice, large group of people that is all focused on learning analytics.

2:14
And one of them so LARS the first one, do you have an acronym for your new center? Because I Love New acronyms?

2:19
Oh, yeah, it is. It's called Center for Learning Analytics research at Monash. And that add monitor was added so that we have an acronym that is meaningful, it's called column. And it's an interesting thing about column when it's spelled with a K instead of with seeing it's basically pot associated with lots of Sri Lankan art. Yeah, I'm basically and it's also very mathematical. And he basically always thought, Oh, this is really kind of very meaningful as well. acronym for our center.

2:48
I love it. Talk Nerdy To us, Dragon anytime, Anne-Marie, How about yourself? What loops you into the LA world's learning analytics?

2:56
A very, very different route. And I'm very much obviously on the practitioner side of this or Yeah, practitioner, even with a small p in the last little while and around about 2012. So I was at University of Edinburgh before is Athabasca around 2012, Edinburgh jumped into some MOOCs with Coursera. That was our first toe in the water. And the end result of those weeks was a big lump of data. And people were interested in analyzing it. And I've been doing some reading around learning analytics, it kind of the core signal stuff coming out of Purdue. And some of that was in the in the kind of press that was reading, I suppose I mean, educators and all sorts of publications. So had a kind of concept of this thing called learning analytics that was appearing on the horizon. And then yeah, I was asked to participate with a group of people. And what do we do with this lump of data that came out of these first Coursera courses? And that that was the sort of gateway drug into going going further down that rabbit hole. And I would say that rabbit hole got a lot bigger when dragon joined us at Edinboro. And we did. We did put, I think you want to call it a kind of stress test exercise. We did a partnership with sever TAs learning to see whether they are tools could be used today, the muscle that was the first time we tried to join some of the datasets they were together. And that that was revealing in terms of examples of institutional capacity, but also how much insight was actually in the data. And then I also do work for an open source software foundation apereo, which isn't an acronym, but don't ask me what it means as well. Or your Software Foundation. And so a project that actually another one from Australia on task, which is feedback driven analytics driven feedback, got very interested in that from an implementation point of view, but also from the perspective The buildings or Software Foundation I'm a part of. And that's actually now one of the projects under our umbrella there. So it's quite a mixed bag of projects and interesting people that have intersected with and work that we've done to position learning analytics. But all of that has taken me to a place where, like, when we talk about learning analytics, what are we talking about? Because I think it's decomposed into two different things. I mean, there's definitions of learning analytics, but there's learning analytics as a set of products that exist in the marketplace. And there's learning analytics as a research field and the field are very different. And not always that very well connected. No, and I think that's where a lot of the challenge and a lot of the discussion around this comes in. And there's a lot of definitions about learning analytics, when you poke them are about predictive analytics based on Ellen Massey's data. And that's the very narrow conception that a lot of people have.

6:01
And the worst, I'm sorry, the worst definition? Yes. I'm, I'm glad you brought that up. This comes after we've already talked around massive open courses, right? MOOCs, massive open online courses, we've talked a little bit around personalized learning what that means the LMS. So this does come later on a really simple diagram. I don't know how Martin packs this topic into one chapter. Because it's so vast, you've hinted on it between the practitioners and researchers, the diagram of the learning analytics cycle learners data metric analysis and intervention. I wish it was that simple. And you to know otherwise. How else do you tell folks what learning analytics is? Um, how are you defining a Dragon?

6:43
Yeah, that's a really good question. I mean, obviously, we have our very formal, very detailed definition that is adopted by solar on that we basically kind of created when we were preparing the call for papers for the first Learning Conference, I think most of the culture contours of that definition was basically created by late Eric Duvall, by kind of being inspired by web analytics. And basically what if he says there is the kind of collection collation, Measurement and Analysis, the reporting of data about learners and contexts. And I think that really is often that we are forgetting. And very often, we are not thinking about the context, in which learning occurs for the purposes of understanding and optimizing learning environments in which learning occurs. So that's a very, very kind of formal definition. But informally speaking, we basically say, Well, you know, I told these creepy cliques that we are being collected collecting about you. That's what we are trying to analyze and trying to be good citizens to kind of offer some benefits for the learners and for teaching stuff that is involved in that process. The other, there's a really a terrific resource that you share outside who is now a lecturer who used to be a research fellow with the University of Edinburgh is electron hour with us Monash University created a terrific video, introducing learning analytics, it's just a three minute cartoon style for video, which kind of tells what kind of data we are collecting, and how we are trying to close that loop with the collection of the type of data, obviously, kind of that thinking about learning with exchanges in terms of what kind of data we are collecting for our stakeholders, how we are enrolling our stakeholders, what principles we aren't following, I mean, that's really the kind of the part of my life or work related to learning analytics, or I had a chance to enjoy collaborating with Anne Marie, where we wrote the history of Adam rich created the principles for use of learning analytics, which basically entailed some of the kind of ethical practices transparency, but also some really kind of well intended ways to promote students agents in the insured that is kind of really done in that way, and also involving a range of stakeholders that are relevant. And so to me, I basically before I kind of pass it on to Emory, as well as really one critical thing that I really like to emphasize. And that's how we always emphasize learning elements. It's a really very socio technical discipline, where the role of people and the social type of relationships as well as structures is as important if not more important, than those that are basically technical and that are coming from data science. We all basically observe if you want to have a real life impact if you want to have impact on policy, if you want to have impact on students lives, it's really essential to think about and engage into policy and from early day George Siemens emphasized the socio technical perspective of learning analytics, and there was a really beautiful paper that George Siemens and Ryan Baker wrote, presented for the second learning analytics conference in Vancouver. about relationships or similarities and differences between educational data mining and learning our exotics for many people, it's still not clear what each of these two fields is emphasizing, and, and then basically going forward some of the people and some thinkers in the field such as Leah McFadden. And then she'd also later emphasize basically the importance of policy, how certain policies potentially implement some bigger complex systems. And then, later, later on, Shane Dawson emphasized even more emphasis on the kind of complexity leadership and the extent to which we actually need to kind of really grapple with the notions of leadership if you want to make meaningful change with introduction of learning. In other words,

10:44
I think that policy pieces is so important and that, you know, I'm not a, I'm not a computer scientist, I'm one of these useless arts graduates just read books for a degree stories, in fact, not even books about real stuff. And but the, the policy piece for me has become super important because it too, for two reasons. Maybe more than two, putting that ethical frame around what we're doing is is important. also thinking about the opportunities for how do you create a permissive culture in which you can do this kind of work in an operational setting? So there's the research, and then there's what you might do in universities to, to use some of the outputs and insights of that research? How do you do that? in a way that's still research informed? And then how do you do that in a way that builds a bigger awareness and set of digital literacies for our students as digital citizens as well, because this is what we do with learning analytics is being done on a much larger scale are things like care being done on a much larger scale that that its extent to which any of us as individuals or data subjects know is just the thing in the world? And, and this is, for me, that's the that's the difficulty in this policy space. How do you create a permissive environment that lets well grounded well thought out ethical stuff through, but holds back some of the worst excesses of things that have been productize based on little to no knowledge of, of educational theory, basically, and dragon and some of your PhD students at Edinburgh wrote a really excellent paper, analyzing the claims made for self regulated learning and student facing dashboards that across a range of learning analytics products. And, you know, that's a good example of where you could argue that those are supporting student agency, but actually, they were so poorly informed by learning theory, that that they really didn't, it was a kind of smokescreen across a series of products. And, and that's where I think the tricky bit in policy often is. And that pomp comment you made about leadership there again, how do you make those decisions? Who do you need to have around the table to make informed decisions when, when you're looking at really quite fine grained distinctions like that, and you need to have quite a depth of knowledge to make those decisions as well. I think it's a really complex area for policy.

13:26
It's something Martin said in the chapter that is very complex. You both have said the words, ethics and ethical, and we're dealing with humanity, right. And so if we're not, alongside some of these developers and designers of these systems or platforms, I wonder, besides the policy, when it goes into practice, what kind of choice does any sort of, I'll just call them end users, we say learners or instructors and faculty? Do they have if they've never had input to how those are designed and created? And I love that you bring up Emory, specifically around like, what are they based on? Is there evidence and and during you say, are the researchers talking with some of the developers in these stages? And what does that mean, into practice? So this is a very complicated topic. How would you recommend like, I like the recommendations for defining it? What do you say to others that they're like, I really want to get into learning analytics, because it's this data or this evidence or this prediction modeling? What are some myths that you often have to, I guess, debunk with either researchers and or with practitioners or educators and things like that?

14:37
Yeah, I think I mean, one critical thing to me, I mean, with respect to myths, we can discuss some of those as well. But I think there are really three critical dimensions and many people really come into learning Alex and maybe that's one of the tests as well as anticipating potentially, please, those who are not necessarily coming into learning all this, but observing learning analytics from the outside and learning analytics is about data science. Ensure data analytics primarily as a 79, appease, in which we basically said that learning analytics is really not just about data science, it is equally so about what do we know about learning and teaching, we said theory, and also design. And that design also has several dimensions there. One dimension is obviously a learning slash educational slash instructional design, depending on the tradition and the ideology that some people may hold with respect to those design related traditions. And then there are two other critical dimensions for design. The other one is really how we design systems. And I think what we are seeing inside of learning, let me set a big push into human centered design. And people like Simon Buckingham, and Roberto Martinez, Juan de lado, they've been really emphasizing and promoting those. And there's really a big push also in that sense, into co design and the ways how the stakeholder can be involved. And we are seeing far more advanced type of things that are happening, at least in the research space, as well as some of the projects that are happening in house of different higher education institutions, suddenly, not so much. And inside the five thing, the type of products that are available by different types of vendors, and we can talk about that kind of, I think the cottony almost between unreal syndicate. What do we what is happening inside of research and learning? What is the field of research and even the surgeon excellent field of practice within higher education, versus what is available in terms of learning analytics as kind of a product in the edtech space? So those are really critical things. And the third dimension of that design is really how we design studies. What is the evidence? What is the accuracy, and that's also the Martin is talking about, like, you know, correlation versus causation. And, you know, if you don't have a study design that can allow you for causal design, then you can crunch your data and you can basically harass in whatever way you want. You want, you're not going to get huge causal inferences, even if you're using type of models that are allowing for causal for making some level of causal inferences. I think that's really, really, really essential to think about. So basically, nutshell, think about learning analytics is just not about data science, it is equally so about what do we know about learning, teaching and education and design?

17:31
Yeah, when I lead vendors at conferences, who explained learning styles to me and then trying to sell me a data analytics product off the back of it to die

17:43
or just blackness me embryos like hiding behind your hands. physical reaction, I'm forgetting Yes.

17:50
I think I did actually just use the word bullshit once we opened up that conversation but but that that, for me is the the richness of your description there dragon is not what comes across. When we look at how some of the earliest systems were product ties, with course, signals came out of the universities span out of the university that span out to the set of research, and got got commercialized, but a lot of other stuff has jumped on, jumped on a bandwagon and on I would say, very early research. And every field, especially in its inception, years producers, you know, not I don't wanna say bad research, but like, failure is a thing. Some stuff happens and doesn't have the output that we want or isn't replicable, or. And that's that's the point of research, right? Because that's all learning. That's not a bad thing. But But I think there's been a kind of fast pipeline to market between research and and product, which I worry about, because I think it ultimately, is starting to tarnish the research field in a way that's not useful. I think there's a little bit of a pushback, I think around concepts of learning analytics based on what, what what people are trying to sell his product, not the serious scholarly research that's being done into, into learning into how we can improve learning. I mean, Martin's chapter references some of the work that Barbary Auntie's did at the Open University. And that's, that, for me is interesting, because it's it's it it's data that informs quality assurance. So it's not about intervening with students. It's not about making predictions about students, it's about looking at how you designed a thing, did it operate in the way you thought it would? If you, you know, according to your design, and what does it tell you does it and I loved the mythbusting piece that was in, in in the book The description of, you know, reinforced that actually the design was pretty flexible, but students are not alike. Yeah, there's there's a lot more we can do with this data than just some traffic light pictures.

20:10
Yeah, no, I find good. And maybe a few more points here with respect to Anne-Marie reminded me of a few other issues that are related. Yeah, memory is really right. About course, you know, so it was a really early system. And often you basically come across the standard was published that hard core core single set like 2012. And, and then based on the critique as well, but study got Caulfield about basically kind of mixed in causation correlation. But I think there was my view, far more insightful study published by the same or similar group of authors 2011, comparison education, which looked what actually actions for signals triggered and the quality of those actions by the course, instructors, and basically demonstrated, they just increase the frequency of some of the feedback, which would say, Oh, you need to work more or harder rather than really implemented any good practices that are coming from education, that really is critical thing that Andrey I think, was also emphasized that you can just go with a product without thinking about what are those thinking? What are those educational practices? That's one thing. The second one is that learning, it's never going to be a simple technological fix. I think many people and people who are coming from critical studies as well in educational technologies are emphasizing that they're spot on about that. I mean, if you're just expecting that, all you need to do is to buy a learning analytics product and fix your system, you're not going to do anything, you're just going to tick a box, but you're not going to do anything, you might even just do hard installing you really need to think about that holistic approach. Offline prior emphasize that learning analytics means that you also need to think about organizational and other type of changes that analytics may actually entail. As part of the process. There was a really nice article in Politico, I think, 2019 that described what was done at Georgia State University, where they Yes, they engage with analytics, predictive models, etc. But they actually completely restructured their existing processes, organizational units, etc, to make sure that things are happening. And then it may be my last final point about some of the myths is that often basically, people say, the more you study, the better performance are getting. Actually, we are often getting in our results and analysis, as soon as we're studying the most are not the best performers. And I think that was one of the points. Martin also made in his chapter, the more you study, the better you're going to be actually start. It's the quality of your learning that you are exhibiting, and therefore, we need to really inform our understanding of that quality of learning with the relevant theoretical lens. My research builds a lot on the different theoretical model of self regulated learning, because they are promoting agency, they're respecting students judgment, how they're making these decisions, and consequently, also, what poor or good choices they may make based on which you can potentially also offer something which is meaningful and pedagogically justified intervention.

23:16
This reminds me of and I always thought it was earlier and it was 2011 2012. I was an academic advisor at the time. So some of the learning analytics was spinning off into like academic success, prediction modeling. And they were on the path of other people vendors coming to say we'd like to test this in the market place not being formed the other way around. So your call out to say, Bs on the vendor, is because we haven't had a lot of partnerships until probably the mid Yeah. 2014 2015 is when there was there was vendors approaching civitella listed I know Titan Partners did with NACADA, to talk about academic advising on how can we actually scale this across and test it before we say let's launch and prove it. Has there been more of that since then? Because I think as we talked about policy and evidence and putting into practice, there was never kind of like any beta testing before saying this is the thing we should use to measure the the learners or measure student success, I guess,

24:19
I will come in, there are really many brilliant studies in learning analytics that are coming with many of these things. Unfortunately, I don't see many of these that are coming from the kind of vendors I mean, there were some attempts and some of the organizations had some really good learning analysts that I would call like John Whitmore, but I think many of them really kind of completely disengaged or moved on with their careers, if you wish. And basically, they're kind of careers, bro them to some other areas. I think there was some level of appetite but I think to a certain extent, that kind of complexity of research does as required, you know, we are really inventing the new AI on some level measurement science. And measurement science requires significant level of rigor that basically requires how you're theorizing learning, what is the evidence of that theoretical learning, and then you have to go and test that. And that is not going to happen overnight. I think it requires, like, you know, at least a decade, if not more, that we actually develop some robust measures, I think there are smaller parts that can be done and that we are seeing some evidence of that. But I think what I would really like to see is more understanding on both sides that, you know, we have these imperfect measures, these imperfect measures are always to be, we just need to kind of measure also the level of the error bounds, if you wish, what is the error in our measurement? And what is the level of incompleteness that's one thing, the other critical thing, as well as that people should always considered data is just the representation of reality, just a small, basically viewpoint, that reality. And, you know, we know even from research, triangulation of different kinds of research methods is really important to give you a more complete picture. That's exactly what we are seeing or someone learning and let us this kind of multi channel type of data that are really useful, that can help us to get a more complete picture. But they will always be incomplete, because they are not reality, they are just the models of reality. And I think that's really one of the kind of critical challenges that we need to kind of overcome how we help to the relevant decision makers, both on the education as well as the vendor and to understand the limitations of the models and embrace the extent to which there are some useful things that can be done. So what I'm trying to say is data can be very useful if you basically consider it in a thoughtful way under symbols to its limitations. But if you don't want to engage with that type of conversation, then you're not going to have really useful use of data and analytics inside of your decision making as well as regular learning and teaching processes.

27:04
And that that whole time and thoughtfulness You know, it doesn't fit within a commercial product development cycle, unless a company is prepared to really seriously invest in long term blue skies research. And I think that's maybe where some of the tensions have been a desire to get product into the marketplace and return on investment, versus the the complexity of what we're dealing with learning is messy, because humans are messy. And yeah, it's a time consuming process. But your point dragon about it's an incomplete picture reminds me, something else I feel a little uncomfortable with is the extent to which the desire to gather data to fuel more analysis is has the potential to change behaviors as well. And I'm, I'm struck, there's a paper Leslie Gurley wrote, you're so much better than me at remembering dates of papers. My brain is not that good. But I think it's something like the tyranny of student participation. And it's a really good analysis of an A bit of a push back on kind of active learning and, and students being kind of constantly engaged in group work, because a lot of what what learning is about is about quiet time, it's about reading, it's about writing, it's about thinking, it's about reflection. And these are solo activities, by and large, and these are activities which kind of defy surveillance and defy measurement in many ways, the time you just spend sitting quietly thinking about something. But I do see things like ebook platforms, no toting as a feature that you can see how much your students have read, tells you nothing about what they've learned. So it just tells you which pages they clicked on. But I see that measurement being sold as a feature now as well. And, and starting to intrude into these kind of slightly quiet spaces. And I want to worry about the ways in which that might change student behavior so that spending time deeply thinking about and reflecting on something becomes more of a performative exercise, or students don't spend as much time on that because they're over here, doing the thing that's being measured, because what we measure sends a message about what matters. So I don't know what to do, I just worry about it. But I worry that the the edtech products are driving something now in pursuit of more data. And, and it's not necessarily healthy and it's it's not again, not really well informed by the very thoughtful, scholarly approach that that you have out

30:02
Yeah, like students become like little task rabbits, and they just have to do the things to get the carrot, which is the point, which is the click, which is the whatever. Yeah. And it's, it's the concept of like, just because they're in there and move it around in the space, doesn't mean the click log matters, or this mount of three post responses are the thing, because that's not what learning is. But now we're programming, almost our learners to say, that's what you need to do when not all learning could be measured. I like that idea that you sat around the quiet reflection time, like in defined surveillance class, just, that's just my jam memory. But I do wonder about how we've, are we letting these systems and these processes or practices drive us this way? Or are we just not even question it anymore?

30:50
Who's we?

30:52
Well, I guess the we that's great question. For those that are teaching in an institution is who I'm thinking about. Faculty and instructors, who don't have much say in the products they use. The students who sign up undergrad grad continuing ed certificates, they have to sign in sign on in sign away a lot of these rates have all their bits and pieces going into the system, the machine. Yeah, and I guess the other collective we is those of us who want to make the changes, our voice is strong enough. Are we doing the actions in the right way to subvert some of this? That's always been my thought. And how's that? How's that looking? The wee, wee small. Yeah,

31:38
we had one study published. I mean, got accepted two years ago, but just finally published in learning and instruction, which was led by our recent the graduated PG student between Shane Dawson, Lisa Lim. And Lisa basically showed in that study that like on past proper feedback, helped us students improve their performance, older data demonstrated that students engaged online less after the intervention. And some reviewers questioned why is this this stance? Right. And the best of all, the intervention was for students to engage less online and read the book. Right? And students followed the advice, and told me that's perfectly legitimate, and we should embrace that. I mean, there's nothing wrong with that. I mean, you know, that's a perfectly legitimate, pedagogically justified decision. But then that brings me to the second point. And that second point is that we actually need to mature and develop our data literacy skills, not just like, the academics, we teachers, but also decision makers. I mean, how can we expect like, you know, today's people who are in similar roles, like Anne-Marie is, and her counterparts in many different universities to all have similar data literacy skills, and then re may have, they don't, then I remember speaking with one of the leading learning analytics organizations several years ago, they were basically saying, Yeah, I mean, there's huge interest, we can close many business deals, but we don't want to close just these business deals, because often we can have really damage or create a damage to different organizations. And so instead, they were basically saying how we can basically create these safe spaces for different stakeholder groups from those who are making decisions to those who are, you know, in a kind of a kind of lower power end of the power structures, if you wish. They all have the comfortable way to learn safe way to learn about some of these things. And we somehow assume that everybody knows these things, but they don't. And how do we create these opportunities? I think we need to embed it somehow these opportunities into our daily discussions into our academic or professional developments are critical in many different places. So we can just assume that like a policymaker, or a decision making institution is making these decisions because they are evil or whatever. But often, because they are uninformed, and how we make sure that we get the important people in that process.

34:13
I think you're spot on there. It's that it's that culture of digital leadership. And I think the last year and COVID has kind of highlighted who's invested there and who hasn't in ways that are a little bit scary at times. But it's a challenge that every institution needs to needs to grapple with. Because it's not just learning analytics that can be problematic in this space. And I'm not going to go off on my proctoring rant right now.

34:41
You're welcome to this is what the podcast is for. Okay.

34:44
We can do a whole load on that if you've got a lot to say on that one. But yeah, I'm glad you mentioned on task there dragon because I think that so on task learning would be the website. of a I belong to Pardo, again, a research project that became an open source system that is now deployed in a number of different institutions. That is a really excellent example for bucking that idea of what analytics is. on task is, is data driven feedback at scale for students. So it allows you to write personalized feedback for students, you know, for class sizes, where you otherwise just would not have the time to write everybody a little personalized email with with actionable advice on how to improve their, their, whatever it is they're doing. It's a it's feedback. But what I love about that, when it's when you talk about the algorithms and on task, what are the algorithms used to deliver the feedback clear, whatever the teacher codes into the system, and when I say code, it's very simple. It's very simple Boolean logic, it's, if you got five on the test, you get this, if you watch the video, you get this, if you didn't do this, you get this. It's whatever is appropriate in that pedagogical context. And, and the feedback is also written by the teacher. So it's a perfect example of a learning analytic system that does something very tangible, very real, very grounded in in learning, you know, learning theory, I mean, you know, good feedback has an impact we know. And it allows you to do good targeted feedback at scale, therefore to have a bigger impact at scale, but it robs no teacher of agency, absolutely embeds teacher agency. And that's some of the pushback I hear on some of the other sort of products that are out there the the obscure information from teachers, and you know, what, what's driving the red, Amber green, I don't understand it. In this case, no teacher agencies absolutely inscribed into the way the system works. Now, the trouble with it is it doesn't scale easily. And obviously, it doesn't scale these lamps thinking like a university administrator. I can't rate some feedback that applies for every class, I can't rate some rules that applies for every class. Why did you do that? Anyway, that's a kind of mad idea. Anyway, feedback should be, you know, relevant to the students and the course and the context. And right back to that point you made at the start dragon, we think about the data, but we forget about the context and on task is a perfect example of those two things together.

37:36
Absolutely, no, no, I agree with you. I mean, Anne-Marie and, you know, I've been always emphasizing I even emphasize earlier today as well, like about that kind of learning design, the context in which certain instruction, or certain learning happens, right? And that that's so fundamental, I think, to the point to if you even think about some of these predictive models, the extent to which you can really even generalize any predictive model across many courses, because you simply have a difficulty. So far more recent studies have shown that basically, you know, yes, even if you have courses that have very similar, like, you know, pedagogical overarching pedagogical Filip philosophy, still the learner themselves, their agency, and what they know what it conveys explain far more variance in your data, then, many of these external conditions, external conditions can help and they need to be nicely tailored to match some of these things, and to help the learners but at the same time, it's the learners who really kind of Aren't you hopefully have seen that paper, which basically tells that learner that learning always about the learners. I mean, if it's not about learners than it is about.

38:50
This made me think of, I'm going through some domains. And as people master things, they're like, if you give them a rubric, or you have a competency matrix, you say, this is the standard, which we think about for learning, then are they ever going to pass it? No, because it's very, it's complex. It's complicated. And as you said earlier, anri it's messy. And it's, it's a lot of why would we want to have our learners just hit a certain point, or code to say you did these three things, check, check, check, like a rubric. Um, this brings me to upgrading, which I won't get into that rant. But I really think about like, you're just encouraging them to click the thing and do this versus actually, what is learning? And how does learning transfer to that individual, it's going to be very individualized. It's not, it's going to be not the same and it's going to be contextualized and relevant to that person. We won't be able to solve this in this one podcast episode or this chapter. I am grateful that Martin has brought up some of these ideas and issues. Are there things that you think might be missing that we can point on or draw about or things that we should call out to say this is even bigger that we should dig into further From what Martin's written about,

40:02
I kind of want to pick up on it's an answer to that question, but it's also a follow on from the conversation we just had. What does learner success looked like if this is all about the learner? What's the measure of learner success, even, that's not generalizable. And that's something that you know, I know work in an open university where credit non credit learning, you can do one course you can do a whole program, you can do anything in between, you can stop start, you can change your mind at any point all the way through as well. So what are students here for that's a piece of data we don't have as institutions. So even our idea of what learner success looks like what what the outcome we might be aiming for is highly variable. You know, so how, how can how does an open university like mine define retention? Like, what is retention in this institution, it's, it's somebody who's here for as long as they want to be here, and they're reasonably happy when they go. And it was, you know, it's the right time to go. So. So I think that bit about kind of what a university is for, and all the different reasons why people might be at universities, is kind of missing. And a lot of the discussions about learning analytics are still in what I read Anyway, there's, there's still, in some ways, a kind of background sense that most people are at university for the same sort of thing. They're there to get some kind of qualification. And that's sort of true, but is it the same qualification? And is that generalizable across universities? So I think that makes us even in what does it eat, what we even trying to measure in terms of the learner success is so well defined, and learners are so absent in those measurements? I think maybe because those measurements to date have been driven by maybe external forces, or maybe, maybe, you know, money inside institutions, I mean, retention costs, so that, of course, institutions focus on it. But retention is also one of these measures that gets used in league tables, or gets used in funding mechanisms. So it sort of implies that a drives a sense that everybody is there for the same thing, and I don't think we've picked that apart enough. And this is why that work on the drydown mentioned the focus on self regulated learning how can we use data as a form of feedback to students and how can they develop towards their goals? is I think, well, that's, that's where the value is. And it's kind of under explored, because there are all these other pressures that swirl around the use of data and institutions that skew these measurements.

42:53
Absolutely, I to me, to him that that's really kind of what I'm very basically starting to speak about this basically thinking about what are those core values, we want learning analytics to promote, and then tailoring our analytics towards that purpose learning analytics on soul should not have a purpose learning annex should have the purpose of achieving certain either individual goals, or supporting people poetry, these individual goals and or certain societal values. I mean, we live in lots of different times and lots of different kind of, you know, debates where people are discussing what's the purpose of education is the purpose of education, just to, you know, train people for the workplace, or is the purpose of education, to basically, you know, educate people to become better citizens to become better, better versions of themselves, if you wish. And I really enjoyed being in Scotland for the time because I had a very clear purpose of education as basically for the sake of education for the sake of educating people for the sake of creating a better society, while I'm now in a country where there's a much bigger push from at least the current government, where we want to see more workforce, whether we agree with these things, I think will be always debate. I don't expect the clear cut, but I think we as the educators as institutions, that we work in school systems and so on, we have certain virtues, we want to promote certain values, we want to promote certain types of skills. I think, you know, it's inevitable and Emery mentioned as well, digital skills, they are really essential for all about senate learning and Lex is perfectly suited to help us develop the digital skills. Like you know, we are talking about fake news. Well, lots of machine learning is out there that is able also to detect fake news that is able to detect the fake that is able to detect many of these things, how we can help people navigate some of these things. For me, I'm most passionate about two types of things, self regulated learning and collaborative learning, how we can use some of these analytics and we always need to kind of think about is the That always small little camera behind us, whereas learning analytics is where I am having certain particular design tasks that is helping me to kind of improve and do certain things. And I think that's realistic, more realistic type of thing that we want to see. And that is also far less intrusive. And it's also in a way safer, but we need to then really think about how we are designing these environments in such a way that everybody feels included or, or as as many included as possible, I don't think we will reach in the foreseeable future, everybody's included. But let's say that's our ultimate utopia we want to be in. And the other thing is, how we are basically creating products in that space and how we are evolving these products, because we also are often kind of saying it's small little thing happens bad. And we are seeing like with vaccines in Europe, what is happening are stopping these like, you know, deliveries of the vaccines based on, I think relatively low risk tolerance, and also without really seeing what is behind it, and how we are avoiding that we are not attributing these type of things. But instead, thinking carefully that analytics is here to serve formative purposes, rather than summative and decision and replacing decision making on our behalf.

46:17
I want to go to the university that to you to run next, um, as a student, as a faculty or staff researcher, I would just want to say thank you, these are some deeper questions that you're right, we haven't addressed and need to be addressed before we even get to data learning analytics. So I really hope that I'm going to write them all down and study themselves to put them out to the community of listeners. I just want to thank you both for sharing kind of what, what wasn't really in the chapter and just expand a little bit more. We really appreciate that. And love to Martin for this conversation. So thank you both for coming to talk with me on the episode today. I appreciate it.

46:57
Thank you. Thank you for having us.

46:59
Thanks so much, Laura. Really always love to chat with

47:04
so much to think about. Thank you so much for this masterclass on learning analytics. I appreciate the conversation

47:11
you've been listening to between the chapters with your host, Laura Pasquini. For more information or to subscribe to between the chapters and 25 years of edtech visit 25 years dot open ed.ca