To kick off the first episode of our revamped podcast series, host John Kleeman goes beyond the score with Melissa Loble, Chief Academic Officer at Instructure, who shares the three pillars guiding Instructure’s AI strategy; opportunities for making AI an enabler of lifelong learning; and how AI is being used to empower people, not replace them.
Full Transcript
John Kleeman:
Hello and welcome to the Learnosity Podcast, Beyond the Score. I’m your host, John Kleeman, an executive at Learnosity and an assessment industry pioneer. The slightly relaunched 2025 premise of the podcast is that for a century or so, assessment has been slow to evolve, perhaps even been stagnant. And now with the availability of advanced technologies like AI, is finally ready to make a quantum leap into the space age, giving us new opportunities to reshape how we think about assessment, how we use it. And in this podcast series I’m talking to experts about emerging technologies, ideas and enduring responsibilities that will recast assessment now and into the future. And I’m really excited to have as a launch guest and a really, really exciting person, Melissa Loble, who is Instructure’s Chief Academic Officer, and she’s also Vice Chair and on the board of 1EdTech, which used to be IMS. Welcome, Melissa.
Melissa Loble:
Thank you so much for having me, and I’m excited to be part of this first relaunch episode.
John Kleeman:
Great, great. Can you tell us a little bit about yourself?
Melissa Loble:
Sure, I’d be happy to. So as you already mentioned, I’m the Chief Academic Officer at Instructure. For those of you curious about Instructure, we’re the makers of Canvas, and I know I’ll probably share a little bit more about us in a few minutes, but I’ve been at Instructure for 11 and a half years now, which is crazy to think about how time flies. And I’ve been in ed tech my entire career. So I’ve lived both on the solution provider side, like Instructure, but then I’ve also started in the K-12 classroom teaching public high school in New York City, and have also spent a number of my years leading educational technology teams in various universities.
And probably the thing I’m most proud of, from a professional perspective, is I’ve been teaching now for 26 years online. I’ve only ever taught online, which is an interesting experience that I’ve had, but I’m really passionate about how do we think about the work we’re doing in education to directly impact on the learners that we are looking to help ensure they achieve their goals, their outcomes, and the futures that they want. So that’s a little bit about me professionally. Personally, I live in Utah in the United States. Came here for Instructure, although now we’re a global company, so I could really pretty much live anywhere. And I live here with my husband and two cats, and I’m an avid scuba diver. So there you go. A little bit about me.
John Kleeman:
Cool. Long way from scuba dive from Utah.
Melissa Loble:
Long way.
John Kleeman:
So I imagine most people listening to this will know what Canvas is, but just in case there’s anybody who doesn’t, can you just tell us what Canvas is?
Melissa Loble:
Absolutely. So Canvas was our flagship product and we launched it in 2010. And it is a learning management system. So for those of you not as familiar with maybe educational technology outside of the assessment realm, learning management systems typically deliver content, create opportunities for communication in the classroom and outside of the classroom online, and then also give opportunities for students to engage in, whether it’s homework, group, activities, discussions. It’s all about trying to amplify what either happens in a traditional classroom or even in some cases replace parts of what happens, or all of what happens, in a traditional classroom in an online space. So we are both K-12 and higher ed. For those of you listening to this, depending on where you’re coming from, we serve customers, institutions, school districts, primary and secondary schools, you name it, across the education spectrum.
John Kleeman:
Thank you. So we met at the EDSAFE Industry Alliance event at BET in London in January where we were both talking about responsible AI in ed tech. So let’s start off talking about AI because everybody is, and everybody wants to. Tell me a bit about what Instructure are doing with AI.
Melissa Loble:
Yeah, AI is the hot topic. That is for sure. No, so there’s two key fundamental pieces to our approach with AI at Instructure and specifically with Canvas. And we also have other products that we provide into the educational space. The first premise is just a structure or an approach to how do we think about AI as an organization? And this is where we have three fundamental principles. We believe it needs to be safe, equitable, and transparent. And in that, that means that as we choose to either leverage AI tools, partner with AI tools, develop our own AI-based tools, or even bring resources out to the education community, we want to make sure they are safe for learners, teachers and institutions and organizations. So abiding by everything from privacy laws, this is where the connection to EDSAFE really connects in for us, but also to ensure that they’re protecting learners at the end of the day from harm.
From an equitable perspective, we want to make sure that one of the great opportunities AI brings is surfaced, and that is that it can help us manage what’s a pretty significant digital divide today in learning. So making sure that as we think about AI tooling, it’s not privileging one collection of society, and it’s actually helping to bring learning more ubiquitously to everyone. And then finally, transparency. We believe that as users of educational technologies that are leveraging AI, you need to be able to understand simply and clearly how has AI been used, everything from the models and the technology behind it to how are we protecting the learner data and the learner information, is really important because we all need to be making good decisions in an era of rapid change. So that’s one approach to AI; that’s half of our approach, this philosophy. The other half I’ll be very quick about, but we are taking, from a development perspective, similarly a three-prong approach.
The first prong is we’re building some tooling, leveraging either existing models or existing tools or even experimenting with some of our own. And that tooling is focused on instructor or teacher efficiency as well as learning efficacy. So that’s where we want to hone our tools. The second one is we want to make sure that third-party tools… We believe ecosystems are incredibly valuable. No single technology solution is going to solve all of the learning needs for any organization. So because of that, we want to make sure that we have the right ways to partner with the AI tools or tools using AI embedded in them, so that these ecosystems can be meaningful, robust, pass data appropriately, do all the things ecosystems need to do to thrive. And then the last one is we know there’s going to be development out in the community by educational organizations. Lots of experiments are happening, lots of research. So we want to make sure we are an open platform in which that development can happen.
For example, as you mentioned earlier, I’m part of 1EdTech. We’re big advocates of standards and in particular LTI. And so we thought very clearly about all of our LTI placements in our technology, or where can you connect the hooks to connect other tools in to make sure they’re not only meaningful for today’s ed tech tools, but what AI tools might need to do. We’ve also have an open API, and we’ve extended some specific extensions in our API that enable AI tools to get at data and content, on permission by an instructor or all within a safe environment, but to access that to then be more robust in these individual tools that teachers, students, institutions, organizations may make. So those two things are running in parallel, how we develop in our framework or our philosophy around where AI fits in our organization.
John Kleeman:
So that makes sense. And just coming at it from the Learnosity angle, we’ve always been a company that believes that education is a human right, and we really have a long history… Gavin and Mark can be very proud of the company they built that has driven accessibility into the space. So a lot of organizations delivering assessments use Learnosity and get very high quality of accessibility. And we’re now moving on into AI and providing AI tools that I hope one day instruction may use. But we’re very much about aiding people, not replacing people. And our key thing and responsibility is that we’re aiding people; the human makes the decision, so we’ve got a variety of different ways of using AI to create questions, scoring essays and short answers and other things, maths, reviewing items for bias and things. But all of them have this key human in the loop. So I think equitable is key for us too. And the main way we’re producing safety, I think, is by making sure that a human is really genuinely in the loop, if that makes sense.
Melissa Loble:
It absolutely does. We’re thinking the same way, and I love how you’re underscoring the importance in that, particularly in the assessment space, because I think as we hear people talk about the opportunities for AI, oh, content creation, auto grading, all of these different aspects, but in order to ensure you have valid content, you need humans in there testing and researching that. In order to be able to have meaningful grading activities, you need humans interacting after perhaps a starter grade is presented to ensure that that’s grounded in the full context of learning and not just very simply what’s being seen as a submission or as a question. So I love your focus. I admire that, and I hope all of the technology companies that are getting into AI do what you’re talking about and really think about the combination of the human and the technology, and not the replacement of one or the other.
John Kleeman:
I agree. And for those technology companies that might be listening to this podcast, the EDSAFE Industry Alliance is worth checking out because it’s a pretty simple framework, the EDSAFE framework, that describes how you should do this, and it really is important that we use AI for the learner’s benefit. Talking of which, you said earlier you were higher education and K-12, but do you also work with people in the wider sense, work as parents and things? How does AI change that or impact that?
Melissa Loble:
Yeah, we definitely do. So one of our big focuses that we’ve been marching towards over the last couple of years is the full spectrum of the lifelong learner. And so it’s trying to make sure we’re thinking about somebody very early on in their learning journey. They’re four, they’re five, they’re three years old. How are we thinking about how they learn and how they progress? And where can AI help all the way through to that adult that is looking to develop continued skills, to build opportunities in their workplace? Or how are we supporting educational organizations that are partnering with industry to better deliver more meaningful workforce-aligned activities? So in that full spectrum, of course we interact with students and teachers, that would make sense, but we also see parents lean in, in really valuable places. And so for example, one of the things that we did from an AI tooling perspective is, embedded in our technology, we have a way to have interactions with parents. It’s called an observer role.
That doesn’t really mean anything other than a teacher can have a conversation with a parent within our platform. So the cool thing about that is we uncovered that AI, and I think we all are seeing this, is really good at translation. So could we empower some of those exchanges to be in languages that perhaps they don’t equally speak, but they can still communicate to each other? So thinking about a parent that only speaks perhaps Spanish, how is that parent interacting with a teacher that doesn’t speak Spanish directly in the context of their student learning? So I think those are some of the ways that these broader audiences we’re interacting with and thinking about, how AI may influence the way technology operates in the learning context to support all of those people rallying around the student and the teacher at the center.
John Kleeman:
I think that’s so valuable, and I think so often we tend to think that the whole world speaks English, when it doesn’t. So that’s certainly been a bit of a focus for us too. In the higher stakes assessment space, we’ve brought out this instant translate capability, which allows people to instantly translate the questions into different languages. And we also do translation as well. In terms of how people learn, do you think we’ve got a better understanding of that? And is one of the reasons Canvas successful because it helps people learn better, or is it more just about organizing?
Melissa Loble:
Yeah, it’s such a good question. I have two perspectives on that. One, I do think we understand more about how people learn today than we did 10, 15 years ago. Some of that is the natural evolution of science, and it’s understanding and interest in how does the brain work, and also societal changes. I think over the last even five years, particularly as an output of the pandemic or during the pandemic, mental health became a real huge focus, and how mental health impacts learning. And we saw a lot more research and science around what are the true impacts in stressful situations or in lonely situations on learning for an individual learner? So I do think we’ve learned more about how people learn and how the brain functions, and we can fold that in. And I know you all have done some really great research and have some really great understanding around this.
I think the second piece to this also is evidence and efficacy. So we need to be thinking about, as we adopt technologies (and honestly if we adopt any practices, but technology, it’s a little easier to do this with), we need to study is that technology actually doing what it says it’s going to do? Is it actually improving outcomes? Is it actually giving a student an opportunity to practice something in a meaningful way to lock in long-term memory, for example? So I think we need to be looking at both pieces. It’s not as simple as Canvas helps people learn better. I would say there’s so many variables in that, in how something’s being used, but it’s more can we assess our programs, our practices, our approaches to teaching and learning to make sure they are directly aligned with the impact that we want to see, while we are also very closely watching, how do we learn, how are we changing and evolving as humans and how do we fold that in?
John Kleeman:
Can I just pick up one of the things you said about people being lonely and how they learn? Is there any science you’ve seen on that about how people can learn better when they’re isolated or whatever?
Melissa Loble:
Yeah, so it’s really interesting. I just read a couple of research articles, and I’ll make sure to share them, about mental health in general. And loneliness was one of the dimensions that they studied. It wasn’t the specific only one they studied, but it was one of them. And that actually our brains, in those situations, the loneliness, depression, high stress, those kinds of variables (and loneliness was one of them), our brain can’t function at the highest level. It actually takes more work to take a task or a learning activity and have it translate into long-term memory. It takes more repetitions, in other words.
It takes more different contextual environments to apply in order to actually build that, because the brain is busy managing the loneliness, managing the stress. And so it’s not firing on all cylinders, if you want to think about it that way, or it reduces our neural plasticity. So I found that really interesting because we’re in a world of further isolation in some ways, this world of remote work, which is so great and we embrace it, but there’s also some negative consequences to that that have come from the shift in where we exist and how we exist and where we learn.
John Kleeman:
That’s really interesting. There’s a bit of an analogy there with the fight or flight type thing that we probably had as prehistoric, that if you want to have a good conversation with people, you need to make them feel safe so that they’re prepared to accept. I haven’t personally done research in this area, but one area that I’ve read quite a lot of research about is this whole thing about assessments being retrieval practice. So that basically if you answer a question on something, it puts it through your brain and makes the pathway through your brain retrieving that answer more likely to be retrievable again in the future. And so if you take questions, it actually works better than rote learning or reading stuff and things. Actually answering a question, I ask you x, y, or z, and you have to go into your memory and find it and bring it back, makes it more likely to be able to retrieve that later. And I was working in the testing field for 15 years before I heard about that, and it was completely eye-opening, and it’s really interesting.
Melissa Loble:
Yeah, I agree. And I think we’re more aware of it in the last couple of years than we had been previously, because sometimes it would get disguised as rote practice, and it’s not that. It’s actually something very different and an exciting, again, way to apply, like you said, how we function and how are we wired, to how can we best continue to develop ourselves through practice.
John Kleeman:
And I think also if you want to retain something for the long term, you need to do something different than just cramming for an exam, is the other aspect. Let’s move on to assessments, and maybe the central issue of this podcast, which is that assessments are often the same as they were 50 years ago. So we’re using all the fancy technology, the Learnosity, Question Mark, Instructure, Canvas are providing, but you might be asking the same question that our parents would’ve taken in exam or whatever. Do you see that changing with AI and new technology or are we still going to be doing multiple choice and essays and little short answer questions forever?
Melissa Loble:
I hope we are. I’ll start there. No, I think we’re seeing some changes definitely with technology, and AI has produced some of those changes. It’s also just caused the conversation to happen. So AI, when it first came on the mainstream market in education, it’s been around for a long time, lots of tools have leveraged AI for a very long time, but when it really came mainstream a couple of years ago, the first question everybody had was cheating. Suddenly students are going to be able to look up the answer. Well, that cheating question has now turned into are bots just going to be both taking the tests and grading tests and is there going to be any learning that happens whatsoever? And I think that is because there’s fundamental questions around the construction of assessment and then the role that assessment plays in a learning process.
So on the construction side, I’m a big advocate for if you create or set an assessment, whether it’s an open-ended assessment, a project, or more of a quiz kind of, or a test or exam kind of assessment with questions, you can craft those in such a way that it would be nearly impossible, at least today, for AI to answer and answer well. And so I think how we craft our assessments and the kinds of assessments we choose will help us navigate this much needed change in assessment that hopefully is coming. I also think that, as we look at how assessments are being used, we often think, and I think of my generation, it was you used it as a summative activity. It was how do you assess who you are and where you are to progress to something next? Well, that’s only one aspect of assessment and you already talked about this, John, and I think it’s really, really valuable, assessment’s also, in a way, it’s in the learning practice itself. It’s not just an assessment of what was learned, but it is a way to learn.
And that’s that practice, the opportunity to self-assess. And I think that’s where AI actually has some really great promise. If I want to test how well do I learn something, can I ask a tool to give me questions on these topics based on the content that I was reviewing so I can practice how I am doing that in my actual learning activity? How do we embed those in the flow of building skills and content and knowledge? So I think it’s got two lenses to it. I think we think differently about what we produce and what assessments look like, but then we also remember that assessments aren’t just summative. They are formative, they are practice, they are opportunities to actually understand how are we teaching a particular concept and improve our teaching. There’s so much more than that, and that’s where there’s a whole lot of promise, I think for AI.
John Kleeman:
I fully agree with all you’ve just said, particularly on the formative assessment. The formative assessment can route people through different learning. I’m wondering in terms of the future of assessment, whether we’re going to see a lot more practical tasks so that… Because the world is going to change, you can get AI to do things. So it’s what people powered by AI can do. It’s more interesting. So I think we’re going to see assessments where maybe you have a video, maybe for example, you’re trying to assess a salesperson, you get them to do a presentation and the AI will grade that with a human review, or I’m sure there are educational analogies as well that you do a project or teamwork as well, which we’ve always been very at bad assessing.
And I think that partly because people can use AI to take these simple questions, and partly because the kind of skills we want is different, and also the face validity of just doing an objective test is reducing, because people want things to be more valid, so I think we’re going to see a lot of unstructured, subjective type questions and answers doing something which is just too expensive for a teacher or a human grader to grade, but we’ll see AI grading it with human review. Does that make sense?
Melissa Loble:
Oh, yes. I could not agree more, and I’m excited for that vision. I think there are some models of more modern pedagogy, and not that older pedagogy is a bad thing. It’s not. It’s just some of the fundamentals. For all the nerds out there, I’m still a constructivist teacher to this day, and that’s how I teach. But experiential learning, apprenticeship-style learning, problem-based, task-based, producing and replicating, how am I going to handle a situation? Those kinds of learning experiences. I start to nerd out on simulations, even, all of that is so expensive to both build and grade. It’s really hard with just human labor, but all of those are really great opportunities for AI, with the human in the loop, like we talked about earlier, to enable new, different, more applied-type learning that seems to becoming more and more interesting to teachers and educators, and seems to be more applicable in the world of work today.
I’m always fascinated by the skills gaps that get reported of, industry says they need students to have these skills, but the incoming employees don’t have those skills, and they center around things like professionalism, technology skills. That one always amazes me, but it is true. You see reports that will talk about critical thinking skills. Those aren’t things that you can ask a multiple choice question to really demonstrate. Those are things you need people to show, present, do, apply. And if we can film people doing that work, or capture the methodology behind that work and then use AI to assess that, super powerful to then have the teacher come in and wrap around that and be able to truly see that skill adoption and sophistication that I think industry in many places is looking for.
John Kleeman:
I think that could well be the future. So look, I shouldn’t let you go from this podcast without talking a bit about 1EdTech, because you were the chair of… So for those of you in the audience who might not know 1EdTech, they used to be called the IMS, and they’ve been the foundation of a lot of the infrastructure within education. They came up with QTI, which I, a long time ago, was involved with. They came up with LTI, they came up with a lot of other very useful things. And so what are the key things that IMS and 1EdTech are doing now, and why is it interesting to the rest of the community?
Melissa Loble:
I think there’s a couple of things that IMS is doing, or 1EdTech, is doing now. One of those is being very thoughtful about the evaluation of technology. And so they have a whole program, their Trusted Apps Marketplace, that does some app vetting work, it does some rubric-based instruction and guidance, and it surfaces up technologies that have good evidence in producing the outcomes that they desire. And the reason I say that’s a really important work, it feels a little different than maybe a traditional standard, like a QTI or LTI, which are so important as well. But what it does is it helps people in the world of thousands of technology tools. In that world, it helps educators, teachers, institutions, organizations make better decisions about the technologies that they’re adopting. Because the other thing that makes me nervous about AI is how fast people are producing new tools and promising that they’re going to deliver everything your hearts desire, without any sort of assessment or understanding behind that.
So I think that’s a really important body of work that 1EdTech continues to do, and you’ll see more and more around this Trusted App Marketplace, and the ability to get underneath how to pick the right tools. Then the second area I would say is how are we thinking about how do things interconnect? That applies to AI, but that applies to any other new field of technology that will surface as we build our learning ecosystems. And how do they connect, not just from a plumbing or an authentication perspective, but how do they connect from an outcomes perspective? So how are we thinking about everything from how am I learning this and now how am I demonstrating this? How do I have standards around the credentials I’m offering out into the world? How do I understand how those standards map to workforce needs and skill matrices? So I think this full ecosystem production piece and how that all weaves together is also a really important body of work for 1EdTech.
John Kleeman:
That’s really interesting. I’d love to learn more about that. That just sounds really, really valuable. I also can’t let you go without sharing this quote that one of my colleagues found by listening to one of your YouTube interviews. “I’m a little bored by generative AI,” you said. So most of us are very excited by generative AI. So tell me if that was taken out of context or what your thoughts are there.
Melissa Loble:
No, that quote is hilarious, and I did mean it in the context of, I get bored by technologies being the savior or the end result. I want to hear new practice. I want to hear how are we sharing, again, how do we teach and learn, and how do we reflect on ourselves? Where are we incorporating, I’ll throw this one out there. Where are we incorporating, in our curriculum, how to make learners, at very early ages, aware of themselves as a learner? I was just chatting with two colleagues and I was asking them, I always ask people what is their favorite learning moment in their life. And a lot of times I get learning moments in college where they’ll say, “I failed this thing in college, and it finally helped me make myself aware that I didn’t know how to study,” or, “I had this instructor that explained chemistry in a way that I never understood it, and it put me on the path to being a doctor.” But it’s college.
Why don’t we hear people say, “In second grade, I remember that I became very aware of myself as how I learn, because I watched the kid next to me try to read a book and he did it differently than I did. And my teacher pointed that out and said, ‘You’re both right,’ and showed that to the class.” Where do we have that moment? We don’t. So we don’t learn about ourselves as learners until later in life. So rather than go, “AI is the new changemaker in education,” I’d like to say let’s use AI to help us ask the questions of what are we missing? How can we be better in how we’re thinking about education? How can we build people contributing to their societies in more meaningful ways? Where does AI help us do that? But how do we have those conversations, and how do we do that earlier and earlier on in our lives?
John Kleeman:
I love that meta-cognition that you understand that. So I’m not sure if you did just share it, but what is your biggest learning moment in life?
Melissa Loble:
So I would say my biggest learning moment, and mine’s not young either, which is one of those crazy things. I was a junior in high school, actually, in the US, and I was in a physics class, and I just couldn’t get it. It just didn’t make sense. I had an incredible teacher who had spent time with me, and he explained all of these different ways, but every way that he was explaining things to me was with either, he’d write things on the whiteboard, or was formulaic, or he’d have me read something, or he’d give me anecdotes of like, “Well, this is how it applies to the real world.” But it was all very written, very reading-intensive, or very, “Follow my oral explanation.” And then finally he was like, “Okay, let me try this.” And he brought into class a ball, and he’d made this track. We always see these domino tracks on YouTube these days and stuff like that. It was kind of like that.
And I remember this, and he started to do this, and as he was doing it, I’m like, “Oh, so you mean this hits here, and that hits here, and this hits here?” And he goes, “Yes! Okay, now you’re getting it.” And it finally opened up that I actually needed to see it physically. I think even if I watched a video of that, I wouldn’t. I needed to interact with it and physically see it. And at that point, because I got away with doing well in school up until then, until I got into this situation. So at that moment, I’m like, “Oh, I need to touch things.”
So now I think about even my personal life. I mentioned I’m an scuba diver. I’ll always tell my instructors as I’m learning more, I’ll be like, “Okay, we can sit in a classroom and we can talk about this stuff. I know you have to check this box, but you’re going to spend more time with me in the water than anything else, and you’re going to have to touch me, move my leg to kick this way, and I’m going to need to touch you so I can see and feel and be interactive in it,” all because I can remember that moment in high school. But I think about all the kids that don’t have those opportunities to have that click, that this is how I’m going to figure out things in my life.
John Kleeman:
Interesting. I think for me, I’m not sure about a moment, but the concept that the computer just does things in order from an instructor, that in a processor it just does, does, does, does, does, has helped me understand how software works, help me write software, and help me understand everything, albeit with a new AI, it maybe doesn’t quite work so deterministically as things, but I think that’s been a huge learning benefit to me in my life. And I would encourage people in the audience, think what was your biggest learning moment, because it does sound like a very interesting thing to do. And also if it can help you with your meta cognition on how to improve learning, it sounds good. So look, that’s one piece of advice for listeners, but I’d like to close the podcast, Melissa, with just asking you, you’ve got a huge amount of experience in this field. What advice, what would you say to people who are starting off in the ed tech or learning or assessment field?
Melissa Loble:
Thank you. I love that question. So I think my advice would be, try to always walk in the shoes of those you are serving. And I say this because I watch this actually even… We have to catch ourselves in my own organization, even my own team. Think about who you are serving with your… So if you’re listening to this and you’re an ed tech builder, think about your teachers and students and administrators, or whoever those roles are, and walk in their shoes. What are they facing? Actually get into the technology and use it from their perspective. If you’re a teacher, again, get into your lessons. Think about you doing that lesson and how would that feel for you? And then the same if you are someone that is building or supporting assessment. We have a team of researchers and a psychometrician, and they’ll say, “I have to step back and go take assessments,” or, “I have to step back and go look at the content, and then go attempt an assessment in the same way that a student would to understand the alignment of the two.”
And I know it sounds simple, because we’re all like, oh yeah, well, we need to know our personas and know who we’re serving, but we don’t do it enough. I don’t do it enough. My team doesn’t do it enough. And then we catch ourselves off guard where it’s like, but wait, why didn’t they understand this? We can do this in presentations. Why didn’t they understand this? Well, oh, I know it, I’m too close to it, that I skipped steps, or I didn’t think about this kind of alternative workflow that might happen. So walk in the shoes always of who you’re serving, and do it regularly so that you can make sure you’re not missing anything, you’re not too close to the mark in what you’re building or what you’re designing, and that you can pull in that evolution of that learner or that student or that teacher or that administrator, because we’re all evolving in our roles. Pull that evolution into the work that you’re doing.
John Kleeman:
Thank you. I love that. Thank you. This is really good. And thank you everybody for listening to this episode of Learnosity’s podcast, Beyond the Score, with me, John Kleeman and my guest, Melissa Loble from Instructure. We appreciate your support. And don’t forget, if you’ve enjoyed this podcast, why not follow the podcast through your favorite listening platform, and check out our back catalogue. And please reach out to me directly at john@learnosity.com for any questions, comments, or if you’d like to keep the conversation going. And Melissa and I are also both on LinkedIn. Thanks again, and please tune in for our next podcast.