EdTech Empowerment: Innovating Education Together

AI & EdTech: Transforming Classrooms with Smarter Tools

Juan Rodriguez Season 2 Episode 3

In this episode of EdTech Empowerment, host Juan Rodriguez speaks with Lindy Hockenbary, an experienced K-12 EdTech advisor, about the integration of technology and AI in education. They discuss Lindy's journey in EdTech, the key elements for successful technology integration, common mistakes educators make, and the importance of digital citizenship. The conversation also explores the role of cell phones in the classroom and the impact of AI on the future of education, emphasizing the need for educators to adapt and embrace these tools for enhanced learning experiences. In this conversation, Lindy Hockenbary discusses the transformative impact of AI on education, emphasizing the need for AI literacy among educators and students. She highlights the misconceptions surrounding AI, particularly the fear of cheating, and advocates for a deeper understanding of AI's capabilities. Lindy also addresses the ethical considerations of using AI in schools, the importance of equity and access in educational technology, and the critical role of professional development for teachers to adapt to these changes effectively. In this conversation, Lindy Hockenbary discusses the integration of AI in education, emphasizing the importance of collaboration between students and AI. She advocates for a shift from Acceptable Use Policies (AUP) to Responsible Use Policies (RUP) to foster a positive learning environment. Lindy also explores the future of AI in education, highlighting the emergence of agent AI and the need for educators to adapt their teaching strategies. She provides valuable insights on emerging trends in EdTech and offers practical advice for educators looking to enhance their tech integration strategies.

Send us a text

Support the show

EdTech Empowerment: Innovating Education Together is hosted by Juan Rodriguez, founder of NextGen Classrooms. Our mission? To empower every student with access to technology-rich education. Tune in each episode to hear from thought leaders, educators, and tech experts on transformative strategies in education, from digital literacy and AI ethics to building inclusive classrooms.

Let’s bridge the digital divide, together!

Visit our website at NextGen Classrooms to learn more about our mission and programs.

Don’t forget to subscribe, share, and join our growing community of educators shaping the future of learning!

Speaker 1:

Welcome back to another episode of EdTech Empowerment Innovating Education Together the podcast where we dive deep into the intersection of technology and education to bring you real-world insights from industry leaders and educators. I'm your host, juan Rodriguez, and today we're tackling one of the most exciting and evolving topics in education technology integration and the role of AI in shaping the future of learning. Joining us today is Lindy Hockenberry, an experienced K-12 edtech advisor and the founder of Integrated PD. With over a decade of experience in instructional technology, lindy has worked with educators to create transformative learning experience using AI, digital tools and innovative teaching and strategies. We're going to explore the best practices for integrating technologies in the classrooms, dive into the power of AI in education and address the biggest challenges school face when adopting new tech. Whether you're an educator looking for practical tips or a tech enthusiast curious about the future of learning, this episode's for you. Let's jump in. Hey, lindy, how you doing?

Speaker 2:

I am well, how about you?

Speaker 1:

I'm doing well. Thank you for being a guest on our podcast today. Tell us where you're joining us from.

Speaker 2:

I am in southwest Montana, bozeman, and I will tell you that we've been on basically a winter weather advisory for a solid month and a half, like literally my phone just every. I just ignore it. Now. Every day it's like winter weather advisory and so last night it started snowing, yesterday like late afternoon. It hasn't stopped. I think we've gotten like a foot that I have to go out there and shovel pretty soon here, but it's still coming down. So there's no reason to go shovel until it quits, right Cause then I'm just going to do it again.

Speaker 1:

Oh, we feel lucky we're in rhode island. So we're not, we haven't, we haven't received too much snow, but it is freezing out there and the cold temperatures is freezing all the snow, freezing all the rain. Yep, it just looks ugly out there but I'm glad through. I'm glad, through all this bad weather that we can sit down and chat let's chat about this ed tech and jump right into it. So I mean, can you tell us about your journey into ed tech and how you became a K-12 ed tech advisor?

Speaker 2:

Yeah, so I've spent my career in education I think about 20 years now and I started teaching middle school and high school. I taught CTE, so current tech ed, business education and family, consumer sciences. And when I taught business, my classroom was a computer lab and this was in the mid to late 2000s where, like, laptop carts didn't exist. I mean, maybe they did, but super, super, super rare right. Chromebooks didn't exist, google apps didn't even exist, right, like none of that stuff existed. So, if you can imagine, my little tiny classroom in rural central Montana was filled with old school big old tower desktop computers with really deep monitors took up almost my entire classroom, little tiny classroom. And I taught a lot of technology topics.

Speaker 2:

Because when you teach business education you're either tech heavy topics Because when you teach business education you're either tech heavy, business heavy or maybe a bit of a mix of both. My curriculum was very tech heavy, the business education side. So, long story short, I always had a one-to-one environment and so using technology as a learning tool became it was always really natural. It didn't become. It was always natural to me because I always had that. And so then that led me into working as a technology integration specialist focused on the instructional, pedagogical side, and then long story short. That's what ended up here.

Speaker 1:

I love that. I mean, I was also started as a CTE teacher and that's kind of like what got me excited about ed tech as well.

Speaker 2:

Okay, which CTE topics? Because there's lots of. When you say CTE, it's like such a huge range.

Speaker 1:

Yeah, there's different clusters. I was in audiovisual communications, I was teaching digital arts, so my classroom was very similar. I mean, it was more modern. We had MacBook Pros in our classrooms because we was teaching digital art.

Speaker 2:

And see, in Montana we're so rural and so small that that would actually be under the umbrella of business educators. So, like I had the yearbook, yearbook is what I did. You know, very just teeny bit into that like digital art realm Nice nice, awesome, awesome.

Speaker 1:

But what inspired you to to focus on instructional technology and the teacher support?

Speaker 2:

Well, for number one, technology is a skill that everyone has to know to function in society. And when I started teaching I was my first year teaching. I was 20. I just turned 23 because I have an August birthday, so just barely 23. So, if you can imagine this, I taught middle school.

Speaker 2:

High schoolers, I told you so. I had high schoolers like my seniors were only four or five years younger than me, so I'm an elder millennial. A Gen Z colleague told me that years, several years ago, and at first I was like what? I hate that word and since I've embraced it like I am an elder millennial and we're pretty awesome because we've lived through like so many huge technological changes in our life, more than like any generation, especially when we were kids and in college. But anyway, I'm an elder millennial.

Speaker 2:

So I was teaching millennials, like the younger millennial group were my students, and I think that was a good thing because I was able to take my experience as a high school student, as a college student, as a professional, and see that literally technology is their life and they have to know this skill to literally function in pretty much any job in society. So that was number one and that's another reason why I was teaching technology, but I was also exploring using technology as an instructional tool, right? And then the second thing is is the more I did that, I saw that technology is a learning tool that enhances learning, right, like? It allows my quiet students to have a voice, so everybody in the class was able to have a voice, not just those that are the extroverted raise their hands. That's me.

Speaker 2:

I'm that person right, so it allowed those more introverted students to have a voice. It provided choice to my students because I was able to give them more choice in the products that they were making and how I was assessing learning. I was able to provide multiple means of representation UDL term right, better for them. And give them again, like multiple means of expression, that assessment piece of it. Like you don't have to write a paper, you don't have to do this, I'm going to give you choices. You could go. At the time we had Microsoft Publisher, which I don't even know does that still exist? But I give them the option, right, like like hey, go create a poster in Microsoft Publisher of what you learned in this unit or this learning outcome right. And I started to explore with, again like giving that whole idea of more choice, giving students voices. I was like wow, this is allowing me to more personalize learning for my students and I couldn't do it without the technology yes, right on this one, and you're showing your age by mentioning that right, I know right.

Speaker 2:

I always tell people, by the way, they can't see me. But everybody always is like you're not old enough to be teaching in education for 20 years. And I'm like, yes, I am, I am older than I look. It's an advantage of being short.

Speaker 1:

When you're short, you look younger. Yeah, being short, when you're short, you look younger. Oh man, clip art, that's super old school, but let's tell us right. So what are the key elements that make technology integration?

Speaker 2:

and education successful for you. I love that question and actually a few weeks ago I got to teach some first year teachers about using technology as a learning tool and it was really an eye-opener for me. I used to do that all the time, but my life the last few years for a while there during the pandemic, it was all online virtual learning. Then we got AI right, so I was able to kind of go back and revisit what I call foundations and actually I have a book, a Teacher's Guide to Online Learning and foundations are so important that the whole second section of my book is literally called Part Two Foundations right, and the whole idea is that if you're going to be successful in using technology as a learning tool, you have to create a culture of technology use and you have these foundational elements that you need to go through. So to give some ideas number one always access right. If your students don't have access, then it's not going to work. I think back.

Speaker 2:

I always tell the story of like I think the first iPad came out around 2009, 2010-ish, somewhere around there. That was right when I started working as a tech integration specialist. So schools were buying up iPad cards and they didn't know what to do with them, right? They were like we just we want to play around with these new tools. Come do a training for us. Okay, that's what I did, right, I was a tech integration specialist, so I'd go in and do these trainings and, of course, we would find ways to use the iPad cards that were shared among rooms. But it was limited because the teachers were like I have to plan ahead and I have to check this card out, right, like that doesn't work. I used to say this a lot Technology in a classroom should be like air.

Speaker 2:

It should be no different than sticky notes, right, like highlighters that you have. It's another learning tool. So if you don't have that access, figured out that at any point in time, a student can grab a device and use it as a learning tool that's number one foundation. Got to figure that out. Number two, which builds on that, is accessibility. Like if the students, if you have a student, if you have all Chrome I'm not picking on Chromebooks, let's just go with Chromebooks, because I mentioned them earlier If you've got Chromebooks but you have a student that can't physically interact with the Chromebook, they can no longer. That's the end of learning. Right there, access is everything, right, the foundation of the pyramid. I always say. Then you've got to tackle, like the student, data privacy compliance, and if you're not getting support for your school on that, you have to and advocate for that as a teacher.

Speaker 2:

Then you have what I call the digital course home base. In other words, the technical term we talk about a lot in education is the LMS or the learning management system. Right, that's the place. I call it the digital hub. Right, that digital course home base, it's your home base. It's your hub. Everything for your class flows in and out of that. And what I've learned from working with a lot of teachers is they think that since it's called the digital course home base and it's a digital tool, that they're only going to post digital assignments to that. No, I want you to post everything. It's the home base, right, it doesn't matter if the assignment, the learning task, is digital or non-digital. If your students are drawing a picture, I want that in the home base, right, and that leads into parent communication. And if students are gone, right, it's that communication hub. So that's a big foundation, consistency and clarity.

Speaker 2:

There's a ton of research behind that and I think we've really overlooked that in education. It's the idea of, like I always say to the elementary teachers like you guys are the masters of routine, right, think about all the routine and middle and high school tool, but especially elementary like. This is how we line up for lunch, this is how we walk down the hall, this is how we use our inside voices routine, routine, routine. But where we've gone wrong in ed tech is we haven't transferred that routine into the digital space, right? So you have to think about things like organization and naming conventions and formatting, making that as clear and consistent as possible.

Speaker 2:

Adult support is another huge foundation and we're really feeling that right now there's a huge conversation in K-12 ed tech around too much technology and some schools going the opposite direction of just getting rid of technology completely.

Speaker 2:

That's not the answer, right, but the conversation in society can easily take us there if we're not educating those adults and our community members about no like. It's much more than that All the things I've said. It's a skill they need to learn for life. On that note, digital citizenship curriculum every school needs to have it and teach it, whether it's embedded in the core content areas, which you can do, or it can be an extra thing, whatever works for you, but you need to have it. And then, finally, my last foundational element is, like device agnostic, instructional tools and notice that that's last right, because you want to pick the tools last, but you have to have the tools. The tools are a means to the end, right? So we have to talk about them and we have to have them, because if you don't have school safe, school friendly tools and you can't do anything past that.

Speaker 1:

That's right on. That's right on, and we want to make sure that students are advancing when they're using these technologies. Which leads us to our next question. Right Is that? How can educators ensure that technology enhances, rather than replaces, a strong pedagogy?

Speaker 2:

your standards. Your standards tell you what you need to teach and what your students need to have when they walk out of your classroom. That then creates your learning outcomes. So I always tell teachers start the standards, build the learning outcomes from there, then choose the tool that's right for the job. That might be a tech tool, that might be a set of post-it notes. I keep referencing that because it's right in front of my face at my desk here. It could be an index card, could be math manipulatives, could be a digital tool like Padlet or Book Creator, just to throw a couple out there. Right, I actually have a physical. In my closet behind me I have a physical box, a little toolbox, and it's labeled Lindy's Teacher Toolkit or Lindy's Teacher Toolbox or something like that, and then inside of it I have highlighters, index cards, post-it notes, manipulatives. But then I have a note that says book creator, Padlet, Canva. Right, Because those are tools that help you get to the learning outcome.

Speaker 1:

Awesome, and you did mention that you have a book. Can you just plug that?

Speaker 2:

Yeah, so it's called a teacher's guide to online learning and I know a lot of people when they hear online learning they're like well, I don't teach online. This doesn't have anything to do with me, but I promise you that does. It is built, I wrote it for online learning teachers and to be everything a teacher needs to start teaching online. But I always tell teachers and I said this so many times in the pandemic if you have a course that's set up for online, you can move that face-to-face very easy and in fact, it makes it more personalized and self-paced for the students. It's way harder to go the other direction. It's way harder which is what almost every face-to-face teacher experienced during the pandemic right, it's way harder to take a face-to-face course and make it fit online. You can go one way, you can't go the other way.

Speaker 2:

So what I tell teachers, like I told you part two, is all foundations. All those things that I just talked about is all outlined in detail in part two. If you have those foundational elements set, you are going to be good to go to teach in, whether it's face-to-face, fully online, virtual, blended hybrid, whatever HyFlex, whatever other terminology we're going to throw out there to define the learning environment. You're going to be good to go and the strategies that I give you to teach virtual are going to be ones that you can easily use in face-to-face and, like I said, turn it in Like I talk about self-paced lessons. A lot you can do self-paced lessons in any along any line of that continuum from face-to-face to virtual awesome, say the title of the book one more time it's called a teacher's guide to online learning the teacher's guide to online learning.

Speaker 1:

It sounds like it has the blueprint to tech integration.

Speaker 2:

It is and it's big, it's 340, some pages just to give you yeah, just to give you a little bit of context. So it literally is a guide, like full end-to-end guide, of what you need to have that culture of technology.

Speaker 1:

Nice, awesome, awesome. And I'm pretty sure in that book it probably talks about some common mistakes, right? What are some common mistakes that you see teachers are making when integrating technology and how can they avoid them?

Speaker 2:

Not creating that culture of technology, not having those foundations. If you don't have those things that I just listed and those are just the big things or some other things in there you need to back up, you need to go back Right and you need to set those foundations. That doesn't mean that you can't keep doing it and you have to halt all things. That doesn't mean that you can't keep doing it and you have to halt all things, but you yourself need to back up and revisit and be like oh, my students are always off iPads, especially now the kids in K-12, every one of them. I told you that the iPad came out in about 2010. So what is it? 15.? So, yes, so almost well, let's say, every kid in K-12 has had some sort of tablet in their hand, probably before they can remember right. Think about that, so it becomes natural to them. But what comes natural to them is using it for fun, for games. What doesn't come natural is using technology to learn, to help yourself be organized, to help yourself be a productive member of society. Those are things we have to teach. So, culture of foundation.

Speaker 2:

Another common mistake is thinking that you have to have 50 different instructional tech tools. I tell teachers and I want to define what I mean by an instructional tech tool. An instructional tech tool is a tool that allows you to do lots of different things. Right, but it's a tech tool. So I named a few of them. Right, it, but it's a tech tool, so I've named a few of them. Padlet is a big one for me. I use it in almost every teaching session I do um. Book Creator is huge. Canva huge um. School AI Spaces is another one that's becoming like my top five. I use in almost every single session. Those are tools that allow me to do lots of different things and, no matter what learning outcome I have, I can use those tools in some way or another. Very, very different than curriculum tech tools. Those are the tools that are maybe teaching math skills, for example, if that makes sense. So you don't have to have 50 instructional tech tools, five to 10, literally like five to 10.

Speaker 2:

And I just named you kind of my top five. I would say it would probably be my top five instructional tech tools and then slowly build on those things, especially if you tend to be along the, you know, if you think of the diffusion of innovation curve, you have innovators, early adopters, early majority, late majority and then the laggards. If you're in especially that late majority and laggard side, which is 50% of people, by the way, then start with even less than five. Start with one, build another one. Okay, this is going great. I'm going to add another tool in there that's going to allow me to do a little bit more, and the trick is picking tools that give you a lot of bang for your buck, and a lot of those that I just named do. This is going to be controversial, but I think a mistake that's being made right now, and it's been made for a while, but especially the conversation in our society right now, is that thinking technology is the problem or the villain To me? Yeah, please do.

Speaker 1:

I'm big on that right, Especially in classrooms where they're telling students to put their cell phones away. It sounds like you're telling us that cell phones shouldn't be put away, but that they should be used as a tool to help students advance with their learning, but I'm pretty sure a lot of teachers that are listening are going to tell me like, what is he talking about? It's a distraction. Students are not going to use cell phones to learn. They can just go on TikTok or go on social media. So what can we do to encourage educators or to give them a guide so that they can teach their students how to use this as an effective tool inside of their classrooms?

Speaker 2:

tool inside of their classrooms. Yeah, so I mean it goes back to that digital assistantship curriculum being a critical foundation that every school needs to have and figure out where it's going to be taught, if it's going to be taught as part of the corporate limit. If that's the case, then teachers need trained. Are you going to have a teacher that teaches more of a standalone digital assistantship type course? There's no right or wrong answer there, but the right answer is it has to happen and we can't. Right now, the conversation and this is what's happening with AI, by the way, which is just another tech tool we can get there is that we blame that because that's the thing, that again, we blame it as the problem or the villain, or the enemy because it's right in front of our face. Kids have their faces in their phones at all times. So we're blaming the technology when, in reality, we need to back up, we need to look at the big picture, right the foundations. We need to teach them how to have healthy and responsible relationships with technology.

Speaker 2:

I talk with this all the time and I ask teachers I've asked hundreds, maybe even thousands of teachers this said who, if I were to say, right now, just get rid of your smartphone, come hand it to me and you're never going to get it back. Who would do that? I think I've ever had a single hand raised. I said, okay, so Pandora's box is open, it's never going to be shut again. You guys just proved it right now. It's never going to be shut again. Right, like so. We can't act like it doesn't exist. We can't create schools that are bubbles that operate outside of societal change.

Speaker 2:

So to me, the problem is we have so many load-bearing walls in formal education that make it hard not impossible, but hard for change to happen at that classroom level. So, for example, if you're a school and you're maybe you're listening to this you're like my school doesn't have digital citizenship curriculum. I don't know what that is. Right. You can take it in your own hands as a teacher and be like nope, I see and I know a lot of teachers that do this I see the importance and even though I'm not getting the support from my school, my district, I'm still going to teach these things in my class.

Speaker 2:

I'm going to have the conversations that AI is just a predicting machine and that algorithms are built to be addictive. Right, I'm going to have my students analyze terms of use of social media tools that they use every single day, especially if I have middle and high school students. I babbled a lot there but hopefully I came back around and kind of made my point that we got to back up. We got to look at the huge load-bearing walls that are the real problem and that technology is never going to go away. So we have to teach kids to have healthy, responsible relationships with technology. There's the summary right there.

Speaker 1:

And I also want to emphasize the importance of being a digital citizenship and making sure that we allow students to understand that, providing those courses, and one of the places that you can see some of those courses are websites like commonsenseorg, and there's plenty of other resources out there. But you had mentioned AI as a key topic. How do you see AI shaping the future of education?

Speaker 2:

Oh boy, where do I start? So we're filming this. It's middle of February 2025, just so you know. And it's right in the middle of what I call EdTech conference season, because there's like a whole bunch of EdTech conferences within like six weeks of each other. So I just got off five different EdTech conferences. I have another one next week in Seattle that I'm attending. So I have led at one of those conferences. I had seven sessions in three days, almost every one of them around AI. So I've had a whole lot of conversations with educators.

Speaker 2:

I have so many thoughts in my brain right now about AI and education, but, to start, I see that in education and in society in general not just education we are not understanding that AI is changing the way that we learn, that we think and that we do. Okay, my daily work looks completely different than it did two years ago because I have outsourced a lot of my doing to AI. Okay, and I know that freaks people out. A lot of you are going what is she talking about? Because if you've never tried it and you've never done it, this sounds like a foreign language. A friend of mine posted a blog post, I think just earlier this week, or might've been last week. That said, we need new terminology to talk about AI because it's so different. The words that we currently have right now don't fit. And I agree 100%. Because I've outsourced my doing to AI, I now have more time to learn, to think, to go deeper, to extend my reach and share more with educators than I ever have before. It's amazing. It's an amazing feeling. But yet the only story I always say you tell your own story, right, and if you don't tell your own story, somebody will tell the story for you. And the story in education right now is that AI is only used for cheating. Ai is just cheating. That's the only story you hear, right, we haven't stopped to think about and there are a lot of bad things.

Speaker 2:

With every new, usually technology is usually the shift that causes a shift in society. In the last what, 30, 40 years, it's usually a technology that does this. There's always goods and there's always bads. I call it the black and white, right. You've got the black and white. Nothing in life is black and white, everything is gray. So you have to take the good with the bad and find kind of this happy medium in the middle.

Speaker 2:

Ai it is just a tool. I told you technology is just another tool for learning. It is there's a lot of conversation about that in education right now Like is AI just a tool? I mean it is. I think that that is healthy. But by saying it's just a tool and it's overhyped and we could go on and on about that, it takes away the conversation that it's a completely different type of tool. We have never had a type of tool before in the history of humans that allow you to outsource your doing. Think about that. So we are also moving the allow you to outsource your doing. Think about that. So we are also moving. The other reason this is a key different type of technology. We're moving from computers to machines. Computers operate off of rules-based computing. We're now moving into machine learning. So I like to use the example of.

Speaker 2:

Think about software. Like I want to say, the late 80s, early 90s, is when Microsoft started putting out their new software processing and spreadsheet software. So let's think about spreadsheets. The old Wendy spent a ton of time creating spreadsheets. And because I had to spend a ton of time creating spreadsheets? Because the software helped a lot and it still saved me time from not using spreadsheet software, but it because I had to build the spreadsheet, I wasn't able to spend as much time on analyzing the spreadsheet, right? So if I have spent all this time building a spreadsheet of my students' testing data, for example, and adding all this data validation and conditional formatting, I've just spent all this time that I could have been spending analyzing, which is going to help me analyze, but I had to do that to get to that point.

Speaker 2:

The new Lindy I told you I've changed my workflow. The new Lindy is that ChatGPT creates my spreadsheets for me, and I specifically mentioned ChatGPT because it's the best of any AI tool out there that I've had for actually giving you a downloadable Excel file. I'm not doing this for anything that has student data, by the way, if people are red flags there. I gave the example of using a spreadsheet for student testing data. I would never put that in the chat GPT. We can talk about that more later.

Speaker 2:

But for other things that don't include student data, privacy things, right, or sensitive confidential information, I have chat GPT create the spreadsheet for me and then guess what? All that time that I used to use to create the spreadsheet I am now using to analyze the data, to think, to use my human brain to dig deeper. So moral of the story again babbling here, but I think it was an important babble is that AI is changing everything, but in education we're still trying to fit a square peg in a round hole. The square peg is AI and the round hole is formal education. Right, and all those load-bearing walls. We're trying to shove this very different machine learning type of technology that is outsourcing, doing for humans into the box and the load-bearing walls. That is formal education.

Speaker 1:

That's right on, and I think folks need to go back to early in this conversation. Right, when you mentioned that they have to build a culture and they have to set standards so that students and the school in general the whole school is using AI in the right way. But I think what educators are afraid of is what you had mentioned that students are using this as a way to cheat. But like, can you share some examples of how AI has been effectively used to improve learning outcomes instead?

Speaker 2:

Yeah, so I told you that only story that we've heard is it's used for cheating, that students are over reliant on AI. I don't doubt that that's the case, don't get me wrong. But a couple of things I want to mention here. For one, there's some new research out just in the last three I just want to say it came out mid to end of January, so in the last few weeks. That says that they call it the magical thinking trap. That, as AI literacy is, when sorry, let me rephrase that when AI literacy is lower, so someone doesn't have much AI literacy their use of AI increases. So stop and think about that for a second. So let's do the opposite too. As AI literacy increases and to define that that means understanding, using and evaluating AI is a good explanation of AI literacy. So when somebody understands how AI works, that it's just a predicting machine, they know how to use it, they know what it's going to outsource and what it's not going to outsource, what's uniquely human versus what AI can help them with. When they can evaluate the outputs, their AI usage actually goes down. So think about that. So what we have going on in education right now is that we have all not just students the majority of society that don't have have very little AI literacy. They don't understand it, they don't know. They don't understand. It's a machine, they right, and so they think it's magical. So back to the magical thinking trap. Okay, the magical thinking trap is this idea that when you think something is magical, you don't understand how it works. You think it's magical, you think it's going to solve all of your problems, but as soon as you take away that curtain of magic and you know that it's just a predicting machine, you now understand that, oh, this is going to help me do certain things, but it's not going to help me with other things. And how to evaluate those two? Does that make sense? Like the magical thinking trap? Now, this is another thing that this isn't necessarily like relevant to the question, but I promise I'll get back to the question because this is so important.

Speaker 2:

This research study, you would think, was an education study. It was not. It was a study done by business professors specifically around marketing strategies, and I have the quote somewhere in a slide. But basically, you can go to the study and you can find that we can put the name of the study. I'll give you the name of the study. We can put in the show notes too, if you want to dig into it. It says that, basically, businesses should shift their marketing efforts to those with low AI literacy.

Speaker 2:

And when you realize that I've already been noticing this for a while. By the way, there was a Super Bowl ad in last year's Super Bowl, so we just had 2025 Super Bowl. In the 2024 Super Bowl, there was an ad from Amazon Q that had Jelly Roll in it and Jelly Roll has face tattoos, right? And the whole ad was that, oh, amazon Q is magical, it's so magical. Jelly roll got another face tattoo that said AWS across his forehead.

Speaker 2:

So, and guess, this is happening in education, marketing and tech tools. A lot, by the way of like this magical idea that says it's going to solve all your problems. Right, as a teacher, it's just going to solve all your problems. Magical thinking trap. Okay, so that's number one. So the reason I say that is because I want you to understand that if you teach your students AI literacy, when they understand that AI is not magical, that over-reliance and that idea of cheating is going to go down.

Speaker 2:

Okay, the second part that I want to answer this question. You asked for a specific example. So I'm going to give you a specific example. The reality is and I don't think educators have had enough time to pause and think that your students have been copying and pasting from Wikipedia and other websites for forever. I had battles with my students over Wikipedia, and the validity of that source has gotten way better since then, by the way, won't go into that, but the key difference is they were copying and pasting from Wikipedia and other websites, but you could actually detect that it was detectable right, which made it not copy and pasteable, because you could really run it through a program that would say oh no, this came directly from Wikipediacom.

Speaker 2:

The reason the cheating conversation has come to the forefront is that AI generated text is customizable. Okay, but we've only looked at that as a bad thing in education, and this is what drives me crazy. I'm like again, let's change the story. I don't want you now just get the idea of your mind that your students are using it to cheat. Just get rid of it for a second and think about what I just said Text is customizable and sit on that for a second. The instructional implications of that. Okay, pause this podcast if you need to just sit on that what that means is Wikipedia you can only access if you read at that reading level and in that language, or if you have a tool that translates it for you.

Speaker 2:

I can now take text from a website if I want to. I can copy and paste it into an AI tool and I can tell it. Explain it to me like a seventh grader, explain it to me like a kindergartner. I do that almost every day, by the way, and that's very subjective. I can get even more specific as a teacher and I can say translate or not translate. Convert this to a 700 Lexile score. Oh, you have students that read it. A 400 Lexile score, no problem, convert this to a 400 Lexile score, right.

Speaker 2:

And then and something that's been way also way underutilized in education that's an example of how to use it to improve learning outcomes is we have a relevance problem in formal education. That's another load, very well, that we could talk about a lot. We have Gen Z and Gen Alpha learners that have absolutely no tie and relevance to our standards in our curriculum in so many cases, right. But what AI can do is I can take literally any content, topic, subject, matter and take any interest that your students are in and I love to ask this question to teachers. I get some of the most random answers like what are your students in?

Speaker 2:

Sometimes I get into it, interested in it, like I get really generic ones like video games, minecraft, sports, right. But then sometimes I get really, really like bizarre, niche-y ones, like one time I got Pennywise. I'm like middle schoolers are into Pennywise. The other day I got some like guerrilla VR game that I'd never heard of. But here's the great thing is that it doesn't matter how niche-y your student interests are. You can take any subject, content, topic, et cetera, and tell an AI relate this to video games, relate this to Minecraft, relate this to sports, relate this to football? Relate this to VR, guerrilla games? Maybe don't do Pennywise. That might be the line that you draw with your human brain as a teacher, right? So, moral of the story hopefully that gives you a good example of like the reality of where we are and we've been for a while, by the way with AI and it gets better and better every day, and I'm not exaggerating when I say that we get new AI models every single week that make that better and better.

Speaker 1:

Oh, for sure, For sure. And I love those examples that you gave, because you can either enhance or simplify a lesson or curriculum, but it's up to you how do you want to use that tool? And we did talk about some misconceptions of AI, but let's talk about some more. What are other misconceptions teachers have about AI in the classrooms, aside from cheating?

Speaker 2:

From cheating. I mean, I already said that tech is the enemy or tech is the problem. It's the same thing with AI. It's not the problem. Dig deeper. Dig deeper If you ever and this is just human advice as a member of society if you ever see something at surface level and you're like, oh, that's the problem Smartphones, ai, technology, whatever it is dig deeper, right, and that's what we need to do in AI and education right now. The problem is not AI. We need to dig deeper and all of these deep, deep issues, the little green walls in education, like I talked about, the issues with the tech companies that drive our capitalistic society and that are using you and your students' data is how they make money a lot of the times by selling your data. All right, and data is the new gold in the age of AI. There's a couple of new goals, but that's the big one Right gold in the age of AI. There's a couple of new goals, but that's the big one right, because AI is trained off of data made by humans. Data is the new gold. If you don't have data, you don't have AI models, and when you don't have AI models, and especially new AI models that are better than older AI models. You're not making money in a capitalistic society, right? It goes round and round and round and round. The other problem I have and I talk about this all the time, if you follow me, you're like okay, linda, you're a broken record, but I will say it again is that teachers, classroom teachers, do not have enough non-instructional time, even close, to keep up. I told you I've been in education for 20 years. About 15 of those years I've been specifically in ed tech and up until the last two to three years I'll say two years about the time that chat GPT was released, november of 2022, I didn't have a problem keeping up with all the changes. It's my job, I love it. I'm a bit of a nerd like that. I love to learn new technologies and all these new features and these tech tools. It's making me better and they're always getting better. It's great. Oh, these new features and these tech tools. It's making me better and they're always getting better. It's great. Ai is now moving so much faster. It's development and it's increase in what's the word I'm thinking of in how good it is Easy way to say it, I guess is moving so fast I can't even keep up. Like, let's just look at the last month. Honestly, not even the last month, the last two to three weeks we got deep seek that forced open AI to push out and they're lying If they don't tell you that they, they, they rapidly pushed out their new oh three reasoning model, because deep seeks a reasoning model and it's open source Right. Then deep seek said oh, we have this new AI assistant, right, agenic AI. We haven't talked about that yet, agenic AI. We haven't talked about that yet, agenic AI. So then OpenAI said and this was all within a week of each other people. Then OpenAI said oh, by the way, we have agenic AI too. We have both deep research and this new operator feature. That's agenic AI for us. Now. Boom, boom, boom, boom, boom. That happened, I'm not even kidding you.

Speaker 2:

I was at the TCA conference and I was actively, even like an hour before my sessions, ai assistant. All the terminology was the release of agentic AI and true, like getting to true agentic AI at the consumer level. It's been around for a while, but it was more at the enterprise level and it was out of reach for the average person or even, honestly, anyone but very large businesses. It's now available at the consumer level, like it's there, full agent KI, right? So moral of the story I can't keep up.

Speaker 2:

I am not in the classroom full-time anymore so I don't have kids in my face like six hours a day, think about that.

Speaker 2:

Maybe even seven hours a day for some teachers, depending upon how many preps you have, how much non-instructional time you have.

Speaker 2:

Most teachers and I ask this to teachers a lot and the most I've ever heard I've heard a handful of teachers tell me they have two preps a day, so like about maybe an hour and a half, a little more than an hour and a half Most teachers only have about 50 minutes of prep time a day non-instructional time, if you're not familiar with that terminology. Right, meaning they don't have students that they're actively teaching, monitoring in some way. That's barely enough time to answer email, let alone figure out all these technological changes. And I always say that teachers have a double duty, because we have to figure out the technological change and understand it, but then we have to take it to the next level and determine how that change is going to affect our curriculum and then our instruction. Right, teachers don't have time to do that. They don't have time to do that and we don't give them enough PD time in the majority of cases to figure out how to do that.

Speaker 1:

Yes, right on, I like that. You're an advocate for more time and let's go back to what you had mentioned. Right, and we're bouncing back and forth and I apologize for that, but you had talked about-.

Speaker 2:

No, I'm bouncing back and forth.

Speaker 1:

I am too with these questions, but, like you had mentioned, developing AI literacy right, the more that someone increases their AI literacy, the less dependent they are on it. How can teachers develop AI literacy and use it in ways to empower rather than overwhelm students?

Speaker 2:

okay, so I want to. I want to start with this students are not overwhelmed by ai and technology. So I told you, we have this myth of digital natives, and that is the myth in terms of they don't know how to use it to learn and do productive things. They know how to play angry birds or whatever guerrilla VR game is out there right now. Right, they're not the one. Students aren't overwhelmed. I've asked this to so many students. They're not overwhelmed.

Speaker 2:

The adults are, and we could say that the reason is a lot of the reason educators are overwhelmed is they don't have the time right. We could link it back to that. It's also just, it's not native to most adults. So it does make you feel overwhelmed and it is moving so fast that it creates an overwhelming feeling for a lot of people, right? So that's number one. Number two how do teachers develop AI, literacy, Professional development and let me tell you, I hate that that word has a negative connotation in education, because professional development, professional learning I've noticed starting to shift to that terminology, which I do like.

Speaker 2:

Professional learning is just that, it's learning. And when did we go from teachers who, literally, your whole job as a teacher is to transfer learning to another human being, to having negative learning experiences. Like think about that for a second and when you hear, oh, we have a PD day, you groan that should not be the case. So I always tell teachers I'm like you are advocates, and I know that's not natural for a lot of people to voice their opinions. It's very natural to me, If anybody knows me, I'm very opinionated. I will tell you my opinion all day long for the most part. But I know that's not natural to a lot of human beings. But you have to be. You have to advocate to your leadership, to get professional learning opportunities to you as a teacher and the time to go and do those professional learning opportunities and follow up with them right. Like you have to have that. We've been in that case for so long with technological change and I've just explained why we're at a whole nother level.

Speaker 2:

So that's number one. Like you've got to have professional learning Advocate. If your school isn't offering it, advocate for it. If they're still not offering it, go find it yourself. Really, there's a lot out there that you can do. A lot of free stuff too. Linkedin is huge. I was just telling someone yesterday that social media world is so weird. Right now I don't know what social media tool to go to anymore, so I pretty much only do LinkedIn and TikTok because that's where I learn by far the most and I know a lot of people like TikTok. I'm telling you micro learning videos. I learn a ton off of there if you follow the right people right and get the right hashtags and the right algorithm and all that stuff.

Speaker 2:

And that does take a bit of time to develop, but once you do it you can learn a lot there. So find the learning yourself right. And I know that's sometimes hard as a teacher, where you're like I'm only contracted eight to three or three, 30 or whatever it is, um, but we're professionals and learning never stops Like that's just what it to be a productive member of society. You can never stop learning, right. So that's number one. Number two, and I know this is tricky, but you just do it. Like literally, the Nike, the Nike theme it's not theme, whatever they call it, just do it. Their slogan, thank you, the Nike slogan, just do it.

Speaker 2:

I just had this conversation with someone the other day. I told them to do some research on something and I said go check out ChatGPT and turn the search feature on. It's a really great way to kind of like help, like speed up your research process, right, because it will like take and pull out the research that might've taken you forever to do in a Google search. It kind of pulls that out and summarizes and customizes for you, right? Well, they came back and they said, well, I did it, but like it gave me one that it doesn't seem like this website exists and I'm like, so okay, and she's like, well, I don't understand why I used it. Then I said AI learning how to use AI and again learning how to use it to outsource your doing is a skill that has to be developed, and in order to develop a skill, we know as teachers, that takes time and trying and failure as part of it. I said it's a combination of learning how to leverage the AI, learning what it's good and bad at, learning what different AI models are good and bad at and what features they have, and that's a very complicated situation right now. I just saw posts from opening AI that they're changing their product to hopefully fix that. Thank goodness Cause it's like I have eight different AI models in my chat GPT pro account right now. That's a lot to choose from and I'm constantly bouncing between them because this one's good at this, this one's good at this, this one has search, this one doesn't, right.

Speaker 2:

The only way I learned that is to do it and develop that skill, and then I take that I combine it with my human brain, right? I sometimes combine it with a Google research, but I will say that less and less every day do I find myself going and doing. I call it a Google search, but an internet search versus more of a generative AI type search with like a conversational AI chatbot, right, so just do it. You have to have time to develop that skill, but it takes time. And that goes back to my teachers don't have very much non-instructional time, so what I tell teachers is do it with your students and, of course, make it age appropriate.

Speaker 2:

But some of the best teachers I know that are embracing this change and embracing AI step back and no longer look at themselves as the expert in the room and open up the conversation to students that say hey, this is all new to all of us. I'm still trying to learn it. Let's learn together. And, of course, you have to have a little bit of research as a teacher to make sure you're using tools that are school approved, school safe, data privacy, checked, developmentally, appropriate to the age of your student. But once you kind of do that and figure that out, don't feel like you have to spend hours figuring out that tool. Do it with your students and learn with them. You'll be amazed at what they can contribute to the conversation and what you can all learn from each other.

Speaker 1:

Yeah, I mean, as what you said earlier, right, these students are born into these technologies, where they're using it at an infancy age and by the time they're in middle school, high school, they're pros at it, where they can give us tips on how to use it a lot better than we're currently using it. And I love that you had mentioned we need to encourage more educators to be intentional about the PDs that they're having. Advocate for what you want and for what you think you need at your school. So I love that you had mentioned that. But let's talk about AI ethics, and it's a growing concern in schools and school districts. But what guidelines should schools and educators follow when implementing these AI tools?

Speaker 2:

So I was getting this question so much and I was hearing from educators like, okay, I want to try AI, but I'm just not. I don't want to do anything. I'm not supposed to do anything. That's illegal or like putting my private student data in something I shouldn't or whatever. So I created an infographic. I'm going to give it to Ron. We'll post it with the show notes for you, so you have it. But just a good summary. It's called Navigating AI Educators' Rules of Thumb and there's four sections. Number one is what you should never upload into public AI models Confidential, proprietary, sensitive information.

Speaker 2:

Those are like my three checkboxes Nothing confidential not completely, but for the sake of this conversation we'll say it's specific to educators. Is PII personally identifiable information. That terminology comes straight from FERPA, which hopefully, if you're an educator you know FERPA. You should have had FERPA training. If you haven't, please go find some FERPA training. Super important. It's a federal law that protects student data. Basically it's student data privacy. It's one of several student data privacy laws. It's kind of the big one. So it basically says PII is anything that attaches back to an individual so name, birthdate, address, phone number, social security number. Of course, those types of things Never put PII into a public AI model.

Speaker 2:

Now we are getting better that schools are starting to adopt private AI models of different kinds, or AI, private AI solutions that have like what's called de-identification is the technical term, or you could put, say, test scores in there, and it will de-identify any PII. How's that for a mouthful? So that you are FERPA compliant but you can still utilize the AI, for example, to analyze testing data. So that's number one. Number two and I'm only going to do one and two I'll leave a little carrot for you to go look at the infographic, for three and four is be aware of, and I have four points that are not comprehensive. So remember this is just your rules of thumb. This is just to get you started with things to be aware of as an educator when it comes to AI.

Speaker 2:

One is knowledge cutoff. That's becoming less and less every day, as AI models are getting connected to the internet, like ChatGPT now does have the search function. You can't use it with every model in every mode, though, so you have to be aware of that, and what that means is so, when you have a large language model, is the technology that drives the chat GPTs of the world, the conversational AI chatbots, basically Large language models, have what's called a knowledge cutoff, because you get data. Is the new gold? You get all this data that goes and imagine it going to a pot. They mix up that pot. This is the analogy I give teachers. They mix up that pot and in reality it's all these different steps they go through to train that model, but no more data goes into that model. Okay, that's called knowledge cutoff. So so Claude, for example, is not connected to the internet. I love Claude, by the way, but that is one disadvantage of it it's not connected to the internet. So currently we haven't gotten a new Claude model in a few months. I can't remember exactly when the last Claude model came out, but depending upon when it did, if that knowledge cutoff was before November of 2024, it's not going to know that Donald Trump is the president right now. Does that make sense? That's knowledge cutoff. But adding true web search helps that knowledge cutoff. That's still super important to know.

Speaker 2:

Hallucinations is the term that's used when AI kind of makes stuff up. The better and better these AI models get, the less and less we get hallucinations, but I still think it's relevant to know and understand. Bias is huge. I told you, data is the new gold. Data comes from humans. Humans create these AI models because humans build the data. Humans are innately biased, so the data is going to be innately biased and we're getting better and better about that as well.

Speaker 2:

As we get more data and learn and these new models get trained, they become less and less biased, but there's all. You can never totally get rid of bias. You can get close to unbiased, but that's a huge conversation to have with kids especially, and teachers. And then anthropomorphism is a big one. That's a huge discussion. That's the idea of giving AI human characteristics or human emotions. This is a huge discussion. If you've never heard that term, it's a big one. Anthropomorphism you need to look it up and you need to have these conversations with your kids, because we're starting to hear really terrible stories about kids developing unhealthy relationships with AIs because they don't understand that it's just an AI, it's an artificial intelligence, it's not a human being, and how to have healthy relationships with an AI and not unhealthy relationships with an AI. Okay, that's that. So anyway, we will. There's three. There's two more parts to that infographic, but that kind of gives you the starting point. That, I think, is a good foundation to then build off the discussion for AI ethics.

Speaker 1:

So what's the name of that infographic? One more time.

Speaker 2:

So it's called Navigating AI Educators Rules of Thumb.

Speaker 1:

Nice Navigating AI Educators Rule of Thumb. So you'll see that either when we share the podcast, you can see it on our website or you can see it in the notes of this podcast. But let's jump into something that you had mentioned earlier before, which is like equity and access, which is the foundation to tech integration. Right, you had mentioned equity and access as the foundational challenges, and how can we address these issues to ensure that all students benefit from ed tech?

Speaker 2:

Oh man, asking the tough questions today. If I had the answer to that, I would probably not be here. To be quite honest, I'd be sipping Mai Tais in Hawaii, or something I don't know.

Speaker 2:

That's a big, big question. It's a big load-bearing wall. Those are both load-bearing walls equity and access and I told you that access is like the number one foundational element, the base of the pyramid. When it comes, if you don't have that base of the pyramid, that technology, culture, if you try to build, your pyramid will fall. It will crumble right.

Speaker 2:

So my answer is that we have to break down those low-grade walls, and so my husband and I remodeled our house. We bought it during the pandemic May of 2020, we moved in. It was stuck in the 1970s and 80s, so we completely remodeled it. During the pandemic May of 2020, we moved in. It was stuck in the 1970s and 80s, so we completely remodeled it over the last four or four and a half years, and so we knocked down walls and broke down drywall, and so I literally imagine it was so satisfying to me to like have like a hammer and like bang into the drywall and start like chipping away and getting like pulling the drywall off these old, nasty walls that just need to go away.

Speaker 2:

So I use that analogy with educators to say, when you start, just and you can't you can't pull down a wall in one swipe unless you're a bulldozer or like literally what it's those machineries that have, like the big balls that like knock whole buildings down or something right.

Speaker 2:

You as a human being, unless you're a piece of machinery, can't knock down an entire wall in one swing. You have to chip away at it. I had to take that hammer and slowly chip away at pulling off the drywall. Then, once the drywall got down, we had to figure out okay, do we have any? Is this a load-bearing wall? If it was, we had to add another level of support, then take down the other part of the load-bearing wall that was no longer load-bearing. That analogy is exactly how it is, as an educator, to be an advocate, to start slowly chipping away. Do the things that you can do in your classroom to make changes, make your learning more equitable, inclusive, more accessible, right, but then be an advocate for the bigger load-bearing wall and what needs to be done to break down those equity and access challenges.

Speaker 1:

Nice and, if you don't mind, lindy, I'm going to plug in my organization, because my organization, and not just my organization I'm pretty sure there's other organizations. Tap into your nonprofits, tap into your organizations within your community. There are organizations out there that are providing access to technologies and making sure that education is equitable for all students, so I think that's another way to do it as well. You also talked about PDs. What role does professional development play in overcoming barriers in tech integration?

Speaker 2:

Gosh, I already said PD is learning. Pd is everything, it's everything for a teacher. Teachers are people who, like I said, they give knowledge to other human beings. We can't stop gaining knowledge, especially in a world that's changing and moving as fast as it is. So to me it's everything. Another big piece and, by the way, this is what I do for a living, so I might be a little biased on this matter. I do a lot of professional development for teachers, but I big on the idea that we teach how we were taught.

Speaker 2:

The way that you learned as a learner typically is the way that you then transfer your teaching right, because it's all that you know, it's all that you've experienced. So I'm a huge advocate in modeling new learning experiences for teachers. Like, I put them almost in the student shoes in a way, and it's really hard for teachers. I always tell them take that teacher hat off for 10 minutes for me. And it's so hard for teachers to take the teacher hat off. No matter what I do, they're constantly going back to their teacher experience, which they should do Right. But I'm like just try, just put yourself in the student shoes, just be a learner here for a minute. Let me take you through a model lesson. Then, once we get through that model lesson, I'm like okay, grab that teacher hat, put it back on. Whether it's a baseball hat, a cowboy hat, I don't care. Put that teacher hat back on. And now let's debrief and let's say what was the experience like learning that way. Now, how could you transfer this into your classroom? How could you edit this to work for your particular learning environment, which is different in every single classroom and for every single teacher, for whatever age of learners you teach, whatever content you teach, whatever standards you teach, whatever curriculum you're tied to from your district? Right, I was so excited so I've been doing this for years this idea of like modeling for teachers and having them be in the student shoes first and then putting their teacher hat back on.

Speaker 2:

I was so excited that ISTE this year ISTE conference came out with a new session type that they're calling Model Lesson and mine got accepted. So in June, iste's in San Antonio I think this year I will be I actually have a full session dedicated to putting teachers in the student shoes and modeling a lesson for them, and you could probably guess what mine is focused on personally is learning using AI. So they're going to actually get to experience. That's another thing. Like this stuff is new to all of us. We have never learned alongside of AI or collaborated with AI before as human beings. Up until the last few years. Well, we had lots of predictive AI out there, but that's a very different type of AI than generative AI or genetic AI, right?

Speaker 2:

So I'm like, hey, let's just see what is this like if I put you in the student's shoes and you actually learn alongside AI and have a chance to collaborate with AI and we fact check and we collaborate with humans too. So it's not just collaborating with AI, we collaborate with humans as part of it, and I take them kind of through this whole model lesson process.

Speaker 1:

That sounds exciting. Can you let the listeners know one more time where is that going to be located?

Speaker 2:

So that's the.

Speaker 1:

ISTE conference.

Speaker 2:

If you're not not familiar with isti, it's the international society for technology and education. They have a huge conference. I think it's like the largest ed tech conference. Maybe in the world, bet might be close, I don't know. Bet and isti are the two largest bets in london every january. Isti this year it's always in the us um and this year it's in, I believe, san antonio. It's always like the last week of june and it's huge.

Speaker 1:

yeah, it's in, I believe, san Antonio. It's always like the last week of June and it's huge. Yeah, it's in San Antonio, june 24th.

Speaker 2:

Okay, good, you're checking as I'm talking. Good, good, I know it's always the last week of June, but it moves around. Last year it was in Denver, which is great for me, the West. This year it's in San Antonio. It's a huge conference. If you're like this idea of digital citizenship and how do I teach kids digital literacy and AI literacy is new language for you. Iste has standards, and actually three different sets of standards, I believe standards for instructional coach, like more instructional support type positions, and then standards for school leaders that outline the basically the competencies that human beings need to know in our current age is kind of what the student lens is focused on. And then the teacher standards really outline what are the skills that you need to know as a teacher to be able to teach these student standards to your students. So that's a really good starting point.

Speaker 1:

Nice and all those standards can be found right on the ISTE website. Yep, they're free ISTE website.

Speaker 2:

Yep ISTEorg here. Let's check that as we're talking. I think it's yep. Isteorg is the website.

Speaker 1:

Yeah, and let's talk about right, right, some of these schools, um, they're pushing away ai and they're creating these policies. Um, but how can schools create policies that support, rather than restrict, meaningful technology use?

Speaker 2:

yeah, I love that. I've been an advocate for a while way, before this generative ai thing came into the consumer level, that we should be changing the terminology. We have a AUPs right, acceptable use policies should be shifted to responsible use policies, rups. Because think of, just sit on that for a second and think about that shift in language from acceptable to responsible and how that shifts. I, when I was teaching, I had a really good mentor that told me hey, your like classroom rules or classroom policies shouldn't start with no, they should be. You should have like and this is the idea behind.

Speaker 2:

If you're familiar with PBIS, right, the positive behavior can't think of what PBIS stands for at the moment, but it's the idea of like positive reinforcement, positive behavior management, of like positive reinforcement, positive behavior management. You have PBIS says like, you have maybe even at the most, three well, we'll say five, three to five things that outline responsible actions as a human being and when you outline them correctly, everything that is not that is a no, that usually is a no. That could create a list, like you know, a mile long of no's is included in there. Right, because if you have in your responsible use policy that you treat other human beings with respect, and blah, blah, blah. If you have a kid, that's cyberbullying, another kid you go. Does that mean? Does that mean this right here? No, okay, well, you're not doing our responsible use policy right. Think about it and look into PBIS if you haven't done that. So, anyway, shift from AUP to RUP responsible use policy. So shifting that language and then shifting the language in that responsible use policy to be more positive, focused and not just a long list of things that they can't do, is huge.

Speaker 2:

You need to have an AI policy as part of your responsible use policy. I hear an AI policy out there a lot and this is I am more of the pedagogy, instructional person and, of course, the policy is the foundation of that, but I wouldn't call myself the policy person, but I still get drug into that a lot. So, to talk about it at a basic level, your AI policy you have to have one. If you don't, you got to have one, and it's amazing how many teachers I talk to that tell me their school still doesn't have an AI policy Should be part of your responsible use policy. It should not be another thing. By the way, on that note, even a year or two, we will no longer be hearing, probably, ai. We'll still hear it a little bit, but we'll just be hearing technology, because AI is just another technology.

Speaker 2:

I did this at TCA. I asked one of my groups it was a pretty big group I said, okay, who was into Web 2.0 tools, like around the 2010 type age? And I want to say, like three people raised their hand in this big room. So then I started to explain well, did you use this tool, this tool, this tool? And all of a sudden, oh yeah, and this tool and this tool and this tool. And then all of a sudden I said, okay, wait, stop, did you use web 2.0 tools? And almost every hand in the room went up and I said, there, you just proved my point right there that in the 2010s, early 2010s, I was doing sessions that were like hey, how do you use Web 2.0 tools as a learning tool in your classroom, but then slowly those just became digital tools.

Speaker 2:

They're just technology tools. We don't say Web 2.0 tools anymore. It's going to be the same thing for AI, right? So your AI policy shouldn't be standalone. Is the moral of that. That's one thing I see going wrong with AI policy. The other thing I see going wrong with AI policy is that it's way too restrictive and it will not age well. And I'm guessing that schools that built AI policies even two years, a year and a half, even a year ago, are probably looking at their AI policies right now and going, oh, we've already out aged this and we have to rethink it. Policies need to focus on the big picture and really think, and I know it's hard, because it's hard to imagine things like agentic AI. Our teachers are always like well, give me an example.

Speaker 2:

I don't understand right. It's hard to imagine because it's so like mission impossible out there or iRobot out there that you're like I don't get it. I know that's hard, but try to put yourself in those shoes when you're developing those policies and make sure that they're going to age well. And I'll just say it one more time Focus on the positive uses. If you focus on the positive, it encompasses the negative.

Speaker 1:

Can you repeat that the responsible use the negative Can?

Speaker 2:

you repeat that the responsible use yeah, so shift from. So we typically, for a long time, have had AUPs, acceptable use policies. That's really common. Even if, like you're in an airport, using airport wifi, you know how you have to accept their terms. That's the airport or coffee shop or wherever your public location, your hotel. That's their acceptable use policy. It's outlining the things and if you've never read one of those, by the way, don't read it now.

Speaker 2:

Copy. I'm dead serious and have your students do this. Have them do this with the terms of use of different digital tools that they use and social media tools. Go copy their terms of use. It should be on their website. If it's not, you shouldn't be using that tool. If they're not putting their terms of use policy out there transparently, then don't use that tool. So go find their terms of use, copy it and put it into an AI and tell it to summarize the main points Anything that seem to be like red flags or things you should look out for.

Speaker 2:

I do this with contracts all the time. Now, right, with an acceptable use policy from, like, an airport or a coffee shop or something, and then do it with your school. School leaders are going to hate me for saying this and tech directors, but do it with your schools, please, and do it with your students. Those are called acceptable use policies. I'm saying we need to shift the language to call them a responsible use policy instead and, as part of that shift in language, shift the terminology to focus on the positive uses and then, once you focus on the positive, healthy uses of technology responsible uses of technology that encompasses the negative, just like classroom management.

Speaker 1:

Yeah and I agree. Does that make sense? It does, it does. Would you agree that if you shift the language, you shift the mindset?

Speaker 2:

That's exactly what. Thank you, you said it way better than I could say it Shift in language equals shift in mindset. That's exactly what. Thank you, you said it way better than I could say it. Shift in language equals shift in mindset, and it's amazing how many different things in life that that applies to, and this is one of them Classroom management, too, your syllabus, your classroom management, the classroom rules that you have. Don't call them rules, call them responsible use or something else, and don't start them with no, and you'll be amazed at just that shift in language and that shift in discussion, and that shift in focus will start to build that culture that you want a positive culture.

Speaker 1:

I agree because one of the schools where I taught at we didn't call them classroom rules, we called them classroom norms. What's the normal behavior that students should have, and they hold themselves accountable? Um, when they see it as normal behavior as that, as that, instead of seeing it as some rules that they should abide by.

Speaker 2:

Um so definitely, um, that does work and actually one more note as you say that I know a lot of teachers that have students help develop their classroom norms, have your students be part of the conversation, of developing your responsible use policy and the AI policy as part of that. It's amazing when somebody feels they have ownership and a voice, that then they have buy-in versus just being told what to do. Right, students should be a part of your tech committees. If you have AI committees, you should should have students. Your student body president or whoever it is like needs to be, and not just one student.

Speaker 1:

You need to have at least two or three students on those committees with you and be part of the conversation oh for sure, and I think would also make that empowering if those two, three students rather gather about 20 students and get their input and their feedback so that they can report back to the adults.

Speaker 2:

Have student-only committees yeah.

Speaker 1:

Yeah, yeah, right on. And I love that you had mentioned AI not being as a key tool. Right, it's not going to be predominant in about like five years, but what do you see? What emerging ed tech trends do you think will happen, will have the biggest impact in the next five years?

Speaker 2:

Oh boy, such a hard question to answer. I'm actually on a committee with COSEN where we develop their if you haven't read their report, which, of course, again, where we develop their if you haven't read their report, which, of course, again, I can't think of the name right now, of the of the exact report AI and COSIN. Hold on, hold on, I'm Googling it, but that will find what you need. I don't know what they're calling the actual report, but basically what we do as a committee. And there's we're all over the world, people from all over the world that work in education, that work together to come up with exactly that Like what are the? Where is technology going? And like what are the changes that need to happen in education? What are going to be the accelerators, what are going to be hurdles to that Right? So we spend like months figuring this out and putting this report together. So, moral of the story, look at that. That will help you guide this.

Speaker 2:

My specific answer is going to be that 2025, the next year in particular is the year of agentic AI. We're already seeing it. I'm already. Even just in the last few weeks, agentic AI was not a term that I was seeing and hearing out there and the last few weeks, with the release of ChatGPT's operator and deep research, I'm starting to hear the chatter and see it on LinkedIn more of agents, ai agents and agentic AI and, like I said, a lot of teachers are like, oh, I don't, I don't understand, like, give me an example.

Speaker 2:

Here's an example of agentic AI in education. So I'm going to use the example of, like a Google form or a Microsoft form. You might be using that as a teacher. You should be using it as a teacher to formatively assess your students of where they're at Right. So you, you send it out and hopefully, if you're not, you're using the auto grade feature as well. Actually, like auto grade for you If you are manually grading paper worksheets. Please don't spend your time as teachers doing that, and I'll just stop there, right, so use the auto grade part of those form features, but then you still have to go to the form and go to the responses, and they're really good about creating little pie charts and bar charts for us and they do a little bit of summarizing the results from that. But you still have to go there as a teacher and find the time to do that and you have to analyze those results and then figure out how it's going to affect your instruction the next day or the following day. What agentic AI is going to do and I'm just using forms as an example I have no idea if Google and Microsoft are going this route. Just using that as an example.

Speaker 2:

To frame this is now say you give your students a Google form or a Microsoft form, little formative assessment, and you're going to get an email that says hey, mrs Hockenberry, I see that your students didn't do so well, they didn't seem to be grasping this topic. I think tomorrow in class, you should maybe start class with one of these discussion questions and maybe it gives you three to five discussion question options and then from there, I think that you should maybe do this lesson with your students or give them this learning task or this activity. Does that make sense? That's agentic AI and people. I always tell people when I say this, I'm like take a deep breath.

Speaker 2:

So everyone right now, if you're listening to this just in and out, agentic AI acts autonomously, and people do not like to hear the word autonomous with AI, but think autonomous with human oversight. So you will have trained that agent to know what to look for how to summarize your Google form results and then what kind of information you want it to send back to you. That's the human oversight part, but it does it autonomously. You no longer have to find the time to go to the form, and I don't know if it'll come be an email, a notification of some way, but it's going to notify you in some way that hey, your results are ready and this is the recommendations that I'm giving you. You don't have to take those recommendations. You have a human brain. You evaluate the outputs right, but it saves you time and it takes that step for you automatically, with what you've trained it to do.

Speaker 2:

So that's going to be the next year 2025, the year of agentic AI Past that we're going to be seeing general AI at some point. Some people say very, very soon. Some people say several years. Some people say many years. Who knows? Only the people that are working and have severe NDAs are the ones that really know how close we are to getting to true general AI and then AGI from there.

Speaker 2:

So it's hard to gauge in the next five years exactly where the tech is going.

Speaker 2:

I lean on the side of looking how fast it's moving and leaning towards sooner rather than later.

Speaker 2:

But what I say is, instead of trying to imagine exactly where that tech is going to be in the next five years, instead I want to look at the outcomes.

Speaker 2:

No matter where the tech is like, what do we want a classroom and a learning environment to look like? And for me, I want to move to true personalized learning. I know that's a term that's thrown out there a lot and a lot of people are like whoa, back up personalized learning. But to me, personalized learning means that as a teacher, I am no longer the stage on the stage. I'm no longer doing direct instruction all day, every day. Instead, my students have are able to work at the pace that they need to work at, are able to match what they're learning to their interests, and it opens up my time as a teacher to be able to go and have those one-on-one conversations, build relationships with my students. Right Focus on the human elements of being human and helping them develop skills that are uniquely human is kind of the outcome that I want to see in the next five years out of formal education.

Speaker 1:

Nice. That's awesome and it is scary to try to think about what technologies are coming in the next five years.

Speaker 2:

You had said it, it's a freight train.

Speaker 1:

It's a freight train. It's a freight train.

Speaker 2:

I am trying to explain this to educators and just my friends and society members in general. There is a freight train moving towards us very, very quickly, about to knock out everything that you possibly know and think is true.

Speaker 1:

And that's not to scare the teachers.

Speaker 2:

Yeah, I know, I know and so I bounce. I'm always the optimist. I'm just an optimistic person in general and I try to really be the optimist when it comes to, especially the cheerleader of you can do this, you can do this people, you have to figure out how to teach your students these skills and all those things. But at the same time, that doesn't mean that I don't see the bad side. I have a whole TikTok video that explains this. It doesn't mean I don't see the bad side. This doesn't mean that I don't see the freight train, and I'm really struggling personally right now to, like you said, not scare educators, but get not just, not just educators, my friends. To like get people to wake up and realize that they're the freight train. We can see the freight train and it's coming at us, but most people aren't looking at the freight train. Like, look at the freight train, it's there and it's coming very, very quickly and you can't avoid it. Pandora's box is open.

Speaker 1:

And it's coming. They're trying to avoid it Very, very quickly and you can't avoid it. Pandora's box is open.

Speaker 2:

And our last question, if you, could give one piece of advice to educators looking to improve their tech integration strategies. What would that be? Oh, one piece of advice. I mean, I feel like I've already given lots of advice. The one thing that maybe I haven't said directly is that teachers in the next, again going back to what are the next five years Honestly, the next? I would love to see this in the next six months, but I know education moves slow.

Speaker 2:

So the next one to five years and I think the technology is going to force this you have to shift from being a bearer of facts to a curator of skills, and skills that are uniquely human skills in the age of AI. You will never outfact AI. You already. You haven't been able to outfact AI for a very, very long time ever. Like it processes at nanoseconds and we're getting close to picoseconds at how fast AI models process and spit out outputs. Right, they know the entirety of the internet. There's no way that a human brain, even savants, have the entirety of the internet, like in their human brain, but AI can do that. So, instead of looking at it as a bad thing, look at it as a good thing.

Speaker 2:

Put the positive, the positive lens. We're coming back to the positive lens, right Of like. How can you leverage the fact that AI can truly customize information for your students and I only talked about text, by the way, in my example, because that's where we're at and it's available to any human being on the planet for free that has access to a device and internet. That's the trick. You have to have device and an internet, but if you do, any human being on the planet for free right now can customize text. We're starting to get into a lot more multimodal right, so it's not just text, it's also audio, it's also visuals. More and more every day we're getting there and more and more every day it's getting to a freemium, consumer level version of that. So look at that lens of the positive and pull that in instead of just looking at the negative that you can't out-fact AI.

Speaker 2:

I think I already said it, but I'll say it one more time Focus on the human skills and teach your students how to collaborate and yourself how to collaborate with AI, because that is not the future, that is the now. We're living in a world where we will never, not ever, collaborate with AI in some form or another to do the majority of human tasks. Of course there's going to be things where we don't, but I mean it already is. I tell teachers my last thing here, then I really will be quiet.

Speaker 2:

So a lot of schools are using that red light, green light, stoplight analogy to outline AI use. If you're using that red light, green light like stoplight analogy to outline, like AI use, if you're using that and you say red, no, ai you, that is totally outdated. It was outdated before it was created, because AI is pervasive in everything. If you're saying no AI use, that means that the student can literally not use a single digital tool, because even if you don't know it and you don't see it, ai is embedded in the background of every single digital tool that you're using as a teacher. So unless you are truly there and truly say like no, really, by saying no AI use, you're saying no tech use, which I don't think and I won't. I won't really be quiet now, but I don't think that's the route that we want to or should go in education.

Speaker 1:

Not in this day and age. In this day and age, we're in a tech driven world where tech should I personally believe technology shouldn't be embraced, and the more technology we have in the class, the better we're preparing students for the future.

Speaker 2:

That doesn't mean that you can't have times that, like I said, your students are drawing a physical drawing perfectly fine, but that can't be again melting's black and white. We can't make it all tech, we can't make it no tech. You have to find the gray in the middle. That's the happy. The gray in the middle is always the happy point. Think about that in relation to most things in your life.

Speaker 1:

There's my parting thought.

Speaker 2:

No, you should drop the mic right there.

Speaker 1:

Thank you for being a guest on our podcast. You want to plug some stuff in before you go the name of your book, the name of the infographic where can folks find you? The name of your TikTok, so that folks can view you on TikTok and follow you.

Speaker 2:

So I actually have exciting news that by the time this podcast posts it will be public that I'm rebranding. So I have been in tech graded PD for many, many years now. I'm in the process of rebranding my current website to a more personal brand of Lindy Hawk. So my name is Lindy Hockenberry Hockenberry is a lot of a last name. I was laughing because my husband they got their rebranding at his work and he got a new email signature and he said the template email signature he couldn't fit his name into and his first name is Chad. So that that shows you how long it four, four letters is his first name. He couldn't fit Hockenberry into the email template.

Speaker 2:

So, moral of the story um, I am shortening it to Lindy Hawk. Think of the Lindy Hop the story. I am shortening it to Lindy Hawk. Think of the Lindy Hop the dance. I am the Lindy Hawk. So my new website well, it's not a new website, it's my current website, but it will be redirected to lindyhawkcom and have my new logo on there. If you go to lindyhawkcom so L-I-N-D-Y-H-O-Ccom, then you can find links to all my socials, my TikTok. You can also find me by searching at Lindy Hawk. I'm in the process of kind of shifting my at Lindy Hockenberries to shorten them. And, by the way, it's also spelled weird it's spelled B-A-R-Y and not B-E-R-R-Y, so that trips people up. So at Lindy Hawk will be all of my socials. You can also find my book on my website, a Teacher's Guide to Online Learning. There's a link to Amazon from my website, so website is the portal to all is the moral of the story LindyHawkcom.

Speaker 1:

Awesome. To our listeners one more time LindyHawkcom, l-i-n-d-y-h-o-c-kcom.

Speaker 2:

Yep, and you can get it. You can do H-O-C-K or just H-, just HOCcom. Either way it will take you there.

Speaker 1:

Oh, it's HOC, I'm sorry.

Speaker 2:

Yeah, either way will get you there. Hoc or HOCKcom yeah.

Speaker 1:

Awesome, awesome. But, lindy, thank you for being a guest on our podcast. It was a pleasure. It was one of my favorite podcasts. You shared tons of resources, tons of information. And that brings us to the end of this episode. Episode of EdTech Empowerment Innovating Education Together. A huge thank you to our guest, lindy Hockenberry, for sharing her invaluable insights on technology integration, ai in education and the future of EdTech. If you want to learn more about Lindy's work, be sure to follow her on LinkedIn, check out her resources at IntegratedPD For more episodes like this and stay connected with the latest EdTech. Visit us at nextgenclassroomsorg and follow us on social media. Don't forget to subscribe to the podcast and share this episode with your fellow educators and tech enthusiasts. Until next time, keep innovating and empowering education with technology.

People on this episode