A Data-Driven Approach to Managing Ed Tech in K–12 Schools

How does a data-driven approach to ed tech work in practice? How does an institution gauge whether its deployments, strategies, or initiatives are effective? Statistics around the usage of technology tools is only a part of it.

According to Louis Tullo, chief technology officer at Ravenscroft School in Raleigh, NC, it also involves developing a specific kind of empathy that involves interviews and feedback framed, in part, by standards for benchmarking like those found in ISTE's K-12 Technology Skills Scope and Sequence.

At his own institution, Tullo oversees all facets of technology on campus, including library services. He has held IT leadership positions in a variety of settings, and is now focused on "creating a robust and organized technology infrastructure with innovative curricular and instructional design strategies." He is a Boston College graduate and an active Association of Technology Leaders in Independent Schools (ATLIS) member. He is currently enrolled in the M.Ed. in Curriculum and Instruction at the University of North Carolina at Charlotte, concentrating on School Leadership for a cohort of independent school educators.

Tullo will be leading a session called "Creating a Culture of Ed Tech Intentionality" at the upcoming Tech Tactics in Education conference, being held Nov. 7–9 in Orlando, FL.

Data driven ed tech management

THE Journal had a chance to sit down with Tullo in advance of the conference and discuss his strategies for data-driven ed tech management, which he will be presenting in person at Tech Tactics.

THE Journal: Let's talk about your data-driven approach to technology in schools. You're looking beyond just usage stats. What does your approach involve?

Louis Tullo: One of the major components of that approach involves empathy interviews with faculty, understanding what their problems are on a day to day basis in terms of tech integration, and seeing how initiatives that a tech department puts out play out in reality in a classroom. And that you can't just get from the amount of times that they use a tool or what activities they do in a tool. To complement that piece of getting to know what teachers do, being the person beta testing or piloting these kinds of lessons in the classroom so that you're getting feedback from students in the way they're engaging with both the systems and lessons that speak to tech integration strategies or another data point. It's sort of a soft data point, in that it relies on, afterward, taking some good notes on what landed well, the kinds of questions that students asked about a specific activity, whether they were able to complete what you're trying to have them do in the course of the lesson or not….

THE Journal: Can you talk a little bit more about what the empathy empathy interview is, what it involves, and how you're able to get the faculty on board with that, because I'm assuming it's extra work and might sound intimidating.

Tullo: It's really a strategy that you we learned about at ATLIS [the Association of Technology Leaders in Independent Schools]. They have a program to help develop tech leaders who are in independent schools. And as part of that program, you do an interview with someone who's at a more senior level when you're going through the program. But the strategy of that interview process sounds more complicated than it actually is. It's just a matter of asking questions to a faculty member about what are the skills that you wish students had in your classroom? If you are going to be delivering a lesson about something that you yourself may be uncomfortable with, what is it that you're uncomfortable about? And what are the things that would make you more comfortable delivering content about a technology topic that seems a little bit more obtuse from your perspective? So rather than making it about the technology, it's making it about the person who's delivering the lesson, and that's where the empathy piece comes in.

THE Journal: You're getting insights about students Learning? What kind of insights are you getting? You mentioned in your session that you're that you go beyond the vendor-supplied usage statistics. What are you getting from these interviews?

Tullo: What it would look like to be able to extend the kind of learning that's happening with these tools beyond just a lesson that's delivered one time. For instance, if there's a specific lesson about an ISTE standard related to folder and file management, you know, you can do a 15-minute activity, walking students through how they do that in Google Drive. But the data that you're really looking for is not necessarily just how they did in that specific 15-minute activity. But in their classes later that year, if they're trying to find an assignment they worked on for a teacher, can they easily find it because they've employed those strategies to organize their files? Or the next year? So seeing how the skills that you're teaching in a finite time transfer over into other activities, classes, and settings.

THE Journal: How do you get everybody on board with this — administration, faculty — for this type of approach?

Tullo: You have to have a lot of patience, working with various groups, and be comfortable with the fact that small wins are just as important as a large one in this area. If you can develop some lessons — it doesn't have to be a lot to start out with — and successfully get grade levels to do one or two of them a year, that's a huge success because you're adding one to two skills for students that they wouldn't have had before that was developed. It's much more palatable than saying our measures of success are a full-fledged program to teach tech skills, which is not attainable in a single school year. It takes a lot of time to develop that. And [when] it becomes sort of like a pro-forma activity, then the likelihood that that work will extend beyond just the time of the lesson would be minimal.

THE Journal: How are you benchmarking the results?

Tullo: That's an iterative process, trying to determine how you measure success. Initially, it's that the lessons are there, that you've delivered them, and that students have participated. That's your entry-level benchmark of success. Eventually, the goal, once there's more content, might involve testing students or giving them an activity in which they have to leverage skills that have been taught to them over several grade periods, more of a project- or practicum-based kind of task, rather than one that's like a multiple choice kind of a [test], so you can see that in context. But again, because we're at the very beginning stages of this work — we completed it initially last year, and we're continuing to develop more this year — we're not at the point where we have like a fully fledged benchmarking strategy.

THE Journal: You're using ISTE's K–12 Technology Skills Scope and Sequence in your benchmarking? How does this translate? Or how have you seen this translate into changes in the way teachers are using technology in the classroom or instructing their students in technology in the classroom?

Tullo: Getting to see how vertical alignment happens when you're using a set of standards is the biggest change. Teachers love to be autonomous in their own classroom. And in some regards, that's a really great and important thing to help build a sense of community. And then when you think about skills, it's important to work more collaboratively, within a specific grade level, for instance. If students in each of those classes are learning how to do the same thing in a different way, and never kind of operating off of … a standardized method of learning a specific skill, it's really hard for where they're going in the next grade level and for the teachers that they'll be interacting with in the next grade level to draw from that knowledge that they learned in the past. So creating alignment vertically is really important there, as well as horizontally.

THE Journal: Are you are you working with other schools also on these strategies?

Tullo: Within the independent school community, we will have discussions with those of us who oversee the ed tech side of the house. For instance, we'll look at a standard, and we'll talk about it either formally or informally and bounce ideas about what activities land well with your students, or, if you've done an activity, how did that land with your faculty? Did they get any feedback about how to improve this lesson or an example that might be more applicable? So within the independent school world's ed tech community, we're often having discussions about these kinds of things because rather than seeing one another as competitors, we're trying to up everyone's game.

THE Journal: How do you see AI affecting your approach to this in the future?

Tullo: As a school, we've tried to frame the way that students think about AI in the way that we frame the discussion around citation and research, really looking at AI tools as a gateway to information and understanding that it's important to represent ideas that are 100%, your own as your own and ideas that are generated through the use of AI tools as those in which you had some support or assistance. And I think that by thinking about it that way, explaining to students that that's the way that we think about it as educators and the way they should think about it as students producing work for different classes, it has really helped ground the way that we can see tools like that used. We're still in the very initial stages of utilizing tools like that.

I would say I personally err on the slight side of "we shouldn't look at the use of AI from a punitive lens," and that it shouldn't be about catching the student who's "cheating." By using these tools, it's about giving them opportunities to use those tools appropriately. So not just defining inappropriate use, but defining what appropriate use looks like and also encouraging students to be critical of the kinds of outputs that AI generates. So for example, if a kid wants to get an initial idea for what the introductory paragraph of a paper should look like, and they were to ask an AI tool like ChatGPT to generate it based on a set of criteria, see what the output is and then decide for yourself: Does this say what I want it to say? Is the voice that's used by AI true to how I would say things? Is it leaving out things that I think are really important? How can I improve what AI outputs? Then it becomes a tool for learning, not just a shortcut to get things done.

THE Journal: What else do IT leaders need to consider about approaching ed tech in schools?

Tullo: I would say the biggest thing is, generally, when you're in tech, there's a desire to have something that looks perfect. When you're working in schools with students and teachers, being okay with learning from failure is really, really important. And having a level of confidence and comfort to be able to say, "Oh, this didn't work, but I learned this because it didn't work, [and that's] what's really important. So when you're talking about work like this, and things that we'll be covering in my session, it's equally important to reflect on what did work as what didn't.

Featured

  • An elementary school teacher and young students interact with floating holographic screens displaying colorful charts and playful data visualizations in a minimalist classroom setting

    New AI Collaborative to Explore Use of Artificial Intelligence to Improve Teaching and Learning

    Education-focused nonprofits Leading Educators and The Learning Accelerator have partnered to launch the School Teams AI Collaborative, a yearlong pilot initiative that will convene school teams, educators, and thought leaders to explore ways that artificial intelligence can enhance instruction.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Supported by OpenAI

    OpenAI, creator of ChatGPT, is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • closeup of laptop and smartphone calendars

    2024 Tech Tactics in Education Conference Agenda Announced

    Registration is free for this fully virtual Sept. 25 event, focused on "Building the Future-Ready Institution" in K-12 and higher education.

  • cloud icon connected to a data network with an alert symbol (a triangle with an exclamation mark) overlaying the cloud

    U.S. Department of Commerce Proposes Reporting Requirements for AI, Cloud Providers

    The United States Department of Commerce is proposing a new reporting requirement for AI developers and cloud providers. This proposed rule from the department's Bureau of Industry and Security (BIS) aims to enhance national security by establishing reporting requirements for the development of advanced AI models and computing clusters.