Teaching and Learning, Cheating, and Assessment in the Age of AI

The evolving capabilities of artificial intelligence require new approaches to instruction. Here, two leaders from the innovative Ulster BOCES share their expertise on AI's potential role in the classrooms of today and tomorrow.

The digital revolution has achieved a profound shift in the organization of and access to information. But today's AI is the first technology that can actually process information in more complex ways. It responds fairly accurately to natural language, which has been a holy grail of researchers for a very long time. AI can now also identify objects in images, another ability that researchers have been working on for decades. These new abilities have practical and exciting ramifications for our classrooms.

Take, for example, plagiarism detection. Until now, plagiarism software has needed something to cross-reference in order to identify potential plagiarism. It scoured the internet for similar texts, and could offer definitive proof by providing the original text for a comparison with the student's writing.

Now, ChatGPT can create an entirely original composition. There are plagiarism detectors that say they work with AI, but in the end, the only proof you're left with is your faith in the software's accuracy. Short of blocking certain websites, not allowing phones in schools, using scramblers to block cell phone service, or taking other extreme measures, we will not be able to stop students from using AI tools, much like we are unable to stop them from using search engines.

Our best bet is to try new ways of adapting instruction to leverage these new tools, because the genie isn't going back into the lamp.

AI for Teachers, AI for Students

When it comes to AI in schools, there are two issues: AI for teachers and AI for students. AI for teachers should be adopted and widely used as soon as possible. There are incredibly powerful AI tools that make our jobs easier and more effective in countless ways that collectively we are just beginning to explore.

On the other side of the equation, teachers see people in other professions at risk of their jobs being eliminated due to these tools. It's perfectly normal to worry. In the private sector, however, the consensus seems to be that AI won't take anyone's job, but someone using AI might. What does that mean in the classroom? If we want to prepare our students of today for the jobs of tomorrow, it is imperative that we provide students with the skills and knowledge to use AI tools safely, ethically, and efficiently.

AI for students is a little thornier because it's best used for jobs that the user already knows how to do. Let's take a mundane example which will resonate with almost everyone: writing a cover letter. Asking an AI tool to write one for you is incredibly helpful if you know enough about writing cover letters to provide the AI with the main salient points you want to make, use the AI for language formatting, and ultimately critically assess its output. However, a high school student who would use AI to write his first one would likely end up with a dry and generic letter and wouldn't learn anything in the process. Using the AI to create different versions of a cover letter for a given position — one strong, one weak, one from someone underqualified, overqualified, etc. — could quickly yield educational materials for students to understand what constitutes a strong letter. Even better: With clever prompting, the AI could easily be turned into a cover-letter-writing coach who would first seek to understand the strengths and interests of the student, learn more about the position being applied for, and so on, to produce a highly personalized cover letter which would at the same time teach the student how it's done. Not only that, but such multilayered prompting and the resulting interactions would also show the student that this technology can be used in very varied and specific ways, expanding the range of possible tasks AI can help them accomplish.

A Changing Approach to Cheating and Assessment

AI also creates a gray area when it comes to cheating. Is it cheating for a student to use Grammarly to clean up a paper? Is it cheating to ask ChatGPT to suggest improvements to a paper? There aren't necessarily right or wrong answers to these questions. The answers are going to vary depending on school culture, how old students are, what the learning objectives for the particular assignment are, and more. There are very few policies or curricular expectations in place, so it's important for teachers and administrators to begin thinking through these issues. The Department of Education and the New York State School Board Association (NYSSBA), for instance, have both published documents with guidelines for schools — but don't click these links thinking you're going to find guidance on how best to use AI tools in education. Here's an example of the language you'd find: "The Board respects the professional capacity of the instructional staff to assign work that is less susceptible to student use of GenAI to circumvent learning." In plain English, this means "teachers will figure it out."

In addition to offering clear guidelines for using AI, schools may have to stop relying on student papers to check for learning. Or, if teachers do assign papers, they might have to approach assessing them differently. One option is to make assignments difficult to do without using AI, but then also raise the standard at which you assess the work. Educators have already done this with spelling and grammar checks. They don't punish students for using widely available tools and, in fact, expect them to do so. The trade-off is that they accept no excuse for poor spelling and grammar in a finished paper.

Another way to change teachers' method of assessing papers is to ask students to defend their work. This means having a conversation with them about their paper, either in person or when they upload it to the learning management system. Why did they make this argument or include that fact? What is the crux of their position? If they can't answer these kinds of questions, maybe they shouldn't receive a good grade on the paper — even if it is an excellent piece of work on its own. This is echoed in the NYSSBA's policy update: "Instructional staff must be clear about their expectations for student use of GenAI in assignments. Staff who suspect a student has not done an assignment on their own can request that the student demonstrate their knowledge of the material in other ways, to the same extent they already do."

It's unlikely that schools will be able to stop students from using AI, and AI detectors are never going to be a definitive solution for plagiarism. At the same time, the innovation cycle of AI is so fast that high-quality studies of AI use in education will always reflect the effective use of already obsolete tools rather than those that students and teachers are currently using. Everyone involved is going to have to keep discussing these issues and figuring them out together.

AI and the Way Forward for Education

While there are challenges for teachers associated with AI, there are also opportunities that were previously unavailable. If two people brainstorm together, they might come up with 10 ideas in an hour. If they ask ChatGPT, they can have 200 ideas within a second. Many of them aren't great, but they can jumpstart thinking and conversation.

Long before AI, we were already convinced that project-based learning was the way forward for education. Now AI has put the final nails in the coffin of the industrial model of education we've had in place for so long. It has already begun to make the status quo almost unworkable, and at the same time enables students to create much more ambitious projects and to learn fairly complex skills along the way. There are far fewer barriers to what students can learn and create when they use these tools with the right support and guidance.

Featured

  • OpenAI Introduces Slimmer, Cheaper GPT-4o Mini

    OpenAI has announced the launch of GPT-4o Mini, a slimmed down, more affordable version of its flagship multimodal GPT-4o model.

  • cloud icon connected to a data network with an alert symbol (a triangle with an exclamation mark) overlaying the cloud

    U.S. Department of Commerce Proposes Reporting Requirements for AI, Cloud Providers

    The United States Department of Commerce is proposing a new reporting requirement for AI developers and cloud providers. This proposed rule from the department's Bureau of Industry and Security (BIS) aims to enhance national security by establishing reporting requirements for the development of advanced AI models and computing clusters.

  • A top-down view of a person walking through a maze with walls made of glowing blue Wi-Fi symbols on dark pathways

    Navigating New E-Rate Rules for WiFi Hotspots

    Beginning in funding year 2025, WiFi hotspots will be eligible for E-rate Category One discounts. Here's what you need to know about your school's eligibility, funding caps, tracking requirements, and more.

  • close-up illustration of a hand signing a legislative document

    California Passes AI Safety Bill, Awaits Governor's Signature

    California lawmakers have approved a bill that would impose new restrictions on AI technologies, potentially setting a national precedent for regulating the rapidly evolving field. The legislation, known as S.B. 1047, now heads to Governor Gavin Newsom's desk. He has until the end of September to decide whether to sign it into law.