AI in the Classroom: TCAPS Looks to the Future of Learning

Artificial intelligence (AI) is poised to reshape many aspects of modern life, including education. Traverse City Area Public Schools (TCAPS) leaders discussed the emerging technology this week, including how teachers and students are using AI now and might going forward, privacy safeguards and other concerns, and the risk-reward balance of a tool supporters believe will transform education but critics worry could undermine a vital human element in learning.

TCAPS Executive Director of Technology Evan OBranovic called AI a “rapidly evolving” technology, one that is daily “getting bigger, better, faster.” He explained the difference between generative AI – technology that uses input data to create new content like text, images, audio, and videos – and predictive AI, which analyzes data to predict future outcomes. Students often use generative AI tools like ChatGPT to brainstorm ideas or conduct quick research. However, they can also use it to plagiarize essays or cheat on assignments. Predictive AI, meanwhile, can be used by educators to predict student outcomes – such as graduation probability – but is “not definitive and may contain bias,” according to TCAPS presentation materials.

TCAPS has a handful of AI tools it promotes now for school use, including Goblin Tools and Google Gemini. Gemini has a “wall” for TCAPS data that ensures anything entered through a TCAPS account is not harvested or saved for external use, OBranovic said. Because of privacy concerns, he said the district looks for “a certain set of assurances that the data is being protected and separated” from the outside world when selecting AI tools. Gemini is also only available to students over the age of 13, with chatbot use by younger students deemed inappropriate, OBranovic said.

Both OBranovic and Jessie Houghton, chief academic officer for secondary schools, said a key part of TCAPS’ approach is teaching students how to use AI responsibly. “It's not something that we are shunning the discussion about,” OBranovic said. “It's something that we're embracing and trying to envelop where it makes sense within our curriculum and in our environment.” AI can be a “powerful” tool for both students and teachers – from jump-starting student brainstorming sessions to helping teachers organize lesson plans and identify classroom trends – but is only “part of the process and not the start and end of the process,” OBranovic said.

Houghton agreed, saying “there’s a lot of power in teaching our kids how to use AI to enhance their learning.” The goal isn’t for AI to replace learning or critical thinking, Houghton said, adding that students shouldn’t be having AI give them all their homework answers or write essays for them. But it can help break down difficult concepts with examples or visuals, assist with topics at home like math when parents are stuck, or provide prompts for students struggling to put pen to paper. Since most AI tools are free, they can help make “learning more accessible” for all students, Houghton said. In the future, AI could also help schools cut costs considerably on printed curriculum materials, she said.

As students navigate the AI landscape, teachers must also learn how to use the technology effectively, according to OBranovic and Houghton. Educators can’t rely on AI detection tools to spot cheating, they said, noting the lackluster success rate of such tools and the risk of falsely accusing a student of cheating. The human relationship between teachers and students remains vital, since teachers knowing their students is what can help them most effectively identify if schoolwork suddenly seems off, OBranovic said. Teachers are also learning to make their assignments more AI-proof or “durable,” according to Houghton. “We're growing their knowledge in that, because I'll tell you what, kids sure figure it out pretty fast and then share it with each other,” she said.

OBranovic and Houghton spoke about the “80/20 rule” in AI education, which refers to using AI to automate 80 percent of repetitive tasks – like grading simple questions, generating basic learning materials, or providing initial feedback – so teachers can focus on the “most impactful” 20 percent of tasks. Those include things like “personalized instruction, addressing complex student needs, and fostering deeper learning interactions,” according to the presentation. Future uses of AI could include increased student tutoring, virtual field trips, enhanced learning customization and progress tracking, and curriculum creation. In addition to working on district guidelines for AI use, TCAPS has launched an optional AI learning series for all employees – not just teachers – with the technology increasingly expected to be integrated into professional development standards going forward.

Though most staff and board members have acknowledged the inevitability of AI becoming a dominant force in education, not all are enthused at the prospect. David Richardson, a teacher at TC West High School, told board members during public comment that he didn’t see AI as “super useful” to his job, adding that promised benefits like improved workflow “haven’t materialized yet.” Describing teaching as “primarily a human connections job,” Richardson said he didn’t think AI is “good at doing that now and I kind of hope it never is.” Technology tools typically haven’t made his profession easier or more efficient, Anderson said, but either reformatted “the same problems in a new way” or “added a layer of hindrance.” He criticized the idea of using AI to tutor students or give them feedback instead of teachers as “dystopian.”

Despite the unknowns and its disruptive potential – good and bad – TCAPS Superintendent Dr. John VanWagoner indicated the district can’t ignore AI or its likely impact on the coming future. “One of our goals for every graduate we have go across that stage is to be career and college ready,” he said. “I do believe (AI) is now a component to truly be career and college ready you're going to have to be able to navigate and use.”