The conversation about AI in education is everywhere, drawing increased media attention and sparking broader community debate as society continues to try to figure out exactly where it belongs. I started writing this spin a few weeks ago while reflecting on a spring break conversation with my daughters about their day‑to‑day use of AI. Their thinking felt fluid, accepting AI as part of their learning experiences when teachers say it is okay to use, and a solid no when teachers draw clear lines around original, unassisted thinking. I was relieved by their interpretations and applications, and comforted by the guidance they are receiving in school.
Beyond my own experience, I notice different perspectives emerging, which seem to ultimately settle into two positions: those who welcome AI and those who view it with deep concern. Framed this way, AI is portrayed as either a thought partner or a mind robber. I find this debate unhelpful. The reality is that AI is here; arguing about whether it should exist is dated, just as attempting to solve the issue by banning it is futile. AI is growing exponentially, and progressive school districts are finding ways to harness its potential. The real question for educators is not whether to engage with AI, but how to safely harness its potential without surrendering human creativity, originality, and the uniquely human connections at the heart of learning.
This spin is intended to provoke reflection, encouraging us to consider the impacts of AI and how it is being used around us so that we can better maximize its benefits, to enhance our thinking rather than simply bang out efficiencies at the expense of human touch. As I think about my own use of AI as a tool for light‑touch editing or as a quick reference point, I’ve noticed that people, myself included, are working differently. Looking more closely at how AI is showing up in people’s thinking, a few patterns are beginning to emerge.
There seem to be three camps. The first group are the campers who run everything through AI first. At the start of a task, their instinct is, “Let’s see what AI thinks,” or “Let’s see if AI matches where we think we might go.” This camp is keen on efficiency, moving fast to create, populate, or verify. These are the keyboard warriors, using AI to save time or slay tasks.
The second camp are the folks who bring AI in once they have a shell in place and want feedback. Something has already been created or built out, and upon, or near, completion, AI is prompted and the material is fed into the platform for review. This is usually for revisions, flow, or soft edits, with the goal of keeping the creator’s voice intact. At times, this camp will also drop an early idea into an AI platform to ask whether there are other considerations, starting a back‑and‑forth that helps expand their thinking. There is a clear level of consciousness at play here, an awareness of ownership and authorship, paired with an intentional use of AI for enhancement rather than originality.
The third camp includes those who see AI as a threat to humanity and believe it is best ignored entirely. The danger of this kind of deliberate ignorance is akin to the classic ostrich burying its head in the sand in the hope that the problem will go away. The analogy is inevitably flawed, but the lesson stands: ignoring a threat doesn’t make it disappear; it only ensures you won’t see it coming. While caution and safety measures are absolutely essential in the AI landscape, believing that AI will simply go away, or be replaced by something “better” may be even more dangerous than misuse itself.

Recently I was out for dinner with friends, I was the only educator at the table and I was amazed to hear the range of perspectives on AI usage. One friend shared how they use it as a director to edit, to shift through footage efficiently, experiment with pace and rhythm, and to flag technical issues. Another shared how as a lawyer their use is super selective ensuring that AI is never used as a creator but as a junior assistant for flagging or catching inconsistent language. It was fascinating to learn how differently we are all working with the use of AI platforms in our professions. I share this as I work through my own positioning of AI as an critical consideration in modern educational design.
With that said, I want to be clear about the very real dangers associated with AI, which demand the establishment of clear boundaries and guardrails to protect youth. In my school district this has been an ongoing focus of direct instruction, intentional teaching and careful supervision as we teach our students about safe, appropriate AI usage. We have curated AI resources and helpful links to guide our community, and we have established and implemented AI literacy lessons to ensure consistency and accountability, so that students understand when to use AI and when not to. While scripted lessons help ensure a shared message, the most impactful part of this work is our teachers. Their human connection and ability to relate to students are essential. The bottom line is that AI should never replace human connection, and schools must protect the depth of slow, relational, human‑to‑human learning.
So which AI camp am I in? I am realizing that I live in the second camp described above. As I work through my own cognitive schemas, I write, get drafts in place, and if I feel the need for a “soft edit,” I’ll turn to AI to support revisions. I’m also noticing that, much like using Google to explore concepts, I am increasingly using Copilot or ChatGPT in the same way. I still move between platforms, when searching for background information, research and peer‑reviewed scholarly documents versus general web‑based information, but the lines are blurring around which tool I turn to first.
I am very aware of the line between AI‑fabricated content and my own thinking, and I try to stay true to ensuring that any soft edits do not drift far from my original input. My micro‑prompts are intentionally framed with requests to support readability, flow, and grammar, with clear direction not to change my voice or tone. While this approach is usually successful, it does require careful review to ensure the feedback and revisions stay aligned with my original intentions.
I’m reminded that this kind of support is not entirely new. In high school, I often asked my dad to review my essays before submission, his feedback wasn’t all that different from AI’s, minus the warmth of human encouragement. Early in my first principalship, I recall asking my administrative assistants to keep an eye on my writing to ensure clarity and grammatical accuracy. Last fall, Chris Kennedy wrote about modelling AI use in authentic writing. His comment, “The standard should be integrity and evidence of learning, not tool abstinence,” has stayed with me as I’ve continued working with AI as a thought partner to revise and edit my writing before publishing.
Recently my daughters noticed that I was using AI as a revision tool, and I was surprised to see how concerned they were. My youngest immediately told me that I needed to be careful as an AI checker could scan my writing and claim it as its own. She shared that she is paranoid to use it for anything at school, as other students had been caught leaning into it and were publicly called out in class. We talked about how tricky using AI can be when you are revising without a deep knowledge of the content you are writing about. The lines between original ideas and content can quickly become blurred when you are in the iterative phases of developing ideas. Humans helping humans navigate AI is essential. AI apps should never replace teachers or the human learning and consultative conversations that sit at the heart of education.
Educators are often known for borrowing and adapting existing content when designing lessons. Over time, many classroom teachers naturally evolve into curriculum designers, thoughtfully reworking materials to ensure required content is covered in engaging ways and aligned with their students’ interests and needs. I recall, as a primary teacher, the countless hours spent putting together unit plans, creating activities, and prepping materials for my students to participate. My friends would get pulled into cutting marathons on weekends as I prepped materials for the week ahead, yes, I am dating myself to the 1990s, the era of clip art, paper consumables, lamination and the brightest, biggest bulletin boards imaginable.
"Beg and borrow, low cost, no cost" have always been teacher‑resource slogans in public education. This is where AI is proving to be incredibly helpful. I watch classroom teachers design some of the most expansive and inclusive lessons as they prompt AI to curate details that, in previous years, would have literally taken hours and hours to build out. The devil, as always, is in the details, and in educators’ in‑depth knowledge of content and curriculum. You have to know where you’re going first; understanding the curriculum is essential in order to backward design effectively with AI‑supported educational tools.
Some argue that AI is taking the rigour out of education and that schools are lowering the expectation bar, bending to the easy efficiencies of AI‑generated feedback. Again, this debate feels circular and overlapping, as AI use is now embedded in all aspects of modern life. From attempts to contact passport offices to accessing online medical care, chatbots are often the first line of contact. Right or wrong, it is what it is. Stepping into this conversation, then, requires human action, and an honest understanding of our own AI use, so that we can guide our youth carefully into this new era of technological learning.
I’ll wrap up this spin with a few recommendations as we continue to journey into new frontiers of AI and education. Rather than prescriptive rules, these are guiding principles that reflective educators can humanize as they embed AI into learning designs.
- Model intentional use
Be clear about when and why you are using AI. Share this thinking transparently with students so they can learn from your decision‑making and application in real time. - Value process over polish
Be explicit about messy learning as an essential part of creative thought. Let mistakes happen, then unpack how they happened and how to move forward. Share this with students so they understand that iteration, revision, and struggle are integral to originality and creativity. - Protect unassisted thinking
Be intentional about ensuring there is always space in learning for spontaneous and creative ideas. Build time into lesson design for collaboration and imagination so they remain established parts of cognitive development. Be clear with students about when AI use is appropriate and when it is not, and explain the reasoning behind those boundaries. - Keep human connection non‑negotiable
Be open, receptive, and human in your classrooms. Intentionally build space and time for relationships and social‑emotional connection. Strong foundations of well‑being set the stage for deeper thinking and academic rigour.
In the end, AI isn’t the villain or the hero in our classrooms; humans are the heart of education. The depth and struggle of authentic learning come with human emotion and the desire to achieve. Our students need to feel that drive for knowledge from the people who surround them, something that cannot be replicated by AI. Building awareness of the lines for safe and appropriate AI use remains a human responsibility, as we ensure that communities of learning stay at the heart of public education in Canada.



Which AI-camp are you in?
Author's Note: This spin is a brief reflection on my learning and observations related to AI in education. I recognize that many educators and researchers are engaged in deeper study and are offering in‑depth commentary on this topic. My intention here is to share real‑time reflections from my own experiences, offered with respect for the depth of ongoing research and without oversimplifying a complex societal and technological evolution.
In the spirit of author integrity and transparency, I note that my Copilot AI thought partner supported this piece through light editing and grammatical revisions to assist with publication.
