This is the third in a four-part series by Dr. Dave Mulder titled, A Better Imagination for AI. In “Rethinking Our Tech Stories,” Dave challenged the narratives we have crafted around AI. The post “Boring Robots” suggests a framework for stewarding the tool of AI. This third post proposes principles for ethical AI implementation.

Why did you become an educator?
If we asked a room full of teachers, I bet that we’d hear familiar refrains:
- Because I love kids.
- Because I love my subject and want others to love it too.
- Because I love teaching and helping people grow.
But somewhere between lesson plans, parent emails, data reports, policy mandates, and the relentless pull of technology in students’ hands, those beautiful reasons can fade. The calling that once felt joyful and sacred can feel heavy. The pressures we face can tempt us to trade purpose for efficiency. This temptation is all the more true in the age of AI, where tools promise easy ways to “take care of the scut work.”
But as Christian educators, we need to pause and ask, “What is the cost of using AI?”
If students learn more quickly … but we form them less deeply …
if work gets “done” … but wisdom is not cultivated …
if teaching becomes efficient … but opportunities for discipleship erode …
maybe we’ve traded something precious for something convenient.
AI is here. It is powerful. It will reshape schooling. But it will not—and must not!—reshape our view of what it means to be human or what it means to teach in a Christ-centered way.
That conviction brings us to the foundational truth that must anchor our imagination about AI in education. …
We are not machines (and neither are our students)
In the rush to explain AI, we sometimes borrow language that subtly shapes our thinking:
- “The brain is like a computer. …”
- “Students process information. …”
- “Learning is basically about inputs that lead to outputs. …”

The problem with this kind of language is that the metaphor obscures the truth. When we talk like humans are machines, we begin to treat them like machines, and that is disastrous for education.
Students are not processors of content. Not algorithms. Not productivity units. Not data points. Students are image-bearers, uniquely created, and called to wonder, to wrestle, to imagine, to create, to grow in wisdom.
Yes, AI can generate text. But only humans can generate love, meaning, and virtue, not to mention worship. We can use AI to assist thinking—but we cannot delegate thinking to the machines without losing out on the formative aspects of teaching and learning. If a tool does all the thinking for us, we may complete tasks faster, but we do not become wiser. And I believe that education is fundamentally about becoming, not just doing.
What does faithful, ethical AI integration look like?
If we want AI to serve formation, not just productivity, we must handle it ethically — as people entrusted with shaping souls, not just managing learning systems. Here are five principles to anchor a Christian imagination for ethical AI implementation:
1. Privacy: Protect what is personal.
Ethical AI use means being deeply thoughtful about what information we put into machines, how we handle student data, and whether students truly have agency over their digital footprint.
Not all platforms are equally trustworthy. When in doubt, let’s err on the side of protecting privacy—not convenience.
2. Relationships: Put people before platforms.
At the center of Christian teaching is presence, the ministry of showing up. AI can help us prepare materials and even differentiate instruction, but it must never replace relational connection.
Technology should amplify human interaction, not take its place. Use AI to create margin for mentoring, not to mechanize the very heart of teaching. The moment we allow a chatbot to become the primary voice in a student’s learning journey, we have forgotten who we are.
3. Integrity: Form hearts, not just skill sets.
Students do not simply need guardrails; they need formation. Ethical AI use requires us to teach discernment, honesty, and effort—not just “How to prompt an AI well.”
Christian education has always been about more than avoiding cheating. It is about loving what is good, working with diligence, and developing virtue.
AI is a powerful tool, but power without moral formation leads to shortcuts, not growth. True learning requires effort, struggle, and perseverance—things no algorithm can substitute.
4. Transparency: Make the invisible visible.
AI should never be a hidden force shaping learning without students knowing. When we use AI—whether to design materials, brainstorm ideas, or scaffold vocabulary—we should be perfectly comfortable saying so. (Honestly, if you are embarrassed to admit you are using AI for this kind of work … that might be a sign you shouldn’t be using it for that work!) Invite students into the process so they can learn to steward technology wisely. And expect the same transparency from them!
Transparency guards trust. When students learn to openly and responsibly use AI, they develop not only academic honesty, but wisdom about technology — a critical virtue for Christian discipleship today.
5. Humility: Remember that we are not omniscient.
AI can feel powerful (maybe even magical?), but its limitations are real. It can hallucinate, amplify bias, and confidently deliver wrong information with a smile.
In practice, humility means testing and verifying outputs, holding technology loosely and choosing not to use it, and being willing to say “I don’t know — let’s investigate together.” The idea here is that AI should serve our teaching, not define it. This kind of humility keeps us grounded in the truth that only God is all-knowing. AI may simulate wisdom, but it cannot produce it.
Where the rubber hits the road
Ethical implementation of AI is not just tech policy issue; it is a discipleship issue. We are thinking about formation of students—this is actually a human dignity issue! Our imagination matters. Our practices matter. The way we model ethical behaviors really matters.
Because the truth is this: at the end of the day, our students do not simply need efficiency, speed, or automation. They need wisdom. They need meaningful work. They need community, belonging, and purpose. A full-orbed, distinctively Christian education opens these possibilities to students … but doing this kind of formative work in students’ lives might mean being less efficient to be more effective.
Where the rubber hits the road is not in the tech itself … but in the heart and imagination of the one who picks it up.
Thanks be to God—we are not machines. Neither are our students. We are image-bearers, called to form image-bearers. May we teach—and innovate—like we believe this truth!


