Leading in an AI World: Engaging Ethically

Paul MatthewsThe CACE RoundtableLeave a Comment

This is the second in a four-part series that looks at the theological foundations for a faithful response to AI in Christian schools. This post is an excerpt from Paul Matthews’ book A Time To Lead.

In the first installment of this four-part series, we looked at how intentionally living under the lordship of Christ would allow us to navigate AI wisely. Not only will this priority free us from the dual errors of fear and idolatry, but it allows us to use this tool wisely.

However, in building a framework for understanding AI in Christian education, we must push beyond what we can do with the tech and understand what the tech does to us. We must understand its formative power.

Technology as a formative force

Our world is full of formative influences. These influences shape who we are, what we love, and how we act. Ideas, people, environments, and technologies have a formative effect on us. It is with this impact in mind that the Apostle Paul wrote, “Do not be conformed to this world, but be transformed by the renewal of your mind” (Romans 12:2).

Derek Schuurman, professor of Computer Science at Calvin University, argues that in the DNA of every new technology, including AI, there is a story being told about the world. Similarly, Neil Postman stated, “Embedded in every technology there is a powerful idea, sometimes two or three powerful ideas.”

Our ethical beliefs and moral actions are shaped by the stories and ideas embedded within the technology we use. If we accept these stories and imbibe these ideas without discernment, they can shape us in dramatic ways. This sentiment is captured in the idea that “we shape our tools, and then our tools shape us.”

The stories within the technology may have been deliberately constructed by the corporations who built them, or they may be a dimension of which even the creator is unaware. Either way, these ideas are compelling and formative. Let us briefly examine some of the stories within these ubiquitous AI tools.

Our ethical beliefs and moral actions are shaped by the stories and ideas embedded within the technology we use. If we accept these stories and imbibe these ideas without discernment, they can shape us in dramatic ways. 

Case Study 1: Snapchat’s My AI

In April 2023, Snapchat launched an AI chatbot within its popular app. With no warning, training, or prior consideration, 750 million users had access to an AI chatbot at their fingertips. Early criticism of the feature centred on inappropriate responses given by the chatbot. Screenshots were being shared of the chatbot telling underage partygoers how to mask the smell of alcohol and marijuana (Fowler) or engage in underage sex and lie about it to their parents (Smith).

While these reports are concerning, little was said about the deeper story within this AI chatbot used by hundreds of millions of teens. If I had to guess, the following ideas are contained within tools like this one:

  • Conversations should be about the topics I want to talk about, and they should last as long as I want them to.
  • A good conversation is one where I have my emotional and intellectual needs met in a way that feels right to me.
  • A bad conversation is one where my conversational partner doesn’t tell me what I want to hear or engage in a way that makes me feel good.

While accessing age-inappropriate information is concerning, more concerning still is that this technology could shape a student’s entire understanding of interpersonal relationships. The ideas within this technology could transform (or, indeed, deform) every relationship the student has.

Case Study 2: Open AI’s ChatGPT

In November 2022, Open AI launched ChatGPT. For many, it was their first use of a large language model (LLM) AI. It left users in awe as it produced novel, engaging, and seemingly intelligent content. As with Snapchat’s My AI, much of the critique revolved around incorrect output (Bordoloi). But on a deeper level than the accuracy of output, what are the stories and ideas embedded within a technology like this one?

Here are a few of my best guesses:

  • It would be a waste of my time learning to do any task that AI can complete to a higher standard in less time.
  • I don’t need to wrestle with deep questions because I can ask an AI who will give me an instant answer to any question I have.
  • I don’t need to stare at a blank page waiting for inspiration. I simply get ChatGPT to write a full first draft of my work, and then I curate the output.

While the accuracy of information is important, more important is a student’s understanding of learning, creativity, and the value of knowledge, all of which are impacted by the “story” within AI tools like ChatGPT.

Ethical engagement

How do we engage with these tools ethically? How can we be sure that we aren’t being unwittingly formed by the powerful stories and ideas embedded within these technologies? Let me make two suggestions.

Discern and critique

Firstly, we must discern and critique the stories. The simple activity of uncovering the stories or ideas within the new technology is a confident step in the right direction.

Bringing these stories to light removes one of the most formative aspects of technology: our ignorance of its shaping power. Once brought into the open, we can have discussions in community (more on this later) about how this formation aligns with God’s vision for our lives and what steps we can take to ensure we are being transformed by the renewing of our minds, not conformed to the pattern of this world.

Navigate new technologies with ancient morality

More than ever, we need faith, integrity, wisdom, self-control, and love. We must encourage ourselves and our students to turn away from evil and do good (Psalm 34:14). While this kind of exhortation has always been a part of Christian education, we must proclaim it with renewed vigour. The temptations of our age are not new, but the frequency and intensity of the temptations are. Put simply, it’s never been this easy to sin.

Take cheating on homework, for example. While students have been cheating on their homework as long as there has been homework, it’s easier than ever before to cheat. Once, cheating may have involved a parent, friend, or tutor completing the work, or finding a relevant book and copying a passage or stealing an idea. Whereas in the past it was possible to cheat, it wasn’t necessarily easy. Now, every student with internet access can simply copy and paste a question into ChatGPT and have an answer in seconds. It’s the same temptation, but far more accessible. Because there has never been an easier time to cheat, there has never been a more important time for integrity.

As we live under the Lordship of Christ and engage with AI ethically, we must then consider how we can adapt appropriately. That adaptation will be the subject of Part III in this series.


References

Bordoloi, Satyen, K. “The Hilarious and Horrifying Hallucinations of AI.” February 2, 2023.
Fowler, Geoffrey, A. “Snapchat’s My AI: A New Era in Personalized Filters.” Technology Section, March 14, 2023.
Smith, P. “AI Glitter Fails to Decorate Snapchat’s Slapdash Sentience.” April 27, 2023.

Author

  • Paul is an Australian teacher, consultant, and TEDx speaker dedicated to partnering with schools to navigate the challenges and opportunities presented by artificial intelligence. Paul emphasizes the need for clear theology, principles of wise AI use, and evidence-based practices. He has addressed these themes in his two books, A Time to Lead and Artificial Intelligence, Real Literacy.

    View all posts

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.