The Age of AI with Jason Thacker
On Episode 11 of the Ministry at Scale Podcast, we talked with Jason Thacker, Chair of Research and Technology at the Ethics & Religious Liberty Commission (ERLC). Jason brought valuable insights to the relationship between ministries and big tech companies – especially helping us remember the human beings behind the computer screen.
Who is Jason Thacker?
Jason Thacker is the Chair of Research and Technology Ethics at ERLC – a fairly unique role where he works with technology companies, researches popular social issues in the digital space, and produces resources for churches on how to approach technology from a Christian ethical perspective. Jason is also the author of The Age of AI, an insightful book about how Christians should think about, and act, with technology.
The Age of AI
In his book, Jason expresses that the wrong way to approach technology and Christianity is to ask, “Is technology good or bad?” Instead, he explains that technology is amoral – it is not inherently good or bad, and faces no moral accountability before God the way humans do. However, technology is not neutral either, because it is easy to miss the dangerous effects of how technology can be used.
For our readers who don’t fully understand what Artificial Intelligence (AI) is, Jason explains it as “non biological intelligence - the ability of a computer to think about or do things that humans can do, like making decisions.” AI is built from a complicated algorithm, or calculation, that learns over time. Sometimes, humans don’t even realize how many things AI is built into – from our phones, websites, and home devices to entire jobs, and even our national security.
Since we’re using AI every day and not even realizing it, Jason states that it’s important, especially for Christians, to try to understand how these tools work and how they’re shaping us – whether good or bad.
Ministries & Big Tech
We recorded this podcast on February 9, 2021, at a time where the role and influence of big tech companies has become increasingly evident, especially for Christian organizations. At Five Q, we have had plenty of partners asking, “What happens if Google decides to no longer show our website in search results?” We asked Jason for his advice on how ministries should approach big tech, and his response was to remember the key commandments: love God and love neighbor.
When it comes to censorship, especially on social media, it’s easy to forget the big picture:
Social media companies face some of the most pressing questions in our society, and have to manage those issues on their platforms.
Bad content moderation policies are written by people just like you and me.
When a post gets taken down, it’s easy to view the situation in individualistic terms (“Facebook wants to censor me”) instead of realizing the AI that flagged your post has been sifting through millions of other posts, all in the span of one day. And the human moderator who handles your appeal application sifts through thousands of posts in a day – often seeing the most dark and vile content people try to post on the internet – so they might make a mistake, too.
Jason’s advice to ministries who are trying to navigate the content moderation policies of big tech companies is three-fold:
Take time to read and understand the content moderation policies. This is not meant to try and skirt around the policies, but rather to help you write content that will work on social media.
Take a step back and realize that these are content policies. We as the public have the right to have conversations about these policies – and advocate for better policies.
Don’t panic. This world is not our home, and as Christians we know the end of the story. Our enemies are not political parties, tech companies, or governments; our enemies are the principalities and powers of evil.
Jason recommends two books for our audience to check out:
To learn more about Jason Thacker, listen to his podcast, purchase his book, or sign up for his weekly newsletter, visit www.jasonthacker.com.