In an Active Citizen Series lecture, Duke University Professor of Sociology, Political Science and Public Policy Christopher Bail discussed how to use ChatGPT to improve the productivity of our political conversations.
In his lecture, 鈥淏ridging Political Divides with Artificial Intelligence,鈥 Duke University Professor of Sociology, Political Science and Public Policy Christopher Bail focused on how a large language model, specifically ChatGPT, can be used for conflict mediation and make political conversations more effective but not more convincing.
The Sept. 18 lecture the Moseley Center was part of the Active Citizen Series, a collection of events at 黑料不打烊 designed to cultivate the next generation of informed leadership who help strengthen our communities and shape democracy. 黑料不打烊 is hosting several nonpartisan events听ahead听of the 2024 election.

Ball described the American political atmosphere as similar to a couple having difficulty communicating and deciding to go to a marriage counselor.听听
鈥淥ne of the first things they tell you [in marriage counseling] is people talk past each other鈥e misunderstand each other because we鈥檙e just not connecting, we鈥檙e just not able to see the world through the eyes of other people. It is a distinctly human problem,” said Bail, who is the founder of the Polarization Lab and author of听鈥淏reaking the Social Media Prism: How to Make Our Platforms Less Polarizing.”
Chat alternatives听
In Bail鈥檚 study, a group of people consented to have a conversation with a partner who had a different political view than themselves. The politically left-leaning person described guns as a 鈥渟tain on democracy鈥 while the politically right-leaning individual would respond with 鈥済uns are protecting democracy鈥 and eventually someone was called 鈥渋ncredibly naive鈥 by the end of the discussion. After this first trial, two to three people were selected to have the choice of using Ball鈥檚 AI chat assistant technology.
The chat assistant offered alternatives to the author of the message and allowed them to accept or edit the suggestions. Instead of one of the users responding to their partner鈥檚 different opinion with an insult, the chat assistant gives the user a choice to rephrase their message to make the conversation more productive.听
Bail and his team hypothesized that the use of the AI chat assistant led to increased conversation quality, and increased democratic reciprocity but would not change their mind on policy issues. At the end of the study, Bail found that the person using ChatGPT is commonly the one changing their mind due to this technology allowing the user to take the perspective of their partner. By stepping into their partner鈥檚 shoes, there was a better understanding of their opinion and reasoning behind it.听
鈥淲e don鈥檛 need AI to do politics for us. But wouldn鈥檛 it be great if it could teach us to speak in a way that makes our arguments more effective,鈥 Bail said.听
This experiment tested democratic reciprocity in which, despite having differing opinions and views on gun control, participants who used and received chat assistant messaging were more inclined to have future tough conversations.听

Your new neighbor: AI
Nextdoor, the neighborhood community forum app, reached out to Bail and his team about using their AI research for their app. The company was having difficulty with its users violating their community messaging guidelines. Bail cited research that noted people witnessed or experienced something that negatively affected them on Nextdoor.
With Bail鈥檚 assistance, Nextdoor鈥檚 鈥淜ind replies make a difference鈥 initiative was put in place allowing for AI chat assistants to flag messages that violated Nextdoor community guidelines, offer options for the user to rephrase their language or to continue to send the flagged message and recommend rephrased messages for the user to share instead.听 It gives the messenger a chance to edit the comment and use a suggestion or don鈥檛 rephrase and still send the flagged message. There is no forced censoring, just thoughtful suggestions to limit verbal conflict.听
鈥淚 was pretty surprised when in a transparency report last year, Nextdoor reported that this kindness intervention reduced the creation of content that violated its community guidelines by 15%,鈥 Bail said.
This technology, and Bail鈥檚 research, also assisted the development of a new company, Temper, which provides this chat assistant technology to be used by non-profit organizations in the service of the social good.听
As the audience was allowed to ask questions, Bail addressed the negatives of this technology, specifically Stanford University sociologist鈥檚 research on a content moderation tool that can disproportionately flag comments from African American users. However, Bail defends his position with the possibility of using ChatGPT to improve the divide of American politics by changing the way we speak to the 鈥渙ther side.鈥