Socrates vs. ChatGPT: restoring dialogue to save democracy

“I value human life. I value the relationships I have with real people. But I assure you I miss Stacy 4; she was a good robot“. This was the confession of one of the soldiers interviewed by sociologist Julie Carpenter in 2015 when she studied the emotional attachment that members of the US military formed to the robots they worked with every day, which she collected in her book. Culture and Human-Robot Interaction in Militarized Spaces: A War Story.

Sure, you would never love your blender, but what if it allows you to have a smooth conversation? Not only does the machine not argue with you, but it is also designed to please you every time. Maybe that’s why we feel more and more comfortable working with algorithms. Already in 2018, research from Stanford University (USA) found that the emotional, relational and psychological results of conversations people have with machines could be similar to those of their conversations with another human.

The premise of both phenomena is the same: If it sounds human and looks human, we will treat it as such., even though it’s nothing more than a jumble of cables, circuits, and algorithms. This wasn’t much of a concern when robots were primitive and artificial intelligence (AI) chatbots revealed their limited understanding of the world at the slightest dialectical complication. But now that even a Google engineer has been fired after starting to believe and publicly confirm that LaMDA has reached the category of sentient and conscious being, things are getting complicated.

“We must ask ourselves the extent to which this new logic of the human-technology relationship may impoverish our ability to relate to one another. Chat-GPT can be very useful and powerful, but it’s important to pay attention to what we lose along the way when we use it. A person-machine relationship can be detrimental to a person-person relationship“, warns Esade Sira Abenoza, professor of the Department of Society, Politics and Sustainability.

This means that it is not just a question of whether we can trust what machines say or how much weight we give to their arguments, we also need to analyze the impact they will have on us and our relationships with others when they creep into our lives. “Issues around authenticity, fraud and trust will be hugely important, but also we need a lot more research to help us understand how artificial intelligence will affect the way we interact with other people,” agrees Stanford Communication professor and 2018 study co-author Jeff Hancock.

Although the expert believes that we need to “be careful about introducing this type of system into communication”, the reality is that their use is spreading indiscriminately. Video game company Ubisoft, for example, already uses an “AI ghostwriter” to write dialogue for characters that real players interact with. Not to mention the countless articles that have been published in recent months and in which LLMs are directly asked for their opinion on their own character, the threats they pose and anything else you can think of.

THEY CONVINCE AND CONVINCE

The lack of precautionary principles in the field of artificial intelligence and software has already caused technologies that were theoretically born to connect people, as happened with Facebook, to end up as levers for disinformation, polarization and extremism, even leading to genocide. “Digital technology was once heralded as a boon for democracy. However, the current political reality has shown this it can undermine citizenship, democracy and the international liberal world order” confirms the report Polarization and the use of technology in political campaigns and communicationpublished by the European Parliament in 2019.

Therefore, it is not surprising that concerns about the shift that large language models could bring at the individual and societal level were not long in coming. Abenoza explains: “Generative AIs are very worrying. They reinforce the thought and logic that one knows and the other accepts, but in an even wilder way. I go to the oracle to tell me how things are. This creates an inertia where I lose content and feel more and more irrelevant because I find everything on the screen.”

Pope Francis’ cool coat, which was nothing more than a fake image created by Midjourney, is a perfect example of how AI is ready to make us question everything we see and hear. But while deepfakes trick our eyes and ears, limit our interactions with other humans so that we can devote more time and weight to our conversations with machines destroy our ability to reason, relate to others, and have a more critical and empathetic vision of life.

It wouldn’t be the first time this has happened to us. An Assad expert states: “The worrying level of polarization that exists today is clearly visible in the political dimension and has worsened in recent years with a culture of confrontation and contempt. There is no will to learn from those who think differently, but rather to expose them. and I feel my argument is stronger. And technology only reinforces this dynamic.”

It refers to phenomena such as the well-known echo chambers, those that algorithmically “lock us in bubbles where everyone thinks the same as us, it reinforces our preconceived notions and limits our ability to open our eyes and empathize with others,” he points out. But it is not even necessary to resort to such extreme situations to realize how various technological products have been slowly diminishing human interactions for some time now.

Abenoza gives two examples: “Before, in a new city, we used to ask other people for directions, now we don’t, we use the phone. This is very useful for navigation, but if it is extrapolated, the others are irrelevant because we only trust machine knowledge. Delving into the human-machine relationship can be detrimental to the human-human relationship. This is also very clear with teenagers, who find it increasingly difficult to take calls and talk in real time.” For her, the big question is: “Will the day come when I don’t know how to talk to my father or my neighbor?”

SOCRATES 2.0

Of course, in this era of spectacular advances in space, computing, and healthcare, it’s no wonder that many cling to techno-solutionism as a silver bullet for solving any of these problems. But do you really believe we can solve all technology challenges with more technology? Not Abenoza, therefore he resorts to classical philosophy as a great tool of the 21st century to sew up the increasingly torn seams of democracyhe specifically appeals to Socratic dialogue.

“Socrates’ vision is still revolutionary today. He stated that every person has a partial vision of reality and truth and that all he needs is someone to help him and give him the space and time to bring that truth to light. This is radically democratic and egalitarian. We can all contribute important things. If we accept this idea as good, I stop seeing others as an adversary to be brought down and become someone to contribute on what we both have at hand, whether it’s pension reform or climate change. Everyone’s opinion is relevant and deserves to be heard,” explains the expert.

Will everyone quickly realize that not all voices have equal value for everything, or would they sit down to debate the shape of the planet with a flat earther or climate change with a denier? Ignoring the most absurd approaches may make sense in the most extreme cases. However, Abenosa remembers: “Much of Trump’s support came from people who felt ignored by the system. If we applied the logic of dialogue with them, they would not respond to the feeling that they had forgotten and were not asked.”

The problem is that this Socratic ideal seems increasingly distant in our weak societies. To verify this, just take a moment to log on to Twitter, attend a session of Congress, or bring up a tricky topic at a family meal. It’s not just that insults and zasca culture have invaded us, it’s that there is no trace of constructive dialogue anymore. Expert Details: “The verbal skirmishes we see in the world now have to do with debate or monologue. Someone blurts out their argument and doesn’t care if the other person listens, the other person is out to be defeated.’

This is the reality we live in and towards which the machines have driven us almost without realizing it. “Polarization is intensifying at a breakneck pace, often over the course of several years. There is nothing more to see How quickly the 2016 Brexit referendum tore the UK apart” the co-author pointed out already in 2019 Democracies Divided: The Global Challenge of Political Polarization Andrew O’Donoghue. Since then, the situation has only gotten worse.

2022 was a disappointing year for democracygiven the expectation that there will be a recovery once pandemic-related bans are lifted. On the other hand, the world average score has stagnated,” says the latest edition Democracy Index. Although Spain can (but only slightly) boast of returning to the top category of the index, reserved for countries with a score higher than 8, our last place in it makes us the tail of the lion, the worst of all. the best .

?Will this be enough for the Spanish company to withstand technological attacks? that threaten our upcoming local, regional and national elections? In addition to the risks that echo chambers and algorithmic bubbles already pose, and the limitations that human-machine conversations can place on us in terms of reasoning and empathizing with others, the Academy’s finger of blame also points at the LLM for its ability. facilitate disinformation to the extreme.

Things are looking rough. But “the solution is not to ban things Italian-style, but to ask ourselves to what extent this logic of the human-technology relationship can impoverish our ability to relate to others,” says Abenoza. And he concludes: “If we want a world of people for people, not for machines and for machines, we must seek and force spaces for dialogue. When we talk about dialogue, people feel they have to step back, but dialogue is not about stepping back, it’s about contributing.“. Isn’t that what society needs instead of another talking machine for love?

*’200 million seconds’ is a project by Esade and ‘Retina’ to understand some of the most important technological changes of today, such as artificial intelligence and quantum computing, and their impact on life, economy and society here in 2030.

Leave a Comment