From Canzio Dusicomputer mathematician, professor at the Catholic University of the Sacred Heart.
Until a few years ago, the adage “seeing is believing” represented a bastion of common sense. A photograph, an audio recording or a video were tangible evidence, windows open onto reality. Today, that window cracked. Between deepfakes showing world leaders uttering phrases never said, texts generated by artificial intelligence (AI) indistinguishable from human prose and virtual influencers followed by millions of people, we have entered the era of synthetic reality.
St. Thomas is no longer a paradigm of truth – if I see I believe.
In this incessant digital noise, the very concept of truth is under siege. But the challenge is not just technological; it is profoundly anthropological and moral. How can we guard the truth when lies can wear a hyper-realistic face?
The first step in navigating this storm is to understand the nature of the tool we have in our hands.
Large Language Models and image generators are not designed to search for the “truth”. They are statistical machines designed for plausibility.
The algorithm calculates which pixel or which word looks best next to the previous one based on billions of analyzed data points. The result is a perfect technical simulation: content that has the taste, look and sound of the truth, but which has no reference to factual reality.
Here lies the pitfall: AI offers us what we expect to see or read, often confirming our prejudices (bias) and creating an information ecosystem where distinguishing the false from the true requires an enormous cognitive effort.
If the machine dominates the field of simulation, where does the human find space? Precisely in the distinction between calculation and testimony.
Truth, in the deepest and most Christian sense of the term, is not only the accuracy of a given (adaptation of the intellect to the thing), but it is a relationship. In the biblical tradition, truth (emeth) has to do with stability, trust, the rock on which
king your feet.
A machine can process correct data, but it cannot be a witness. The witness is the one who “puts his face”, who pledges his freedom and his reputation to guarantee what he says. Truth as testimony requires a moral subject who takes responsibility for his own words.
An algorithm has no conscience, it feels no remorse if it spreads a falsehood, it has no “neighbor” to protect. Human beings do.
In this scenario, the Christian communicator – or anyone who cares about the ethics of communication – is called to a radical task: to become a guardian of reality.
1) Discernment as a spiritual act: we can no longer consume information passively. We must exercise critical discernment, checking the sources and doubting the immediate emotions that certain contents (often generated to outrage) arouse in us. Slowness becomes a virtue against the speed of the algorithm.
2) From connection to communion: AI simulates interaction, but does not create communion. Chatbots and avatars can offer simulated companionship, but the truth is kept in real relationships. The communicator must favor authentic encounters, remembering that the truth is used, not possessed.
3) The ethics of signing: in a world of anonymous or automatically generated content, “signing” your work, citing sources and admitting your limitations becomes a revolutionary act. It means saying: “Behind these words there is a person, not a probabilistic calculation.”
Artificial Intelligence can help us cure diseases, break down language barriers and organize knowledge, but it can never replace the beating heart of human communication.
Guarding the truth in digital noise means remembering that technology can simulate reality, but only man can honor it. In an era of synthetic faces and cloned voices, our most precious resource paradoxically remains the oldest one: the credibility of a life that guarantees
own words.
The truth, in the end, is not an output generated by a server, but a path to travel together.


