“AI: the algorithm, censorship of the 21st century” is the theme of the meeting which takes place on 17 December in Rome, at the Salone Pietro da Cortona at the National Galleries of Ancient Art – Palazzo Barberini, and promoted by Lucia Bergonzoni, Undersecretary of Culture.
Lucia Borgonzoni exercises ministerial delegations for cinema and audiovisual, copyright and cultural and creative enterprises. She is also delegated for the promotion of youth businesses in the culture sector.
In recent months Borgonzoni has fought to defend the audiovisual sector from the possible abuses of artificial intelligencewith the risks associated with copyright and the professionalism, for example, of voice actors. And already last summer the Undersecretary had expressed his “dismay” at the “censorship carried out in the digital field to the detriment of an internationally renowned talent and the works that are the result of his creativity. An episode which is yet another demonstration of the limits of platforms which, incapable of distinguishing pornography from art, end up affecting our artists and our cultural heritage”. Borgonzoni was referring in particular to artistic nudes, such as that of the artist Jago, blocked by social platforms such as Meta (Facebook/Instagram).
“There is also the case of a photographer”, explains Borgonzoni, “who has no longer found her photos of pregnant women online. Evidently for social platforms the sacredness of the naked belly of a pregnant woman, also depicted several times in art, is pornography. The algorithms do not understand that a naked body depicted by a painting or sculpture is art, without sexual or pornographic implications. And this risks being enormous damage to a country endowed with an extraordinary artistic heritage like Italy”.
This happens, according to Borgonzoni, because the large digital giants subcontract controls to countries where the concept of democracy and women are very different from ours. There are implications that concern not only art, but also religion. A naked Baby Jesus in the manger or in the arms of the Madonna runs the risk, on social platforms, of being considered pornographic.
“One day the algorithm”, explains Borgonzoni, “could decide not only to censor a work of art, it also happened to the statues of Canova, because the algorithm does not recognize art but the nude and the non-nude. We also risk that these algorithms one day decide that a religion or an ethnic group is no longer good, is inappropriate, talks about the wrong things. Therefore the algorithm could decide that they should be deleted, inserting forms of denigration and of marginalization”.
The objective of the conference and of the Undersecretary’s next initiatives is to open up serious reasoning on these issues, because otherwise “we risk that our civilization, as we know it, will slowly be swept away in this standardization”. “The danger”, concludes Borgonzoni, “is that the algorithms decide what is convenient or inconvenient, what the population who accesses the platforms likes or dislikes. At a certain point, the true or the false will no longer exist, but only what is true for the majority of people”.









