Friday, June 14, 2024

AI and the Pope

 A few months back I wrote a Note on the issue of "what is AI". It appears that everyone knows but no one agrees. Now the Pope and his band of followers is lecturing world leaders.

In the Guardian, they note:

Global governance is now over AI like a rash. The EU, never slow to regulate in the digital field, has passed an act that seeks to regulate AI in the EU to ensure it is “aligned with human rights, democratic integrity, and the rule of law”. Canada is broadly following suit. The UK and the US are being less prescriptive. So how does the pope fit into this patchwork tapestry? It is to Meloni’s credit that she is attempting to build on Japan’s work rather than set off in an entirely new direction. Indeed she has described AI as “the main challenge we face, anthropologically, economically, productively and socially”. But she has attached herself to the pope, partly because the pope himself is leaning on the thinking of a Franciscan friar Paolo Benanti – who has in turn become central to her own thinking, turning up as her adviser to meetings with titans such as Bill Gates.  Under-shaved, brown-robed and jovial, Benanti is adept at explaining how technology can change the world, “with humans ceding the power of choice to an algorithm that knows us too well. Some people treat AIs like idols, like oracles, like demigods. The risk is that they delegate critical thinking and decisional power to these machines.” AI is about choices. He points out: “Already a few tens of thousands of years ago, the club could have been a very useful tool or a weapon to destroy others …”The Italians, not pioneers in the technology, warn that AI prefigures a world in which progress does not optimise human capabilities, but replaces them

 If one looks closer one sees again the lack of any specific definition. Banning opioids is somewhat easy, we know the chemical structure. Banning or regulating AI is a lost cause. First everything is now AI yet no one can define it. I guess it is like pornography, you know it when you see it? Hardly.

The risk is having controlling individuals take it upon themselves to posit definitions at will.