Wednesday, July 5, 2023

I Don't Want an AI ChatBOT

 Garbage in garbage out. Or GIGO. That is the risk of these AI ChatBots, AICs. You see they merely take in lots of data then process it through a weighted array and then use a text formatting system to provide an answer. Thus if I ask "What is the cause of prostate cancer?" the AIC sees words like prostate. cancer, cause, what etc and then takes all its info in and passes it through the neural net to come up with data from the input and then parses it back as a sentence. The input may be a variety of things from professional journals to newspapers to blogs. Each of these has varying reliability but also some critical literature may have been missed.

Then the system iterates in the event this question is asked again and reinforces its original opinion. Unless, however, the user says it is wrong, then it may adjust depending on the algorithm.

The results are correlative not dispositive. The AIC just takes lots of others statements, often lacking in any weighting, and produces a text output. 

The concern is that one may rely on the AIC result as being factual. It is not. At best it is a correlative collection of disparate opinions.