Top  
Newsletter 02/11/2023 Back to Contents

I Ask the Question.  You decide.
AI — Artificial Intelligence or Abject Idiocy?
Part 2: Now, Here Comes the Bad Part
There's Always a Bad Part

Picture yourself on a train in a station
With plasticine porters with looking glass ties
Suddenly someone is there at the turnstile
The girl with the kaleidoscope eyes

— Lucy in the Sky With Diamonds, The Beatles

Hallucinating rock stars in the 1960s were never any real threat to anyone but themselves.  In our hyper technical 21st century, however, Hallucinating AI is apparently a real and present danger to us all.  Furthermore, I don't think I would be going too far out on a climb by saying that Hallucinating AI is inherently Bad.

AI Hallucinations is simply the current euphemism for what is an ages old problem in computing: GIGO (Garbage In, Garbage Out).  As head of Google search, Prabhakar Raghavan has warned:

“This kind of artificial intelligence we’re talking about right now can sometimes lead to something we call hallucination. This then expresses itself in such a way that a machine provides a convincing but completely made-up answer”.

More precisely put, writes Satyen K Bordoloi, February 7, 2023, AI Hallucinations are "caused by a variety of factors like errors in the data used to train the system or wrong classification and labeling of the data, errors in its programming, inadequate training or the systems inability to correctly interpret the information it is receiving or the output it is being asked to give."

... What is called ‘AI hallucination’? It is when an AI system gives a response that is not coherent with what humans know to be true. The system sometimes perceives something that isn’t there or does not exist in the real world.

Of course, what if the AI response is to a prompt about a subject the human user knows nothing about?  AI prose can be quite fluid in style, while an easy sense of authority is conveyed.  And, as P.T. Barnum has been proved time and again to have been unbelievably prescient in declaring that: “There’s a Sucker Born Every Minute.”  We can readily see how social media has fanned the flames of social devisions.  AI has the potential to amplify even greater the piles of misinformation that permeates much of cyberspace these days.

One feature of the technology that makes AI responses like those of ChatGPT inherently suspect is the technology is currently "unable to cite their own sources."  Thus, AI is just as likely to be a valuable research tool as AI is likely to be merely a "bullshit generator."  As Dan McQuillan wrote in Vice, February 7, 2023:

ChatGPT is, in technical terms, a 'bullshit generator'. If a generated sentence makes sense to you, the reader, it means the mathematical model has made a sufficiently good guess to pass your sense-making filter. The language model has no idea what it's talking about because it has no idea about anything at all. It's more of a bullshitter than the most egregious egoist you'll ever meet, producing baseless assertions with unfailing confidence because that's what it's designed to do.

Great. AI can either be the model of your Aunt Hazel, the kindly and knowledgeable retired school teacher; or come off like your drunken, ignorant, but highly opinionated, Uncle Ernie, who insists on regurgitating in your direction whatever wasted cycles he spent on Facebook; or maybe you might get both personas at the same time. 

AI Foundational Models build their datasets by scraping data from whatever source it derives.  A simple definition of data scraping is:

Importing data from websites into documents or spreadsheets is known as “data scraping,” sometimes called “web scraping.” Data is taken from the web and reused on other websites or utilized for the scrape operator’s benefit.

Several lawsuits are now advancing over data scraping of copyrighted material.  Getty Images is one such Plaintiff who has filed legal actions over copyright violations.  A group of artists have brought an action in California Courts.  In this suit, the issue of web scraping is explicitly cited in the filing.  ClassAction.org recently reported on the case.

To create its widely used Stable Diffusion software, the backbone of all three products at issue in this suit, London-based Stability downloaded or otherwise acquired copies of “billions” of copyrighted images without permission, the filing states.

The article cited makes note of the fact Getty Images suit also names the same defendant as the California Class action suit: Stability AI.  The suits maintain that the Defendants earned tremendous incomes in what is essentially the repackaging of the copyrighted material.  Repurposing, or making something new derived from data scraped from websites, lies at the foundation of AI technology.

Central to any thesis about the downside of AI technology as currently implemented is an inherent "distrust" of any particular outcome derived from AI technology.  As written in Daedalus (Spring 2022) MIT Press:

Despite sophisticated techniques to teach algorithms from data sets, there is no ground truth available to check whether the results match reality.  This is a basic challenge for ensuring reliable AI. We can prove that the learned algorithm is indeed the result of applying a specific learning technique to the training data, but when the learned algorithm is applied to a previously unseen individual, one not in the training data, we do not have proof that the outcome is correct in terms of an underlying factual basis, rather than inferences from indirect or arbitrary factors.

In other words, when there is a lack of any prior knowledge about a topic — and without sources cited that lend credence to any argument — how can a casual user discern what is fact from what is an AI Hallucination to what is simply utter bullshit?  In some ways, the downside of AI forces us to revisit the age old problem asked by the study of Metaphysics:
What is the nature of Reality?  Or more simply asked: What is Truth?

[PILATE] Then you're a king? --
[JESUS] -- It's you that say I am. I look for truth, and find that I get damned
[PILATE] But what is truth? Is truth unchanging law?
We both have truths - are mine the same as yours?”

— “Trial Before Pilate,” Jesus Christ Superstar, Tim Rice & Andrew Lloyd Webber 

 

¯\_(ツ)_/¯
Gerald Reiff
Back to Top previous post next post