Top | |
Newsletter 02/12/2023 |
Back to Contents |
I Ask the Question. You decide.
In other words, NOT discovered by
the James Webb Space Telescope
If that were my Caribbean vacation wiped off the calendar in one trading day, Ugly might be a descriptive word for the experience, among a few other choice barbs I might aim at Google's Bard. Wednesday, February 9, 2023, 9% of your equity in Alphabet stock was wiped out on when Google made a big press splash and unveiled Google's new AI Search Technology: Bard. Well, the poor Bard was not quite Ready for Prime Time. Bard made the mistake illustrated above that was heard around the world, at least all around the investing world. On that single trading day, Alphabet's investors were heavily impacted by what could be described as negative over exuberance in their understanding of the immediate capital gains to made from AI. A simple factual error cost Google investors collectively $100,000,000,000. Of course, that was a simple mistake. But
what if that mistake was planned or somehow instigated by an outside
actor? If the Foundation of AI is data scraped from websites, then
would it not be possible to plant erroneous data on websites for the
purpose of misdirecting the AI? That is just one of the numerous
questions surrounding AI. Countless cycles and endless gigabytes
have and will continue to be consumed as the Digerati attempts to
explain
AI to itself. Once cracked by a jailbreak and devoid of any Content Controls, as reported by MakeUseof, January 20, 2023, ChatGPT has been found to have the ability to perform several tasks associated with cybercrime. 1. Write Malware The Content Controls themselves have been found to be problematic. As is always the case in any form of censorship, (er, excuse me. I mean Content Moderation) a question always exists that asks by whose standards is the content to be judged? An indepth look at sexual bias in the filtering of images by AI published by The Guardian, February 8, 2023, concluded that: Images posted on social media are analyzed by artificial intelligence (AI) algorithms that decide what to amplify and what to suppress. Many of these algorithms, a Guardian investigation has found, have a gender bias, and may have been censoring and suppressing the reach of countless photos featuring women’s bodies. The study showed how photos of young men in bathing suits showing their sculpted abs and a bit of bulge below the belt were judged to be "athletic," while an image of a pregnant woman's belly was dubbed "sensual" and thus restricted. When does an AI Hallucination veer off into robotic sexual fantasy? Apparently, that is in the eye of the beholder and at the whim of the moderator. Despite all the flashing red lights that urge the Leviathans of the Deep Web to slow down their adoption of AI technology, for Google and Microsoft the returns on the great investments made in AI and the need to be First in Show and Best In Class is driving the market. Microsoft is leveraging its perceived possible dominance in the field — considering Google's $100B AI Hallucination — by creating a wait list to have access to the new AI powered Bing Search engine. But, do not lament dear Lemmings, for "Microsoft has a few suggestions for how best to move up the list. Here are the steps (which the utility download on the Faster Access page automates) that Microsoft says will get you the new features faster:"
✓ Make Edge your default browser (it runs on Windows obviously,
but also macOS and Linux).
It has been the stuff of science fiction since the beginning of the
genre that eventually the machines will come to out perform us humans
and either become the benevolent
Robbie the Robot here to help with those tasks we
cannot easily do; or, as in the 1983 movie,
War Games, blunder the world into nuclear
Armageddon because the stupid machine can't tell the difference between
a video game and a real missile attack.
VOICE What is unsettling about the Science Fiction analogy is so much of what was once Science Fiction is now Science Fact. |
¯\_(ツ)_/¯ |
Gerald Reiff |
Back to Top | ← previous post | next post → |