Top  
Newsletter 09/19/2023 Back to Contents

Time As Come Today:
Deep Fakes On the Rise

I'll be your mirror
Reflect what you are, in case you don't know
— I'll Be Your Mirror, Lou Reed, The Velvet Underground

August 29, 2023, Software vendor, Retool, reported that on August 27, 2023, notified "27 cloud customers there had been unauthorized access to their accounts."  Retool explained that a number of employees received similar text messages falsely claiming to be from the company's IT department.  One employee did click the spam link in the spam text that went to a fake company log in site.  Then another action was taken by the attacker.

After logging into the fake portal – which included a MFA form – the attacker called the employee... The caller claimed to be one of the members of the IT team, and deepfaked our employee’s actual voice.

September 16, 2023, decrypt.co, reported that several well known celebrities have had their likenesses, voices, and other recognizable personal attributes of those individuals digitized, and then stored by AI startup, Metaphysic.ai.  As Jason Nelson of decrypt.co stated in his report, the celebrities are doing this as a means to protect what is now considered their intellectual property.

Demonstrating the need for celebrities to copyright their personal attributes, at the recent CogX Festival, a UK based Coachella like celebration of all things AI, actor Stephen Frye, presented a recording of "an AI system mimicking his voice to narrate a historical documentary."  Frye had narrated all 7 of the Harry Potter Audio Books, as reported by Deadline, September 17, 2023.  Frye laid out well the essential problem deep fakes pose, not to just Hollywood celebrities, but as the Retool incident shows, this an online landmine that could so victimize anyone.

The ever growing ubiquity and unprecedented capabilities of AI technology has caught the attention of Uncle Sam and its cyber warriors.  On September 12, 2023, the NSA and its partner agencies, released the joint Cybersecurity Information Sheet (CSI) “Contextualizing Deepfake Threats to Organizations” to help organizations identify, defend against, and respond to deepfake threats."  The Press Release, synopsis, and summary can be found here.  The complete 18 page PDF Cybersecurity Information Sheet, "Contextualizing Deepfake Threats to Organizations" can be had here.

In the synopsis, the authors offer a more technically precise definition and explanation of what Deep Fakes actually are.  

The term “deepfake” refers to multimedia that has either been synthetically created or manipulated using some form of machine or deep learning (artificial intelligence) technology.

Although individual Consumer are unlikely to take the defensive measures the authors recommend.  Furthermore, Consumers may consider reports such of this article and its sources, as irrelevant to their personal experience.  As was certainly the case with all the cyber threats that have come before, the fallout from the threats "synthetic media" pose will eventually impact many of these same Consumers.  When Consumers' cellphones are vulnerable to Sim Swapping, and the various ways that 2FA identification techniques can be thwarted, it would be quite naive for anyone to assume that the increasing prevalence of synthetic media cannot impact them and their well-being.  Indeed, for as the NSA reports states, "Threats from synthetic media, such as deepfakes, present a growing challenge for all users of modern technology and communications." [pg. 1]

On March 6, 2023, the IEEE Spectrum published "Detection Stays One Step Ahead of Deepfakes—for Now. The spread of AI-generated content is keeping the tech designed to spot it on its toes. The IEEE offers a survey of the new technologies currently available for immediate adoption, or are under development for future adoption.  IEEE summarized the State of the Art thusly:  "Recent research shows progress in making detection more robust, sometimes by looking beyond subtle signatures of particular generation tools and instead utilizing underlying physical and biological signals that are hard for AI to imitate."  Examining all network traffic "subtle signatures" is how malware detection is intended to work.  On the other hand, "physical and biological signals," signal the coming imperative of biometrics that Consumers will eventually be compelled by circumstances to adopt.  The most successful of these emerging technologies will incorporate both the technologies of detection and the biometrics of authentication.  The authentication of an actual living breathing human being is the impetus behind the WordCoin Orb, no matter how questionable its methodologies may seem.

One promising technology that may have Consumer appeal is an application developed by Intel.  Launched in November 22, 2022, FakeCatcher, is described by Intel as:

...A technology that can detect fake videos with a 96% accuracy rate. Intel’s deepfake detection platform is the world’s first real-time deepfake detector that returns results in milliseconds.

FakeCatcher utilizes the science of "a process called photoplethysmography" (PPG).  PPG measures and analyzes "biological signals hidden in portrait videos [that] can be used as an implicit descriptor of authenticity, because they are neither spatially nor temporally preserved in fake content," according to the IEEE summary of the technology.  The National Library of Medicine, published by the National Institute of Health, defines Photoplethysmography (PPG) as:

... a simple and low-cost optical technique that can be used to detect blood volume changes in the microvascular bed of tissue. It is often used non-invasively to make measurements at the skin surface.

The Infographic below published by Intel is a summary of how the technology works to authenticate a biological entity, and this identify a fake.

Currently, the application of advanced biometric technologies are still in development, or only made available to large entities and institutions.  Nevertheless, the historical march of the distribution and adoption of IT products is one of increasing adoption and use, thus driving production up, and, ultimately, retail costs down. So, it is safe to assume that Consumers will soon be consumers of some Deep Fake detection and authentication technology.

If you think, like I and so many of my clients do, that hacking, malware, and viruses, and all the measures that must be employed to defend and protect against such email borne threats have taken all the fun out of email and texting, think about a bummer YouTube will soon become as we are required to scan each and verify incoming snippet of audio and video. 

Stay Tuned.  More to come, I am sure.

Now the time has come (Time)
There's no place to run (Time)
I might get burned up by the sun (Time)
But I had my fun (Time)
— Time Has Come Today, The Chamber Brothers

¯\_(ツ)_/¯
Gerald Reiff
Back to Top previous post next post