Security
Headlines
HeadlinesLatestCVEs

Headline

Criminals are applying for remote work using deepfake and stolen identities, says FBI

The FBI has warned businesses of an uptick in reports of criminals applying for remote work using deepfake and stolen PII. The post Criminals are applying for remote work using deepfake and stolen identities, says FBI appeared first on Malwarebytes Labs.

Malwarebytes
#mac#git#intel#auth

The FBI has warned businesses of an uptick in reports of criminals applying for remote work using deepfake and stolen PII (personally identifiable information).

A deepfake is essentially created or modified media (image, video, or audio), often with the help of artificial intelligence (AI) and machine learning (ML). Deepfake creations are designed to appear and sound as authentic as possible. Because of this, they’re difficult to spot unless you know what to look for.

Years of data breaches made millions of Americans’ identities available for anyone with ill intent to gather and use for personal gains. This time, criminals seem confident about pulling off a scheme that fully intends to sabotage or steal from companies that hire them while keeping their true identities intact.

Armed with compelling synthetic images and videos with legitimate PII, we can imagine criminals likely getting the job before pulling the wool over their employer’s eyes.

Most open positions identified in the report were in the technology field, such as IT (information technology), computer programming, database, and software. The FBI has also noted that some positions criminals are trying to fill would grant them access to PII, financial data, corporate databases, and proprietary information.

Fortunately for organizations, there is a glaring flaw to an otherwise masterful execution of deceit: the deepfakes the criminals use suffer from sync issues.

“Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants. In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

~ FBI, PSA number I-062822-PSA

Misuse of stolen PII is spotted with a pre-employment background check. So even if an interviewer puts the desyncs in a deepfake video down to a dodgy connection, criminals won’t be able to escape findings from a standard background check.

TechCrunch said the most at-risk businesses from criminals entering the job market this way are startups and SaaS (software as a service) companies. This is because these potentially hold lots of data or access to it “but comparatively little security infrastructure compared with the enterprises they serve or are attempting to displace.”

If you’re worried that your data might be used to get criminals into the same sector as you are, there’s not much to do apart from remaining alert and keeping an eye out for strange emails or phone calls.

Stay safe!

Malwarebytes: Latest News

2024 in AI: It’s changed the world, but it’s not all good