Threat Library

The attacks LiveLock is designed to stop.

AI impersonation fraud is no longer theoretical. Every scenario below has a documented real-world analogue — a company that lost money, data, or access because they trusted a voice, face, or email without a second channel of verification.

Financial5
System Access6
HR & Legal4
Physical & Identity3

Financial

"Send this wire now" — urgent email from the CFO

Attackers compromise or spoof executive email accounts and create urgency to bypass normal approval processes.

Email Compromise

Vendor calls to update their banking details

A fraudster impersonates a known vendor over phone or email, redirecting future payments to a controlled account.

Impersonation

Real estate attorney sends new closing instructions

Email accounts of attorneys or title companies are compromised, and last-minute wiring instructions are changed.

Wire Fraud

AI-cloned CFO voice authorizes an emergency transfer

Using just minutes of public audio, AI tools can clone an executive's voice to authorize transactions over the phone.

Voice Clone

Deepfake CFO on video call approves a $25M wire

The Arup/Hong Kong incident: a finance worker was deceived by a full video call featuring deepfake recreations of colleagues.

Deepfake Video

System Access

IT manager requests admin credentials over Slack

Attackers compromise Slack or Teams accounts and use them to request privileged credentials from IT staff.

Account Takeover

"Your CEO" approves emergency system access on a video call

Real-time deepfake video tools allow attackers to impersonate executives during live video calls to authorize access.

Deepfake Video

Colleague asks you to share the client database — via text

Compromised phone numbers or messaging accounts are used to request sensitive data exports from trusted colleagues.

Data Exfiltration

Fake IT support requests remote desktop control

Attackers clone the voice of a known IT contact and call employees to request remote access under the guise of support.

Voice Clone

North Korean IT worker hired via deepfake job interview

Documented FBI-warned scheme: state-sponsored actors use AI face-swap during video interviews to gain insider access.

Identity Fraud

Fake employee gains system access for months undetected

Once inside, fraudulent employees exfiltrate data, install backdoors, or sabotage systems over extended periods.

Insider Threat

HR & Legal

Executive voice memo authorizes a new hire or termination

AI-cloned audio of an executive can be used to issue HR directives — hiring, firing, or salary changes — without their knowledge.

Voice Clone

Attorney instructs you to sign and return a contract — by email

Compromised legal counsel email is used to send fraudulent contracts or redirect signed documents to attackers.

Impersonation

Manager approves sensitive personnel data release over phone

A cloned manager voice calls HR to authorize release of employee records, salary data, or personal information.

Deepfake Audio

Fake board member approves a policy change via video

Governance attacks target board-level decisions — using deepfake video to impersonate directors during remote meetings.

Deepfake Video

Physical & Identity

Deepfake face bypasses facial recognition at building entry

AI-generated faces or video loops are used to defeat facial recognition systems at physical access points.

Biometric Fraud

Synthetic identity used to open a business bank account

AI generates fully synthetic identities with realistic documents, photos, and credit histories to open fraudulent accounts.

Identity Synthesis

Cloned voice defeats bank voice authentication

Financial institutions using voice biometrics for authentication are increasingly vulnerable to AI voice cloning attacks.

Voice Clone

AI Video Generation — The Emerging Frontier

EMERGING THREAT

Real-time AI tools can now generate a live video feed of a fake co-worker using only a few photos. A fraudster can appear on a video call as your CFO, CEO, or trusted colleague — giving verbal and visual approval for a wire transfer, policy change, or system access grant — then disappear without a trace.

In 2024, a finance worker at engineering firm Arup paid out $25 million after a video call featuring deepfake recreations of his CFO and other colleagues. The scam was only discovered weeks later. In 2025, Singapore saw a $499K deepfake CEO video scam. These are no longer edge cases.

→ LiveLock's out-of-band challenge cannot be intercepted by a video feed. The one-time word only appears on the real person's registered, device-bound app — not on any screen an attacker can see.

See how LiveLock stops these attacks in real time.