Creative Genius Creative Genius

Voice Cloning in 2025: The Ethics and the Law

ElevenLabs, Resemble, and others have made voice cloning trivial. Here's what's legal.

By Creative Genius · · 7 min read

Cloning a voice now takes 30 seconds and a credit card. The law has not kept pace. Here's the actual legal landscape as of 2025 and the operational discipline that keeps you out of trouble.

Federal — the gap and the closing of it

No specific federal voice-cloning law as of today. The FTC has signaled enforcement under existing fraud and deception authority. The proposed NO FAKES Act would create a federal right of publicity for voice and likeness — watch for movement in the next Congress.

State laws with real teeth

  • Tennessee — ELVIS Act. Criminalizes unauthorized AI voice cloning of artists. Active since 2024, first enforcement actions underway.
  • New York — AB 5605. Requires consent for AI replicas of performers, including audio.
  • California — multiple bills. The state with the most active legislative work on AI replicas. Track in your compliance calendar.
  • Illinois — BIPA precedent. Voice prints are biometric data. Class action exposure is non-trivial.

What about consent on the platform side?

ElevenLabs requires consent verification for "Professional Voice Cloning" but not for "Instant Voice Cloning" with short samples. Resemble has stricter verification. Vendors are not your compliance officer — the legal obligation is on the user of the technology.

The operational discipline

  1. Explicit written consent for every voice clone, signed before any synthesis.
  2. Log every synthesis with timestamp, requester, content category, and use case.
  3. Store consent records for at least 7 years. Treat voice clones like signed contracts.
  4. Watermarking where the platform supports it. Provenance signals reduce downstream fraud exposure.
  5. Right to revoke in your terms — the subject can withdraw consent and require deletion.

Use cases that are clearly fine

Audiobook narration with the author's voice (and consent). Multilingual versions of training videos with the original speaker's voice. IVR systems with employee voices. Internal accessibility (text-to-speech of meeting notes in a chosen voice).

Use cases that will get you sued

Cloning a public figure without consent. "Deepfake" calls of executives requesting wire transfers (already prosecuted as wire fraud). Marketing testimonials in a celebrity's voice. Cloning a deceased person without estate authorization.

Bottom line

The technology is trivial; the legal exposure is real and growing. Treat voice clones as IP-encumbered assets that require contracts, retention, and revocation. Build the process before the regulator builds it for you.

Want this kind of AI clarity for your team?

Creative Genius builds custom AI agents, automation, and data pipelines for ambitious businesses.

Get Started