Your Name, Your Face, Your Voice: The Creator's Complete Guide to Right of Publicity in 2026

TL;DR

The right of publicity protects you, the person, not your work. Your name, face, voice, and identity. State law patchwork (California, Tennessee, New York) plus federal landscape (TAKE IT DOWN Act passed; NO FAKES Act pending). What every creator should have in place before the next AI voice clone or deepfake surfaces.

Aghil Ebrahimi, Esq.
Licensed in California · Ontario · Quebec~19 min read

Your Name, Your Face, Your Voice: The Creator's Complete Guide to Right of Publicity in 2026

Most creators think about copyright when they think about protecting their work. Copyright protects the things you make: the song, the post, the video, the album. The right of publicity protects something different. It protects you, the person behind the work: your name, your face, your voice, your image, and the persona you have spent years building.

Those are two separate legal tools. You need both.

I am a practicing attorney licensed in California, Ontario, and Quebec. I am also a touring artist with over 500,000 followers who has signed performance contracts, management agreements, and brand deals from both sides of the table. The right of publicity is not an abstract concept in my practice. It is the question my clients ask me the week after something goes wrong: a voice clone surfaced on a streaming platform, a deepfake ran in an ad, a brand used footage from a fan video without consent. This guide explains what the right of publicity covers, where the law stands in 2026, and what you can do before the incident happens rather than after.


What the right of publicity actually protects

The right of publicity gives you the legal authority to control the commercial use of your identity. The specific elements protected vary slightly by state, but the core list is consistent: your name, your voice, your signature, your photograph, and your likeness.

California's statutory version, Civil Code Section 3344, names them explicitly: "name, voice, signature, photograph, or likeness." New York's Civil Rights Law Section 50 uses "name, portrait, picture, likeness, or voice." Tennessee's Personal Rights Protection Act, which was updated in 2024 to specifically address AI voice cloning, covers name, photograph, and likeness.

The right of publicity is a state law right, not a federal one. That distinction matters enormously, and we will get to it.

The right does not protect your creative output, which is copyright's job. It protects your identity: the thing that is distinctly and recognizably you, regardless of what you have created or whether you have registered anything with the government.


Three tools. Three different jobs.

Copyright protects original creative works: songs, videos, photographs, written content. It arises automatically when you fix an original work in a tangible medium. It gives you the exclusive right to reproduce, distribute, and display that work. It does not protect your voice as a sound, your face as a physical fact, or your name as an identifier.

Trademark protects brand identifiers used in commerce: your artist name, your logo, a signature phrase, even a distinctive sound mark. It requires either use in commerce (for common law protection) or registration with the USPTO (for statutory protection). A federal trademark registration gives you access to federal courts under the Lanham Act. It does not protect the underlying person behind the brand.

Right of publicity protects the person. No registration required. No creative output required. The protection exists because you exist, and because your identity has commercial value.

A complete protection stack uses all three. Copyright covers the work. Trademark covers the brand. The right of publicity covers you.

In the AI context, these tools overlap but do not duplicate each other. If someone creates an AI clone of your voice to sell a product, that implicates your right of publicity. If they train an AI model on your recordings without permission, that implicates copyright. If the AI output is used in a way that falsely implies your endorsement, that can also implicate the Lanham Act's false endorsement doctrine under 15 U.S.C. § 1125(a). The three claims often coexist.


California: the anchor state

California has the most developed and most litigated right of publicity law in the country, for reasons that are not complicated: the entertainment industry is here.

The statutory right: Civil Code Section 3344. Any person who knowingly uses another's name, voice, signature, photograph, or likeness for advertising, selling, or soliciting without prior consent is liable. The damages floor is the greater of $750 or actual damages, plus any profits attributable to the unauthorized use, plus attorney fees and costs. Punitive damages are available. Injunctive relief is available and, once granted, requires removal within two business days. The statute was last amended by SB 683, effective January 1, 2026.

The common law right. California also recognizes a broader common law right of publicity alongside the statute, established by Eastwood v. Superior Court, 149 Cal. App. 3d 409 (1983). The common law version does not require a "knowing" use and may reach circumstances the statute does not. A claim can be brought under both theories simultaneously.

Posthumous rights: Civil Code Section 3344.1. California protects a deceased personality's right of publicity for 70 years after death. These rights are property rights that can be transferred by contract, trust, or will. They belong to heirs and estates and must be registered with the California Secretary of State before damages can be recovered. Section 3344.1 was amended by AB 1836, effective January 1, 2025, to add specific digital replica protections for deceased personalities. Under the updated statute, unauthorized digital replicas of a deceased personality carry a minimum damages floor of $10,000, higher than the $750 floor for living persons under Section 3344.

What it does not cover. The right of publicity does not cover uses in connection with news, public affairs, sports reporting, or political campaigns. It does not cover uses that are so tangentially commercial that the connection to advertising or selling is not direct. And it must be balanced against First Amendment interests. Commentary, criticism, satire, and parody on matters of public concern have been recognized as protected expression even when they involve an identifiable person.


The state law patchwork

The right of publicity is a creature of state law. No federal right exists. Roughly half of US states recognize it in some form, either by statute, common law, or both. The other half provide limited or no protection.

Tennessee has the strongest purpose-built protection for recording artists and performers. Tennessee's Personal Rights Protection Act (Title 47, Chapter 25, Part 11 of the Tennessee Code) protects individuals' property rights in the commercial use of their name, photograph, and likeness. The Act was updated in 2024 with the addition of the ELVIS Act (Ensuring Likeness, Voice, and Image Security Act), which explicitly names AI voice cloning as a prohibited use. Tennessee also provides indefinite posthumous protection as long as rights are actively being exploited, making it the strongest post-mortem framework currently available in any US state. For recording artists and performers specifically, Tennessee's protections are the most purpose-built for the AI era.

New York recognizes the right of publicity under Civil Rights Law Section 50 and Section 51. Section 50 makes commercial use of a living person's name, portrait, picture, likeness, or voice without consent a misdemeanor. Section 51 provides the civil cause of action for injunctive relief and damages. New York's statutory protection applies only to living persons. New York does not have a robust posthumous right of publicity, a gap that has become more consequential as AI tools allow estates to have genuine commercial concerns about unauthorized use of deceased performers' likenesses.

Texas, Indiana, Florida all have statutory right of publicity laws with varying scopes and terms. Most states with strong statutory frameworks focus primarily on living persons and commercial use.

States without strong protection leave creators relying on common law misappropriation claims, which are narrower, less predictable, and harder to enforce than statutory rights. A creator harmed by an AI voice clone in a state without a strong right of publicity statute is in a materially worse legal position than one harmed in California or Tennessee.

The practical implication: where you are based, where the harm occurs, and which state's law governs your contracts all matter. These are not academic distinctions.


Where Ontario and Quebec stand

StarGuard Law is licensed in three jurisdictions. The right of publicity does not map cleanly onto Canadian law, but creators and artists working in Ontario and Quebec are not without recourse.

Ontario does not have a standalone statutory right of publicity. Protection comes through common law. Ontario courts have recognized the tort of appropriation of personality, established in cases including Krouse v. Chrysler Canada Ltd. (1973, Ontario Court of Appeal) and further developed in Athans v. Canadian Adventure Camps (1977). The tort allows a claim when a person's identity is appropriated for commercial purposes without consent. It is narrower than California's statutory right and lacks the same damages floor and attorney fees provisions, but it exists and has been applied.

Quebec is a civil law jurisdiction. The Civil Code of Québec expressly protects personality rights. Article 3 establishes that every person holds personality rights, including the right to respect of one's name, reputation, and privacy. Article 35 recognizes the right to privacy. Article 36 provides a non-exhaustive list of specific privacy violations, including using a person's name, image, likeness, or voice for purposes other than the legitimate information of the public. Quebec's Charter of Human Rights and Freedoms, Article 5, also protects privacy. A Quebec creator whose voice or likeness is used commercially without consent has a cognizable civil law claim even in the absence of a California-style right of publicity statute.

The Canadian dimension matters in practice. Cross-border brand deals, streaming releases, and talent agreements that involve parties in Ontario or Quebec may benefit from Canadian law protections in addition to US state law claims, depending on the governing law clause and where the harm occurs.


The federal landscape in 2026

At the federal level, two relevant legislative developments mark the current moment. One became law. One is still pending.

The TAKE IT DOWN Act (Public Law 119-12). Signed into law May 19, 2025. This statute prohibits the non-consensual online publication of intimate visual depictions, including AI-generated deepfakes, and requires covered platforms to remove reported content within 48 hours of notification. Criminal penalties and mandatory restitution apply to violators. This is meaningful protection for victims of non-consensual intimate imagery. It is not protection against commercial AI voice cloning, brand impersonation, or the unauthorized use of a creator's identity to sell a product. The TAKE IT DOWN Act addresses a specific category of harm. It does not address the broader commercial identity exploitation that the right of publicity covers.

The NO FAKES Act (S.1367). Introduced April 9, 2025 by Senator Coons (D-DE). Currently referred to the Senate Judiciary Committee. As of May 2026, this bill has not passed either chamber. It would create a federal property right in an individual's voice and visual likeness, making the production and distribution of unauthorized AI replicas a federal cause of action available to every person regardless of their celebrity or commercial prominence. The bill would not preempt stronger state laws. If it passes, it would close the geographic gap that currently leaves creators in states without strong right of publicity laws with fewer remedies than creators in California or Tennessee. It has not passed.

The US Copyright Office's position. In July 2024, the Copyright Office released Part 1 of its AI report, focused specifically on digital replicas. The Office explicitly concluded that "a new law is needed" and that "the speed, precision, and scale of AI-created digital replicas calls for prompt federal action." The report cited the "Fake Drake" incident as an early example: an AI-generated song using cloned vocals of two prominent artists attracted over 15 million social media views before either artist was aware it existed. The Copyright Office recommended a federal statute covering all individuals, not just celebrities, focused on the distribution and making-available of unauthorized digital replicas rather than their creation alone.

The gap between the USCO's July 2024 recommendation and the NO FAKES Act's status as an unpassed bill as of May 2026 is the current state of federal law on this question.


What "commercial use" means, and the First Amendment limit

The right of publicity does not grant a blanket veto over every use of your identity. It applies to commercial uses: advertising, selling, soliciting purchases, merchandise. A brand using your likeness to sell a product without consent is a textbook violation. An AI company using your voice to generate a song and selling access to the output is in disputed but legally actionable territory.

The First Amendment limits the right's reach on the other end. Commentary, criticism, satire, and parody are constitutionally protected forms of expression, even when they involve an identifiable person's name or likeness. A political cartoon, a satirical sketch, a news story, a documentary: these generally fall outside the right of publicity's scope. Courts balance the commercial purpose of the use against the expressive value of the work.

The line can be genuinely difficult to draw in individual cases. A brand running an ad that uses a creator's AI-cloned voice to sell a product is clearly commercial. An AI-generated comedy video that uses a recognizable voice to parody a public figure occupies more contested ground. Where the use sits on that spectrum is a legal analysis, not a bright line.


The transactional layer: your strongest defense is the contract

The right of publicity is a legal tool you use after harm has already occurred. The contract is the legal tool you use to prevent the harm from occurring in the first place, or at least to ensure that if it does, you have clear grounds to act.

Every agreement that gives another party access to your voice, your likeness, or your image should include explicit terms addressing AI. The categories to address:

Scope of use. The license should specify exactly what the grantee can do with your voice, image, or likeness. Platform, medium, territory, duration. Anything outside those defined parameters requires your written consent.

AI training data prohibition. The agreement should explicitly prohibit using any recordings, images, or footage of you to train any AI model, generate any synthetic output, or create any digital replica, without a separate written consent for that specific use.

Cloning and digital replica prohibition. The agreement should state that no AI-generated replica, voice clone, deepfake, or synthetic likeness may be created from materials provided under the agreement without express written consent.

Post-term reversion. Licenses should expire. If the agreement grants any ongoing rights to your voice or likeness, define when those rights terminate. Perpetual, irrevocable licenses are the version to negotiate against.

Remedies language. Include a clause specifying that unauthorized use of your identity or likeness constitutes a material breach and entitles you to specific performance (removal of the content) as well as damages.

This kind of contract architecture does not eliminate the risk of unauthorized use. It documents that the use was unauthorized, which simplifies both the demand letter and any subsequent legal action. It also creates a contractual remedy that is often faster and more certain than pursuing a statutory right of publicity claim.

The agreements where this language matters most: talent releases, brand deal agreements, management agreements, touring contracts (which increasingly include livestream and recording provisions), NIL deals, and any agreement that involves an AI company's access to your recordings or footage.


What you can do right now

You do not need to wait for a harm to start building your protection.

Audit your existing agreements. Review any management, brand deal, or talent agreement you have signed in the past three years. Look for AI carve-out language. If it is absent, the absence is information you can use when the agreement comes up for renewal or renegotiation.

Include AI language in every new agreement. Every agreement that involves your voice, face, image, or recordings should include an explicit AI prohibition or a clearly scoped AI consent clause. This is not complex. It is a few sentences. Your attorney can add it as a standard addendum.

Document your identity assets. If your voice, likeness, or name has commercial value, document it. This includes maintaining records of how you use these assets commercially, registering trademarks for distinctive marks where warranted, and keeping copies of all agreements that involve your identity.

Register trademarks for commercially valuable identity elements. If you have a signature phrase, a distinctive sound mark, or a name you use commercially, federal trademark registration adds a layer of protection. It gives you Lanham Act access to federal courts and creates a presumption of validity that a state right of publicity claim alone does not. This is not a substitute for the right of publicity. It is a complement to it.

Know which jurisdiction's law applies to you. If you are based in California, you have strong statutory protection. If you are in a state without a robust right of publicity statute, your protection depends more heavily on contract and common law. The geographic variation matters for how you structure agreements and how quickly you need to move if something goes wrong.


When to contact an attorney

Some situations call for immediate legal intervention:

  • An AI-generated clone of your voice is circulating on streaming platforms or in advertisements
  • A deepfake video of you is being used commercially
  • A brand or company has used your likeness in marketing without your consent
  • An agreement you are about to sign includes an AI use provision you do not fully understand
  • A talent agreement, management contract, or brand deal lacks AI carve-out language and involves significant monetary value

For situations involving active misuse, time matters. AI-generated content spreads across platforms quickly, and evidence disappears. If you discover unauthorized use, preserve the evidence before contacting the source of the infringement directly. Screenshots, URLs, timestamps, and any identifying information about the underlying account all have evidentiary value.

For situations involving contracts about to be signed, the review should happen before you sign. A contract that has already been executed can be renegotiated or challenged, but the leverage is lower and the path is longer.


Frequently asked questions

Is AI voice cloning illegal?

AI voice cloning is illegal when used commercially without consent under the right of publicity laws in California, Tennessee, New York, and other states. California Civil Code § 3344 names voice explicitly as a protected element. Tennessee's ELVIS Act, effective July 1, 2024, specifically targets unauthorized AI voice cloning. There is no general federal ban yet. Non-commercial parody, news, and commentary are typically protected by the First Amendment. The federal TAKE IT DOWN Act covers AI-generated intimate imagery only, not general commercial voice cloning.

Does California protect me from AI voice cloning?

Yes. California Civil Code § 3344 protects your name, voice, signature, photograph, and likeness from unauthorized commercial use, with damages starting at $750 plus profits. The statute also awards attorney fees and allows punitive damages. California separately recognizes a parallel common-law right under Eastwood v. Superior Court (1983). Posthumous protection runs 70 years after death under section 3344.1, which was amended by AB 1836 effective January 1, 2025 to add specific protection against unauthorized digital replicas of deceased persons.

What should I do if someone made a deepfake of me?

Preserve the evidence first, then file a takedown with the platform hosting the content while you assess your legal options. Capture screenshots, URLs, timestamps, and the source account before reporting. Most major platforms (YouTube, TikTok, Meta, X) have AI-impersonation reporting tools. If the content is intimate, the federal TAKE IT DOWN Act (Public Law 119-12, signed May 19, 2025) requires covered platforms to remove it within 48 hours of notice. For commercial misuse, a demand letter citing your state's right of publicity law is usually the next step.

Can I sue someone for making an AI video of me?

Yes. State right of publicity laws give you a civil claim when your identity is used commercially without consent. California, Tennessee, New York, and roughly 20 other states have statutory rights; the rest rely on common-law misappropriation. Most cases start with a demand letter and platform takedown rather than a lawsuit, because evidence preservation and removal speed matter more than court timelines. If federal claims also apply (Lanham Act false endorsement under 15 U.S.C. § 1125(a)), a federal action becomes available. StarGuard Law handles the demand-letter and contract-architecture side; litigation is referred to outside counsel.

Does the NO FAKES Act protect me yet?

No. The NO FAKES Act (S.1367) was introduced April 9, 2025 and remains in the Senate Judiciary Committee as of May 2026. If passed, it would create a federal property right in voice and visual likeness, available to every person regardless of celebrity status, and would not preempt stronger state laws. Until it passes, your protection comes from state right of publicity laws, the Lanham Act for false endorsement, and the federal TAKE IT DOWN Act for intimate imagery only. Contracts with explicit AI carve-outs remain the strongest preventive layer.


StarGuard Law handles right of publicity matters on the advisory and transactional side: demand letters, multi-statute takedown campaigns, and the consent architecture in your management, brand deal, talent, and AI-use agreements. If you have an active incident or an agreement on the table with identity-rights implications, the strategy call is where the analysis starts: Right of Publicity & AI Likeness Protection.


Aghil is a lawyer licensed in California, Ontario, and Quebec. He is also a touring artist with 500,000+ followers. The content in this article is for informational purposes and does not constitute legal advice. For advice about your specific situation, consult a licensed attorney.

This article is for general information only — not legal advice.

Aghil Ebrahimi, Esq.

About the author

Aghil Ebrahimi, Esq.

Founder of StarGuard Law. Trilingual IP and technology attorney licensed in California, Ontario, and Quebec. Former touring artist and tech founder who now represents creators, founders, and agencies at the intersection of law, technology, and culture.

Work With Me

Think this applies to your situation?

Book a free discovery call