AI licensing consent standard adds June registry for machine-readable consent

2 hours ago 18
AI licensing consent standard

Hollywood is putting fresh weight behind an AI licensing consent standard designed to answer one of generative AI’s biggest questions: who gets to say yes, no, or pay up when a person’s face, voice, characters, or creative work are used by machines. The new Human Consent Standard arrives with support from major names including George Clooney, Tom Hanks, and Meryl Streep, giving the effort immediate visibility far beyond the usual tech-policy crowd.

That star power matters because the fight over AI use of human-created material is no longer abstract. It now reaches into likeness rights, creative ownership, and the growing tension between AI scraping and permission. In practice, the Human Consent Standard is being pitched as a practical way for rights holders to set terms before their identity or work gets absorbed into AI systems.

Behind it is RSL Media, a nonprofit cofounded by Cate Blanchett. The group is overseeing the standard and tying it to a broader push to make consent readable not just by lawyers and platforms, but by AI systems themselves.

What the Human Consent Standard does

At its core, the Human Consent Standard lets people define the terms for how AI can use a person’s likeness, creative work, characters, and designs.

Those terms can range from full permission to conditional access or outright restriction. In other words, the system is meant to create a machine-readable way for creators and public figures to express whether AI use is allowed, prohibited, or requires permission.

That is the central promise of this AI licensing consent standard: turning a messy rights question into something clearer and easier for systems to recognize.

A registry launching in June is expected to play a key role. According to the details provided, people will be able to verify their identity there and set permissions for the use of their likeness and creative works. AI systems will then check a Human Consent Standard declaration against that registry.

RSL Media says it will translate those permissions into signals AI systems can read.

Who is backing the Human Consent Standard

The Human Consent Standard is entering the market with unusually high-profile support.

Named backers include:

  • George Clooney
  • Viola Davis
  • Tom Hanks
  • Kristen Stewart
  • Steven Soderbergh
  • Meryl Streep

The initiative also has support from organizations including Creative Artists Agency and Music Artists Coalition.

This is one reason the launch stands out. AI rights tools often arrive as technical or legal infrastructure with little public attention. Here, Hollywood talent is helping frame the issue as a mainstream rights problem, not just a niche compliance exercise.

RSL Media, the nonprofit overseeing the standard, was cofounded by Cate Blanchett and Eckart Walther. The organization is also connected to the broader licensing framework behind the project.

How the RSL Standard framework works

The Human Consent Standard builds on the Really Simple Licensing Standard, also known as the RSL Standard, which launched last year. That earlier framework was designed to let websites signal how AI systems may use their content.

The new layer pushes that concept beyond a single web page or URL.

According to Walther, the Human Consent Standard can be discovered through a website’s robots.txt page, the file commonly used to tell web and AI crawlers whether they may scrape content. But unlike the original RSL approach, which usually applies to content at a specific URL, this newer model is meant to apply to the underlying work, identity, character, or mark itself, wherever it appears.

That shift is significant. It means the system is aiming to follow the asset or person, not just the page where the material happens to live.

How AI systems are expected to read the declarations

The mechanics described so far are straightforward on paper:

  • a declaration can be surfaced through robots.txt
  • AI systems can check that declaration against a registry launching in June
  • the registry will allow identity verification and permission-setting
  • RSL Media will translate those permissions into signals AI systems can read

Why the AI licensing consent standard matters for creators

For creators, performers, and rights holders, the appeal is obvious: more direct control over AI likeness rights and use of original work.

The Human Consent Standard is trying to create a common permissions language at a moment when many artists feel AI systems have moved faster than the rules around consent. That helps explain why actors, filmmakers, and industry groups are rallying around it now.

It also matters because the standard is not framed only for celebrities. Blanchett said RSL Media is intended as a practical solution where people everywhere, not just public figures, can assert control over how their work is used by AI. If that promise gains traction, the framework could widen the rights conversation beyond marquee names and into the broader creator economy.

There is also a strategic point here. By building on the Really Simple Licensing Standard, RSL Media is not starting from scratch. It is extending an existing structure that already tried to make AI-use permissions legible on the web. That could make adoption easier for sites and rights holders already familiar with robots.txt-based signaling.

The broader fight over likeness rights and permission

The launch lands as public figures are already looking for ways to guard against unauthorized AI use.

The provided details note that some artists and actors have taken separate protective steps. Matthew McConaughey trademarked clips of himself, while Taylor Swift applied for a trademark covering a photo of herself and two soundbites.

That context helps explain why this new AI licensing consent standard is drawing attention. The industry is looking for a system that does not require every dispute to become a custom legal battle. A standardized permissions layer, if widely used, could give AI companies a clearer way to identify what is open, what is restricted, and what requires a license.

That does not settle the larger debate over AI and ownership. But it does push the conversation toward infrastructure instead of reaction. And with Human Consent Standard declarations set to be checked against a registry in June, the next real test will be whether AI systems and rights holders start treating those signals as part of the basic rules of using creative work online.

Read Entire Article