Skip to main content

Content Signals

A proposed standard by Google that would allow publishers to express preferences about how AI systems use their content (more granular than robots.txt but different in scope from llms.txt. Where llms.txt says "here's what's on my site," Content Signals would say "here's what you're allowed to do with it") train on it, cite it, summarize it, display it in AI-generated answers.

Why it matters for writers: Content Signals represents one possible future for how publishers control AI's use of their work. It competes with (and partially overlaps with) CC Signals and IETF aipref, multiple standards trying to solve related problems in incompatible ways. Standards fragmentation is the technical term. "Too many cooks" is the accessible one.

Related terms: llms.txt · CC Signals · IETF aipref · robots.txt