Controlled Natural Language
A subset of a natural language (like English) that restricts vocabulary, grammar, or both to eliminate ambiguity and make text easier for machines to process. Think of it as English with guardrails: you can still read it, but the rules prevent the kind of vagueness that trips up automated systems. Examples range from Simplified Technical English (used in aerospace maintenance manuals since the 1980s) to modern systems that compress verbose prose into dense, machine-optimized strings.
The tradeoff is expressiveness for precision. Natural language is rich, ambiguous, and context-dependent—qualities that make it wonderful for novels and terrible for machine processing. A controlled natural language trades some of that richness for deterministic interpretability. The text says exactly what it means, and a parser can prove it.
Why it matters for writers: Controlled natural languages show up in AI pipelines as compression tools: transform wordy content into token-efficient representations that preserve meaning while reducing cost. Haiku Protocol is one example, compressing document content into dense strings that fit more information into smaller context windows. For writers, understanding CNL means understanding that there's a spectrum between "writing for humans" and "writing for machines"—and that the interesting work happens in the middle, where both can read the output.
Related terms: Token · Context Window · Prompt Engineering