Obey The Rules or Die
Writing rules isn't the hard part. Following them is. How iterative compliance review turns team standards into enforced guarantees.
Your team spent weeks building a comprehensive set of Cursor rules. Communication standards, API patterns, component conventions, testing requirements. The rules are thorough. They cover edge cases. Everyone agrees they're valuable.
Then you review the AI's output and find hardcoded strings where constants should be, abbreviated variable names your communication standards explicitly forbid, and import patterns from a framework you deprecated three months ago. The rules exist. They just weren't followed.
This is the gap between having standards and enforcing them. We call the enforcement pattern "Obey The Rules or Die."
Why Rules Get Missed
AI agents process rules at the start of a session, then get absorbed in the task. The deeper the agent gets into implementation—solving type errors, chasing edge cases, debugging tests—the further it drifts from the rules it loaded at the beginning.
Human developers do the same thing. A senior engineer who authored your coding standards will still violate them when they're deep in a debugging session at 4pm. The standards live in a document. The implementation lives in the moment.
The Core Insight
The Exhaustive Compliance Review
The pattern is a cursor rule itself—one that triggers at a specific moment: when the agent is about to declare work complete. Before saying "done," the agent must discover every rule that applies to what it did, read each one, and verify compliance. The cycle repeats until a full pass finds zero violations.
The Review Process
The critical distinction: the agent decides which rules apply based on what it did, not where the files live. A database optimization rule applies when you write a query—regardless of whether the file is in models/ or handlers/ or utils/. A Pydantic rule applies when you write a Pydantic schema, even in a file that no glob pattern anticipated.
Why the Loop Matters
The Recurring Offender: Hardcoded Strings
Across both frontend and backend codebases, the most frequently missed violation is hardcoded strings that should be constants, enums, or type unions. It's the violation we call out specifically in the rule because AI agents miss it so consistently.
// Frontend if (file.type === 'application/pdf') // Backend if media.resource_type == "image": if channel.source == "asi_one":
// Frontend if (file.type === PDF_MIME_TYPE) // Backend if media.resource_type == Media.IMAGE: if channel.source == Channel.ASI_ONE:
Why does this violation persist? Because the hardcoded string works. The code compiles. Tests pass. The behavior is correct. The violation is about maintainability and consistency—qualities that don't surface as errors in the moment but compound as the codebase grows.
Rules Are Accumulated Team Wisdom
Every rule in a well-maintained rule set exists because someone learned something the hard way. A communication standard about spelling out acronyms exists because a cross-functional team spent time confused by abbreviations that meant different things to different disciplines. A framework pattern exists because a developer introduced a subtle bug by using the wrong import path.
Rules Prevent Repeat Mistakes
Each rule encodes a lesson. Skipping a rule means repeating the mistake that prompted its creation. The compliance review ensures the lesson sticks.
Rules Transfer Context
AI agents start each session with zero project history. Rules are the mechanism for transferring accumulated team knowledge into a fresh context window.
Rules Scale People
A team of three engineers can enforce the standards of a team of thirty when those standards are codified and automatically reviewed.
The "or die" framing is deliberate. It communicates that compliance isn't optional—it's the minimum bar for declaring work complete. A pull request that violates known rules isn't a pull request that needs "minor fixes." It's a pull request that isn't finished yet.
Why Not a Fixed Checklist?
Our first version of this rule was a static table: seven numbered passes, each checking a specific rule file. It worked initially. Then new rules got added. Folder structures changed. The table didn't. We had 40+ rules in the project, and the review was checking seven of them.
We tried replacing the hardcoded list with metadata: use the globs field in each rule's frontmatter to match against changed file paths. Better, but still brittle. A database optimization rule with a glob of app/**/handlers/*.py won't catch a query you wrote in a utility file. Globs encode where rules usually apply, not where they actually apply.
The Breakthrough
This approach is self-maintaining. When someone adds a new rule, it automatically enters scope for future reviews—no index to update, no checklist to expand. The rules themselves are the source of truth, and the AI reads them directly.
Implementing the Rule
The compliance review rule works because of three properties: it's always applied, it triggers at a specific moment, and it demands semantic judgment.
Set It to Always Apply
The rule needs alwaysApply: true in its frontmatter. It can't be a rule the agent loads on demand—by the time the agent thinks to load it, the violations are already committed.
Instruct the Agent to Discover, Not Follow a List
Tell the agent to scan all rule files and assess applicability based on what it did—not based on file paths, globs, or a hardcoded checklist. The instruction is: "For each rule, ask: did I do anything that this rule has an opinion about?" This scales automatically as your rule library grows.
Call Out Your Recurring Offenders
Every team has them. The violations that show up more than any others. Give them their own section with before/after examples. For our team, it's hardcoded strings and wrong database projection strategies. Yours might be missing error handling, inconsistent naming, or skipped tests.
Define the Trigger
The rule activates when the agent is about to say "done," "complete," or "all changes made." This timing is the entire mechanism. The agent must run the review before it can declare completion.
The Compound Effect
Each enforcement cycle makes the next one cheaper. As the agent internalizes patterns across sessions—and as your rule library grows more precise—the number of violations per cycle decreases. The first time you deploy this pattern, the agent might run three full cycles. After a month of consistent use, most reviews complete in a single pass.
The compound effect also applies to the rules themselves. When you catch a violation that no existing rule covers, you write a new rule. The next review cycle checks for that violation automatically. The rule library grows from lived experience, not theoretical best practices.
The Self-Improvement Loop
Beyond Code Quality
"Obey The Rules or Die" sounds like a code quality initiative. It's a culture initiative. The rule embodies a set of deeper principles:
Radical Transparency
Rules are written down, version-controlled, and visible to everyone. There's no tribal knowledge about 'how we do things.' The rules are the documentation of how we do things.
Kaizen (Continuous Improvement)
The iterative review loop is Kaizen applied to code review. Each cycle improves the output. Each new rule improves the next cycle. The process never stops getting better.
Accountability
Saying 'done' means something. It means every relevant rule was checked, every violation was fixed, and the review loop completed cleanly. No exceptions.
When a team operates this way, code reviews become faster. Reviewers spend less time catching basic violations because the compliance review already caught them. Review conversations shift from "you forgot to use the constant here" to "is this the right architectural approach?"—the conversations that matter.
Writing rules is an investment in your team's future. But rules without enforcement are suggestions, and suggestions erode under deadline pressure. The exhaustive compliance review transforms suggestions into guarantees: a structured, repeatable process that ensures every piece of work respects the standards your team agreed on.
Obey the rules. Every one of them. Every time. Or watch the standards you invested in become the documentation nobody reads.
Want to see the rule in action?
Our exhaustive compliance review rule is open and available. If you're building a Cursor rules library for your team, I'm happy to share the full set of rules that power our development workflow.