User flows and related documentation tell developers how users move through an application. Traditionally, this means Figma mockups, annotated screenshots, or whiteboard sketches. These artefacts work reasonably well for human developers who can infer intent from visual cues. AI coding assistants cannot see your whiteboard. They cannot extract meaning from a screenshot without descriptive captions.
Here is how user flow documentation differs when written for humans versus written for AI agents.
The "Standard" Way (High Specification Debt)
Most teams document user flows visually. A designer creates mockups in Figma, adds arrows between screens, and shares a link in the project brief. This approach feels efficient because it leverages visual processing that humans excel at. But it creates a translation bottleneck when working with AI agents.
See attached Figma file: checkout-flow-v3.fig
The user goes from the cart to the shipping page, then
to payment, then to confirmation. There's a back button
on each step. The design team has added notes in the
Figma comments for edge cases.
of real-world software issues are rejected from AI benchmarking datasets due to underspecification. Visual-only documentation without text equivalents is a primary culprit.
Why This Fails for AI
Let us be honest about what happens when you ask an AI agent to implement a feature based on a Figma link. The AI cannot access your Figma account. Even if it could, it would see pixel data, not semantic structure. It has no way to know that "the blue button on the right" means "proceed to payment" versus "apply discount code."
The phrase "the design team has added notes in the Figma comments" is particularly problematic. This is tribal knowledge that the AI cannot access. You might as well tell the AI to ask your colleague Susan what she meant by the third arrow from the left.
The result? The AI guesses. It implements a checkout flow based on its training data, which may or may not match your actual requirements. You review the code, find mismatches, and iterate. This cycle repeats until you have manually transcribed every visual detail into text that the AI can process.
The 4ge Way (AI-Ready)
A proper user flow specification replaces visual ambiguity with explicit navigation states and transition rules. It describes the flow in terms the AI can parse and verify.
## User Flow: Checkout Process
### Entry Points
* User initiates from shopping cart (/cart)
* Deep link from promotional email (/checkout?promo=CODE)
### Flow States
**State 1: Shipping Information**
* Route: /checkout/shipping
* Required fields: full_name, address_line_1, city, postcode, country
* Validation: UK postcode format, country from allowed list
* Transitions:
- "Continue to Payment" → State 2 (validate all fields first)
- "Back" → /cart (preserve entered data in session)
**State 2: Payment Method**
* Route: /checkout/payment
* Display: Order summary (items, quantities, subtotal, shipping, total)
* Payment options: card, PayPal, Apple Pay
* Card form: Stripe Elements with SCA handling
* Transitions:
- "Pay Now" → State 3 (process payment, create order)
- "Back" → State 1 (preserve payment selection)
**State 3: Order Confirmation**
* Route: /checkout/confirmation?order_id={id}
* Display: Order number, estimated delivery, receipt download
* Actions: "Continue Shopping" → /products
* No back navigation (completed order)
### Error States
* Payment declined: Display error message, remain in State 2
* Session expired: Redirect to /cart with toast notification
* Inventory changed: Redirect to /cart with updated quantities
### Navigation Rules
* Browser back button: Respects state transitions above
* Direct URL access: State 2 and 3 require valid session, else /cart
* Refresh: Preserves current state, does not resubmit payment
Why This Works Better for AI
The structured specification opens with entry points, immediately grounding the AI in how users reach this flow. No more guessing whether the checkout can be accessed from the product page or only from the cart.
Each state is explicitly defined with route, fields, validation rules, and transitions. The AI knows exactly what "Continue to Payment" means in concrete terms: validate all shipping fields, then route to /checkout/payment. No visual interpretation required.
The error states section is crucial. Visual mockups often show the "happy path" and leave edge cases implicit. An AI agent will implement the happy path and ignore error handling unless you explicitly specify it. By documenting payment decline, session expiry, and inventory changes, you prevent production bugs that would otherwise emerge during user testing.
The Mermaid.js Alternative
For flows with complex branching, Mermaid.js diagrams provide a code-based visual representation that AI agents can parse. The diagram becomes both human-readable flowchart and machine-readable specification.
```mermaid
stateDiagram-v2
[*] --> Shipping
Shipping --> Payment: Valid shipping
Shipping --> Shipping: Validation error
Payment --> Confirmation: Payment success
Payment --> Payment: Card declined
Payment --> Shipping: Back
Confirmation --> [*]
The AI can read this diagram directly, understanding state transitions without ambiguity. You can generate Mermaid from your 4ge specification, or write it manually for complex flows. Either way, you have a single source of truth that serves both human developers and AI agents.
Related Examples
- PRD Examples for product-level requirements
- Acceptance Criteria Examples for feature-level test cases
- Development Brief Examples for token-optimised task breakdowns