When to Use
Use this checklist when:
- Approving an AI-related capital investment
- Reviewing requests for additional funding
- Evaluating stalled or ambiguous initiatives
- Comparing competing AI proposals for the same business objective
If multiple questions cannot be answered clearly, pause is a valid decision.
Part I
Before Capital Commitment (Go / No-Go)
A. Problem Definition & Business Intent
If the problem is unclear, the solution is irrelevant.
| Question | Director's Lens |
|---|---|
| 1. What specific business outcome will improve if this works? | Avoids funding technical ambition |
| 2. What happens if this AI component does not work at all? | Reveals downside containment |
| 3. What is the baseline performance today, without AI? | Prevents improvement illusion |
| 4. How will success be judged in business terms, not technical metrics? | Aligns incentives |
| 5. What outcome would justify stopping the initiative early? | Defines walk-away threshold |
Director signal: If answers rely on optimism, narratives, or future clarity → No-Go or defer.
B. Evidence & Learnability
Not all problems are learnable—many only appear so.
| Question | Director's Lens |
|---|---|
| 6. What evidence shows this problem is learnable from available data? | Prevents assumption-based investment |
| 7. How stable is the relationship between inputs and outcomes? | Reveals non-stationarity risk |
| 8. How will evaluation distinguish true learning from overfitting? | Tests analytical rigor |
| 9. What specifically differs between the training environment and production? | Exposes simulation gap |
| 10. How was the data validated for quality and relevance? | Prevents foundation failures |
Director signal: Vague data plans are capital risk.
C. Governance & Control
Ambiguity in governance is a leading indicator of loss.
| Question | Director's Lens |
|---|---|
| 11. Who has explicit authority to pause or stop deployment? | Prevents escalation bias |
| 12. What specific indicators trigger intervention or review? | Removes ambiguity |
| 13. What is the maximum acceptable loss before reassessment? | Protects capital |
| 14. How frequently will independent review occur? | Counters internal bias |
| 15. How will unintended behavior be detected and contained? | Addresses incentive drift |
Director signal: If governance is "to be decided later," risk already exists.
Part II
When Progress Stalls (Continue / Stop)
Early Warning Signals
Two or more warrant a formal Continue / Stop discussion:
- Success criteria keep shifting
- Updates focus on effort, not outcomes
- New data or dependencies repeatedly emerge
- Comparisons avoid baseline or alternatives
- "Almost there" persists across review cycles
Continue Only If ALL Are True
- Root cause of stall is clearly identified
- Original investment thesis remains valid
- Remediation plan is time-bound and testable
- Opportunity cost is explicitly acceptable
- Team capability matches revised challenge
Stop If ANY Are True
- No credible path to business value exists
- The problem has materially changed
- Continuation is driven by sunk cost
- Better risk-adjusted alternatives exist
- Evidence consistently lags confidence
Part III
Board-Level Questions
For Initial Approval
- What is the single assumption that would make this investment fail?
- What must we learn in 90 days to justify continuation?
- How does this compare to non-AI alternatives?
- Who independently validates the evidence presented?
For Ongoing Reviews
- What do we know now that we didn't at approval?
- Has our confidence increased or decreased, and why?
- What would immediately convince us to stop?
- Are we measuring progress or activity?
Part IV
Risk Prioritization (Director View)
Highest Capital Risk
- Incentive misalignment
- Simulation–reality gap
- Evaluation blind spots
- Governance ambiguity
Lower Priority (Often Over-Discussed)
- Model accuracy improvements
- Feature additions
- Infrastructure upgrades
- Team expansion requests
Interpreting Signals
- 0–2 concerns: Proceed with standard oversight
- 3–5 concerns: Tighten review cycles
- 6+ concerns: Independent assessment recommended
Perfect answers are suspicious. Transparency beats confidence.
Derived from documented failure patterns in The 98% Win Rate That Failed
Version: Director Edition v1.0 | December 2025