Sector-specific leakage norms, EBITDA impact ranges, and time-to-value data pulled from our actual engagements. Anonymized, structured, published on a quarterly cadence.
Why the index exists
Most of what gets published about AI outcomes is either vendor marketing or survey data, and neither is instrumented. Vendors publish case studies that skip the base rate. Surveys publish sentiment that does not touch the income statement.
The Operational Leakage Index is built the other way around. Every data point comes from a real engagement, with a dollarized leakage estimate, a measured intervention, and a time-to-value outcome we tracked ourselves. We anonymize at the client level, aggregate at the sector level, and publish a structured quarterly issue.
What the Q3 2026 issue will contain
- Sector benchmarks. Leakage magnitude and category mix for the core sectors we serve: professional services (legal, accounting, consulting), multi-location operators, healthcare services, distribution and specialty retail, and founder-led commercial operations.
- EBITDA impact ranges. Ranges, not point estimates, with sample size disclosed. We will publish the distribution, not just the median.
- Time-to-value distributions. How long it actually took from kickoff to measurable P&L impact across engagement types. This is the number everyone wants and nobody publishes honestly.
- Failure mode analysis. Where engagements stalled, what caused the stall, and how we fixed it. The thesis requires us to be honest about our own misses.
- Year-over-year deltas. Starting with issue two. What moved. What did not.
How we structure each data point
Every engagement we run is instrumented the same way, which is what makes aggregation honest. The schema behind every number in the index:
- Leakage category. Revenue, labor, process, risk, or decision. (See the Operational Leakage Map.)
- Baseline. The dollarized cost of the current state, measured before any intervention.
- Intervention. What we built, operated, or removed.
- Outcome. The measured change on the income statement, with the measurement window and attribution method disclosed.
- Time-to-value. Days from engagement kickoff to the first measurable P&L signal, and days to stable impact.
- Confidence. How clean the attribution was. Everything reported honestly, including the weak attributions.
What it will not contain
A few things we will deliberately exclude, because including them would make the index worse:
- Vendor comparisons. We do not rank tools. We are tool-agnostic by design.
- Named clients. Every data point is anonymized at the client level. The sector and revenue band appear. The name does not.
- Predictions. We publish what happened, not what we think will happen. The thesis lives in the writing, not in the benchmark.
- Survey data. The index is not a survey. If a number is in the index, it came from work we did.
Sample bias, named
We will also publish what the index cannot do. Our sample is the engagements that reach us, which skews toward:
- Operating businesses, not venture-funded companies.
- Mid-market, not enterprise or micro.
- The southeastern U.S., especially the Atlanta metro and Georgia more broadly, though we work nationally.
- Founder-led firms and professional-services firms, because those are the two groups that self-select into our work.
These biases will be disclosed at the front of every issue. If you want to use the index to calibrate a decision outside our sample, read the disclosure first.
How to receive it
No form. No gate. Email hello@peachstateai.com with "Leakage Index" in the subject line and we will add you to the distribution. The Q3 2026 issue will arrive in your inbox on launch day. No other mail.
If you want the quarterly issue but do not want to be on any other list, that is fine. The distribution is single-purpose.
