AI workflows • failure taxonomy • risk scoring • ghost data
A structured failure taxonomy for AI workflow pipelines, focused on the data integration layer existing frameworks miss. Introducing the ghost data failure class — when your tools return valid responses but the data was never actually there.
6-dimension risk scorer ghost data class
AI agents • MCP • Supabase • pgvector
Why building platform-agnostic data layers beats compensating for AI platform gaps. Introducing fds-recall: a portable memory system for AI agents, and the CEO Agent morning briefing that runs on top of it.
platform-agnostic fds-recall
Snowflake • Terraform • AI-native workflow
A config-driven RBAC framework where an AI agent runs the intake interview, you review the output, and Terraform handles the rest. Validated against a live Snowflake account.
52 unit tests passing Terraform validated
Python • Airflow • Cloud Run • HubSpot
Building custom Airflow pipelines vs. using native integrations. Full technical breakdown with architecture diagrams, ROI analysis, and code.
96% latency reduction 90% cost reduction
More posts coming soon.
Pick the right tool for the team you have today. Most startups sign commitments with big names too early. Why? Because it feels like progress. However, complicated tools rarely fix all your problems, they usually create new ones.
Labor costs are usually what lead to a buy decision due to the ongoing maintenance time needed to keep custom solutions running. Is it time to reassess?
My current thesis is that AI tooling does not eliminate tech debt, but makes it easier for smaller teams to manage: faster MVPs, better testing, and automatic PRs for bugs.
A recent client had 52k invalid emails in their CRM (9.4% of the list). That's not just a deliverability problem. It means marketing budget wasted on fake contacts, storage overage fees, and sales time spent on bad leads.
Email validation costs ~$0.001 per check. If it prevents even one wasted marketing campaign, it pays for itself 100x over.
Active research project of mine: paying $2k/mo for Snowflake might be overkill for companies under $10M ARR or internal reporting use cases. DuckDB + MotherDuck gives you enterprise-grade SQL for pennies. You can run the same queries, use the same dbt models, and pay 1/100th the cost.
I'm building a "BI-in-a-Box" reference architecture (dlt + dbt + DuckDB + Evidence) that gives you Snowflake-class analytics without the Snowflake price tag. Coming soon.