The AI Boom Has a Graveyard No One Talks About
A Software Engineer's Perspective - The Hidden Dependency Behind Most AI Products And Why Many Will Fail In The Long Run
Disclaimer: This publication and its authors are not licensed investment professionals. Nothing posted on this blog should be construed as investment advice. Do your own research.
I’ve been rereading The Psychology of Money by Morgan Housel recently, and it’s been hard not to see the parallels with how people are investing in AI right now. Housel’s core point is that most mistakes aren’t about missing information. They’re about how we interpret success after the fact.
That hit a little closer to home because I spend my days working as a software engineer, building and integrating AI products. I’m inside these systems constantly. I see where the power lives, where the costs show up, and how much of what looks like “AI innovation” is actually dependency dressed up as product.
AI investing feels like a textbook case of psychology getting ahead of system reality.
The Part of the AI Story That Feels Too Obvious
Most AI conversations today sound strangely settled. People point to a few winners, talk about adoption curves, and speak as if the market structure was inevitable. Of course these companies won. Of course this is where the value landed.
That sense of inevitability is exactly what Housel warns about. When outcomes feel obvious in hindsight, we stop asking what had to go right and what risks are being quietly ignored.
Most AI Products Don’t Own the Intelligence
This is something that becomes very clear when you actually work with AI systems every day.
Most AI products do not own the intelligence that makes them work. They sit on top of it.
A large share of AI startups are thin layers on top of a small number of foundational providers, most commonly OpenAI. They add a workflow, a UI, some prompt logic, maybe light fine-tuning, and wrap it into something usable. That can still be valuable. But it’s very different from owning the core system.
When people say “AI company,” what they often mean is “company that calls an API and makes it feel nice.”
Dependency Is Not a Phase You Grow Out Of
From a software perspective, dependency always has a cost. In AI, it defines the business.
If you rely on an upstream model provider, you don’t control pricing, inference efficiency, performance improvements, or roadmap direction. Your margins and differentiation are downstream of decisions you don’t make.
That’s not a scaling issue you solve later. That’s the structure you’re building on.
One thing Housel emphasizes is how easy it is to underestimate risks that don’t show up immediately. Dependency feels fine early. It only becomes painful once you’re successful enough for it to matter.
Why So Many AI Companies Stall Instead of Fail
This is why most AI companies don’t fail dramatically.
Early on, costs are falling, credits are generous, and competition is light. Usage grows and everything looks validated. Then scale arrives. Inference costs become very real. Customers start questioning pricing. Features you thought were differentiators turn into baseline expectations as models improve.
From the inside, nothing breaks. From the outside, growth just stops being impressive.
That’s survivorship bias at work. We remember the few that broke through and forget the many that quietly stalled.
The AI Graveyard Is Quiet
Working with AI products daily makes this especially obvious. Most projects don’t collapse. They just hit economic ceilings.
Margins never show up. The product remains useful but not defensible. Eventually the company becomes strategically irrelevant or gets absorbed. These outcomes don’t make headlines, which is why the graveyard is easy to miss.
But it’s full.
The Wrapper Problem Isn’t an Insult, It’s a Constraint
Calling something an AI “wrapper” isn’t an insult. It’s a description of where power sits.
If the underlying model improves, your product gets commoditized. If pricing goes up, your margins compress. If the provider launches something adjacent, your differentiation disappears. Switching costs are lower than they look once customers understand what’s underneath.
From a software standpoint, you’re competing on UX, speed, and distribution, not on system control. That can work for a while. It rarely compounds.
AI Feels Like Software, but Behaves Like Infrastructure
This is where a lot of expectations break.
AI feels like software because you can ship fast and iterate. But once usage grows, it behaves like infrastructure. Hardware efficiency, energy costs, networking, and inference economics dominate outcomes. You can’t refactor your way out of those constraints.
If you don’t sit close to the cost curve, scale amplifies the problem instead of solving it.
Why Infrastructure Owners Sit in a Different Position
This is why companies closer to the metal operate in a different reality.
NVIDIA and large cloud platforms like Amazon Web Services, Microsoft Azure, and Google Cloud sit where costs are set, not just passed through. That doesn’t mean they’re guaranteed great returns. It does mean they’re not structurally dependent in the same way most AI products are.
From an engineering perspective, owning infrastructure is painful. From an investing perspective, not owning it caps upside.
A Simple Test I Keep Coming Back To
One simple mental model I keep using, both as an engineer and as an investor, is this:
What happens to this business if its core AI provider changes pricing or launches a competing feature?
If the answer is “that would really hurt,” then you’re not looking at a moat. You’re looking at a dependency that only feels manageable while conditions are unusually friendly.
Closing Thoughts
AI is real. The technology is powerful. Working with these systems daily makes that very clear.
What The Psychology of Money helped reinforce for me is how much of today’s AI market is shaped by survivorship bias and hidden dependencies. A small group of companies own critical layers of the stack. A much larger group depends on them and hopes the economics work out.
Survivorship bias makes it feel like everyone has a shot.
In reality, most of the graveyard is made up of companies that never failed loudly. They just never escaped the stack they were built on.
The hardest part of AI isn’t building something impressive.
It’s building something that still works when the psychology shifts and the easy phase is over.




Insightful and thought provoking piece, thanks for sharing. Thank you for this new lens where I can now differentiate between AI base products and AI wrappers.