

And thereby lock it away from the underserved communities that need it most? Naw. Open source publishing is the way forward for a truly egalitarian system, which is what I’m aiming for


And thereby lock it away from the underserved communities that need it most? Naw. Open source publishing is the way forward for a truly egalitarian system, which is what I’m aiming for


Yup. Already working on a suite of local pipeline apps and an orchestration platform for this. Happy to share if interested! Source


The pricing question assumes the current model (cloud inference, centralized compute, hyperscaler margins) is the only model.
Local inference flips that math entirely. If the model runs on your hardware, the marginal cost to the provider is close to zero. The pricing problem is a distribution problem, not a compute problem.
What I think actually happens: cloud AI settles at $20-50/month for power users who need the latest frontier models and don’t want to manage hardware. That’s sustainable. The “free tier” disappears or gets severely throttled.
But for a large chunk of use cases (summarization, classification, drafting, local assistants) models small enough to run on a consumer GPU are already good enough. That market doesn’t need to pay $50/month to Anthropic. It needs a good local runner and a one-time hardware investment.
The companies that will survive the pricing correction are the ones who either have genuinely differentiated frontier capability, or who make local deployment easy enough that users own their own stack.
Ah well the trouble is software patents can cost upwards of 5 figures, so yeah if I start making money I might do that, but it’s definitely not within my capacity for now, thus public publishing for copyright establishment