frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Ask HN: Is token-based pricing making AI harder to use in production?

2•Barathkanna•10h ago
Hi HN,

I’ve noticed a recurring theme in many threads here: AI is powerful, but once you move past demos, token based pricing becomes expensive and hard to reason about.

We ran into this problem ourselves while building AI powered systems. Predicting costs, budgeting usage, and experimenting safely all got harder as workloads grew. So we built a small AI API platform for inference, aimed at early developers and small teams who want to integrate AI without constantly calculating token usage. The focus is on lower and more predictable costs rather than chasing the newest model.

This is still early, and I’m mainly posting to learn from others here. For people running AI in production, what’s been the hardest part to manage so far? Cost, predictability, performance, or something else?

I’d really appreciate any insights or experiences.

Comments

iamrobertismo•9h ago
Not clear what you are pitching, if you don't control the infrastructure or have a major contract, how exactly are you lowering or stabilizing costs. Especially if you are not chasing the newest model, at this point token economics is essentially a commodity. Commodity pricing is not a engineering problem, it is a financing problem.
Barathkanna•9h ago
That’s fair, and I probably didn’t explain it clearly. We’re building an AI API as a service platform aimed at early developers and small teams who want to integrate AI without constantly thinking about tokens at all.

I agree that token economics are basically a commodity today. The problem we’re trying to address isn’t beating the market on raw token prices, but removing the mental and financial overhead of having to model usage, estimate burn, and worry about runaway costs while experimenting or shipping early features. In that sense it’s absolutely an engineering and finance problem combined, and we’re intentionally tackling it at the pricing and API layer rather than pretending the underlying models are unique.

iamrobertismo•5h ago
Would you just be... subsidizing low volume users? I am saying this isn't like a new problem in the grand scheme of things. hopefully I am not being too negative, do you have a site or something to learn more? It's not clear how you can have better token economics to provide me or someone else better token economics, rather than just burning more money lol.
Barathkanna•4h ago
Totally fair question, and you’re not being negative.

We’re not claiming better token economics in the sense of magically cheaper tokens, and we’re not just burning money to subsidize usage indefinitely. You’re right that this isn’t a new problem.

What we’re building is an AI API platform aimed at early developers and small teams who want to integrate AI without constantly reasoning about token math while they’re still experimenting or shipping early features. The value we’re trying to provide is predictability and simplicity, not beating the market on raw token prices. Some amount of cross-subsidy at low volumes is intentional and bounded, because lowering that early friction is the point.

If you want to see what we mean, the site is here: https://oxlo.ai Happy to answer questions or go deeper on how we’re thinking about this.

iamrobertismo•4h ago
Oh you're arbing! I see now. Makes sense, seems like it could be useful if you have a rock solid DX.