It could be $98/hour but you're splitting that up between multiple users. You don't run the instance entirely for an hour, you run it for a few seconds 20-50 times in the hour. If you had Claude spitting out tokens for an hour straight you'd run up a crazy bill.
It would be uneconomical to run Llama 3 14B on a bunch of A100s unless you're actually going to be using all that throughput. You can run Llama 3 8B locally no problem at all on regular consumer hardware with good speeds.
joegibbs•5h ago
It would be uneconomical to run Llama 3 14B on a bunch of A100s unless you're actually going to be using all that throughput. You can run Llama 3 8B locally no problem at all on regular consumer hardware with good speeds.