tbh, models in pipeline are cheaper if local is comparable only to warm water is nice and relaxing. The cursor case is a bit different, but it is because cursor cannot be profitable while competing with their providers and it is not clear yet if they will survive at all or the kimi model will prove itself as a good competition.
operatingthetan•53m ago
I'm using minimax m2.7 and it's good enough. What I'd like to understand is how these models are so cheap though? Surely it costs them just as much for the compute? Do US-based AI companies have that much overhead?
gostsamo•55m ago