But for most internet applications (as opposed to "math" stuff) I would think Python is still a better language choice.
however, even this advantage is eaten away somewhat because the models themselves are decent at solving hard integrals.
computation-augmented generation, or CAG.
The key idea of CAG is to inject in real time capabilities from our foundation tool into the stream of content that LLMs generate. In traditional retrieval-augmented generation, or RAG, one is injecting content that has been retrieved from existing documents.
CAG is like an infinite extension of RAG
, in which an infinite amount of content can be generated on the fly—using computation—to feed to an LLM."
We welcome CAG -- to the list of LLM-related technologies!
petcat•1h ago
Aside, I hate the fact that I read posts like these and just subconsciously start counting the em-dashes and the "it's not just [thing], it's [other thing]" phrasing. It makes me think it's just more AI.
llbbdd•1h ago
zamadatix•1h ago
gnatman•1h ago
e.g. https://writings.stephenwolfram.com/2014/07/launching-mathem...
_alaya•1h ago
scoot•1h ago
arjie•1h ago
Somehow I don't think "trying to make my writing look professional" is very high on the priority list.
mr_mitm•1h ago
jacquesm•1h ago
keybored•1h ago
> LLMs don’t—and can’t—do everything. What they do is very impressive—and useful. It’s broad. And in many ways it’s human-like. But it’s not precise. And in the end it’s not about deep computation.
This is a mess. What is the flow here? Two abrupt interrupts (and useful) followed by stubby sentences. Yucky.
written-beyond•1h ago
sdeiley•47m ago