I personally don’t see what advantages Python as a language (not an ecosystem) would have here.
But antononcube has thicker skin than me so..
For me, it is a mystery how programmers "decide" as a group that a new tool or language is better than the established & familiar ones. But I would love to see more folks open to try new tools like Raku that could make their lives easier and more fun.
Another point, which could have mentioned in my previous response -- Raku has more elegant and easy to use asynchronous computations framework.
IMO, Python's introspection matches that Raku's introspection.
Some argue that Python's LLM packages are more and better than Raku's. I agree on the "more" part. I am not sure about the "better" part:
- Generally speaking, different people prefer decomposing computations in a different way. - When few years ago I re-implemented Raku's LLM packages in Python, Python did not have equally convenient packages.
WL's LLMGraph is more developed and productized, but Raku's "LLM::Graph" is catching up.
I would like to say that "LLM::Graph" was relatively easy to program because of Raku's introspection, wrappers, asynchronous features, and pre-existing LLM functionalities packages. As a consequence the code of "LLM::Graph" is short.
Wolfram Language does not have that level introspection, but otherwise is likely a better choice mostly for its far greater scope of functionalities. (Mathematics, graphics, computable data, etc.)
In principle a corresponding Python "LLMGraph" package can be developed, for comparison purposes. Then the "better choice" question can be answered in a more informed manner. (The Raku packages "LLM::Functions" and "LLM::Prompts" have their corresponding Python packages implemented already.)
antononcube•4mo ago
"LLM::Graph" uses a graph structure to manage dependencies between tasks, where each node represents a computation and edges dictate the flow. Asynchronous behavior is a default feature, with specific options available for control.