Good job teaching the sloppy semantics of a scripting language for numerics I guess.
30 years. Numeric came out in 1995, then evolved into NumPy in 2005. https://en.m.wikipedia.org/wiki/NumPy
Almost every AI researcher and AI lab does most of their research work in Python.
It is also possible that your own fitness-for-purpose coefficients are tuned differently than the majority of the field and they've made a sensible choice for their situation.
I'd wager on the latter.
Python is simply the orchestration layer. The heavy lifting is done by low-level libraries used in the repo, written in C++, CUDA, and Rust (e.g., PyTorch’s C++ backend, Flash Attention’s CUDA kernels, FAISS’s SIMD optimizations, or Hugging Face’s Rust-powered tokenizers).
Python’s role is to glue these high-performance components together with minimal overhead, while providing accessibility. Claiming it’s "unsuitable" ignores the entire stack beneath the syntax.
A critique that is like blaming the steering wheel for a car’s engine performance.
Official code repo for the O'Reilly Book - "Hands-On Large Language Models"
No text of the book in there.
relyks•14h ago
saqrais•12h ago
Prerequisites This book assumes that you have some experience programming in Python and are familiar with the fundamentals of machine learning. The focus will be on building a strong intuition rather than deriving mathematical equations. As such, illustrations combined with hands-on examples will drive the examples and learning through this book. This book assumes no prior knowledge of popular deep learning frameworks such as PyTorch or TensorFlow nor any prior knowledge of generative modeling. If you are not familiar with Python, a great place to start is Learn Python, where you will find many tutorials on the basics of the language. To further ease the learning process, we made all the code available on Google Colab, a platform where you can run all of the code without the need to install anything locally.