I built it because I needed evolutionary optimization across multiple projects (stock prediction, NHL game modeling) and found existing libraries either too heavy on dependencies or too magical. Design goals were: zero dependency bloat (numpy optional for CMA-ES only), full reproducibility via seeding, and clean control over everything.
Some things that might be interesting to HN:
- Built-in diagnostics — every generation reports diversity metrics, stagnation detection, and parameter adjustment recommendations - Agent-steerable — on_generation callbacks can return parameter overrides mid-run, designed for LLM agents to tune hyperparameters on the fly - Landscape analysis — samples your fitness function and recommends which optimizer to use - Flexible worker control — workers=4, workers=-1 (all cores minus one), workers=0 (all cores) - 425 tests, Python 3.10–3.13, MIT licensed
pip install evogine
Happy to answer questions about the design, evolutionary algorithms, or use cases.