frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

Show HN: GravOptAdaptive – Drop-In PyTorch Optimizer, 25% Faster Training

https://drereg.gumroad.com/l/joehz
1•DREDREG•4mo ago
Introducing GravOptAdaptive — a new version of GravOpt that:https://lnkd.in/dw78avM6 7 day trial, Drop-in replacement for Adam/SGD . There is no need to change the model, data, loss function, or pipeline. → The code continues to work — but it is trained faster (in the paid version) or with a trial limitation (in the free version).

Analyzes the behavior of each parameter separately Gives a larger update only when necessary Keeps the stability of parameters that are already moving well Improves accuracy — without increasing the number of iterations