If you can find a way (which this JEP is one way) to make the bulk of the java standard api AOT compiled, then java programs will be faster (much faster).
Also, the JVM is already an engine marvel (java JIT code is fast as hell), but this will make java programs much nimbler.
Saying "java programs will be faster" is perhaps a bit misleading to those who don't know how java works. This will speed up only the first moments of a JVM execution, nothing more. Or, I misread the JEP, in which case I'd owe you one if you can explain what I missed.
As a java developer this will be lightly convenient when developing. We go through JVM warmup a lot more than your average user ever does. Personally I think I'm on the low end (I like debuggers, and I don't use TDD-style "what I work on is dictated by a unit test run and thus I rerun the tests a lot during development". But still it excites me somewhat, so that should mean your average java dev should be excited quite a bit by this.
I am not all that experienced in it, but I gather that lambda-style java deployments (self contained simple apps that run on demand and could in theory be operating on a 'lets boot up a JVM to run this tiny job which won't last more than half a second') have looong ago moved on from actually booting JVMs for every job, such as by using Graal, an existing AOT tool. But if you weren't using those, hoo boy. This gives every java app 'graal level bootup' for as far as I can tell effectively free (a smidge of disk space to store the profile).
For the kinds of java deployments I'm more familiar with (a server that boots as the box boots and stays running until a reboot is needed to update deps or the app itself), this probably won't cause a noticable performance boost.
EDIT: they describe this in their "Alternative" section as future work
nmstoker•14h ago