Sounds like reflow problems. And like OP is slowly discovering the 200% Problem.
I'm curious about actual metrics with regard to Chrome / Safari's garbage collector overhead. You still don't have a lot of "objects" in the video; when V8 is used server side it handles significantly more objects. (IE, assuming each sword is 1-5 objects.)
Are these engines canvas based, or are they generating HTML? Assuming they are generating HTML, are the elements removed from the screen when you are done with them?
In a lot of garbage collected environments, you still need to call some kind of close / remove method when you are done with some kinds of objects. (In C#, it's "Dispose.")
Only if you don't want to get your hands dirty with the layers you're using.
The trade-off with an abstraction layer is that every layer introduces bugs/decisions which may be a concern. But you can always (if it's open source) tweak the layer(s) below.
Moreover, why not finish the game and make your next project in something new.
The main down side is poor performance if you’re not targeting Safari and Chrome.
Here's Chrome showing 100k objects in 3D, with very decent fps: https://m.youtube.com/watch?v=dKg5H1OtDK8 This should give some idea of what should be achievable.
Sure, but that's kind of my point, modern software is endemic of a mindset that most things are "fast enough" that we can reinvent the wheel, repeat old mistakes, and more or less get away with it, instead of using something better suited and more purpose built.
If creating text, sprite objects, etc results in a performance bottleneck with less than 20 objects on screen, then why offer those in the first place? I mean, so little is happening in that game that you couldn't even come up with enough game objects to tank the performance with.
There's what? A title of the location, the enemy HP indicator, the level number and the player stats. Even if you committed the worst performance sins, it should just work anyway.
As a senior game/graphics programmer looking at the screen caps, I see a game that should never be using more than 1% of the CPU and a few percent of the GPU.
It all depends on how things are organized. If each object needs to run logic every frame, you can start to run into severe performance issues way before you get to 10k. 60fps means you have 16 milliseconds to check everything in the scene. This can be a lot of time but you only have to make one small mistake to lose any advantage.
We have to reach for data oriented techniques like ECS if we want to keep the CPU ~idle while managing 10k+ entities in the scene. This stuff is a big pain in the ass to use if you don't actually need it.
- "When I set a FPS cap it seems to slow my game down to a halt, and I don't really know why, so I'm just going to shrug and disable it and it seems to fix it, so I'm not going to think about it anymore"
- "My game has worse performance on Mac but a library developer says they're fixing it, so I'm just going to shrug and hope it's eventually fixed"
- "The game performs better on Firefox than Chrome/Safari, I don't know why, who knows, moving on .."
- "I have horrible performance rendering text as an entity, so instead of fixing my entity system to be performant I'm going to just modify my game loop for text rendering to be its own thing"
- "I'm going to use pooling to re-use entities and it increases my performance by ~30%" (even though they have only a dozen entities on screen at once, only allocating a max of 1 or 2 objects per frame, and adding pooling for this is somehow a massive performance win?)
What value was I supposed to take from this article?
kg•2mo ago