That does force you to duplicate some assets a lot. It's also more important the slower your seeks are. This technique is perfect for disc media, since it has a fixed physical size (so wasting space on it is irrelevant) and slow seeks.
I'd love to see it analysed. Specifically, the average number of nonseq jumps vs overall size of the level. I'm sure you could avoid jumps within megabytes. But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower. This paper effectively says that if you have long files, there's almost guaranteed gaps https://dfrws.org/wp-content/uploads/2021/01/2021_APAC_paper... so at that point, you may be better off preallocating the individual does where eating the cost of switching between them.
Nowadays? No. Even those with hard disks will have lots more RAM and thus disk cache. And you are even guaranteed SSDs on consoles. I think in general no one tries this technique anymore.
But it also depends on how the assets are organized, you can probably group the level specific assets into a sequential section, and maybe shared assets could be somewhat grouped so related assets are sequential.
By default, Windows automatically defragments filesystems weekly if necessary. It can be configured in the "defragment and optimize drives" dialog.
https://web.archive.org/web/20100529025623/http://blogs.tech...
old article on the process
Someone installing a 150GB game sure do have 150GB+ of free space and there would be a lot of continuous free space.
If you break it up into smaller files, those are likely to be allocated all over the disk; plus you'll have delays on reading because windows defender makes opening files slow. If you have a single large file that contains all resources, even if that file is mostly sequential, there will be sections that you don't need, and read ahead cache may work against you, as it will tend to read things you don't need.
Which makes me think: Has there been any advances in disk scheduling in the last decade?
https://www.arrowheadgamestudios.com/2025/10/helldivers-2-te...
But for a mechanical drive, you'll get much better throughput on sequential reads than random reads, even with command queuing. I think earlier discussion showed it wasn't very effective in this case and taking 6x the space for a marginal benefit for the small % of users with mechanical drives isn't worth while...
This does not work if you're doing tons of small IO and you want something fast.
Lets say were on a HDD with 200IOPS and we need to read 3000 small files randomly across the hard drive.
Well, at minimum this is going to take 15's seconds plus any additional seek time.
Now, lets say we zip up those files in a solid archive. You'll read it in half a second. The problem comes in when different levels all need different 3000 files. Then you end deduping a bunch of stuff.
Now, where this typically falls apart for modern game assets is they are getting very large which tends to negate seek times by a lot.
For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.
While it may not have been obvious, I have taken archiving or bundling of assets into a bigger file for granted. The obvious benefit is that the HDD knows that it should store game files continuously. This has nothing to do with file duplication though and is a somewhat irrelevant topic, because it costs nothing and only has benefits.
The asynchronous file IO case for bundled files is even better, since you can just hand over the internal file offsets to the async file IO operations and get all the relevant data in parallel so your only constraint is deciding on an optimal lower bound for the block size, which is high for HDDs and low for SSDs.
>For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.
Here's a random blog post that has benchmarks for a 2015 HDD:
https://davemateer.com/2020/04/19/Disk-performance-CrystalDi...
It shows 1.5MB/s for random 4K performance with high queue depth, which works out to just under 400 IOPS. 1 queue depth (so synchronous) performance is around a third.
As the other user stated, just look up Crystal Disk Info results for both HDDs and SSD and you'll see hard drives do about 1/3rd of a MBPs on random file IO while the same hard drive will do 400MBps on a contiguous read. For things like this reading a zip and decompressing in memory is "typically" (again, you have to test this) orders of magnitude faster.
It's a well known technique but happened to not be useful for their use case.
If the game was ~20GB instead of ~150GB almost no player with the required CPU+GPU+RAM combination would be forced to put it on a HDD instead of a SSD.
Hard drives are much, much faster than optical media - on the order of 80 seeks per second and 300 MB/s sequential versus, like, 4 seeks per second and 60 MB/s sequential (for DVD-ROM).
You still want to load sequential blocks as much as possible, but you can afford to have a few. (Assuming a traditional engine design, no megatextures etc) you probably want to load each texture from a separate file, but you can certainly afford to load a block of grass textures, a block of snow textures, etc. Also throughput is 1000x higher than a PS1 (300 kB/s) so you can presumably afford to skip parts of your sequential runs.
I still don't know but found instead an interesting reddit post were users found and analyzed this "waste of space" three month ago.
https://www.reddit.com/r/Helldivers/comments/1mw3qcx/why_the...
PS: just found it. According to this Steam discussion it does not download the duplicate data and back then it only blew up to ~70 GB.
https://steamcommunity.com/app/553850/discussions/0/43725019...
[0] https://partner.steamgames.com/doc/sdk/uploading#AppStructur...
I'm not an arrowhead employee, but my guess is at some point in the past, they benchmarked it, got a result, and went with it. And that's about all there is to it.
>We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.
they did absolutely zero benchmarking beforehand, just went with industry haresay, and decided to double it just in case.
the "industry hearsay" from two replies above mine is about deliberate data duplication to account for the spinning platters in HDD (which isn't entirely correct, as the team on Helldivers 2 have realized)
This has nothing to do with consoles, and only affects PC builds of the game
So the PS5's SSD architecture was what developers were familiar with when they tried to figure out what changes would be needed to make the game work on PC.
Maybe you're saying the hearsay was Sony exaggerating how bad hard drives are? But they didn't really do that, and the devs would already have experience with hard drives.
If the Helldivers devs were influenced by what Sony said, they must have misinterpreted it and taken away an extremely exaggerated impression of how much on-disk duplication was being used for pre-SSD game development. But Sony did actually say quite a bit of directly relevant stuff on this particular matter when introducing the PS5.
But uh if the devs didn't realize that, I blame them. It's their job to know basics like that.
And on top of any potential confusion between normal SSD and fancy SSD, a mailbox is a super tiny asset and the issue in the spiderman game is very rapidly cycling city blocks in and out of memory. That's so different from helldivers level loading.
Everything else about the PS5 SSD and storage subsystem was mere icing on the cake and/or snake oil.
Do you benchmark every single decision you make on every system on every project you work on? Do you check that redis operation is actually O(1) or do you rely on hearsay. Do you benchmark every single SQL query, every DTO, the overhead of the DI Framework, connection pooler, json serializer, log formatter? Do you ever rely on your own knowledge without verifying the assumptions? Of course you do - you’re human and we have to make some baseline assumptions, and sometimes they’re wrong.
https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence
It was a real issue in the past with hard drives and small media assets. It's still a real issue even with SSDs. HDD/SSD IOPS are still way slower than contiguous reads when you're dealing with a massive amount of files.
At the end of the day it requires testing which requires time at a time you don't have a lot of time.
>wait for the whole list to finish rather than blocking on every tiny file.
And this is the point. I can make a test that shows exactly what's going on here. Make a random file generator that generates 100,000 4k files. Now, write them on hard drive with other data and things going on at the same time. Now in another run of the program have it generate 100,000 4k files and put them in a zip.
Now, read the set of 100k files from disk and at the same time read the 100k files in a zip....
One finishes in less than a second and one takes anywhere from a few seconds to a few minutes depending on your disk speeds.
The Fence is a parable about understanding something that already exists before asking to remove it. If you cannot explain why it exists, you shouldn't ask to remove it.
In this case, it wasn't something that already existed in their game. It was something that they read, then followed (without truly understanding whether it applied to their game), and upon re-testing some time later, realized it wasn't needed and caused detrimental side-effects. So it's not Chesterton's Fence.
You could argue they followed a videogame industry practice to make a new product, which is reasonable. They just didn't question or test their assumptions that they were within the parameters of said industry practice.
I don't think it's a terrible sin, mind you. We all take shortcuts sometimes.
You will be surprised what some people are playing games on. e.g. I know people that still use Windows 7 on a AMD BullDozer rig. Atypical for sure, but not unheard of.
old stuff is common, and doubly so for a lot of the world, which ain't rich and ain't rockin new hardware
Pretending that this is an outrageous decision when the data and the commonly assumed wisdom was that there were still a lot of people using HDDs.
They've since rectified this particular issue and there seems to be more criticism of the company after fixing an issue.
To be fair, the massive install size was probably the least of the problems with the game, it's performance has been atrocious, and when they released for xbox, the update that came with it broke the game entirely for me and was unplayable for a few weeks until they released another update.
In their defense, they seem to have been listening to players and have been slowly but steadily improving things.
Playing Helldivers 2 is a social thing for me where I get together online with some close friends and family a few times a month and we play some helldivers and have a chat, aside from that period where I couldn't play because it was broken, it's been a pretty good experience playing it on Linux; even better since I switched from nvidia to AMD just over a week ago.
I'm glad they reduced the install size and saved me ~130GB, and I only had to download about another 20GB to do it.
> multi minute load times
23Gb / 100mb / 60s = 3.92m
So in the worst case when everything is loaded at once (how on a system with < 32Gb RAM?) it takes 4 minutes.
Considering GTA whatever version could sit for 15 minutes at the loading screen because nobody bothered to check why - the industry could really say not to bother.
Instead they did blindly did extra work and 6x’ed the storage requirement.
>we looked at industry standard values and decided to double them just in case.
it had no serious or glaring impact to their bottom line.
thus it was the right call, and if they didn't bother to fix it they'd still be rolling in $$$$
It will make them a lot of money and is thus the right call. Who cares about customers am I right? They'd still be rolling in $$$$.
They basically just made the numbers up. Wild.
As an aside, I do enjoy the modding community naming over multiple iterations of mods - "better loading" -> "better better loading" -> "best loading" -> "simplified loading" -> "x's simplified loading" -> "y's simplified loading" -> "z's better simplified loading". Where 'better' is often some undisclosed metric based on some untested assumptions.
The wife cuts the end off of the ham before putting it in the oven. The husband, unwise in the ways of cooking, asks her why she does this.
"I don't know", says the wife, "I did it because my mom did it."
So they call the mom. It turns out that her mother did it, so she did too.
The three of them call the grandma and ask "Why did you cut the end off of the ham before cooking it?"
The grandma laughs and says "I cut it off because my pan was too small!"
> The pop-culture cargo cult description, however, takes features of some cargo cults (the occasional runway) and combines this with movie scenes to yield an inaccurate and fictionalized dscription. It may be hard to believe that the description of cargo cults that you see on the internet is mostly wrong, but in the remainder of this article, I will explain this in detail.
FWIW, I meant it strictly in the generic vernacular sense in which I've encountered it: doing something because it has the outward form of something useful or meaningful, without understanding whether or how it works.
Given the problematic history you shared, it seems a new term is needed for this... maybe "Chesterson's Folly"? It's related to Chesterson's Fence (the principle that it's unwise to remove a fence if you don't know why it was erected). If you leave in place all "fences" you don't understand, and never take the time to determine their purpose, fences which serve no good purpose will accumulate.
For their newer instalment, Fatshark went with a large rework of the engine's bundle system, and players on HDDs are complaining about long loading times expectedly. That game is still large at ~80GB, but not from duplication.
[1]: https://www.reddit.com/r/Vermintide/comments/hxkh0x/comment/...
Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware. That will quickly teach them the value of fixing stuff like this and optimising the game in general.
That’s how we wound up with this game where your friends are as much of a liability as your enemies.
Pay 2000$ for indie games so studios could grow up without being beholden to shareholders and we could perhaps get that "perfect" QA,etc.
It's a fucking market economy and people aren't making pong level games that can be simply tuned, you really get what you pay for.
In my last project, the gameplay team played every single day.
> Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware
How would playing on middle of the road hardware have caught this? The fix to this was to benchmark the load time on the absolute bottom end of hardware, with and without the duplicated logic. Which you'd only do once you have a suspicion that it's going to be faster if you change it...
Data-sizes has continued to grow and HDD-seek times haven't gotten better due to physics (even if streaming probably has kept up), the assumption isn't too bad considering history.
It's a good that they actually revisited it _when they had time_ because launching a game, especially a multiplayer one, will run into a lot of breaking bugs and this (while a big one, pun intended) is still by most classifications a lower priority issue.
I don’t know about the Xbox, but on PS4 the hard drive was definitely not fast at all
It was a fundamentally sound default that they revisited. Then they blogged about the relatively surprising difference it happen to make in their particular game. As it turns out the loading is CPU bound anyway, so while the setting is doing it's job, in the context of the final game, it happens to not be the bottle neck.
There's also the movement away from HDD and disc drives in the player base to make that the case as well.
Was it a bad game? Or jankey? What parts of Helldivers are "making sense" now?
You cast spells in a similar way as calling in strategems in hd2.
The spell system was super neat. There’s several different elements (fire, air, water, earth, electricity, ice, ands maybe something else. It’s been a while since I played). Each element can be used on its own or is combinable. Different combinations would cast different spells. Fire+water makes steam for instance. Ice + air is a focused blizzard, etc.
there’s hundreds to learn and that’s your main weapon in the game. There’s even a spell you can cast that will randomly kick someone you’re playing with out of the game.
It’s great fun with friends, but can be annoying to play sometimes. If you try it, go with kb/m. It supports controller, but is way more difficult to build the spells.
Water, Life, Arcane, Shield, Lightning, Cold, Fire, and Earth. [0] It's worth noting that, though you can combine most of the elements to form new spells (and with compounding effects, for example wetting or steaming an enemy enhances lightning damage), you cannot typically combine opposites like lightning/ground, which will instead cancel out. Killed myself many times trying to cast lightning spells while sopping wet.
In my experience, though, nobody used the element names—my friends and I just referred to them by their keybinds. QFASA, anyone?
But the sense of humor/tone between the two games is very visibly the same thing.
I just shared it because it's sort of like learning Slack started life as the internal communication tooling for the online game glitch, or that a lot of the "weird Twitter" folks started life as FYAD posters - once you know that, you can draw the lines between the two points.
I do credit their sense of humor about it though.
When an orbital precision strike reflects off the hull of a factory strider and kills your friend, or eagle one splatters a gunship, or you get ragdolled for like 150m down a huge hill and then a devastator kills you with an impassionate stomp.
Those moments elevate the game and make it so memorable and replayable. It feels like something whacky and new is around every corner. Playing on PS5 I’ve been blessed with hardly any game-breaking bugs or performance issues, but my PC friends have definitely been frustrated at times
In fact, the whole point of their games is that they are coop games where is easy to accidentally kill your allies in hilarious manners. It is the reason for example why to cast stratagems you use complex key sequences, it is intentional so that you can make mistake and cast the wrong thing.
The dialing adds friction to tense situations, which is okay as a mechanic.
Game Freak could not finish the project, so they had to be bailed by Nintendo with an easy-to-program game so the company could get some much needed cash (the Yoshi puzzle game on NES). Then years later, with no end to the game in sight, Game Freak had to stoop to contracting Creatures inc. to finish the game. Since they had no cash, Creatures inc. was paid with a portion of the Pokémon franchise.
Pokémon was a shit show of epic proportions. If it had been an SNES game it would have been canceled and Game Freak would have closed. The low development cost of Game Boy and the long life of the console made Pokémon possible.
As an example for overly realistic physics, projectile damage is affected by projectile velocity, which is affected by weapon velocity. IIRC, at some point whether you were able to destroy some target in two shots of a Quasar Cannon or three shots depended on if you were walking backwards while you were firing, or not.
That sounds like a bug, not an intentional game design choice about the game logic, and definitely unrelated to realism vs not realism. Having either of those as goals would lead to "yeah, bullet velocity goes up when you go backwards" being an intentional mechanic.
That you do less damage if you do a certain movement sounds like fun, emergent gameplay? That's not how I understand either of those terms, but of course, every player likes different things.
Surf maps in CS is actually a good example of an engine bug leading to game designers intentionally use it to design new experience, with the keyword being "intentional" since those map makers actually use that bug intentionally. For me that feels very different from engine bugs that don't add any mechanic, and instead just makes the normal game harder.
The game has semi-regular patches where they seem to fix some things and break others.
The game has a lot of hidden mechanics that isn't obvious from the tutorial e.g. many weapons have different fire modes, fire rates and stealth is an option in the game. The game has a decent community and people friendly for the most part, it also has the "feature" of being able to be played for about 20-40 minutes and you can just put it down again for a bit and come back.
i want to play the game, like now, and i'll read the forums after i figure out that i'm missing something imporant
The missing information also encourages positive interactions among the community - newer players are expected to be missing lots of key information, so teaching them is a natural and encouraged element of gameplay.
I stopped playing the game awhile ago, but the tutorial always struck me as really clever.
People make much more smooth and complex experiences in old engines.
You need to know your engine as a dev and dont cross its limits at the costs of user-experiences and then blame your tools....
The whole story about more data making load times better is utter rubbish. Its a sign of pisspoor resource management and usage. For the game they have, they should have realized a 130GB install is unacceptable. It's not like they have very elaborate environments. A lot of similar textures and structures everywhere.. its not like its some huge unique world like The Witcher or such games...
There is an astronomical amount of information available for free on how to optimise game engines, loads of books, articles, courses.
How much money do you think they have made so far?
"Arrowhead Game Studios' revenue saw a massive surge due to Helldivers 2, reporting around $100 million in turnover and $76 million in profit for the year leading up to mid-2025, significantly increasing its valuation and attracting a 15.75% investment from Tencent"
75 million in profit but can't figure out how to optimise a game engine. get out.
If anything, it's a testament to how good a job they've done making the game.
Is that supposed to be praise?
I played it a bit after release and have 230 hours. I liked the game and it was worth my money.
I have almost 270 hours in Helldivers 2 myself. Like any multiplayer game, it can expand to fill whatever amount of time you want to dedicate to it, and there's always something new to learn or try.
> Like any multiplayer game, it can expand to fill whatever amount of time you want to dedicate to it, and there's always something new to learn or try.
Generally at this point I normally do runs where I go full like gas, stun or full fire builds.
For some reason, though, I tend to compare everything to movie theater tickets. In my head (though it's not true anymore), a movie ticket costs $8 and gives me 1 hour of entertainment. Thus anything that gives me more than 1 hour per $8 is a good deal.
$40 / 4 => $10/hr
$40 / 8 => $5/hr
Thus, I think Helldivers is a good deal for entertainment even if you only play it for under 10 hours.
It's the kind of game where some people spend thousands of hours in, well worth the $40 to them.
A fun game is a fun game.
The fact it is un-optimised can be forgiven because the game has plenty of other positives so people like myself are willing to look over them.
I've got a few hundred hours in the game (I play for maybe an hour in the evening) and for £35 it was well worth the money.
More or less nothing is optimized these days, and game prices and budgets have gone through the roof. Compared to the other games available these days (combined with how fun the game is) I definitely give HD2 a big pass on a lot of stuff. I'm honestly skeptical of Sony's involvement being a benefit, but that's mostly due to my experience regarding their attempts to stuff a PSN ID requirement into HD2 as well as their general handling of their IPs. (Horizon Zero Dawn is not only terrible, but they seem to try to force interest with a new remake on a monthly basis.)
Not true, lots of games are optimized, but it's one of those tasks that almost no one notices when you do it great, but everyone notices when it's missing, so it's really hard to tell by just consuming ("playing") games.
> I'm honestly skeptical of Sony's involvement being a benefit
I'm not, SIE have amazing engineers, probably the best in the industry, and if you have access to those kind of resources, you use it. Meanwhile, I agree that executives at Sony sometimes have no clue, but that doesn't mean SIE helping you with development suddenly has a negative impact on you.
I don't mean this is a counter-argument -- I'm really interested. What are some good examples of very recent optimized games?
Thanks for the list!
Yeah, Unreal Engine (5 almost specifically) is another example of things that are unoptimized by default, very easy to notice, but once you work on it, it becomes invisible and it's not that people suddenly cheer, you just don't hear complaints about it.
It's also one of those platforms where there is a ton of help available from Epic if you really want it, so you can tune the defaults BEFORE you launch your game, but hardly anyone seemingly does that, and then both developers and users blame the engine, instead of blaming the people releasing the game. It's a weird affair all around :)
I'm not sure having the support of Sony is that gold-standard imprint that people think it is.
You never assume something is an optimization or needed and never do hypothetical optimizations
I can see why it would happen in this case though, gamedev is chaotic and you're often really pressed for time
Then I realized you said build systems and eh, whatever. It's not good for build systems to be bloated, but it matters a lot less than the end product being bloated.
And you seem to be complaining about the people that are dealing with these build systems themselves, not inflicting them on other people? Why don't they get to complain?
But that’s all beside the point. What I was really doing was criticizing the <waves hands wildly> HN commenters. HN posters are mostly webdevs because most modern programmers are webdevs. And while I won’t say the file bloat here wasn’t silly, I wonder stand for game dev slander from devs that commit faaaaaaaaaaaaaar greater sins.
> Download bloat is net less impactful than build time bloat imho.
Download bloat is a bad problem for people on slow connections, and there's a lot of people on slow connections. I really dislike when people don't take that into account.
And even worse if they're paying by the gigabyte in a country with bad wireless prices, that's so much money flushed down the drain.
For consoles total disk space is an even bigger constraint than download size. But file size is a factor. Call of Duty is known to be egregious. It’s a much more complex problem than most people realize. Although hopefully not as trivial a fix as Helldivers!
In any case HN has a broadly dismissive attitude towards gamedevs. It is irksome. And deeply misplaced. YMMV.
I'm pretty sure that is in fact what he meant, and that "have build systems" is a typo of "have built systems".
Wow! It looks like I do indeed know better.
I’ve got to say. I do find it somewhat unusual that despite the fact that every HN engineer has John Carmack level focus on craftsmanship, about 1/100k here produce that kind of outcome.
I don’t get it. All of you guys are good at pointing out how to do good engineering. Why don’t you make good things?
This seems great to me. Am I crazy? This feels like it should be Hacker News's bread and butter, articles about "we moved away from Kubernetes/microservices/node.js/serverless/React because we did our own investigation and found that the upsides aren't worth the downsides" tend to do really well here. How is this received so differently?
Really, the different factions in software development are a fascinating topic to explore. Add embedded to the discussion, and you could probably start fights in ways that flat out don't make sense.
The game programming was actually just as research focused and involved as the actual research. They were trying to figure out how to get the lowest latency and consistency for impact sounds.
I mean I agree with you, that it is trendy and seemingly easy, to shit on other people's work, and at this point it seems to be a challenge people take up upon themselves, to criticise something in the most flowery and graphic way as possible, hoping to score those sweet internet points.
Since maybe 6-7 years I stopped reading reviews and opinions about newly launched games completely, the internet audience (and reviewers) are just so far off base compared to my own perspective and experience that it have become less than useless, it's just noise at this point.
also keep in mind that modern gaming generates more revenue than the movie industry, so it's in the interests of several different parties to denigrate or undermine any competing achievement -- "Bots Rule Every Thing Around Me"
Many would consider this a bare minimum rather than something worthy of praise.
The Electron debate isn't about details purism, the Electron debate is about the foundation being a pile of steaming dung.
Electron is fine for prototyping, don't get me wrong. It's an easy and fast way to ship an application, cross-platform, with minimal effort and use (almost) all features a native app can, without things like CORS, permission popups, browser extensions or god knows what else getting in your way.
But it should always be a prototype and eventually be shifted to native applications because in the end, unlike Internet Explorer in its heyday which you could trivially embed as ActiveX and it wouldn't lead to resource gobbling, if you now have ten apps consuming 1GB RAM each just for the Electron base to run, now the user runs out of memory because it's like PHP - nothing is shared.
PWAs have the problem that for every interaction with the "real world" they need browser approval. While that is for a good reason, it also messes with the expectations of the user, and some stuff such as unrestricted access to the file system isn't available to web apps at all.
But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?
It's kind of exemplary of HD2's technical state in general - which is a mix of poor performance and bugs. There was a period where almost every other mission became impossible to complete because it was bugged.
The negativity is frustration boiling over from years of a bad technical state for the game.
I do appreciate them making the right choice now though, of course.
Have you never worked in an organization that made software?
Damn near everything can be 10x as fast and using 1/10th the resources if someone bothered to take the time to find the optimizations. RARE is it that something is even in the same order of magnitude as its optimum implementation.
Given that, people need to accept higher costs, longer development times, or reduced scope if they want better optimized games.
But what is worse, is just trying to optimize software is not the same as successfully optimizing it. So time and money spent on optimization might yield no results because there might not be anymore efficiency to be gained, the person doing the work lacks the technical skill, the gains are part of a tradeoff that cannot be justified, or the person doing the work can't make a change (i.e., a 3rd party library is the problem).
The lack of technical skill is a big one, IMO. I'm personally terrible at optimizing code, but I'm pretty good at building functional software in a short amount of time. We have a person on our team who is really good at it and sometimes he'll come in after me to optimize work that I've done. But he'll spend several multiples of the time I took making it work and hammering out edge cases. Sometimes the savings is worth it.
God why can’t it just be longer development time. I’m sick of the premature fetuses of games.
The reason games are typically released as "fetuses" is because it reduces the financial risk. Much like any product, you want to get it to market as soon as is sensible in order to see if it's worth continuing to spend time and money on it.
Where do you stop? What do the 5 tech designers do while the 2 engine programmers optimise every last byte of network traffic?
> I’m sick of the premature fetuses of games.
Come on, keep this sort of crap off here. Games being janky isn't new - look at old console games and they're basically duct taped together. Go back to Half-life 1 in 1998 - the Xen world is complete and utter trash. Go back farther and you have stuff that's literally unplayable [0], or things that were so bad they literally destroyed an entire industry [1], or rendered the game uncompleteable [2].
[0] https://en.wikipedia.org/wiki/Dr._Jekyll_and_Mr._Hyde_(video... [1] https://www.theguardian.com/film/2015/jan/30/a-golden-shinin... [2] https://www.reddit.com/r/gamecollecting/comments/hv63ad/comm...
One of the highest rated games ever released without devs turning on the "make it faster" button which would have required approximately zero effort and had zero downsides.
This kind of stuff happens because the end result A vs. B doesn't make that much of a difference.
And it's very hard to have a culture of quality that doesn't get overrun by zealots who will bankrupt you while they squeeze the last 0.001% of performance out of your product before releasing. It is very had to have a culture of quality that does the important things and doesn't do the unimportant ones.
The people who obsess with quality go bankrupt and the people who obsess with releasing make money. So that's what we get.
A very fine ability for evaluating quality mixed with pragmatic choice for what and when to spend time on it is rare.
I think this is a little harsh and I’d rephrase the second half to “the people Who obsess with releasing make games”.
Reverting it now though, when the game is out there on a million systems, requires significant investigation to ensure they're not making things significantly worse for anyone, plus a lot of testing to make sure it doesn't outright break stuff.
It's great they did all the work to fix it after the fact, but that doesn't justify why it was worth throwing rocks through the window in the first place (which is different than not doing optimizations).
I don't see why it's a surprise that people react "negatively", in the sense of being mad that (a) Helldivers 2 was intentionally screwing the customers before, and (b) everyone else is still doing it.
That is an extremely disingenuous way to frame the issue.
Helldivers II was also much smaller at launch than it is now. It was almost certainly a good choice at launch.
Then they just didn't reconsider the choice until, well, now.
In fact, I would seriously consider even buying a game that big if I knew beforehand. When a 500Gb SSD is $120 Aussie bucks, that's $37 of storage.
It took Sony's intervention to actually pull back the game into playable state once - resulting in the so called 60 day patch.
Somehow random modders were able to fix some of the most egregiously ignored issues (like an enemy type making no sound) quickly and effectively. ArrowHead ignored, then denied, then used the "gamers bad" tactic, banned people pointing it out. After long time, finally fixing it and trying to bury it in the patch notes too.
They also have been caught straight up lying about changes, most recent one was: "Apparently we didn't touch the Coyote", where they simply buffed enemies resistance to fire, effectively nerfing the gun.
- To their PC not reboot and BSOD (was a case few months ago)
- Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)
- Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)
- Continue to run, even when anybody else from the team was stimming (yes, any person in the team stimming caused others to slow down)
- Actually be able to hear one of the biggest enemies in the game
- To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)
- Be able to use chat, when in the vehicle (this would result in using your primary weapon)
- Be able to finish drill type mission (this bugs out a lot still)
- Not be attacked by enemies that faze through buildings
- Not be attacked by bullets passing through terrain, despite the player bullets being stopped there
are just vocal player's complaints? A lot of those bugs went totally unaddressed for months. Some keep coming back in regressions. Some just are still ongoing. This is only a short list of things I came across, while casually playing. It's a rare sight to have a full OP without an issue (even mission hardlocks still).
About Sony - I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.
A lot of issues are to do with the fact that the game seems to corrupt itself. If I have issues (usually performance related), I do a steam integrity check and I have zero issues afterwards. BTW, I've had to do this on several games now, so this isn't something that is unique to HellDivers. My hardware is good BTW, I check in various utils and the drives are "ok" as far as I can tell.
> - To their PC not reboot and BSOD (was a case few months ago)
This was hyped up by a few big YouTubers. The BSODs was because their PCs were broken. One literally had a burn mark on their processor (a known issue with some boards/processor combos) and the BSODs went away when they replaced their processor. This tells me that there was something wrong with their PC and any game would have caused a BSOD.
So I am extremely sceptical of any claims of BSODs because of a game. What almost is always the case is that the OS or the hardware is at issue and playing a game will trigger the issue.
If you are experiencing BSODs I would make sure your hardware and OS are actually good, because they are probably not. BTW I haven't a BSOD in Windows for about a decade because I don't buy crap hardware.
> - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)
False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.
> Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)
This happened for like about a week for some people and I personally didn't experience this.
> To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)
I've not experience this. Not heard anyone complain about this and I am in like 4 different HellDivers focus'd discord servers
> Not be attacked by enemies that faze through buildings
This can be annoying, but it happens like once in a while. It isn't the end of the world.
Generally speaking, I am too. That is unless there is kernel-level anticheat. In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle
I am sorry but that is asinine and unscientific. You should blame BSODs on what is causing them. I don't like kernel anti-cheat but I will blame the actual cause of the issues, not assign blame on things which I don't approve of.
I am a long time Linux user and many of the people complaining about BSODs on Windows had a broken the OS in one way or another. Some were running weird stuff like 3rd party shell extensions that modify core DLLs, or they had installed every POS shovelware/shareware crap. That isn't Microsoft's fault if you start running an unsupported configuration of the OS.
Similarly. The YouTubers that were most vocal about HellDivers problems did basically no proper investigation other than saying "look it crashed", when it was quite clearly their broken hardware that was the issue. As previously stated their CPU had a burn mark on one of the pins, some AM5 had faults that caused this IIRC. So everything indicated hardware failure being the cause of the BSOD. They still blamed the game, probably because it got them more watch time.
During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.
Well, yes. I did say something to that effect. Blaming BSODs on invasive anti-cheat out of principle is a political position, not a scientific one.
> During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.
I understand what you're saying here, but anyone who does a substantial amount of systems programming could tell you that hardware-dependent behavior is evidence for a hardware problem, but does not necessarily rule out a software bug that only manifests on certain hardware. For example, newer hardware could expose a data race because one path is much faster. Alternatively, a subroutine implemented with new instructions could be incorrect.
Regardless, I don't doubt that this issue with Helldivers 2 was caused by (or at least surfaced by) certain hardware, but that does not change that given such an issue, I would presume the culprit is kernel anticheat until presented strong evidence to the contrary.
When there are actual valid concerns about the anti-cheat, these will be ignored because of people that assigned blame to it when unwarranted. This is why making statements based on your ideology can be problematic.
> I understand what you're saying here, but anyone who does a substantial amount of systems programming could tell you that hardware-dependent behavior is evidence for a hardware problem, but does not necessarily rule out a software bug that only manifests on certain hardware. For example, newer hardware could expose a data race because one path is much faster. Alternatively, a subroutine implemented with new instructions could be incorrect.
People were claiming it was causing hardware damage which is extremely unlikely since both Intel, AMD and most hardware manufacturers have mechanisms which prevent this. This isn't some sort of opaque race condition.
> RI would presume the culprit is kernel anti-cheat until presented strong evidence to the contrary.
You should know that if you making assumptions without evidence that will often lead you astray.
I don't like kernel anti-cheat and would prefer for it not to exist, but making stupid statements based on ideology instead of evidence just makes you look silly.
I was just about to replace my gpu (4090 at that!), I had them 3 times a session. I did sink a lot of hours to debug that (replaced cables, switched PSUs between desktops) and just gave up. After few weeks, lo and behold, a patch comes out and it all disappears.
A lot of people just repeat hearsay about the game
> False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.
I specifically mean the exact time, right after the pelican starts to fly. I keep seeing "<player> left" or "disconnected". Some come back and I have a habit of asking: "Crash?", they respond with "yeah"
The answer to every such claim is just: no. But it's click bait gold to the brain damage outrage YouTuber brigade.
Accidentally using a ton of resources might e reveal weaknesses, but it is absolutely not any software vendors problem that 100% load might reveal your thermal paste application sucked or Nvidia is skimping on cable load balancing.
These guys are running an intensive game on the highest difficulty, while streaming and they probably have a bunch of browser windows and other software running background. Any weakness in the system is going to be revealed.
I had performance issues during that time and I had to restart game every 5 matches. But it takes like a minute to restart the game.
The issue is 1) actually exaggarated in the community, but not without actual substance 2) getting disregarded exactly because of exaggarations. It was a very real thing.
I also happen to have a multi gpu workstation that works flawlessly too
I don't know what you mean. The game literally got 84,000 negative reviews within 24 hours after Sony tried to force PSN on everyone. No bug or missing feature ever came anywhere close to this kind of negative sentiment toward the game.
If it had been more well known this was the cause of game bloat before then this probably would have been better received. Still, Arrowhead deserves more credit both for testing and breaking the norm as well as making it a popular topic.
154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.
Is there a name for the solution to a problem (make size big to help when installed on HDD) in fact being the cause of the problem (game installed on HDD because big) in the first place?
I thought all required ssd's now for "normal" gameplay.
At 200 MB/s the way hard drives usually measure it, you're able to read up to 390,625 512-byte blocks in 1 second, or to put it another way, a block that's immediately available under the head can be read in 2.56 microseconds. On the other hand, at 7200 RPM, it takes up to 8.33 milliseconds to wait for the platter to spin around and reach a random block on the same track. Even if these were the only constraints, sequentially arranging data you know you'll need to have available at the same time cuts latency by a factor of about 3000.
It's much harder to find precise information about the speed of the head arm, but it also usually takes several milliseconds to move from the innermost track to the outermost track or vice versa. In the worst case, this would double the random seek time, since the platter has to spin around again because the head wasn't in position yet. Also, since hard drives are so large nowadays, the file system allocators actually tend to avoid fragmentation upfront, leading to generally having few fragments for large files (YMMV).
So, the latency on a hard drive can be tolerable when optimized for.
You did the math for 7200 rotations per second, not 7200 rotations per minute = 120 rotations per second.
In gaming terms, you get at most one or two disk reads per frame, which effectively means everything has to be carefully prefetched well in advance of being needed. Whereas on a decade-old SATA SSD you get at least dozens of random reads per frame.
I think War Thunder did it the best:
* Minimal client 23 GB
* Full client 64 GB
* Ultra HQ ground models 113 GB
* Ultra HQ aircraft 92 GB
* Full Ultra HQ 131 GB
For example, I will never need anything more than the full client, whereas if I want to play on a laptop, I won't really need more than the minimal client (limited textures and no interiors for planes).The fact that this isn't commonplace in every engine and game out there is crazy. There's no reason why the same approach couldn't also work for DLCs and such. And there's no reason why this couldn't be made easy in every game engine out there (e.g. LOD level 0 goes to HQ content bundle, the lower ones go into the main package). Same for custom packages for like HDDs and such.
https://partner.steamgames.com/doc/sdk/uploading#AppStructur...
The whole industry could benefit from this.
This would defeat the purpose. The goal of the duplication is to place the related data physically close, on the disk. Hard links, removing then replacing, etc, wouldn't preserve the physical spacing of the data, meaning the terrible slow read head has to physically sweep around more.
I think the sane approach would be to have a HDD/SDD switch for the file lookups, with all the references pointing to the same file, for SDD.
Also largely cause devs/publishers honestly just don't really think about it, they've been doing it as long as optical media has been prevalent (early/mid 90s) and for the last few years devs have actually been taking a look and realizing it doesn't make as much sense as it used to, especially if like in this case the majority of the time is spent on runtime generation of, or if they require a 2080 as minimum specs whats the point of optimizing for 1 low end component if most people running it are on high end systems.
Hitman recently (4 years ago) did a similar massive file shrink and mentioned many of the same things.
When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.
People probably feel the same about this, why were they so disrespectful of our space and bandwidth in the first place? But I agree it is very nice that they wrote up the details in this instance.
Software developers of all kinds (not just game publishers) have a long and rich history of treating their users' compute resources as expendable. "Oh, users can just get more memory, it's cheap!" "Oh, xxxGB is such a small hard drive these days, users can get a bigger one!" "Oh, most users have Pentiums by now, we can drop 486 support!" Over and over we've seen companies choose to throw their users under the bus so that they can cheap out on optimizing their product.
It seems no one takes pride in their piracy anymore.
This doesn't even pass the sniff test. The files would just be compressed for distribution and decompressed on download. Pirated games are well known for having "custom" installers.
All Steam downloads are automatically compressed. It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.
Even when Titanfall 2 was released in 2016, I don't think that was meaningfully the case. Audio compression formats have been tuned heavily for efficient playback.
Games also can stack many sounds, so even if the decoding cost is negligible when playing a single sound, it'll be greater if you have 32 sounds playing at once.
I'm not sure what you mean by this. Encoding latency is only relevant when you're dealing with live audio streams - there's no delay inherent to playing back a recorded sound.
> Sounds like gunshots or footsteps are typically short files anyway, so the increased memory usage isn't that painful.
Not all sound effects are short (consider e.g. loops for ambient noise!), and the aggregate file size for uncompressed audio can be substantial across an entire game.
There absolutely is. You can decompress compressed audio files when loading so they play immediately, but if you want to keep your mp3 compressed, you get a delay. Games keep the sound effects in memory uncompressed.
> Not all sound effects are short
Long ambient background noises often aren't latency sensitive and can be streamed. For most games textures are the biggest usage of space and audio isn't that significant, but every game is different. I'm just telling you why we use uncompressed audio. If there is a particular game you know of that's wasting a lot of space on large audio files, you should notify the devs.
There is a reason both Unity and Unreal use uncompressed audio or ADPCM for sound effects.
If that really bothers you then write your own on-disk compression format.
> why we use uncompressed audio
> ADPCM
... which is a compressed and lossy format.
Why? What are you trying to solve here? You're going to have a hard time making a new format that serves you better than any of the existing formats.
The most common solution for instant playback is just to store the sound uncompressed in memory. It's not a problem that needs solving for most games.
ADPCM and PCM are both pretty common. ADPCM for audio is kinda like DXT compression for textures: a very simple compression that produces files many times larger than mp3, and doesn't have good sound quality, but has the advantage that playback and seek costs virtually nothing over regular PCM. The file sizes of ADPCM are closer to PCM than mp3. I should have been clearer in my first comment that the delay is only for mp3/Vorbis and not for PCM/ADPCM.
There isn't a clean distinction between compressed and uncompressed and lossy/lossless in an absolute sense. Compression is implicitly (or explicitly) against some arbitrary choice of baseline. We normally call 16-bit PCM uncompressed and lossless but if your baseline is 32-bit floats, then it's lossy and compressed from that baseline.
Storage space. But this is the way for the same guys who duplicate 20Gb seven times 'to serve better by the industry standard'.
More sane people would just pack that AD/PCM in a .pk3^W sorry in a .zip file (or any other packaging format with LZ/7z/whatever compatible compression method) with the fastest profile and would have the best of the both worlds: sane storage requirements, uncompressed in memory. As a bonus it would be loaded faster from HDD because a data chunk which is 10 times smaller than uncompressed one would be loaded surprise 10 times faster.
The uncompressed audio for latency-sensitive one-shots usually isn’t taking up the bulk of memory either.
Like exploring the 'widely accepted industry practices' and writing code to duplicate the assets, then writing the code to actually measure what it did what the 'industry practices' advertised and then ripping this out, right?
And please note what you missed the 'if it really bothers you'.
I was trying to point out that the decision to compress or not compress audio likely has nothing to do with the download size.
> Titanfall accesses Microsoft's existing cloud network, with servers spooling up on demand. When there's no demand, those same servers will service Azure's existing customers. Client-side, Titanfall presents a dedicated server experience much like any other but from the developer and publisher perspective, the financials in launching an ambitious online game change radically.
Things changed _massively_ in games between 2014 and 2017 - we went from supporting borderline embedded level of platforms with enormous HW constraints, architecture differences, and running dedicated servers like the 90's, to basically supporting fixed spec PCs, and shipping always online titles running on the cloud.
[0] https://www.digitalfoundry.net/articles/digitalfoundry-2014-...
Bullshit. This is not a problem since 2003.
And nobody forbids you to actually decompress your compressed audio when you are loading the assets from the disk.
Titanfall wasn't on steam when it launched.
> It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.
The person that I replied to (not you) claimed "They said it was for performance but the theory was to make it more inconvenient for pirates to distribute."
Game studio's no longer care how big their games are if steam will still take them. This is a huge problem. GTA5 was notorious for loading json again, and again, and again during loading and it was just a mess. Same for HD2, game engines have the ability to only pack what is used but its still up to the developers to make sure their assets are reusable as to cut down on size.
This is why Star Citizen has been in development for 15 years. They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.
The anger here is real. The anger here is justified. I'm sick of having to download 100gb+ simply because a studio is too lazy and just packed up everything they made into a bundle.
Reminds me of the Crack.com interview with Jonathan Clark:
Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.
https://web.archive.org/web/20160125143707/http://www.loonyg...
GTA5 had well over 1000 people on its team.
When making a game, once you have something playable, is to figure out how to package it. This is included in that effort. Determining which assets to compress, package, and ship. Sometimes this is done by the engine. Sometimes this is done by the art director.
When I did this. My small team took a whole sprint to make sure that assets were packed. That tilemaps were made. That audio files were present and we did an audit to make sure nothing extra was packaged on disk. Today, because of digital stores and just releasing zip files, no one cares what they ship and often you can see it if you investigate the files of any Unity or Unreal engine game. Just throw it all over the fence.
I won’t state my own personal views here, but for those that share the above perspective, there is little benefit of the doubt they’ll extend towards Arrowhead.
This doesn't advance accepted industry wisdom because:
1. The trade-off is very particular to the individual game. Their loading was CPU-bound rather than IO-bound so the optimization didn't make much difference for HDDs. This is already industry wisdom. The amount of duplication was also very high in their game.
2. This optimization was already on its way out as SSDs take over and none of the current gen consoles use HDDs.
I'm not mad at Arrowhead or trying to paint them negatively. Every game has many bugs and mishaps like this. I appreciate the write-up.
I've been on PS5 since launch and aside from Baldur's Gate 3, it's been the best game this gen IMO.
The negativity I see towards the game (especially on Youtube) is weird. Some of the critiques seem legit but a lot of feels like rage bait, which appears to be a lot of YT videos around gaming lately.
Anyway, a big improvement for a great game. Seems like less of an incentive now to uninstall if you only play now and then.
iO
https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...
https://cookieplmonster.github.io/2025/04/23/gta-san-andreas...
Just don’t get caught at the end!
I feel like writes would probably be quite painful, but with game assets are essentially write-once read-forever so not the end of the world?
As an aside, its messed up that people with expensive SSDs are unnecessarily paying this storage tax. Just feels lazy...
habbekrats•2mo ago
wvbdmp•2mo ago
breve•2mo ago
habbekrats•2mo ago
jsheard•1mo ago
Cthulhu_•1mo ago
tetris11•2mo ago
That being said, cartridges were fast. The move away from cartridges was a wrong turn
BizarroLand•2mo ago
crest•1mo ago
hbn•1mo ago
I've kinda given up on physical games at this point. I held on for a long time, but the experience is just so bad now. They use the cheapest, flimsiest, most fragile plastic in the cases. You don't get a nice instruction manual anymore. And honestly, keeping a micro SD card in your system that can hold a handful of games is more convenient than having to haul around a bunch of cartridges that can be lost.
I take solace in knowing that if I do still have a working Switch in 20 years and lose access to games I bought a long time ago, hopefully the hackers/pirates will have a method for me to play them again.
wtallis•1mo ago
You've been paying attention to the wrong sources for information about NAND flash. A new Switch cartridge will have many years of reliable data retention, even just sitting on a shelf. Data retention only starts to become a concern for SSDs that have used up most of their write endurance; a Switch cartridge is mostly treated as ROM and only written to once.
hbn•1mo ago
I've read about people's 3DS cartridges already failing just sitting on a shelf.
hbn•1mo ago
https://x.com/marcdwyz/status/1999226723322261520
Dylan16807•1mo ago
maccard•1mo ago
crote•1mo ago
Cartridges were also crazy expensive. A N64 cartridge cost about $30 to manufacture with a capacity of 8MB, whereas a PS1 CD-ROM was closer to a $1 manufacturing cost, with a capacity of 700MB. That's $3.75/MB versus $0.0014/MB - over 2600x more expensive!
Without optical media most games from the late 90s & 2000s would've been impossible to make - especially once it got to the DVD era.
Cthulhu_•1mo ago
jayd16•1mo ago
maccard•1mo ago
[0] https://news.ycombinator.com/item?id=10066338
lynnharry•1mo ago
This reminds me of the old days when I check who's using my PC memory every now and then.
Cthulhu_•1mo ago
But it's stored because it's possible, easy, and cheap. Unlike older games, where developers would hide unused blocks of empty data for some last-minute emergency cramming if they needed it.
high_na_euv•1mo ago
Zambyte•1mo ago
Tepix•1mo ago
red-iron-pine•1mo ago
they're a fantastically popular franchise with a ton of money... and did it without the optimizations.
if they never did these optimizations they'd still have a hugely popular, industry leading game
minor tweaks to weapon damage will do more to harm their bottom line compared to any backend optimization