frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

France's homegrown open source online office suite

https://github.com/suitenumerique
1•nar001•50s ago•0 comments

SpaceX Delays Mars Plans to Focus on Moon

https://www.wsj.com/science/space-astronomy/spacex-delays-mars-plans-to-focus-on-moon-66d5c542
1•BostonFern•1m ago•0 comments

Jeremy Wade's Mighty Rivers

https://www.youtube.com/playlist?list=PLyOro6vMGsP_xkW6FXxsaeHUkD5e-9AUa
1•saikatsg•1m ago•0 comments

Show HN: MCP App to play backgammon with your LLM

https://github.com/sam-mfb/backgammon-mcp
1•sam256•3m ago•0 comments

AI Command and Staff–Operational Evidence and Insights from Wargaming

https://www.militarystrategymagazine.com/article/ai-command-and-staff-operational-evidence-and-in...
1•tomwphillips•3m ago•0 comments

Show HN: CCBot – Control Claude Code from Telegram via tmux

https://github.com/six-ddc/ccbot
1•sixddc•4m ago•1 comments

Ask HN: Is the CoCo 3 the best 8 bit computer ever made?

1•amichail•7m ago•0 comments

Show HN: Convert your articles into videos in one click

https://vidinie.com/
1•kositheastro•9m ago•0 comments

Red Queen's Race

https://en.wikipedia.org/wiki/Red_Queen%27s_race
2•rzk•9m ago•0 comments

The Anthropic Hive Mind

https://steve-yegge.medium.com/the-anthropic-hive-mind-d01f768f3d7b
2•gozzoo•12m ago•0 comments

A Horrible Conclusion

https://addisoncrump.info/research/a-horrible-conclusion/
1•todsacerdoti•12m ago•0 comments

I spent $10k to automate my research at OpenAI with Codex

https://twitter.com/KarelDoostrlnck/status/2019477361557926281
2•tosh•13m ago•0 comments

From Zero to Hero: A Spring Boot Deep Dive

https://jcob-sikorski.github.io/me/
1•jjcob_sikorski•14m ago•0 comments

Show HN: Solving NP-Complete Structures via Information Noise Subtraction (P=NP)

https://zenodo.org/records/18395618
1•alemonti06•19m ago•1 comments

Cook New Emojis

https://emoji.supply/kitchen/
1•vasanthv•21m ago•0 comments

Show HN: LoKey Typer – A calm typing practice app with ambient soundscapes

https://mcp-tool-shop-org.github.io/LoKey-Typer/
1•mikeyfrilot•24m ago•0 comments

Long-Sought Proof Tames Some of Math's Unruliest Equations

https://www.quantamagazine.org/long-sought-proof-tames-some-of-maths-unruliest-equations-20260206/
1•asplake•25m ago•0 comments

Hacking the last Z80 computer – FOSDEM 2026 [video]

https://fosdem.org/2026/schedule/event/FEHLHY-hacking_the_last_z80_computer_ever_made/
2•michalpleban•26m ago•0 comments

Browser-use for Node.js v0.2.0: TS AI browser automation parity with PY v0.5.11

https://github.com/webllm/browser-use
1•unadlib•27m ago•0 comments

Michael Pollan Says Humanity Is About to Undergo a Revolutionary Change

https://www.nytimes.com/2026/02/07/magazine/michael-pollan-interview.html
2•mitchbob•27m ago•1 comments

Software Engineering Is Back

https://blog.alaindichiappari.dev/p/software-engineering-is-back
2•alainrk•28m ago•1 comments

Storyship: Turn Screen Recordings into Professional Demos

https://storyship.app/
1•JohnsonZou6523•28m ago•0 comments

Reputation Scores for GitHub Accounts

https://shkspr.mobi/blog/2026/02/reputation-scores-for-github-accounts/
2•edent•32m ago•0 comments

A BSOD for All Seasons – Send Bad News via a Kernel Panic

https://bsod-fas.pages.dev/
1•keepamovin•35m ago•0 comments

Show HN: I got tired of copy-pasting between Claude windows, so I built Orcha

https://orcha.nl
1•buildingwdavid•35m ago•0 comments

Omarchy First Impressions

https://brianlovin.com/writing/omarchy-first-impressions-CEEstJk
2•tosh•40m ago•1 comments

Reinforcement Learning from Human Feedback

https://arxiv.org/abs/2504.12501
7•onurkanbkrc•41m ago•0 comments

Show HN: Versor – The "Unbending" Paradigm for Geometric Deep Learning

https://github.com/Concode0/Versor
1•concode0•42m ago•1 comments

Show HN: HypothesisHub – An open API where AI agents collaborate on medical res

https://medresearch-ai.org/hypotheses-hub/
1•panossk•45m ago•0 comments

Big Tech vs. OpenClaw

https://www.jakequist.com/thoughts/big-tech-vs-openclaw/
1•headalgorithm•48m ago•0 comments
Open in hackernews

Helldivers 2 on-disk size 85% reduction

https://store.steampowered.com/news/app/553850/view/491583942944621371
276•SergeAx•2mo ago

Comments

habbekrats•2mo ago
it seems wild the state of games and development today... imagine 131GB out of 154GB of data was not needed....
wvbdmp•2mo ago
The whole world took a wrong turn when we moved away from physical media.
breve•2mo ago
Hard drives and optical discs are the reason they duplicated the data. The duplicated the data to reduce load times.
habbekrats•2mo ago
do they even sell disc of these game?...
jsheard•1mo ago
They do, but it's irrelevant to performance nowadays since you're required to install all of the disc data to the SSD before you can play. The PS3/360 generation was the last time you could play games directly from a disc (and even then some games had an install process).
Cthulhu_•1mo ago
I believe even then it was already "most", at least for the PS3; that was the era where always-online devices became the norm, where game developers were more eager to release patches after release, etc.
tetris11•2mo ago
In terms of ownership, yes absolutely. In terms of read/write speeds to physical media, the switch to an SSD has been unsung gamechanger.

That being said, cartridges were fast. The move away from cartridges was a wrong turn

BizarroLand•2mo ago
I hate it when you buy a physical game, insert the disk, and immediately have to download the game in order to play the game because the disk only contains a launcher and a key. Insanity of the worst kind.
crest•1mo ago
Or the launch day patch is >80% the size of the game, but I don't want to go back to game design limited by optical media access speeds.
hbn•1mo ago
Nintendo is pretty good for putting a solid 1.0 version of their games on the cartridges on release. But on the other hand, the Switch cartridges use NAND memory which means if you aren't popping them into a system to refresh the charge every once in a while, your physical cartridge might not last as long as they keep the servers online so you could download a digital purchase.

I've kinda given up on physical games at this point. I held on for a long time, but the experience is just so bad now. They use the cheapest, flimsiest, most fragile plastic in the cases. You don't get a nice instruction manual anymore. And honestly, keeping a micro SD card in your system that can hold a handful of games is more convenient than having to haul around a bunch of cartridges that can be lost.

I take solace in knowing that if I do still have a working Switch in 20 years and lose access to games I bought a long time ago, hopefully the hackers/pirates will have a method for me to play them again.

wtallis•1mo ago
> the Switch cartridges use NAND memory which means if you aren't popping them into a system to refresh the charge every once in a while, your physical cartridge might not last as long

You've been paying attention to the wrong sources for information about NAND flash. A new Switch cartridge will have many years of reliable data retention, even just sitting on a shelf. Data retention only starts to become a concern for SSDs that have used up most of their write endurance; a Switch cartridge is mostly treated as ROM and only written to once.

hbn•1mo ago
What's "many years"?

I've read about people's 3DS cartridges already failing just sitting on a shelf.

hbn•1mo ago
Speak of the devil, I just got this tweet in my feed today

https://x.com/marcdwyz/status/1999226723322261520

Dylan16807•1mo ago
Are you sure those flashes are capable of refreshing?
maccard•1mo ago
The read speed off of an 8xDVD is ~10MB/s. The cheapest 500GB SSD on Amazon has a read speed of of 500MB/s. An NVMe drive has is 2500MB/s. We can read an entire DVD's capacity (4.7GB) from an SSD in under 10 seconds, compared to 8 minutes.
crote•1mo ago
> That being said, cartridges were fast. The move away from cartridges was a wrong turn

Cartridges were also crazy expensive. A N64 cartridge cost about $30 to manufacture with a capacity of 8MB, whereas a PS1 CD-ROM was closer to a $1 manufacturing cost, with a capacity of 700MB. That's $3.75/MB versus $0.0014/MB - over 2600x more expensive!

Without optical media most games from the late 90s & 2000s would've been impossible to make - especially once it got to the DVD era.

Cthulhu_•1mo ago
Maybe, but I'd argue the on-board storage chips literally an inch away from the CPU / GPU of the PS5 are faster these days. But in between cartridge consoles and fast hard drive consoles there was a disk-based gap where seek times were an issue.
jayd16•1mo ago
The (de)-optimization exists, essentially, because of physical media.
maccard•1mo ago
This isn't unique to games, and it's not just "today". Go back a decade [0] find people making similar observations about one of the largest tech companies on the planet.

[0] https://news.ycombinator.com/item?id=10066338

lynnharry•1mo ago
> FB App is 114MB in size, but loading this page in Chrome will use a good 450MB, idk how they managed that.

This reminds me of the old days when I check who's using my PC memory every now and then.

Cthulhu_•1mo ago
And that's consumer apps, having only glimpsed in the world of back-end / cloud shenanigans, there's heaps of data being generated and stored in datacenters. Useful data? Dunno, how useful are all access logs ever?

But it's stored because it's possible, easy, and cheap. Unlike older games, where developers would hide unused blocks of empty data for some last-minute emergency cramming if they needed it.

high_na_euv•1mo ago
It was needed. Just the trade off wasnt worth it.
Zambyte•1mo ago
It was wanted and intentionally selected, but it wasn't needed.
Tepix•1mo ago
I'd argue it was incompetence.
red-iron-pine•1mo ago
it wasn't needed -- need means "must have"

they're a fantastically popular franchise with a ton of money... and did it without the optimizations.

if they never did these optimizations they'd still have a hugely popular, industry leading game

minor tweaks to weapon damage will do more to harm their bottom line compared to any backend optimization

andrewstuart•2mo ago
How is there so much duplication?
jy14898•2mo ago
The post stated that it was believed duplication improved loading times on computers with HDDs rather than SSDs
dontlaugh•2mo ago
Which is true. It’s an old technique going back to CD games consoles, to avoid seeks.
SergeAx•2mo ago
Is it really possible to control file locations on HDD via Windows NTFS API?
dontlaugh•2mo ago
No, not at all. But by putting every asset a level (for example) needs in the same file, you can pretty much guarantee you can read it all sequentially without additional seeks.

That does force you to duplicate some assets a lot. It's also more important the slower your seeks are. This technique is perfect for disc media, since it has a fixed physical size (so wasting space on it is irrelevant) and slow seeks.

viraptor•1mo ago
> by putting every asset a level (for example) needs in the same file, you can pretty much guarantee you can read it all sequentially

I'd love to see it analysed. Specifically, the average number of nonseq jumps vs overall size of the level. I'm sure you could avoid jumps within megabytes. But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower. This paper effectively says that if you have long files, there's almost guaranteed gaps https://dfrws.org/wp-content/uploads/2021/01/2021_APAC_paper... so at that point, you may be better off preallocating the individual does where eating the cost of switching between them.

dontlaugh•1mo ago
Sure. I’ve seen people that do packaging for games measure various techniques for hard disks typical of the time, maybe a decade ago. It was definitely worth it then to duplicate some assets to avoid seeks.

Nowadays? No. Even those with hard disks will have lots more RAM and thus disk cache. And you are even guaranteed SSDs on consoles. I think in general no one tries this technique anymore.

toast0•1mo ago
From that paper, table 4, large files had an average # of fragments around 100, but a median of 4 fragments. A handful of fragments for a 1 GB level file is probably a lot less seeking than reading 1 GB of data out of a 20 GB aggregated asset database.

But it also depends on how the assets are organized, you can probably group the level specific assets into a sequential section, and maybe shared assets could be somewhat grouped so related assets are sequential.

wcoenen•1mo ago
> But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower.

By default, Windows automatically defragments filesystems weekly if necessary. It can be configured in the "defragment and optimize drives" dialog.

pixl97•1mo ago
Not 'full' de-fragmentation, Microsoft labs did a study and after 64MB slabs of contiguous files you don't gain much so they don't care about getting gigabytes fully defragmented.

https://web.archive.org/web/20100529025623/http://blogs.tech...

old article on the process

justsomehnguy•1mo ago
> But if someone ever got closer to filling up the disk in the past, the chances of contiguous gigabytes are much lower

Someone installing a 150GB game sure do have 150GB+ of free space and there would be a lot of continuous free space.

jayd16•1mo ago
It's an optimistic optimization so it doesn't really matter if the large blobs get broken up. The idea is that it's still better than 100k small files.
toast0•1mo ago
Not really. But when you write a large file at once (like with an installer), you'll tend to get a good amount of sequential allocation (unless your free space is highly fragmented). If you load that large file sequentially, you benefit from drive read ahead and OS read ahead --- when the file is fragmented, the OS will issue speculative reads for the next fragment automatically and hide some of the latency.

If you break it up into smaller files, those are likely to be allocated all over the disk; plus you'll have delays on reading because windows defender makes opening files slow. If you have a single large file that contains all resources, even if that file is mostly sequential, there will be sections that you don't need, and read ahead cache may work against you, as it will tend to read things you don't need.

pjc50•1mo ago
Key word is "believed". It doesn't sound like they actually benchmarked.
wongogue•1mo ago
There is nothing to believe. Random 4K reads for HDD is slow.
debugnik•1mo ago
I assume asset reads nowadays are much heavier than 4 kB though, specially if assets meant to be loaded together are bundled together in one file. So games now should be spending less time seeking relative to their total read size. Combined with HDD caches and parallel reads, this practice of duplicating over 100 GBs across bundles is most likely a cargo-cult by now.

Which makes me think: Has there been any advances in disk scheduling in the last decade?

khannn•1mo ago
Who cares? I've installed every graphically intensive game on SSDs since the original OCZ Vertex was released.
teamonkey•1mo ago
Their concern was that one person in a squad loading on HDD could slow down the level loading for all players in a squad, even if they used a SSD, so they used a very normal and time-tested optimisation technique to prevent that.
khannn•1mo ago
Their technique makes it so that the normal person with a ~base SSD of 512 GB can't reasonably install the game. Heck of a job Brownie.
teamonkey•1mo ago
Nonsense. I play it on a 512GB SSD and it’s fine.
khannn•1mo ago
It's hard for me to use a laptop with win11 and one game (BG3) installed on a 512 GB SSD.
breve•2mo ago
They duplicate files to reduce load times. Here's how Arrowhead Game Studios themselves tell it:

https://www.arrowheadgamestudios.com/2025/10/helldivers-2-te...

imtringued•2mo ago
I don't think this is the real explanation. If they gave the filesystem a list of files to fetch in parallel (async file IO), the concept of "seek time" would become almost meaningless. This optimization will make fetching from both HDDs and SSDs faster. They would be going out of their way to make their product worse for no reason.
Xss3•2mo ago
If they fill your harddrive youre less likely to install other games. If you see a huge install size youre less likely to uninstall with plans to reinstall later because thatd take a long time.
ukd1•1mo ago
Unfortunately this actually is believable. SMH.
toast0•1mo ago
Solid state drives tend to respond well to parallel reads, so it's not so clear. If you're reading one at a time, sequential access is going to be better though.

But for a mechanical drive, you'll get much better throughput on sequential reads than random reads, even with command queuing. I think earlier discussion showed it wasn't very effective in this case and taking 6x the space for a marginal benefit for the small % of users with mechanical drives isn't worth while...

seg_lol•1mo ago
Every storage medium, including ram, benefits from sequential access. But it doesn't have to be super long sequential access, the seek time, or block open time just needs to be short relative to the next block read.
extraduder_ire•1mo ago
"97% of the time: premature optimization is the root of all evil."
pixl97•1mo ago
>If they gave the filesystem a list of files to fetch in parallel (async file IO)

This does not work if you're doing tons of small IO and you want something fast.

Lets say were on a HDD with 200IOPS and we need to read 3000 small files randomly across the hard drive.

Well, at minimum this is going to take 15's seconds plus any additional seek time.

Now, lets say we zip up those files in a solid archive. You'll read it in half a second. The problem comes in when different levels all need different 3000 files. Then you end deduping a bunch of stuff.

Now, where this typically falls apart for modern game assets is they are getting very large which tends to negate seek times by a lot.

imtringued•1mo ago
I haven't found any asynchronous IOPS numbers on HDDS anywhere. The internet IOPs are just 1000ms/seek time with a 8ms seek time for moving from the outer to the inner track, which is only really relevant for the synchronous file IO case.

For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.

While it may not have been obvious, I have taken archiving or bundling of assets into a bigger file for granted. The obvious benefit is that the HDD knows that it should store game files continuously. This has nothing to do with file duplication though and is a somewhat irrelevant topic, because it costs nothing and only has benefits.

The asynchronous file IO case for bundled files is even better, since you can just hand over the internal file offsets to the async file IO operations and get all the relevant data in parallel so your only constraint is deciding on an optimal lower bound for the block size, which is high for HDDs and low for SSDs.

gruez•1mo ago
>I haven't found any asynchronous IOPS numbers on HDDS anywhere. The internet IOPs are just 1000ms/seek time with a 8ms seek time for moving from the outer to the inner track, which is only really relevant for the synchronous file IO case.

>For asynchronous IO you can just do inward/outward passes to amortize the seek time over multiple files.

Here's a random blog post that has benchmarks for a 2015 HDD:

https://davemateer.com/2020/04/19/Disk-performance-CrystalDi...

It shows 1.5MB/s for random 4K performance with high queue depth, which works out to just under 400 IOPS. 1 queue depth (so synchronous) performance is around a third.

pixl97•1mo ago
>I haven't found any asynchronous IOPS numbers on HDDS anywhere.

As the other user stated, just look up Crystal Disk Info results for both HDDs and SSD and you'll see hard drives do about 1/3rd of a MBPs on random file IO while the same hard drive will do 400MBps on a contiguous read. For things like this reading a zip and decompressing in memory is "typically" (again, you have to test this) orders of magnitude faster.

jayd16•1mo ago
The technique has the most impact on games running off physical disc.

It's a well known technique but happened to not be useful for their use case.

crest•1mo ago
The idea is to duplicate assets so loading a "level" is just sequential reading from the file system. It's required on optical media and can be very useful on spinning disks too. On SSDs it's insane. The logic should've been the other way around. Do a speed test on start an offer to "optimise for spinning media" if the performance metrics look like it would help.

If the game was ~20GB instead of ~150GB almost no player with the required CPU+GPU+RAM combination would be forced to put it on a HDD instead of a SSD.

immibis•1mo ago
This idea of one continuous block per level dates back to the PS1 days.

Hard drives are much, much faster than optical media - on the order of 80 seeks per second and 300 MB/s sequential versus, like, 4 seeks per second and 60 MB/s sequential (for DVD-ROM).

You still want to load sequential blocks as much as possible, but you can afford to have a few. (Assuming a traditional engine design, no megatextures etc) you probably want to load each texture from a separate file, but you can certainly afford to load a block of grass textures, a block of snow textures, etc. Also throughput is 1000x higher than a PS1 (300 kB/s) so you can presumably afford to skip parts of your sequential runs.

immibis•1mo ago
I meant to write that you probably DON'T want to load each texture from a separate file, but it would be fine to have them in blocks.
Calzifer•2mo ago
I was curious if they optimized the download. Did it download the 'optimized' ~150 GB and wasting a lot of time there or did it download the ~20 GB unique data and duplicated as part of the installation.

I still don't know but found instead an interesting reddit post were users found and analyzed this "waste of space" three month ago.

https://www.reddit.com/r/Helldivers/comments/1mw3qcx/why_the...

PS: just found it. According to this Steam discussion it does not download the duplicate data and back then it only blew up to ~70 GB.

https://steamcommunity.com/app/553850/discussions/0/43725019...

SergeAx•2mo ago
They downloaded 43 GB instead of 152 GB, according to SteamDB: https://steamdb.info/app/553850/depots/ Now it is 20 GB => 21 GB.
maccard•1mo ago
Steam breaks your content into 1MB Chunks and compresses/dedupes them [0]

[0] https://partner.steamgames.com/doc/sdk/uploading#AppStructur...

tetris11•2mo ago
I wonder if a certain Amelie-clad repacker noticed the same reduction in their release of the same game.
squigz•1mo ago
Fitgirl and Anna (of Anna's Archive) are modern day heroes.
debugnik•1mo ago
Helldivers 2 is an always-online game, you won't find it cracked.
NullPrefix•1mo ago
such games can only get private servers
debugnik•1mo ago
Yes, but those are rarely a thing for most live service games. Unless someone is working on a reimplementation of the entire server side, there's no point in offering or downloading pirate copies.
simplyinfinity•1mo ago
There is - albeit a dwindling - community that does reimplement entire backends for mmo games. Look up the ragezone forums. I grew up around Mu Online private servers. And I'm sure in time a private server for HD 2 will appear if arrowhead don't release one themselves :)
rawling•1mo ago
https://news.ycombinator.com/item?id=46134178

282 comments

HelloUsername•1mo ago
https://news.ycombinator.com/item?id=46123009
_aavaa_•1mo ago
My takeaway is that it seems like they did NO benchmarking of their own before choosing to do all that duplication. They only talk about performance tradeoff now that they are removing it. Wild
maccard•1mo ago
I've been involved in decisions like this that seem stupid and obvious. There's a million different things that could/should be fixed, and unless you're monitoring this proactively you're unlikely to know it hsould be changed.

I'm not an arrowhead employee, but my guess is at some point in the past, they benchmarked it, got a result, and went with it. And that's about all there is to it.

Xelbair•1mo ago
>These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

>We now know that, contrary to most games, the majority of the loading time in HELLDIVERS 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time. We now know that this is true even for users with mechanical HDDs.

they did absolutely zero benchmarking beforehand, just went with industry haresay, and decided to double it just in case.

creshal•1mo ago
"Industry hearsay" in this case was probably Sony telling game devs how awesome the PS5's custom SSD was gonna be, and nobody bothered to check their claims.
mary-ext•1mo ago
the industry hearsay is about concern of HDD load times tho
creshal•1mo ago
HDD load times compared to......?
mary-ext•1mo ago
are we not reading the same post and comments?

the "industry hearsay" from two replies above mine is about deliberate data duplication to account for the spinning platters in HDD (which isn't entirely correct, as the team on Helldivers 2 have realized)

maccard•1mo ago
What are you talking about?

This has nothing to do with consoles, and only affects PC builds of the game

creshal•1mo ago
HD2 started as playstation exclusive, and was retargeted mid-development for simultaneous release.

So the PS5's SSD architecture was what developers were familiar with when they tried to figure out what changes would be needed to make the game work on PC.

maccard•1mo ago
I don't really understand your point. You're making a very definitive statement about how the PS5's SSD architecture is responsible for this issue - when the isssue is on a totally different platform, where they have _already_ attempted (poorly, granted) to handle the different architectures.
creshal•1mo ago
No. Please try reading more carefully.
Dylan16807•1mo ago
If what they were familiar with was a good SSD, then they didn't need to do anything. I don't see how anything Sony said about their SSD would have affected things.

Maybe you're saying the hearsay was Sony exaggerating how bad hard drives are? But they didn't really do that, and the devs would already have experience with hard drives.

wtallis•1mo ago
What Sony said about their SSD was that it enabled game developers to not duplicate assets like they did for rotating storage. One specific example I recall in Sony's presentation was the assets for a mailbox used in a Spider Man game, with hundreds of copies of that mailbox duplicated on disk because the game divided Manhattan into chunks and tried to have all the assets for each chunk stored more or less contiguously.

If the Helldivers devs were influenced by what Sony said, they must have misinterpreted it and taken away an extremely exaggerated impression of how much on-disk duplication was being used for pre-SSD game development. But Sony did actually say quite a bit of directly relevant stuff on this particular matter when introducing the PS5.

Dylan16807•1mo ago
Weird, since that's a benefit of any kind of SSD at all. The stuff their fancy implementation made possible was per-frame loading, not just convenient asset streaming.

But uh if the devs didn't realize that, I blame them. It's their job to know basics like that.

creshal•1mo ago
Yeah, that's what I was trying to get at. Sony was extremely deceptive in how they marketed the PS5 to devs, and the Helldivers dev don't want to admit how completely they fell for it.
Dylan16807•1mo ago
It's incompetence if they "fell for" such basic examples being presented in the wrong context. 5% of the blame can go to Sony, I guess, if that's what happened.

And on top of any potential confusion between normal SSD and fancy SSD, a mailbox is a super tiny asset and the issue in the spiderman game is very rapidly cycling city blocks in and out of memory. That's so different from helldivers level loading.

wtallis•1mo ago
By far the most important thing about the PS5 SSD was the fact that it wasn't optional, and developers would no longer have to care about being able to run off mechanical drives. That has repercussions throughout the broader gaming industry because the game consoles are the lowest common denominator for game developers to target, and getting both Xbox and PlayStation to use SSDs was critical. From the perspective of PlayStation customers and developers, the introduction of the PS5 was the right time to talk about the benefits of SSDs generally.

Everything else about the PS5 SSD and storage subsystem was mere icing on the cake and/or snake oil.

maccard•1mo ago
Nowhere in that does it say “we did zero benchmarking and just went with hearsay”. Basing things on industry data is solid - looking at the steam hardware surveys if a good way to figure out the variety of hardware used without commissioning your own reports. Tech choices are no different.

Do you benchmark every single decision you make on every system on every project you work on? Do you check that redis operation is actually O(1) or do you rely on hearsay. Do you benchmark every single SQL query, every DTO, the overhead of the DI Framework, connection pooler, json serializer, log formatter? Do you ever rely on your own knowledge without verifying the assumptions? Of course you do - you’re human and we have to make some baseline assumptions, and sometimes they’re wrong.

pixl97•1mo ago
>they did absolutely zero benchmarking beforehand, just went with industry haresay, a

https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence

It was a real issue in the past with hard drives and small media assets. It's still a real issue even with SSDs. HDD/SSD IOPS are still way slower than contiguous reads when you're dealing with a massive amount of files.

At the end of the day it requires testing which requires time at a time you don't have a lot of time.

imtringued•1mo ago
It's not an issue with asynchronous filesystem IO. Again, async file IO should be the default for game engines. It doesn't take a genius to gather a list of assets to load and then wait for the whole list to finish rather than blocking on every tiny file.
pixl97•1mo ago
There are two different things when talking about application behavior versus disk behavior.

>wait for the whole list to finish rather than blocking on every tiny file.

And this is the point. I can make a test that shows exactly what's going on here. Make a random file generator that generates 100,000 4k files. Now, write them on hard drive with other data and things going on at the same time. Now in another run of the program have it generate 100,000 4k files and put them in a zip.

Now, read the set of 100k files from disk and at the same time read the 100k files in a zip....

One finishes in less than a second and one takes anywhere from a few seconds to a few minutes depending on your disk speeds.

the_af•1mo ago
This is not a good invokation of Chesterton's Fence.

The Fence is a parable about understanding something that already exists before asking to remove it. If you cannot explain why it exists, you shouldn't ask to remove it.

In this case, it wasn't something that already existed in their game. It was something that they read, then followed (without truly understanding whether it applied to their game), and upon re-testing some time later, realized it wasn't needed and caused detrimental side-effects. So it's not Chesterton's Fence.

You could argue they followed a videogame industry practice to make a new product, which is reasonable. They just didn't question or test their assumptions that they were within the parameters of said industry practice.

I don't think it's a terrible sin, mind you. We all take shortcuts sometimes.

FieryMechanic•1mo ago
They made a decision based on existing data. This isn't unreasonable as you are pretending, especially as PC hardware can be quite diverse.

You will be surprised what some people are playing games on. e.g. I know people that still use Windows 7 on a AMD BullDozer rig. Atypical for sure, but not unheard of.

red-iron-pine•1mo ago
i believe it. hell i'm in F500 companies and virtually all of them had some legacy XP / Server 2000 / ancient Solaris box in there.

old stuff is common, and doubly so for a lot of the world, which ain't rich and ain't rockin new hardware

FieryMechanic•1mo ago
My PC now is 6 years old and I have no intention of upgrading it soon. My laptop is like 8 years old and it is fine for what I use it for. My monitors are like 10-12 years old (they are early 4k monitors) and they are still good enough. I am primarily using Linux now and the machine will probably last me to 2030 if not longer.

Pretending that this is an outrageous decision when the data and the commonly assumed wisdom was that there were still a lot of people using HDDs.

They've since rectified this particular issue and there seems to be more criticism of the company after fixing an issue.

alias_neo•1mo ago
They admitted to testing nothing, they just [googled it].

To be fair, the massive install size was probably the least of the problems with the game, it's performance has been atrocious, and when they released for xbox, the update that came with it broke the game entirely for me and was unplayable for a few weeks until they released another update.

In their defense, they seem to have been listening to players and have been slowly but steadily improving things.

Playing Helldivers 2 is a social thing for me where I get together online with some close friends and family a few times a month and we play some helldivers and have a chat, aside from that period where I couldn't play because it was broken, it's been a pretty good experience playing it on Linux; even better since I switched from nvidia to AMD just over a week ago.

I'm glad they reduced the install size and saved me ~130GB, and I only had to download about another 20GB to do it.

seg_lol•1mo ago
Performance profiling should be built into the engine and turned on at all times. Then this telemetry could be streamed into a system that tracks it across all builds, down to a specific scene. It should be possible to click a link on the telemetry server and start the game at that exact point.
maccard•1mo ago
How would that help them diagnose a code path that wasn't ever being run (loading non duplicated assets on HDDs)?
seg_lol•1mo ago
> diagnose a code path that wasn't ever being run
esrauch•1mo ago
It's very easy to accidentally get misleading benchmarking results in 100 different ways, I wouldn't assume they did no benchmarking when they did the duplication.
Hikikomori•1mo ago
They used industry data to make the decision first to avoid potential multi minute load times for 10% or do of their players, hard to test all kinds of pc configurations. Now they have telemetry showing that it doesn't matter because another parallel task takes about as much time anyway.
whywhywhywhy•1mo ago
Maybe it's changed a lot statistically in the last few years but for long time PC gamers used to have the mantra of small SSD for the OS and large HDD for games if they're price conscious so I could see that being assumed to be much more normal during development.
Dylan16807•1mo ago
It's a shameful tragedy of the commons if you bloat your game 6x because you think your customers don't have enough SSD space for their active games.
justsomehnguy•1mo ago
So they premature optimized for a wrong case.

> multi minute load times

23Gb / 100mb / 60s = 3.92m

So in the worst case when everything is loaded at once (how on a system with < 32Gb RAM?) it takes 4 minutes.

Considering GTA whatever version could sit for 15 minutes at the loading screen because nobody bothered to check why - the industry could really say not to bother.

_aavaa_•1mo ago
But they did NOT know it would lead to multi-minute load time. They did not measure a baseline.

Instead they did blindly did extra work and 6x’ed the storage requirement.

djmips•1mo ago
A tale as old as time. Making decisions without actually profiling before, during and after implementing.
Xelbair•1mo ago
worse, in their post they basically said:

>we looked at industry standard values and decided to double them just in case.

functionmouse•1mo ago
Some kind of evil, dark counterpart to Moore's law in the making
red-iron-pine•1mo ago
this is one of the best selling games in history, and is emently popular across the globe.

it had no serious or glaring impact to their bottom line.

thus it was the right call, and if they didn't bother to fix it they'd still be rolling in $$$$

ycombinatrix•1mo ago
All companies should also defraud & rug pull their customers.

It will make them a lot of money and is thus the right call. Who cares about customers am I right? They'd still be rolling in $$$$.

wongarsu•1mo ago
It's pretty standard to do that duplication for games on CD/DVD because seek times are so long. It probably just got carried over as the "obviously correct" way of doing things, since HDDs are like DVDs if you squint a bit
jayd16•1mo ago
The game does ship on disc for console, no?
Teknoman117•1mo ago
The current generation of consoles can’t play games directly off the disk. They have to be installed to local storage first.
Arrath•1mo ago
I had assumed the practice started to die off when installing games became dominant over streaming from the disc even on consoles. Seems I was wrong!
Hendrikto•1mo ago
> our worst case projections did not come to pass. These loading time projections were based on industry data - comparing the loading times between SSD and HDD users where data duplication was and was not used. In the worst cases, a 5x difference was reported between instances that used duplication and those that did not. We were being very conservative and doubled that projection again to account for unknown unknowns.

They basically just made the numbers up. Wild.

rjzzleep•1mo ago
On the flip side I don't remember who did it, but basically extracting textures on disk fixed all the performance issues UE5 has on some benchmarks(sorry for being vague, but I can't find the source material right now). But their assumption is in fact a sound one.
Normal_gaussian•1mo ago
Yes. Its quite common for games to have mods that repack textures or significantly tweak the UE5 config at the moment - and its very common to see users using it when it doesn't actually affect their use cases.

As an aside, I do enjoy the modding community naming over multiple iterations of mods - "better loading" -> "better better loading" -> "best loading" -> "simplified loading" -> "x's simplified loading" -> "y's simplified loading" -> "z's better simplified loading". Where 'better' is often some undisclosed metric based on some untested assumptions.

fullstop•1mo ago
It's like the story of a young couple cooking their first Christmas ham.

The wife cuts the end off of the ham before putting it in the oven. The husband, unwise in the ways of cooking, asks her why she does this.

"I don't know", says the wife, "I did it because my mom did it."

So they call the mom. It turns out that her mother did it, so she did too.

The three of them call the grandma and ask "Why did you cut the end off of the ham before cooking it?"

The grandma laughs and says "I cut it off because my pan was too small!"

chrisweekly•1mo ago
Haha, cargo cult strikes again!
01HNNWZ0MV43FF•1mo ago
For today's 10,000: https://www.righto.com/2025/01/its-time-to-abandon-cargo-cul...

> The pop-culture cargo cult description, however, takes features of some cargo cults (the occasional runway) and combines this with movie scenes to yield an inaccurate and fictionalized dscription. It may be hard to believe that the description of cargo cults that you see on the internet is mostly wrong, but in the remainder of this article, I will explain this in detail.

chrisweekly•1mo ago
Thanks. TIL.

FWIW, I meant it strictly in the generic vernacular sense in which I've encountered it: doing something because it has the outward form of something useful or meaningful, without understanding whether or how it works.

Given the problematic history you shared, it seems a new term is needed for this... maybe "Chesterson's Folly"? It's related to Chesterson's Fence (the principle that it's unwise to remove a fence if you don't know why it was erected). If you leave in place all "fences" you don't understand, and never take the time to determine their purpose, fences which serve no good purpose will accumulate.

bombcar•1mo ago
It's the corollary to Chesterton's Fence - don't remove it until you know why it's there, but also investigate why it's there.
SirAiedail•1mo ago
Non-made up numbers from Vermintide 2 (same engine): On PS4 when an optimized build took around 1.5 minutes to boot to main menu, the unoptimized version would take 12-15 minutes [1]. A different benchmark than SSD vs HDD, but shows that the optimization was certainly needed at the time. Though the PS4 was partially to blame as well, with it's meagre 5400 RPM spinny drive.

For their newer instalment, Fatshark went with a large rework of the engine's bundle system, and players on HDDs are complaining about long loading times expectedly. That game is still large at ~80GB, but not from duplication.

[1]: https://www.reddit.com/r/Vermintide/comments/hxkh0x/comment/...

Pannoniae•1mo ago
The good old "studios don't play their own games" strikes again :P

Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware. That will quickly teach them the value of fixing stuff like this and optimising the game in general.

Forgeties79•1mo ago
They could have been lying I guess but I listened to a great podcast about the development of Helldivers 2 (I think it was gamemakers notebook) and one thing that was constantly brought up was as they iterated they forced a huge chunk of the team to sit down and play it. That’s how things like diving from a little bit too high ended up with you faceplanting and rag-dolling, tripping when jet packing over a boulder that you get a little too close to, etc. They found that making it comically realistic in some areas led to more unexpected/emergent gameplay that was way more entertaining. Turrets and such not caring if you’re in the line of fire was brought up I believe.

That’s how we wound up with this game where your friends are as much of a liability as your enemies.

whizzter•1mo ago
People literally play the games they work on all the time, it's more or less what most do.

Pay 2000$ for indie games so studios could grow up without being beholden to shareholders and we could perhaps get that "perfect" QA,etc.

It's a fucking market economy and people aren't making pong level games that can be simply tuned, you really get what you pay for.

maccard•1mo ago
I've worked in games for close to 15 years, and every studio I've worked on we've played the game very regularly. My current team every person plays the game at least once a week, and more often as we get closer to builds.

In my last project, the gameplay team played every single day.

> Games would be much better if all people making them were forced to spend a few days each month playing the game on middle-of-the-road hardware

How would playing on middle of the road hardware have caught this? The fix to this was to benchmark the load time on the absolute bottom end of hardware, with and without the duplicated logic. Which you'd only do once you have a suspicion that it's going to be faster if you change it...

whizzter•1mo ago
It's an valid issue, those of us who worked back in the day on GD/DVD,etc games really ran into bad loading walls if we didn't duplicate data for straight streaming.

Data-sizes has continued to grow and HDD-seek times haven't gotten better due to physics (even if streaming probably has kept up), the assumption isn't too bad considering history.

It's a good that they actually revisited it _when they had time_ because launching a game, especially a multiplayer one, will run into a lot of breaking bugs and this (while a big one, pun intended) is still by most classifications a lower priority issue.

dwroberts•1mo ago
It seems plausible to me that this strategy was a holdover from the first game, which shipped for PS4 and XBO

I don’t know about the Xbox, but on PS4 the hard drive was definitely not fast at all

jayd16•1mo ago
You can't bench your finished game before it exists and you don't really want to rock the boat late in dev, either.

It was a fundamentally sound default that they revisited. Then they blogged about the relatively surprising difference it happen to make in their particular game. As it turns out the loading is CPU bound anyway, so while the setting is doing it's job, in the context of the final game, it happens to not be the bottle neck.

There's also the movement away from HDD and disc drives in the player base to make that the case as well.

everdrive•1mo ago
I love Helldivers 2, but from what I can tell it's a bunch of enthusiasts using a relatively broken engine to try to do cool stuff. It almost reminds me of the first pokemon game. I'll bet there's all sorts of stuff they get wrong from a strictly technical standpoint. I love the game so much I see this more as a charming quirk than I do something which really deserves criticism. The team never really expected their game to be as popular as it's become, and I think we're still inheriting flaws from the surprise interest in the game. (some of this plays out in the tug of war between the dev team's hopes for a realistic grunt fantasy vs. and the player base's horde power fantasy.)
rincebrain•1mo ago
A lot of things suddenly made sense when I learned their prior work was Magicka.
brainzap•1mo ago
oh no
jfindper•1mo ago
I never played Magicka, but the reviews seem fine (76% GameRankings, 74/100 Metacritic, 8/10 EuroGamer, etc.)

Was it a bad game? Or jankey? What parts of Helldivers are "making sense" now?

darthcircuit•1mo ago
Not op, but magicka is a pretty fun game.

You cast spells in a similar way as calling in strategems in hd2.

The spell system was super neat. There’s several different elements (fire, air, water, earth, electricity, ice, ands maybe something else. It’s been a while since I played). Each element can be used on its own or is combinable. Different combinations would cast different spells. Fire+water makes steam for instance. Ice + air is a focused blizzard, etc.

there’s hundreds to learn and that’s your main weapon in the game. There’s even a spell you can cast that will randomly kick someone you’re playing with out of the game.

It’s great fun with friends, but can be annoying to play sometimes. If you try it, go with kb/m. It supports controller, but is way more difficult to build the spells.

finalarbiter•1mo ago
> maybe something else

Water, Life, Arcane, Shield, Lightning, Cold, Fire, and Earth. [0] It's worth noting that, though you can combine most of the elements to form new spells (and with compounding effects, for example wetting or steaming an enemy enhances lightning damage), you cannot typically combine opposites like lightning/ground, which will instead cancel out. Killed myself many times trying to cast lightning spells while sopping wet.

In my experience, though, nobody used the element names—my friends and I just referred to them by their keybinds. QFASA, anyone?

[0] https://magicka.fandom.com/wiki/Elements

jamesgeck0•1mo ago
This is the most Helldivers 2 part for me. Spells being intentionally tricky to execute, combined with accidental element interactions and "friendly fire."
darthcircuit•1mo ago
Thank you! I haven’t played in probably more than ten years. Makes me want to fire it up for old times sake!
rincebrain•1mo ago
Oh, it's not bad or janky (...okay, no, it was in some ways janky, but on purpose), I loved that game.

But the sense of humor/tone between the two games is very visibly the same thing.

moritonal•1mo ago
Oh my, I loved that game! It's wild everyone's throwing shade at Helldivers whilst ignoring that it was an massive success because of how fun it is. I've said it before, Dev's are really bad at understanding the art of making Fun experiences.
SpaceManNabs•1mo ago
Is that a negative? All of the "negative" things listed make me think that they are really cool and trying to learn stuff and challenge things.
rincebrain•1mo ago
It's not a negative, at all.

I just shared it because it's sort of like learning Slack started life as the internal communication tooling for the online game glitch, or that a lot of the "weird Twitter" folks started life as FYAD posters - once you know that, you can draw the lines between the two points.

Zarathruster•1mo ago
Yeah the "Crash to Desktop" comedy spell wasn't added to the game for no good reason.

I do credit their sense of humor about it though.

delichon•1mo ago
Thank you for your service in keeping the galaxy safe for managed democracy.
chamomeal•1mo ago
The game is often broken but they’ve nailed the physics-ey feel so hard that it’s a defining feature of the game.

When an orbital precision strike reflects off the hull of a factory strider and kills your friend, or eagle one splatters a gunship, or you get ragdolled for like 150m down a huge hill and then a devastator kills you with an impassionate stomp.

Those moments elevate the game and make it so memorable and replayable. It feels like something whacky and new is around every corner. Playing on PS5 I’ve been blessed with hardly any game-breaking bugs or performance issues, but my PC friends have definitely been frustrated at times

whalesalad•1mo ago
It's such a janky game. Definitely feels like it was built using the wrong tool for the job. Movement will get stuck on the most basic things. Climbing and moving over obstacles is always a yucky feeling.
speeder•1mo ago
All other games from the same studio have the same features.

In fact, the whole point of their games is that they are coop games where is easy to accidentally kill your allies in hilarious manners. It is the reason for example why to cast stratagems you use complex key sequences, it is intentional so that you can make mistake and cast the wrong thing.

heftig•1mo ago
The only wrong thing I've been throwing is the SOS Beacon instead of a Reinforce, which is just annoying, and not just once. It makes the game public if it was friends-only and gives it priority in the quick play queue. So that can't be it.

The dialing adds friction to tense situations, which is okay as a mechanic.

throwaway902984•1mo ago
Just accidentally smashed some teammates with an eagle napalm instead of eagle smoke before I saw this.
everdrive•1mo ago
Almost certainly this occurred on a Rapid Acquisition Mission on K.
aftbit•1mo ago
It's actually a really nice spell casting system. It lets you have a ton of different spells with only 4 buttons. It rewards memorizing the most useful (like reinforce). It gives a way for things like the squid disruptor fields or whatever they're called to mess with your muscle memory while still allowing spells. It would be way less interesting if it was just using spell slots like so many other games.
rimunroe•1mo ago
I think it has the best explosions in any game I've played too. They're so dang punchy. Combined with their atmospheric effects (fog and dust and whatnot) frantic firefights with bots look fantastic.
philistine•1mo ago
You put the nail on the head with the first Pokémon, but Helldivers 2 is an order of magnitude smaller in the amateur-to-success ratio.

Game Freak could not finish the project, so they had to be bailed by Nintendo with an easy-to-program game so the company could get some much needed cash (the Yoshi puzzle game on NES). Then years later, with no end to the game in sight, Game Freak had to stoop to contracting Creatures inc. to finish the game. Since they had no cash, Creatures inc. was paid with a portion of the Pokémon franchise.

Pokémon was a shit show of epic proportions. If it had been an SNES game it would have been canceled and Game Freak would have closed. The low development cost of Game Boy and the long life of the console made Pokémon possible.

heftig•1mo ago
The game logic is also weird. It seems like they started with at attempt at a realistic combat simulator which then had lots of unrealistic mechanics added on top in an attempt to wrangle it into an enjoyable game.

As an example for overly realistic physics, projectile damage is affected by projectile velocity, which is affected by weapon velocity. IIRC, at some point whether you were able to destroy some target in two shots of a Quasar Cannon or three shots depended on if you were walking backwards while you were firing, or not.

embedding-shape•1mo ago
> depended on if you were walking backwards while you were firing

That sounds like a bug, not an intentional game design choice about the game logic, and definitely unrelated to realism vs not realism. Having either of those as goals would lead to "yeah, bullet velocity goes up when you go backwards" being an intentional mechanic.

heftig•1mo ago
To be clear, walking backwards (away from the target) reduced your bullet velocity relative to the target, reducing the damage you were doing and leading to you needing more shots.
embedding-shape•1mo ago
And to be extra clear, either way, neither options makes me believe it was an intentional design choice.
thunderfork•1mo ago
Systems-driven gameplay is an intentional design choice all unto itself
embedding-shape•1mo ago
Sure, but that doesn't mean every consequence of that choice is suddenly intentional as well.
Cthulhu_•1mo ago
It may not be intentional, but it sounds like it's a fun, emergent gameplay mechanic. How much fun have people had with physics and silliness with Valve's Source engine, which was one of the earlier full physics games? Or going back further, "surf" maps in e.g. Unreal Tournament or CS that abused the movement physics to create a movement puzzle (which, arguably, led to some of the movement mechanics in Titanfall).
embedding-shape•1mo ago
> but it sounds like it's a fun, emergent gameplay mechanic

That you do less damage if you do a certain movement sounds like fun, emergent gameplay? That's not how I understand either of those terms, but of course, every player likes different things.

Surf maps in CS is actually a good example of an engine bug leading to game designers intentionally use it to design new experience, with the keyword being "intentional" since those map makers actually use that bug intentionally. For me that feels very different from engine bugs that don't add any mechanic, and instead just makes the normal game harder.

FieryMechanic•1mo ago
A lot of people in the comments here don't seem to understand that it is a relatively small game company with an outdated engine. I am a lot more forgiving of smaller organisations when they make mistakes.

The game has semi-regular patches where they seem to fix some things and break others.

The game has a lot of hidden mechanics that isn't obvious from the tutorial e.g. many weapons have different fire modes, fire rates and stealth is an option in the game. The game has a decent community and people friendly for the most part, it also has the "feature" of being able to be played for about 20-40 minutes and you can just put it down again for a bit and come back.

heftig•1mo ago
The bad tutorial at least has some narrative justification. It's just a filter for people who are already useful as shock troops with minimal training.
FieryMechanic•1mo ago
I also think that the tutorial would be tedious if it went through too much of the mechanics. They show you the basics, the rest you pick up through trial and error.
red-iron-pine•1mo ago
aye. give me the 3 minute tutorial, not the 37 minute tutorial.

i want to play the game, like now, and i'll read the forums after i figure out that i'm missing something imporant

banannaise•1mo ago
Not only does the bad tutorial have an in-universe justification; the ways in which it is bad are actually significant to the worldbuilding in multiple ways.

The missing information also encourages positive interactions among the community - newer players are expected to be missing lots of key information, so teaching them is a natural and encouraged element of gameplay.

I stopped playing the game awhile ago, but the tutorial always struck me as really clever.

Cthulhu_•1mo ago
The tutorial is just fine - here's a gun, here's how you shoot it, here's how you call reinforcements, now go kill some bugs!
123malware321•1mo ago
considering it still cost 40$ for a 2 year old game, i think they are way beyond the excuse of small team low budget trying to make cool stuff. They have receive shit tons of money and are way to late trying to optimise the game. When it came out it ran so pisspoor i shelved it for a long time. Trying it recently its only marginally better. its really poorly optimised, and blaming old tech is nonsense.

People make much more smooth and complex experiences in old engines.

You need to know your engine as a dev and dont cross its limits at the costs of user-experiences and then blame your tools....

The whole story about more data making load times better is utter rubbish. Its a sign of pisspoor resource management and usage. For the game they have, they should have realized a 130GB install is unacceptable. It's not like they have very elaborate environments. A lot of similar textures and structures everywhere.. its not like its some huge unique world like The Witcher or such games...

There is an astronomical amount of information available for free on how to optimise game engines, loads of books, articles, courses.

How much money do you think they have made so far?

"Arrowhead Game Studios' revenue saw a massive surge due to Helldivers 2, reporting around $100 million in turnover and $76 million in profit for the year leading up to mid-2025, significantly increasing its valuation and attracting a 15.75% investment from Tencent"

75 million in profit but can't figure out how to optimise a game engine. get out.

shadowgovt•1mo ago
It costs $40 for a 2-year-old game because the market is bearing $40 for a 2-year-old game.

If anything, it's a testament to how good a job they've done making the game.

aftbit•1mo ago
The most recent Battlefield released at $80. Arc Raiders released at $40 with a $20 deluxe edition upgrade. I think $40 for a game like Helldivers 2 is totally fair. It's a fun game, worth at least 4 to 8 hours of playtime.
debugnik•1mo ago
> worth at least 4 to 8 hours of playtime.

Is that supposed to be praise?

entropie•1mo ago
Its also wrong. With 10 hours of helldivers 2 you havent seen much of the game at all.

I played it a bit after release and have 230 hours. I liked the game and it was worth my money.

aftbit•1mo ago
Yeah, I meant "at least" 4-8 hours. Even if you get bored and give up after that, you've gotten your money's worth, in my opinion.

I have almost 270 hours in Helldivers 2 myself. Like any multiplayer game, it can expand to fill whatever amount of time you want to dedicate to it, and there's always something new to learn or try.

FieryMechanic•1mo ago
I would say until you are about level 60 there are a bunch of mechanics that you won't understand.

> Like any multiplayer game, it can expand to fill whatever amount of time you want to dedicate to it, and there's always something new to learn or try.

Generally at this point I normally do runs where I go full like gas, stun or full fire builds.

everdrive•1mo ago
It's a comment about cost-to-hourly-entertainment. eg: if in the general sense you're spending $5-$10 per hour of entertainment you're doing at least OK. I understand that a lot of books and video games can far exceed this, but it's just a general metric and a bit of a low bar to clear. (I have a LOT more hours into the game so from my perspective my $40 has paid quite well.)
aftbit•1mo ago
Ah sorry, I thought "at least" would carry this statement. I've played Helldivers for more than 250 hours personally.

For some reason, though, I tend to compare everything to movie theater tickets. In my head (though it's not true anymore), a movie ticket costs $8 and gives me 1 hour of entertainment. Thus anything that gives me more than 1 hour per $8 is a good deal.

$40 / 4 => $10/hr

$40 / 8 => $5/hr

Thus, I think Helldivers is a good deal for entertainment even if you only play it for under 10 hours.

debugnik•1mo ago
Thanks, I get what you meant now. I've never liked this comparison because I don't find movie tickets a particularly good deal, but that might just be my upbringing.
Cthulhu_•1mo ago
I'm not sure what that is supposed to be - it's an online co op game, not a story-driven one like the 4-6 hour FPS games that was a norm at one point.

It's the kind of game where some people spend thousands of hours in, well worth the $40 to them.

the_af•1mo ago
What does the age of the game in years have to do with anything?

A fun game is a fun game.

FieryMechanic•1mo ago
Compared to the bigger gaming studios they are small. In fact they are not that much larger than the company I work for (not a game studio).

The fact it is un-optimised can be forgiven because the game has plenty of other positives so people like myself are willing to look over them.

I've got a few hundred hours in the game (I play for maybe an hour in the evening) and for £35 it was well worth the money.

embedding-shape•1mo ago
This would make sense if it was a studio without experience, and without any external help, but their publisher is Sony Interactive Entertainment, which also provides development help when needed, especially optimizations and especially for PS hardware. SIE seems to have been deeply involved with Helldivers 2, doubling the budget and doubling the total development time. Obviously it was a good choice by SIE, it paid off, and of course there is always 100s of more important tasks to do before launching a game, but your comment reads like these sort of problems were to be expected because the team started out small and inexperienced or something.
everdrive•1mo ago
>but your comment reads like these sort of problems were to be expected because the team started out small and inexperienced or something.

More or less nothing is optimized these days, and game prices and budgets have gone through the roof. Compared to the other games available these days (combined with how fun the game is) I definitely give HD2 a big pass on a lot of stuff. I'm honestly skeptical of Sony's involvement being a benefit, but that's mostly due to my experience regarding their attempts to stuff a PSN ID requirement into HD2 as well as their general handling of their IPs. (Horizon Zero Dawn is not only terrible, but they seem to try to force interest with a new remake on a monthly basis.)

embedding-shape•1mo ago
> More or less nothing is optimized these days

Not true, lots of games are optimized, but it's one of those tasks that almost no one notices when you do it great, but everyone notices when it's missing, so it's really hard to tell by just consuming ("playing") games.

> I'm honestly skeptical of Sony's involvement being a benefit

I'm not, SIE have amazing engineers, probably the best in the industry, and if you have access to those kind of resources, you use it. Meanwhile, I agree that executives at Sony sometimes have no clue, but that doesn't mean SIE helping you with development suddenly has a negative impact on you.

everdrive•1mo ago
>Not true, lots of games are optimized,

I don't mean this is a counter-argument -- I'm really interested. What are some good examples of very recent optimized games?

embedding-shape•1mo ago
BF6 comes to mind, out of newly released games. Arc Raiders too, seems to have avoided the heap of criticism because of performance, meaning it is probably optimized enough so people don't notice issues. Dyson Sphere Program (yet to be released) is a bit older, and indie, but very well optimized.
everdrive•1mo ago
Thanks for the list -- now that you mention it, I recall being quite surprised to learn that Arc Raiders was not only an UE5 game but would also run nicely on my PC. (I haven't played it, but a friend asked me to consider it) Now that you mention it as well, I think I recall the BF6 folks talking specifically about not cramming too many graphical techniques into their games so that people could actually play the game.

Thanks for the list!

embedding-shape•1mo ago
> I recall being quite surprised to learn that Arc Raiders was not only an UE5 game but would also run nicely on my PC

Yeah, Unreal Engine (5 almost specifically) is another example of things that are unoptimized by default, very easy to notice, but once you work on it, it becomes invisible and it's not that people suddenly cheer, you just don't hear complaints about it.

It's also one of those platforms where there is a ton of help available from Epic if you really want it, so you can tune the defaults BEFORE you launch your game, but hardly anyone seemingly does that, and then both developers and users blame the engine, instead of blaming the people releasing the game. It's a weird affair all around :)

shadowgovt•1mo ago
Sony also published No Man's Sky.

I'm not sure having the support of Sony is that gold-standard imprint that people think it is.

embedding-shape•1mo ago
No Man's Sky didn't have technical issues at launch though, it ran fine for what is was. The problem with NMS was that people were told it would be a completely different experience compared to what it ended up being (at launch).
dnrvs•1mo ago
too many arm chair game devs who think they know better in this thread
lordnikon001•1mo ago
I think what irks people is the number one rule of optimization is to always measure

You never assume something is an optimization or needed and never do hypothetical optimizations

I can see why it would happen in this case though, gamedev is chaotic and you're often really pressed for time

forrestthewoods•1mo ago
WebDevs who have build systems that take ten minutes and download tens of megabytes of JS and have hundreds of milliseconds of lag are sooooooooooooo not allowed to complain about game devs ever.
Dylan16807•1mo ago
Oh, at first I thought you were talking about websites doing that and I was going to say "sure, those people can't complain, but the rest of us can".

Then I realized you said build systems and eh, whatever. It's not good for build systems to be bloated, but it matters a lot less than the end product being bloated.

And you seem to be complaining about the people that are dealing with these build systems themselves, not inflicting them on other people? Why don't they get to complain?

forrestthewoods•1mo ago
Download bloat is net less impactful than build time bloat imho. Game download and install size bloat is bad. But is a mostly one time cost. Build time bloat doesn’t directly impact users, but iteration time is GodKing so bad build times indirectly hurt consumers.

But that’s all beside the point. What I was really doing was criticizing the <waves hands wildly> HN commenters. HN posters are mostly webdevs because most modern programmers are webdevs. And while I won’t say the file bloat here wasn’t silly, I wonder stand for game dev slander from devs that commit faaaaaaaaaaaaaar greater sins.

Dylan16807•1mo ago
Web devs are not a hivemind. That kind of criticism doesn't work well at all when pointed at the entirety of the site.

> Download bloat is net less impactful than build time bloat imho.

Download bloat is a bad problem for people on slow connections, and there's a lot of people on slow connections. I really dislike when people don't take that into account.

And even worse if they're paying by the gigabyte in a country with bad wireless prices, that's so much money flushed down the drain.

forrestthewoods•1mo ago
Believe you me I wish every website worked on 2G. HN is great at least.

For consoles total disk space is an even bigger constraint than download size. But file size is a factor. Call of Duty is known to be egregious. It’s a much more complex problem than most people realize. Although hopefully not as trivial a fix as Helldivers!

In any case HN has a broadly dismissive attitude towards gamedevs. It is irksome. And deeply misplaced. YMMV.

bigstrat2003•1mo ago
> Oh, at first I thought you were talking about websites doing that

I'm pretty sure that is in fact what he meant, and that "have build systems" is a typo of "have built systems".

ycombinatrix•1mo ago
"Don't 6x your game's install size for no measurable benefit to users"

Wow! It looks like I do indeed know better.

renewiltord•1mo ago
Pretty cool. I think it’s completely normal to be under a crunch and just go with some standard practices under normal conditions. Cool that they went back and sorted it out afterwards!

I’ve got to say. I do find it somewhat unusual that despite the fact that every HN engineer has John Carmack level focus on craftsmanship, about 1/100k here produce that kind of outcome.

I don’t get it. All of you guys are good at pointing out how to do good engineering. Why don’t you make good things?

mort96•1mo ago
The negativity towards this is wild. A company followed relatively widely accepted industry practice (lots and lots of other games also have huge sizes on disk for the exact same reason), then eventually they decided to do their own independent testing to check whether said common practice actually makes things better or not in their case, found that it didn't, so they reversed it. In addition, they wrote up some nice technical articles on the topic, helping to change the old accepted industry wisdom.

This seems great to me. Am I crazy? This feels like it should be Hacker News's bread and butter, articles about "we moved away from Kubernetes/microservices/node.js/serverless/React because we did our own investigation and found that the upsides aren't worth the downsides" tend to do really well here. How is this received so differently?

scsh•1mo ago
It's because shitting on game devs is the trendy thing these days, even among more technically inclined crowds unfortunately. It seems like there's a general unwillingness to accept that game development is hard and you can't just wave the magic "optimize" wand at everything when your large project is also a world of edge cases. But it seems like it should be that simple according to all the armchair game devs on the internet.
taeric•1mo ago
There has long been a trend that "software engineers" and "computer scientists" both have been rather uninterested in learning the strategies that gaming developers use.

Really, the different factions in software development are a fascinating topic to explore. Add embedded to the discussion, and you could probably start fights in ways that flat out don't make sense.

buildbot•1mo ago
The level of work that goes into even “small” games is pretty incredible. When I was a grad student another student was doing their (thesis based, research focused) masters while working at EA on a streetfighter(?) game.

The game programming was actually just as research focused and involved as the actual research. They were trying to figure out how to get the lowest latency and consistency for impact sounds.

embedding-shape•1mo ago
Meh, the same is true for almost every discussion on the internet, everyone is an expert armchair for whatever subject you come across, and when you ask them about their experience it boils down to "I read lots of Wikipedia articles".

I mean I agree with you, that it is trendy and seemingly easy, to shit on other people's work, and at this point it seems to be a challenge people take up upon themselves, to criticise something in the most flowery and graphic way as possible, hoping to score those sweet internet points.

Since maybe 6-7 years I stopped reading reviews and opinions about newly launched games completely, the internet audience (and reviewers) are just so far off base compared to my own perspective and experience that it have become less than useless, it's just noise at this point.

AngryData•1mo ago
I wish many people's "expertise" atleast amounted to reading wikipedia. It seems for many that is too much and they either make crap up on the spot or latch onto whatever the first thing they find that will confirm their biases regardless of how true it is.
jeffwask•1mo ago
For me it's not so much about shitting on game devs as it is about shitting on the ogres that run game companies. Any of us who have done development should understand we have little control over scope and often want to do more than the business allows us to.
scsh•1mo ago
That is completely ok in my opinion. It's just most discourse I come across treats the developers as complete amateurs who don't know what they're doing. As someone who's a professional dev myself I just can't get behind bashing the people doing the actual work when I know we're all dealing with the same business realities, regardless of industry.
red-iron-pine•1mo ago
the engineers disease: "i'm smarter than you and I need to prove it, and we're so smart we wouldn't have shipped this code in the first place" bla bla bla

also keep in mind that modern gaming generates more revenue than the movie industry, so it's in the interests of several different parties to denigrate or undermine any competing achievement -- "Bots Rule Every Thing Around Me"

MattGaiser•1mo ago
Probably because many are purists. It is like how anything about improving Electron devolves into "you shouldn't use Electron."

Many would consider this a bare minimum rather than something worthy of praise.

mschuster91•1mo ago
> Probably because many are purists. It is like how anything about improving Electron devolves into "you shouldn't use Electron."

The Electron debate isn't about details purism, the Electron debate is about the foundation being a pile of steaming dung.

Electron is fine for prototyping, don't get me wrong. It's an easy and fast way to ship an application, cross-platform, with minimal effort and use (almost) all features a native app can, without things like CORS, permission popups, browser extensions or god knows what else getting in your way.

But it should always be a prototype and eventually be shifted to native applications because in the end, unlike Internet Explorer in its heyday which you could trivially embed as ActiveX and it wouldn't lead to resource gobbling, if you now have ten apps consuming 1GB RAM each just for the Electron base to run, now the user runs out of memory because it's like PHP - nothing is shared.

jauntywundrkind•1mo ago
Or these devs & users can migrate to a PWA. Which will have vastly less overhead. Because it is shared, because each of those 10 apps you mention would be (or could be, if they have ok data architecture) tiny.
mschuster91•1mo ago
> Or these devs & users can migrate to a PWA

PWAs have the problem that for every interaction with the "real world" they need browser approval. While that is for a good reason, it also messes with the expectations of the user, and some stuff such as unrestricted access to the file system isn't available to web apps at all.

saratogacx•1mo ago
Removing layers is hard though, better to have electron host a WASM application which will become a new "native" that gets argued semantically.
zamadatix•1mo ago
Each person seems to have their own bugbear about Electron but I really doubt improving Electron to have shared instances a la WebView2 would make the much of a dent in the hate for it here.
Night_Thastus•1mo ago
It would be one thing if it was a 20% increase in space usage, or if the whole game was smaller to start with, or if they had actually checked to see how much it assisted HDD users.

But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?

It's kind of exemplary of HD2's technical state in general - which is a mix of poor performance and bugs. There was a period where almost every other mission became impossible to complete because it was bugged.

The negativity is frustration boiling over from years of a bad technical state for the game.

I do appreciate them making the right choice now though, of course.

colechristensen•1mo ago
>But over 6x the size with so little benefit for such a small segment of the players is very frustrating. Why wasn't this caught earlier? Why didn't anyone test? Why didn't anyone weigh the pros and cons?

Have you never worked in an organization that made software?

Damn near everything can be 10x as fast and using 1/10th the resources if someone bothered to take the time to find the optimizations. RARE is it that something is even in the same order of magnitude as its optimum implementation.

ozgrakkurt•1mo ago
This is not a reason for accepting it imo
mywittyname•1mo ago
Optimization takes up time, and often it takes up the time of an expert.

Given that, people need to accept higher costs, longer development times, or reduced scope if they want better optimized games.

But what is worse, is just trying to optimize software is not the same as successfully optimizing it. So time and money spent on optimization might yield no results because there might not be anymore efficiency to be gained, the person doing the work lacks the technical skill, the gains are part of a tradeoff that cannot be justified, or the person doing the work can't make a change (i.e., a 3rd party library is the problem).

The lack of technical skill is a big one, IMO. I'm personally terrible at optimizing code, but I'm pretty good at building functional software in a short amount of time. We have a person on our team who is really good at it and sometimes he'll come in after me to optimize work that I've done. But he'll spend several multiples of the time I took making it work and hammering out edge cases. Sometimes the savings is worth it.

kappaking•1mo ago
> Given that, people need to accept higher costs, longer development times, or reduced scope if they want better optimized games.

God why can’t it just be longer development time. I’m sick of the premature fetuses of games.

Cyphusx•1mo ago
The trade off they're talking about is to arrive at the same end product.

The reason games are typically released as "fetuses" is because it reduces the financial risk. Much like any product, you want to get it to market as soon as is sensible in order to see if it's worth continuing to spend time and money on it.

mort96•1mo ago
And this really shouldn't surprise professionals in an industry where everything's always about development velocity and releasing Minimum Viable Products as quickly into the market as possible.
maccard•1mo ago
> God why can’t it just be longer development time.

Where do you stop? What do the 5 tech designers do while the 2 engine programmers optimise every last byte of network traffic?

> I’m sick of the premature fetuses of games.

Come on, keep this sort of crap off here. Games being janky isn't new - look at old console games and they're basically duct taped together. Go back to Half-life 1 in 1998 - the Xen world is complete and utter trash. Go back farther and you have stuff that's literally unplayable [0], or things that were so bad they literally destroyed an entire industry [1], or rendered the game uncompleteable [2].

[0] https://en.wikipedia.org/wiki/Dr._Jekyll_and_Mr._Hyde_(video... [1] https://www.theguardian.com/film/2015/jan/30/a-golden-shinin... [2] https://www.reddit.com/r/gamecollecting/comments/hv63ad/comm...

colechristensen•1mo ago
Super Mario 64, widely recognized as one of the most iconic influential games ever... was released with a build that didn't have the compiler optimizations turned on. They proved this by decompiling it and with the exact right compiler and tools recompiling it with the non-optimized arguments. Recompiling with the optimizations turned on resulted in no problems and significant performance boosts.

One of the highest rated games ever released without devs turning on the "make it faster" button which would have required approximately zero effort and had zero downsides.

This kind of stuff happens because the end result A vs. B doesn't make that much of a difference.

And it's very hard to have a culture of quality that doesn't get overrun by zealots who will bankrupt you while they squeeze the last 0.001% of performance out of your product before releasing. It is very had to have a culture of quality that does the important things and doesn't do the unimportant ones.

The people who obsess with quality go bankrupt and the people who obsess with releasing make money. So that's what we get.

A very fine ability for evaluating quality mixed with pragmatic choice for what and when to spend time on it is rare.

maccard•1mo ago
> The people who obsess with quality go bankrupt and the people who obsess with releasing make money. So that's what we get.

I think this is a little harsh and I’d rephrase the second half to “the people Who obsess with releasing make games”.

unusualmonkey•1mo ago
Just wait until after launch. You get a refined experience and often much lower prices.
zamadatix•1mo ago
I think what makes this a bit different from the usual "time/value tradeoff" discussion is bloating the size by 6x-7x was the result of unnecessary work in the name of optimization instead of lack of cycles to spend on optimization.
mort96•1mo ago
Eh probably not, it's probably handled by some automated system when making release builds of the game. Sure, implementing that initially was probably some work (or maybe it was just checking a checkbox in some tool), but there's probably not much manual work involved anymore to keep it going.

Reverting it now though, when the game is out there on a million systems, requires significant investigation to ensure they're not making things significantly worse for anyone, plus a lot of testing to make sure it doesn't outright break stuff.

zamadatix•1mo ago
Reverting it now was certainly a pile of work, but that's neither here nor there for the portion of the story bothering people. It's like they threw rocks threw the windows years ago to make them slightly clearer to see through and now put a ton of work in to undo that because they discovered that made no sense in reality.

It's great they did all the work to fix it after the fact, but that doesn't justify why it was worth throwing rocks through the window in the first place (which is different than not doing optimizations).

thaumasiotes•1mo ago
But this isn't an optimization. The 150+GB size is the "optimization", one that never actually helped with anything. The whole news here is "Helldivers 2 stopped intentionally screwing its customers".

I don't see why it's a surprise that people react "negatively", in the sense of being mad that (a) Helldivers 2 was intentionally screwing the customers before, and (b) everyone else is still doing it.

bigstrat2003•1mo ago
> The whole news here is "Helldivers 2 stopped intentionally screwing its customers".

That is an extremely disingenuous way to frame the issue.

thaumasiotes•1mo ago
How so?
teamonkey•1mo ago
It was a choice, not an oversight. They actively optimised for HDD users, because they believed that failing to do so could impact load times for both SSD and HDD users. There was no speed penalty in doing so for SSD users, just a disk usage penalty.

Helldivers II was also much smaller at launch than it is now. It was almost certainly a good choice at launch.

mort96•1mo ago
You make a million decisions in the beginning of every project. I'm certain they made the choice to do this "optimization" at an early point (or even incidentally copied the choice over from an earlier project) at a stage where the disk footprint was small (a game being 7GB when it could've been 1GB doesn't exactly set off alarm bells).

Then they just didn't reconsider the choice until, well, now.

teamonkey•1mo ago
Even at the end of development it’s a sensible choice. It’s the default strategy for catering to machines with slow disk access. The risk of some players experiencing slow load times is catastrophic at launch. In absence of solid user data, it’s a fine assumption to make.
XorNot•1mo ago
The first impression matters is the thing. This was John Carmacks idea on how to sell interlacing to smartphone display makers for VR: the upsell he had was that there's one very important moment when a consumer sees a new phone: they pick it up, open something and flick it and that scroll effect better be a silky smooth 60 FPS or more or there's trouble. (His argument was making that better would be a side effect of what he really wanted).
brokenmachine•1mo ago
Call me a dinosaur, but I don't consider a 154Gb download before I can start playing a good first impression.

In fact, I would seriously consider even buying a game that big if I knew beforehand. When a 500Gb SSD is $120 Aussie bucks, that's $37 of storage.

eurekin•1mo ago
The negativity wasn't created in a vacuum. ArrowHead has a long track record of technical mishaps and a proven history of erasing all evidence about those issues, without ever trying to acknowledge them. Reddits, Discord and YouTube comment section are heavily moderated. I suspect there's might be a 3rd party involved in this, which doesn't forward any technical issues, if the complaint involves any sign of frustration. Even the relation with their so called "Propaganda Commanders" (official moniker for their youtube partner channels) has been significantly strained in two cases, for trivialities.

It took Sony's intervention to actually pull back the game into playable state once - resulting in the so called 60 day patch.

Somehow random modders were able to fix some of the most egregiously ignored issues (like an enemy type making no sound) quickly and effectively. ArrowHead ignored, then denied, then used the "gamers bad" tactic, banned people pointing it out. After long time, finally fixing it and trying to bury it in the patch notes too.

They also have been caught straight up lying about changes, most recent one was: "Apparently we didn't touch the Coyote", where they simply buffed enemies resistance to fire, effectively nerfing the gun.

sigmoid10•1mo ago
Sony nearly killed all good will the game had accrued when they tried to use the massive player base as an opportunity to force people into their worthless ecosystem. I don't think Sony even has the capability to make good technical decisions here, they are just the publisher. It was always Arrowhead trying to keep up with their massive success that they clearly weren't prepared for at all. In the beginning they simply listened to some very vocal players' complaints, which turned out to not be what the majority actually wanted. Player driven development is hardly ever good for a game.
eurekin•1mo ago
So, players wanting:

- To their PC not reboot and BSOD (was a case few months ago)

- Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)

- Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)

- Continue to run, even when anybody else from the team was stimming (yes, any person in the team stimming caused others to slow down)

- Actually be able to hear one of the biggest enemies in the game

- To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)

- Be able to use chat, when in the vehicle (this would result in using your primary weapon)

- Be able to finish drill type mission (this bugs out a lot still)

- Not be attacked by enemies that faze through buildings

- Not be attacked by bullets passing through terrain, despite the player bullets being stopped there

are just vocal player's complaints? A lot of those bugs went totally unaddressed for months. Some keep coming back in regressions. Some just are still ongoing. This is only a short list of things I came across, while casually playing. It's a rare sight to have a full OP without an issue (even mission hardlocks still).

About Sony - I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.

FieryMechanic•1mo ago
As someone with 700 hours in the game, I've played the game both on Windows and Linux.

A lot of issues are to do with the fact that the game seems to corrupt itself. If I have issues (usually performance related), I do a steam integrity check and I have zero issues afterwards. BTW, I've had to do this on several games now, so this isn't something that is unique to HellDivers. My hardware is good BTW, I check in various utils and the drives are "ok" as far as I can tell.

> - To their PC not reboot and BSOD (was a case few months ago)

This was hyped up by a few big YouTubers. The BSODs was because their PCs were broken. One literally had a burn mark on their processor (a known issue with some boards/processor combos) and the BSODs went away when they replaced their processor. This tells me that there was something wrong with their PC and any game would have caused a BSOD.

So I am extremely sceptical of any claims of BSODs because of a game. What almost is always the case is that the OS or the hardware is at issue and playing a game will trigger the issue.

If you are experiencing BSODs I would make sure your hardware and OS are actually good, because they are probably not. BTW I haven't a BSOD in Windows for about a decade because I don't buy crap hardware.

> - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)

False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.

> Be able to use weapon customisation (the game crashed, when you navigated to the page with custom paints)

This happened for like about a week for some people and I personally didn't experience this.

> To not issue stim/reload/weapon change multiple times, for them just to work (it's still normal to press stim 6 times in some cases, before it activates, without any real reason)

I've not experience this. Not heard anyone complain about this and I am in like 4 different HellDivers focus'd discord servers

> Not be attacked by enemies that faze through buildings

This can be annoying, but it happens like once in a while. It isn't the end of the world.

gfaster•1mo ago
> So I am extremely sceptical of any claims of BSODs because of a game.

Generally speaking, I am too. That is unless there is kernel-level anticheat. In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle

eurekin•1mo ago
I had them and I keep observing this strange tendency to wipe that particular issue out of existence
FieryMechanic•1mo ago
> In that case I believe it's fair to disregard all other epistemological processes and blame BSODs on the game out of principle

I am sorry but that is asinine and unscientific. You should blame BSODs on what is causing them. I don't like kernel anti-cheat but I will blame the actual cause of the issues, not assign blame on things which I don't approve of.

I am a long time Linux user and many of the people complaining about BSODs on Windows had a broken the OS in one way or another. Some were running weird stuff like 3rd party shell extensions that modify core DLLs, or they had installed every POS shovelware/shareware crap. That isn't Microsoft's fault if you start running an unsupported configuration of the OS.

Similarly. The YouTubers that were most vocal about HellDivers problems did basically no proper investigation other than saying "look it crashed", when it was quite clearly their broken hardware that was the issue. As previously stated their CPU had a burn mark on one of the pins, some AM5 had faults that caused this IIRC. So everything indicated hardware failure being the cause of the BSOD. They still blamed the game, probably because it got them more watch time.

During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.

gfaster•1mo ago
> that is asinine and unscientific

Well, yes. I did say something to that effect. Blaming BSODs on invasive anti-cheat out of principle is a political position, not a scientific one.

> During the same time period when people were complaining about BSODs, I didn't experience one. I was running the same build of the game as them and playing on the same difficulty and sometimes recording it via OBS (just like they were). What I didn't have was a AM5 motherboard, I have and older AM4 motherboard which doesn't have these problems.

I understand what you're saying here, but anyone who does a substantial amount of systems programming could tell you that hardware-dependent behavior is evidence for a hardware problem, but does not necessarily rule out a software bug that only manifests on certain hardware. For example, newer hardware could expose a data race because one path is much faster. Alternatively, a subroutine implemented with new instructions could be incorrect.

Regardless, I don't doubt that this issue with Helldivers 2 was caused by (or at least surfaced by) certain hardware, but that does not change that given such an issue, I would presume the culprit is kernel anticheat until presented strong evidence to the contrary.

FieryMechanic•1mo ago
> Well, yes. I did say something to that effect. Blaming BSODs on invasive anti-cheat out of principle is a political position, not a scientific one.

When there are actual valid concerns about the anti-cheat, these will be ignored because of people that assigned blame to it when unwarranted. This is why making statements based on your ideology can be problematic.

> I understand what you're saying here, but anyone who does a substantial amount of systems programming could tell you that hardware-dependent behavior is evidence for a hardware problem, but does not necessarily rule out a software bug that only manifests on certain hardware. For example, newer hardware could expose a data race because one path is much faster. Alternatively, a subroutine implemented with new instructions could be incorrect.

People were claiming it was causing hardware damage which is extremely unlikely since both Intel, AMD and most hardware manufacturers have mechanisms which prevent this. This isn't some sort of opaque race condition.

> RI would presume the culprit is kernel anti-cheat until presented strong evidence to the contrary.

You should know that if you making assumptions without evidence that will often lead you astray.

I don't like kernel anti-cheat and would prefer for it not to exist, but making stupid statements based on ideology instead of evidence just makes you look silly.

eurekin•1mo ago
> - To their PC not reboot and BSOD (was a case few months ago)

I was just about to replace my gpu (4090 at that!), I had them 3 times a session. I did sink a lot of hours to debug that (replaced cables, switched PSUs between desktops) and just gave up. After few weeks, lo and behold, a patch comes out and it all disappears.

A lot of people just repeat hearsay about the game

eurekin•1mo ago
> > - Be able to actually finish a mission (game still crashes a lot just after extraction, it's still rare for the full team to survive 3 missions in a row)

> False. A few months ago I played it for an entire day and the game was fine. Last week I played it a good portion of Saturday night. I'm in several large HellDivers focused Discord servers and I've not heard a lot of people complaining about it. Maybe 6 months ago or a year ago this was the case, but not now.

I specifically mean the exact time, right after the pelican starts to fly. I keep seeing "<player> left" or "disconnected". Some come back and I have a habit of asking: "Crash?", they respond with "yeah"

FieryMechanic•1mo ago
If that is happening, they need to do a Steam Integrity check. I understand the game is buggy, but it isn't that buggy.
XorNot•1mo ago
It's basically an Internet fable at this point that there's "a game that physically damages your hardware".

The answer to every such claim is just: no. But it's click bait gold to the brain damage outrage YouTuber brigade.

Accidentally using a ton of resources might e reveal weaknesses, but it is absolutely not any software vendors problem that 100% load might reveal your thermal paste application sucked or Nvidia is skimping on cable load balancing.

FieryMechanic•1mo ago
This was pretty much my take as well. I have an older CPU, Motherboard and GPU combo before the newer GPU power cables that obviously weren't tested properly and I have no problems with stability.

These guys are running an intensive game on the highest difficulty, while streaming and they probably have a bunch of browser windows and other software running background. Any weakness in the system is going to be revealed.

I had performance issues during that time and I had to restart game every 5 matches. But it takes like a minute to restart the game.

eurekin•1mo ago
Trust me, I'm a software developer with more than two decades of experience. Have been dabbling in hardware since the Amiga 500 era. "I have that specific set of skills" that allows me to narrow down a class of issues pretty well - just a lot of component switching in a binary divide and conquer fashion across hardware.

The issue is 1) actually exaggarated in the community, but not without actual substance 2) getting disregarded exactly because of exaggarations. It was a very real thing.

I also happen to have a multi gpu workstation that works flawlessly too

sigmoid10•1mo ago
>I specifically referred the Shams Jorjani's (CEO of ArrowHead) explanation to Hermen Hulst (the head of PlayStation Studios) why the review score collapsed to 19%, among other issues.

I don't know what you mean. The game literally got 84,000 negative reviews within 24 hours after Sony tried to force PSN on everyone. No bug or missing feature ever came anywhere close to this kind of negative sentiment toward the game.

zamadatix•1mo ago
Arrowhead probably deserves more love for breaking the norm but I think it's overshadowed by people finding out for the first time the reason HDDs are so common in gaming setups is companies have been blindly shaving a few seconds off HDD load time off at the cost of 7x the disk space.

If it had been more well known this was the cause of game bloat before then this probably would have been better received. Still, Arrowhead deserves more credit both for testing and breaking the norm as well as making it a popular topic.

abtinf•1mo ago
Part of what makes this outrageous is that the install size itself is probably a significant part of the reason to install the game on an HDD.

154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.

Is there a name for the solution to a problem (make size big to help when installed on HDD) in fact being the cause of the problem (game installed on HDD because big) in the first place?

consp•1mo ago
Can any games these days be reliably ran on hdd's with max 200mb/s throughout (at best)? Or does everyone get a coffee and some cookies when a new zone loads? Even with this reduction that will take a while.

I thought all required ssd's now for "normal" gameplay.

kbolino•1mo ago
Until you get to super-high-res textures and the like, the throughput isn't nearly as important as the latency.

At 200 MB/s the way hard drives usually measure it, you're able to read up to 390,625 512-byte blocks in 1 second, or to put it another way, a block that's immediately available under the head can be read in 2.56 microseconds. On the other hand, at 7200 RPM, it takes up to 8.33 milliseconds to wait for the platter to spin around and reach a random block on the same track. Even if these were the only constraints, sequentially arranging data you know you'll need to have available at the same time cuts latency by a factor of about 3000.

It's much harder to find precise information about the speed of the head arm, but it also usually takes several milliseconds to move from the innermost track to the outermost track or vice versa. In the worst case, this would double the random seek time, since the platter has to spin around again because the head wasn't in position yet. Also, since hard drives are so large nowadays, the file system allocators actually tend to avoid fragmentation upfront, leading to generally having few fragments for large files (YMMV).

So, the latency on a hard drive can be tolerable when optimized for.

wtallis•1mo ago
> On the other hand, at 7200 RPM, it takes up to 138 microseconds to wait for the platter to spin around and reach a random block on the same track.

You did the math for 7200 rotations per second, not 7200 rotations per minute = 120 rotations per second.

In gaming terms, you get at most one or two disk reads per frame, which effectively means everything has to be carefully prefetched well in advance of being needed. Whereas on a decade-old SATA SSD you get at least dozens of random reads per frame.

kbolino•1mo ago
Fixed!
jayd16•1mo ago
"Self fulfilling prophecy" perhaps?
KronisLV•1mo ago
> 154GB vs 23GB can trivially make the difference of whether the game can be installed on a nice NVMe drive.

I think War Thunder did it the best:

  * Minimal client 23 GB
  * Full client 64 GB
  * Ultra HQ ground models 113 GB
  * Ultra HQ aircraft 92 GB
  * Full Ultra HQ 131 GB
For example, I will never need anything more than the full client, whereas if I want to play on a laptop, I won't really need more than the minimal client (limited textures and no interiors for planes).

The fact that this isn't commonplace in every engine and game out there is crazy. There's no reason why the same approach couldn't also work for DLCs and such. And there's no reason why this couldn't be made easy in every game engine out there (e.g. LOD level 0 goes to HQ content bundle, the lower ones go into the main package). Same for custom packages for like HDDs and such.

nopurpose•1mo ago
My immediate question is that if all of that was on-disk data duplication, why did it affected download size? Can't small download be expanded into optimal layout on the client side?
ahartmetz•1mo ago
Sure it can - it would need either special pre- and postprocessing or lrzip ("long range zip") to do it automatically. lrzip should be better known, it often finds significant redundancy in huge archives like VM images.
braiamp•1mo ago
It didn't. They downloaded 43 GB instead of 152 GB, according to SteamDB: https://steamdb.info/app/553850/depots/ Now it is 20 GB => 21 GB. Steam is pretty good at deduplicating data in transit from their servers. They are not idiots that will let developers/publishers eat their downstream connection with duplicated data.

https://partner.steamgames.com/doc/sdk/uploading#AppStructur...

myself248•1mo ago
Furthermore, this raises the possibility of a "de-debloater" that HDD users could run, which would duplicate the data into its loading-optimized form, if they decided they wanted to spend the space on it. (And a "de-de-debloater" to recover the space when they're not actively playing the game...)

The whole industry could benefit from this.

nomel•1mo ago
> to recover the space when they're not actively playing the game

This would defeat the purpose. The goal of the duplication is to place the related data physically close, on the disk. Hard links, removing then replacing, etc, wouldn't preserve the physical spacing of the data, meaning the terrible slow read head has to physically sweep around more.

I think the sane approach would be to have a HDD/SDD switch for the file lookups, with all the references pointing to the same file, for SDD.

myself248•1mo ago
So you'd have to defrag after re-bloating, to make all the files contiguous again. That tool already exists, and the re-bloater could just call it.
nomel•1mo ago
Sure, but defrag is a very slow process, especially if you're re-bloating (since it requires shifting things to make space), and definitely not something that could happen in the background, as the player is playing. Re-bloating definitely wouldn't be good for a quick "Ok, I'm ready to play!".
myself248•1mo ago
I imagine it'd be equivalent to a download task, just one that doesn't consume bandwidth.
ender341341•1mo ago
depending on how the data duplication is actually done (like texture atlasing the actual bits can be very different after image compression) it can be much harder to do rote bit level deduplication. They could potentially ship the code to generate all of those locally, but then they have to deal with a lot of extra rights/contracts to do so (proprietary codecs/tooling is super, super common in gamedev), and

Also largely cause devs/publishers honestly just don't really think about it, they've been doing it as long as optical media has been prevalent (early/mid 90s) and for the last few years devs have actually been taking a look and realizing it doesn't make as much sense as it used to, especially if like in this case the majority of the time is spent on runtime generation of, or if they require a 2080 as minimum specs whats the point of optimizing for 1 low end component if most people running it are on high end systems.

Hitman recently (4 years ago) did a similar massive file shrink and mentioned many of the same things.

somat•1mo ago
At one point, I think it was TitanFall2, the pc port of a game deliberately converted it's audio to uncompressed wav files in order to inflate the install size, They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.

When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.

People probably feel the same about this, why were they so disrespectful of our space and bandwidth in the first place? But I agree it is very nice that they wrote up the details in this instance.

ryandrake•1mo ago
> When the details of exactly why the game was so large came out, many people felt this was a sort of customer betrayal, The publisher was burning a large part of the volume of your precious high speed sdd for a feature that added nothing to the game.

Software developers of all kinds (not just game publishers) have a long and rich history of treating their users' compute resources as expendable. "Oh, users can just get more memory, it's cheap!" "Oh, xxxGB is such a small hard drive these days, users can get a bigger one!" "Oh, most users have Pentiums by now, we can drop 486 support!" Over and over we've seen companies choose to throw their users under the bus so that they can cheap out on optimizing their product.

mghackerlady•1mo ago
Maybe that'll start to change since ram is the new gold and who knows what the AI bubble will eat next
recursive•1mo ago
I remember seeing warez game releases in the late 90s that had custom packaging to de-compress sound effects that were stored uncompressed in the original installer.

It seems no one takes pride in their piracy anymore.

maccard•1mo ago
> They said it was for performance but the theory was to make it more inconvenient for pirates to distribute.

This doesn't even pass the sniff test. The files would just be compressed for distribution and decompressed on download. Pirated games are well known for having "custom" installers.

ycombinatrix•1mo ago
>The files would just be compressed for distribution and decompressed on download

All Steam downloads are automatically compressed. It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.

duskwuff•1mo ago
> The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.

Even when Titanfall 2 was released in 2016, I don't think that was meaningfully the case. Audio compression formats have been tuned heavily for efficient playback.

nearbuy•1mo ago
Uncompressed audio is typically used for sound effects, while music is compressed. Latency is the primary benefit. Uncompressed audio will play immediately while an mp3 will have a few frames delay. Sounds like gunshots or footsteps are typically short files anyway, so the increased memory usage isn't that painful.

Games also can stack many sounds, so even if the decoding cost is negligible when playing a single sound, it'll be greater if you have 32 sounds playing at once.

duskwuff•1mo ago
> Uncompressed audio will play immediately while an mp3 will have a few frames delay.

I'm not sure what you mean by this. Encoding latency is only relevant when you're dealing with live audio streams - there's no delay inherent to playing back a recorded sound.

> Sounds like gunshots or footsteps are typically short files anyway, so the increased memory usage isn't that painful.

Not all sound effects are short (consider e.g. loops for ambient noise!), and the aggregate file size for uncompressed audio can be substantial across an entire game.

nearbuy•1mo ago
> there's no delay inherent to playing back a recorded sound.

There absolutely is. You can decompress compressed audio files when loading so they play immediately, but if you want to keep your mp3 compressed, you get a delay. Games keep the sound effects in memory uncompressed.

> Not all sound effects are short

Long ambient background noises often aren't latency sensitive and can be streamed. For most games textures are the biggest usage of space and audio isn't that significant, but every game is different. I'm just telling you why we use uncompressed audio. If there is a particular game you know of that's wasting a lot of space on large audio files, you should notify the devs.

There is a reason both Unity and Unreal use uncompressed audio or ADPCM for sound effects.

justsomehnguy•1mo ago
> but if you want to keep your mp3 compressed, you get a delay

If that really bothers you then write your own on-disk compression format.

> why we use uncompressed audio

> ADPCM

... which is a compressed and lossy format.

nearbuy•1mo ago
> If that really bothers you then write your own on-disk compression format.

Why? What are you trying to solve here? You're going to have a hard time making a new format that serves you better than any of the existing formats.

The most common solution for instant playback is just to store the sound uncompressed in memory. It's not a problem that needs solving for most games.

ADPCM and PCM are both pretty common. ADPCM for audio is kinda like DXT compression for textures: a very simple compression that produces files many times larger than mp3, and doesn't have good sound quality, but has the advantage that playback and seek costs virtually nothing over regular PCM. The file sizes of ADPCM are closer to PCM than mp3. I should have been clearer in my first comment that the delay is only for mp3/Vorbis and not for PCM/ADPCM.

There isn't a clean distinction between compressed and uncompressed and lossy/lossless in an absolute sense. Compression is implicitly (or explicitly) against some arbitrary choice of baseline. We normally call 16-bit PCM uncompressed and lossless but if your baseline is 32-bit floats, then it's lossy and compressed from that baseline.

justsomehnguy•1mo ago
> Why? What are you trying to solve here? You're going to have a hard time making a new format that serves you better than any of the existing formats.

Storage space. But this is the way for the same guys who duplicate 20Gb seven times 'to serve better by the industry standard'.

More sane people would just pack that AD/PCM in a .pk3^W sorry in a .zip file (or any other packaging format with LZ/7z/whatever compatible compression method) with the fastest profile and would have the best of the both worlds: sane storage requirements, uncompressed in memory. As a bonus it would be loaded faster from HDD because a data chunk which is 10 times smaller than uncompressed one would be loaded surprise 10 times faster.

danbolt•1mo ago
Within the scope of a game’s production, the programmer time spent dogfooding the new audio format can be used towards something else that improves the value of the end product.

The uncompressed audio for latency-sensitive one-shots usually isn’t taking up the bulk of memory either.

justsomehnguy•1mo ago
> programmer time spent dogfooding the new audio format can be used towards something else that improves the value of the end product

Like exploring the 'widely accepted industry practices' and writing code to duplicate the assets, then writing the code to actually measure what it did what the 'industry practices' advertised and then ripping this out, right?

And please note what you missed the 'if it really bothers you'.

ycombinatrix•1mo ago
I think GP was confused - Titanfall 1 from 2014 is the one with the massive volume of uncompressed audio. Though I think your point still stands.

I was trying to point out that the decision to compress or not compress audio likely has nothing to do with the download size.

maccard•1mo ago
It's easy to apply todays standards. Titanfall was released 11 years ago, and ran on an Xbox 360, and a Core 2 Duo. MP3 was a patent encumbered format. There's a fun DF article [0] where they say:

> Titanfall accesses Microsoft's existing cloud network, with servers spooling up on demand. When there's no demand, those same servers will service Azure's existing customers. Client-side, Titanfall presents a dedicated server experience much like any other but from the developer and publisher perspective, the financials in launching an ambitious online game change radically.

Things changed _massively_ in games between 2014 and 2017 - we went from supporting borderline embedded level of platforms with enormous HW constraints, architecture differences, and running dedicated servers like the 90's, to basically supporting fixed spec PCs, and shipping always online titles running on the cloud.

[0] https://www.digitalfoundry.net/articles/digitalfoundry-2014-...

justsomehnguy•1mo ago
> The point is that playback of uncompressed audio

Bullshit. This is not a problem since 2003.

And nobody forbids you to actually decompress your compressed audio when you are loading the assets from the disk.

maccard•1mo ago
> All Steam downloads are automatically compressed.

Titanfall wasn't on steam when it launched.

> It's also irrelevant. The point is that playback of uncompressed audio is indeed cheaper than playback of compressed audio.

The person that I replied to (not you) claimed "They said it was for performance but the theory was to make it more inconvenient for pirates to distribute."

snet0•1mo ago
This is conspiratorial nonsense.
ycombinatrix•1mo ago
Wasn't that Titanfall 1? I remember Titanfall 2 having a much smaller installation size.
reactordev•1mo ago
The negativity comes from the zero effort they put into this prior to launch. Forcing people to download gigs of data that was unnecessary.

Game studio's no longer care how big their games are if steam will still take them. This is a huge problem. GTA5 was notorious for loading json again, and again, and again during loading and it was just a mess. Same for HD2, game engines have the ability to only pack what is used but its still up to the developers to make sure their assets are reusable as to cut down on size.

This is why Star Citizen has been in development for 15 years. They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

The anger here is real. The anger here is justified. I'm sick of having to download 100gb+ simply because a studio is too lazy and just packed up everything they made into a bundle.

bluedino•1mo ago
> They couldn't optimize early and were building models and assets like it's for film. Not low poly game assets but super high poly film assets.

Reminds me of the Crack.com interview with Jonathan Clark:

Adding to the difficulty of the task, our artist had no experience in the field. I remember in a particular level we wanted to have a dungeon. A certain artist begin by creating a single brick, then duplicating it several thousand times and building a wall out of the bricks. He kept complaining that his machine was too slow when he tried to render it. Needless to say this is not the best way to model a brick wall.

https://web.archive.org/web/20160125143707/http://www.loonyg...

reactordev•1mo ago
this is very very common as there's only a handful of school that teach this. Displacement mapping with a single poly is the answer. Game dev focused schools have this but any other visual media school it's "build a brick, array the brick 10,000 times".
fyrabanks•1mo ago
There were 20 people working on this game when they started development. Total. I think they expanded to a little over 100. This isn't some huge game studio that has time to do optimization.

GTA5 had well over 1000 people on its team.

reactordev•1mo ago
Size of team has no bearing in this argument. Saying they were small so they get a pass at preventing obscene download sizes is like saying “Napster was created by one man, surely he shouldn’t be accountable” but he was.

When making a game, once you have something playable, is to figure out how to package it. This is included in that effort. Determining which assets to compress, package, and ship. Sometimes this is done by the engine. Sometimes this is done by the art director.

WheatMillington•1mo ago
Amount of resources absolutely has a bearing on how resources can be allocated.
reactordev•1mo ago
This isn’t a resourcing issue. It’s a lack of knowledge and skipped a step issue.

When I did this. My small team took a whole sprint to make sure that assets were packed. That tilemaps were made. That audio files were present and we did an audit to make sure nothing extra was packaged on disk. Today, because of digital stores and just releasing zip files, no one cares what they ship and often you can see it if you investigate the files of any Unity or Unreal engine game. Just throw it all over the fence.

onli•1mo ago
Not sure GTA 5 is the right example to list here. Remember https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times.... At least for a while they didn't optimize at all.
Krasnol•1mo ago
I feel like negativity has become Hacker News's bread and butter.
vict7•1mo ago
Many players perceive Arrowhead as a pretty incompetent and untrustworthy developer. Helldivers has suffered numerous issues with both performance and balancing. The bugs constantly introduced into the game (not the fun kind you get to shoot with a gun) have eroded a lot of trust and good will towards the company and point towards a largely non-existent QA process.

I won’t state my own personal views here, but for those that share the above perspective, there is little benefit of the doubt they’ll extend towards Arrowhead.

nearbuy•1mo ago
This is a mischaracterization of the optimization. This isn't a standard optimization that games apply everywhere. It's an optimization for spinning disks that some games apply sometimes. They're expected to measure if the benefits are worth the cost. (To be clear, bundling assets is standard. Duplicating at this level is not.)

This doesn't advance accepted industry wisdom because:

1. The trade-off is very particular to the individual game. Their loading was CPU-bound rather than IO-bound so the optimization didn't make much difference for HDDs. This is already industry wisdom. The amount of duplication was also very high in their game.

2. This optimization was already on its way out as SSDs take over and none of the current gen consoles use HDDs.

I'm not mad at Arrowhead or trying to paint them negatively. Every game has many bugs and mishaps like this. I appreciate the write-up.

ok_coo•1mo ago
I'm glad they've been able to do this, looks like a huge improvement for HD2 on PC.

I've been on PS5 since launch and aside from Baldur's Gate 3, it's been the best game this gen IMO.

The negativity I see towards the game (especially on Youtube) is weird. Some of the critiques seem legit but a lot of feels like rage bait, which appears to be a lot of YT videos around gaming lately.

Anyway, a big improvement for a great game. Seems like less of an incentive now to uninstall if you only play now and then.

iO

doener•1mo ago
[dupe] https://news.ycombinator.com/item?id=46134178
sergiotapia•1mo ago
If this article was exciting for you, I also highly recommend this one. A random dude fixed a bug in GTA 5 that was the root cause of it loading insanely slowly since the game came out!

https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times...

oceansky•1mo ago
The write up of how Windows 11 24H2 broke GTA San Andreas was excellent.

https://cookieplmonster.github.io/2025/04/23/gta-san-andreas...

roflchoppa•1mo ago
Do HDDs make up that much of the gaming market segment?
aidenn0•1mo ago
1/9th of Helldivers 2 players, per TFA.
forrestthewoods•1mo ago
Moral of the Story: don’t roll out a fix like this all at once. Do it over 6 months over several patches. Keep finding “new improvements”.

Just don’t get caught at the end!

tlonny•1mo ago
If this is somewhat common for games, could one create a virtual fs with FUSE that dedupes using via content-defined chunking and install games there?

I feel like writes would probably be quite painful, but with game assets are essentially write-once read-forever so not the end of the world?

As an aside, its messed up that people with expensive SSDs are unnecessarily paying this storage tax. Just feels lazy...