frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Show HN: AI Hairstyle Changer – Try Different Hairstyles (1 free try, no login)

https://aihairstylechanger.space
1•QuLi-ops•26s ago•0 comments

Glass-Ceramic Substrates for Electronics Packaging

https://advanced.onlinelibrary.wiley.com/doi/10.1002/aelm.202500331
1•akshatjiwan•53s ago•0 comments

What I think of the TAISE certification as a proven AI Governance expert

https://beabytes.com/taise-certification/
1•beabytes•2m ago•0 comments

Superfill.ai – Open-source AI extension for intelligent form autofill

2•_mikr13•3m ago•0 comments

Unlocking oxygen's hidden role in turning propylene into useful chemicals

https://phys.org/news/2025-11-oxygen-hidden-role-propylene-chemicals.html
2•PaulHoule•4m ago•0 comments

Show HN: ReddBoss – Turn Reddit into your lead generation machine with AI

1•MoNagm•4m ago•0 comments

Ask HN: How can a web Senior SWE move into a good game-dev or game-related job?

1•llll_lllllll_l•4m ago•1 comments

The Solitaire Encryption Algorithm (1999)

https://www.schneier.com/academic/solitaire/
1•mooreds•5m ago•0 comments

Aisuru botnet behind new record-breaking 29.7 Tbps DDoS attack

https://www.bleepingcomputer.com/news/security/aisuru-botnet-behind-new-record-breaking-297-tbps-...
1•fleahunter•5m ago•0 comments

Desugaring the Relationship Between Concrete and Abstract Syntax

https://thunderseethe.dev/posts/desugar-base/
1•todsacerdoti•5m ago•0 comments

Use money transfer app to gest high fx rate

https://idealremit.com
1•boubrik•6m ago•0 comments

Show HN: A prediction market where you can bet against my goals

https://market.ericli.tech
1•ericlmtn•6m ago•1 comments

Show HN: VoxCSS – A DOM based voxel engine

https://github.com/LayoutitStudio/voxcss
1•rofko•7m ago•0 comments

Show HN: AI Model Arena – Compare Z-Image, Nano Banana Pro, and Flux.2 Pro

https://z-image.app/arena
1•yeekal•8m ago•0 comments

Best of Metadata in 2025

http://muratbuffalo.blogspot.com/2025/12/best-of-metadata-in-2025.html
4•mark4•9m ago•0 comments

A Long Game

https://benjamindreyer.substack.com/p/a-long-game
1•mooreds•10m ago•0 comments

Change Commit Timestamps in Git

https://cassidoo.co/post/change-git-timestamp/
1•mooreds•12m ago•0 comments

Machine Code Explained [video]

https://www.youtube.com/watch?v=8VsiYWW9r48&list=PLzH6n4zXuckpwdGMHgRH5N9xNHzVGCxwf&index=1
1•tosh•12m ago•0 comments

I built NeurIPS '25 visualizer

1•huydangx•12m ago•0 comments

Top DevOps Companies Hiring in 2025

https://devopsprojectshq.com/role/top-devops-companies-2025/
1•thomster•13m ago•0 comments

Artificial Intelligence for Quantum Computing

https://www.nature.com/articles/s41467-025-65836-3
1•jonbaer•14m ago•0 comments

Congressional lawmakers 47% pts better at picking stocks

https://www.nber.org/papers/w34524
17•mhb•17m ago•3 comments

Technocrats Are Getting Stupider

https://unherd.com/2025/12/why-the-great-reset-failed/
3•voxleone•19m ago•2 comments

GitHub to Codeberg Migration Script

https://github.com/LionyxML/migrate-github-to-codeberg
3•klaussilveira•22m ago•0 comments

Mistral launches Mistral 3, a family of open models

https://venturebeat.com/ai/mistral-launches-mistral-3-a-family-of-open-models-designed-to-run-on
2•mark_l_watson•22m ago•0 comments

Google Adds LLMs.txt to Search Developer Docs

https://www.seroundtable.com/google-adds-llms-txt-to-search-developer-docs-40533.html
1•speckx•22m ago•0 comments

Show HN: AI Reasoning Workflows – The 6 Skills That Improve Model Output

1•ai_updates•22m ago•1 comments

Show HN: C++ for Autonomous Driving – From Learning to Landing AV Jobs

https://github.com/0voice/Awesome-CPP-Autonomous-Driving
1•ysy63874•24m ago•0 comments

Why Ceph and Rook Is the Gold Standard for Bare-Metal Kubernetes

https://oneuptime.com/blog/post/2025-12-03-ceph-rook-standard-bare-metal-storage-pools/view
1•ndhandala•29m ago•0 comments

How LLM Inference Works

https://arpitbhayani.me/blogs/how-llm-inference-works/
2•manishpushkar•29m ago•0 comments
Open in hackernews

Helldivers 2 devs slash install size from 154GB to 23GB

https://www.tomshardware.com/video-games/pc-gaming/helldivers-2-install-size-slashed-from-154gb-to-just-23gb-85-percent-reduction-accomplished-by-de-duplicating-game-data-an-optimization-for-older-mechanical-hard-drives
33•doener•46m ago

Comments

easyThrowaway•42m ago
Did the duplicated files were even used on pc? Like, do you even have such low access to the file system that you can deduce which duplicated instance has a faster access time on a mechanical hard drive?
arghwhat•26m ago
Not sure if this is what they did, but you can just put all the things you need together sequentially into a single file and rely on the filesystem to allocate contiguous blocks where possible (using the appropriate size hints to help). It's trivial unpack at loading time without any performance impact.

A filesystem is by itself just one big "file" acting like a file archive.

tehbeard•6m ago
It's not which duplicated instance....

Think of it as I have two packs for levels.

Creek.level and roboplanet.level

Both use the cyborg enemies, by duplicating the cyborg enemy model and texture data across both files, Only the level file needs to be opened to get all nessecary data for a match.

Because modern OS will allow you to preallocate contiguous segments and have auto defrag, you can have it read this level file at max speed, rather than having to stop and seek to go find cyborg.model file because it was referenced by the spawn pool. Engine limitations may prevent other optimisations you think up as a thought exercise after reading this.

It's similar to how crash bandicoot packed their level data to handle the slow speed of the ps1 disc drive.

As to why they had a HDD optimisation in 2024... Shrugs

rwmj•25m ago
23GB is supposed to be "slim"?!
bilekas•19m ago
In this day and age it's a gift to only be ~23GB.. I'm reminded of the old days when you literally didn't have the space so had to get creative, now any kind of space optimization isn't even considered.
onli•17m ago
Yes. High resolution textures take up a lot of space. Have a look at HD texture mods for skyrim for example. 23GB is more in line with a game from a few years ago, so this really is slim for a modern game with modern graphics.
mfro•15m ago
Have you played a big budget video game released in the last 10 years? It’s pretty standard to reach upwards of 60GB.
phoronixrly•11m ago
I do love rich soundtracks with high quality compression, and textures that look crisp on 4k. And also games with 100+ hours of single-player campaign.
throw0101c•8m ago
Back in the day:

> 3-D Hardware Accelerator (with 16MB VRAM with full OpenGL® support; Pentium® II 400 Mhz processor or Athlon® processor; English version of Windows® 2000/XP Operating System; 128 MB RAM; 16-bit high color video mode; 800 MB of uncompressed hard disk space for game files (Minimum Install), plus 300 MB for the Windows swap file […]

* https://store.steampowered.com/app/9010/Return_to_Castle_Wol...

* https://en.wikipedia.org/wiki/Return_to_Castle_Wolfenstein

Even older games would be even smaller:

* https://www.oldgames.sk/en/game/ultima-vi-the-false-prophet/...

* https://en.wikipedia.org/wiki/Ultima_VI:_The_False_Prophet

alias_neo•7m ago
I mean yes, it's a very nice looking game with fairly sizeable worlds and lots of different enemies, biomes, etc.

It's currently over 100GB because of duplicated assets, so this is a game-changer (pun intended).

rincebrain•18m ago
I've been really curious precisely what changed, and what sort of optimization might have been involved here.

Because offhand, I know you could do things like cute optimizations of redundant data to minimize seek time on optical media, but with HDDs, you get no promises about layout to optimize around...

The only thing I can think of is if it was literally something as inane as checking the "store deduplicated by hash" option in the build, on a tree with copies of assets scattered everywhere, and it was just nobody had ever checked if the fear around the option was based on outcomes.

(I know they said in the original blog post that it was based around fears of client performance impact, but the whole reason I'm staring at that is that if it's just a deduplication table at storage time, the client shouldn't...care? It's not writing to the game data archives, it's just looking stuff up either way...)

alias_neo•9m ago
I'm not entirely clear what you're trying to say, but, my understanding is that they simply put lots of copies of files in lots of places like games have done for a long time, in the hopes it would lower seek times on HDDs for those players who use them.

They realised, after a lot of players asking, that it wasn't necessary, and probably had less of an impact than they thought.

They removed the duplicates, and drastically cut the install size. I updated last night, and the update alone was larger than the entire game after this deduplication run, so I'll be opting in to the Beta ASAP.

It's been almost a decade since I ran spinning rust in a desktop, and while I admire their efforts to support shitty hardware, who's playing this on a machine good enough to play but can't afford £60 for a basic SSD for their game storage?

djmips•17m ago
I did similar work on a game a long time ago and it took over a month to slim it down to 1/4 of the size but in this case 'at runtime' - the producer wasn't impressed. It looked exactly the same. I wonder if they had any pushback.
geerlingguy•7m ago
Possibly a similar process to when you go into an AWS account, and find dozens of orphaned VMs, a few thousand orphaned disk volumes, etc., saving like $10k/month just deleting unused resources.
alias_neo•4m ago
We've all been there Jeff.

In this case, I don't think it was forgetfulness; unlike us, they have an excuse and they were trying to optimise for disk seek times.

Anyway, I've got a half-dozen cloud accounts I need to go check for unused resources waves.

snet0•6m ago
> With their latest data measurements specific to the game, the developers have confirmed the small number of players (11% last week) using mechanical hard drives will witness mission load times increase by only a few seconds in worst cases. Additionally, the post reads, “the majority of the loading time in Helldivers 2 is due to level-generation rather than asset loading. This level generation happens in parallel with loading assets from the disk and so is the main determining factor of the loading time.”

It seems bizarre to me that they'd have accepted such a high cost (150GB+ installation size!) without entirely verifying that it was necessary!

I expect it's a story that'll never get told in enough detail to satisfy curiosity, but it certainly seems strange from the outside for this optimisation to be both possible and acceptable.