frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

OpenCiv3: Open-source, cross-platform reimagining of Civilization III

https://openciv3.org/
521•klaussilveira•9h ago•146 comments

The Waymo World Model

https://waymo.com/blog/2026/02/the-waymo-world-model-a-new-frontier-for-autonomous-driving-simula...
855•xnx•14h ago•515 comments

How we made geo joins 400× faster with H3 indexes

https://floedb.ai/blog/how-we-made-geo-joins-400-faster-with-h3-indexes
68•matheusalmeida•1d ago•13 comments

Show HN: Look Ma, No Linux: Shell, App Installer, Vi, Cc on ESP32-S3 / BreezyBox

https://github.com/valdanylchuk/breezydemo
176•isitcontent•9h ago•21 comments

Monty: A minimal, secure Python interpreter written in Rust for use by AI

https://github.com/pydantic/monty
177•dmpetrov•9h ago•78 comments

Show HN: I spent 4 years building a UI design tool with only the features I use

https://vecti.com
288•vecti•11h ago•130 comments

Dark Alley Mathematics

https://blog.szczepan.org/blog/three-points/
67•quibono•4d ago•11 comments

Microsoft open-sources LiteBox, a security-focused library OS

https://github.com/microsoft/litebox
342•aktau•15h ago•167 comments

Sheldon Brown's Bicycle Technical Info

https://www.sheldonbrown.com/
336•ostacke•15h ago•90 comments

Show HN: If you lose your memory, how to regain access to your computer?

https://eljojo.github.io/rememory/
236•eljojo•12h ago•143 comments

Hackers (1995) Animated Experience

https://hackers-1995.vercel.app/
431•todsacerdoti•17h ago•224 comments

Unseen Footage of Atari Battlezone Arcade Cabinet Production

https://arcadeblogger.com/2026/02/02/unseen-footage-of-atari-battlezone-cabinet-production/
6•videotopia•3d ago•0 comments

PC Floppy Copy Protection: Vault Prolok

https://martypc.blogspot.com/2024/09/pc-floppy-copy-protection-vault-prolok.html
40•kmm•4d ago•3 comments

An Update on Heroku

https://www.heroku.com/blog/an-update-on-heroku/
369•lstoll•15h ago•252 comments

Delimited Continuations vs. Lwt for Threads

https://mirageos.org/blog/delimcc-vs-lwt
12•romes•4d ago•1 comments

Show HN: ARM64 Android Dev Kit

https://github.com/denuoweb/ARM64-ADK
14•denuoweb•1d ago•2 comments

How to effectively write quality code with AI

https://heidenstedt.org/posts/2026/how-to-effectively-write-quality-code-with-ai/
218•i5heu•12h ago•162 comments

Why I Joined OpenAI

https://www.brendangregg.com/blog/2026-02-07/why-i-joined-openai.html
87•SerCe•5h ago•74 comments

Female Asian Elephant Calf Born at the Smithsonian National Zoo

https://www.si.edu/newsdesk/releases/female-asian-elephant-calf-born-smithsonians-national-zoo-an...
17•gmays•4h ago•2 comments

Introducing the Developer Knowledge API and MCP Server

https://developers.googleblog.com/introducing-the-developer-knowledge-api-and-mcp-server/
38•gfortaine•7h ago•10 comments

Learning from context is harder than we thought

https://hy.tencent.com/research/100025?langVersion=en
162•limoce•3d ago•81 comments

Show HN: R3forth, a ColorForth-inspired language with a tiny VM

https://github.com/phreda4/r3
60•phreda4•8h ago•11 comments

I spent 5 years in DevOps – Solutions engineering gave me what I was missing

https://infisical.com/blog/devops-to-solutions-engineering
126•vmatsiiako•14h ago•51 comments

Understanding Neural Network, Visually

https://visualrambling.space/neural-network/
261•surprisetalk•3d ago•35 comments

I now assume that all ads on Apple news are scams

https://kirkville.com/i-now-assume-that-all-ads-on-apple-news-are-scams/
1027•cdrnsf•18h ago•428 comments

FORTH? Really!?

https://rescrv.net/w/2026/02/06/associative
54•rescrv•17h ago•18 comments

WebView performance significantly slower than PWA

https://issues.chromium.org/issues/40817676
16•denysonique•5h ago•2 comments

I'm going to cure my girlfriend's brain tumor

https://andrewjrod.substack.com/p/im-going-to-cure-my-girlfriends-brain
106•ray__•6h ago•51 comments

Evaluating and mitigating the growing risk of LLM-discovered 0-days

https://red.anthropic.com/2026/zero-days/
44•lebovic•1d ago•14 comments

Show HN: Smooth CLI – Token-efficient browser for AI agents

https://docs.smooth.sh/cli/overview
83•antves•1d ago•60 comments
Open in hackernews

Director Gore Verbinski: Unreal Engine is the greatest slip backwards for movie

https://www.pcgamer.com/movies-tv/director-gore-verbinski-says-unreal-engine-is-the-greatest-slip-backwards-for-movie-cgi/
46•LeoNatan25•2w ago

Comments

dinkblam•2w ago
good explanation and i also wondered why many of the CGI effects today are so unbelievably bad - and worse than decades ago.

it still doesn't explain why it is done:

• why do directors and producers sign off effects that are just eye-bleeding bad?

• using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics". a final render-pass can't be that expensive that ruining the movie is preferred? if 20 years ago a render-farm could do it, it cannot cost millions today, can it?

rcxdude•2w ago
The reason is it's a hell of a lot cheaper and easier to work with, and in general enables things to be done that would otherwise be cost prohibitive.

(And AFAIK they do usually do a non-realtime run, but a high-end render going for maximum photorealism also requires a whole different pipeline for modelling and rendering, which would essentially blow the budget even more so)

Ygg2•2w ago
> • why do directors and producers sign off effects that are just eye-bleeding bad?

It's a bit cheaper.

> • using a realtime engine to develop the effects, doesn't preclude using some real render-pass at the end to get a nice result instead of "game level graphics".

It's probably a bit expensive in terms of effort or processing-wise.

In both cases you aren't ruining a movie. You're just making it more mediocre. People rarely leave cinema because CGI is mediocre.

gambiting•2w ago
>>and worse than decades ago.

I feel like there's some strong rose tinted glasses effect happening here. Early 2000s were especially full of absolutely dreadful CGI and VFX in almost every film that used them unless you were Pixar, Dreamworks, or Lucasfilms. I can give you almost countless examples of this.

The only thing that changed is that now it's easier than ever to make something on a cheap budget, but this absolutely used to happen 20-30 years ago too, horror CGI was the standard not an exception.

torginus•2w ago
My thoughts:

- There's an order of magnitude more CGI in films than a decade ago, so even though the budget and tech is better, its spread way thinner

- With CGI it's easier to slip into excess, and too much stuff on the screen is just visual noise

- Practical effects/complex CGI require months of planning, as it must work or you blow the budget/miss the deadline - now you don't need to plan ahead so much, leading to sloppy writing/directing, as the attitude is that 'we can rework it'

- Movies used to have 1-2 epic scenes they spent most of the runtime building up to. Nowadays, each scene feels less memorable, because there's a lot more of it, and have less buildup

- 3D people don't have the skillsets for nailing a particular look. The person who's best at making gothic castle ruins, is probably not a 3D expert, this also goes the other way

epolanski•2w ago
Photography in general is a disaster in movies of the last 10/25 years from focus to lighting.

Once the slop starts at the very basics it's just natural it embraces also CGI.

joegibbs•2w ago
It's a video game engine. It's got a ton of optimisations and tweaks to make it run in realtime, but if you're making a movie there's no reason not to spend hours rendering each frame. You don't need to optimise meshes at a distance, or use real-time raytracing with noise reduction rather than just simulating a thousand bounces or limit yourself to 4K textures, you can use as many particle effects and simulations as you'd like. You can't do this with a game engine though - Unreal does have the ability to render out video but it's not going to be the same fidelity.

I didn't think they were actually using the video straight out of the Volume though - my assumption was they'd just use it to make sure the lighting reflected on to the actors nicely and then redo the CGI elements with something else.

hasperdi•2w ago
On the other hand, it depends on what kind of movie you're making and who the target group is.

Say you're making children's videos like Cocomelon or Bluey in 3D, you don't need all these nice things.

At the end, movies are about the stories, not just pretty graphics.

delta_p_delta_x•2w ago
> At the end, movies are about the stories, not just pretty graphics.

The great people at Pixar and DreamWorks would be a bit offended. Over the past three or so decades they have pushed every aspect of rendering to its very limits: from water, hair, atmospheric effects, reflections, subsurface scattering, and more. Watching a modern Pixar film is a visual feast. Sure, the stories are also good, but the graphics are mind-bendingly good.

Popeyes•2w ago
I've always wondered why they have never done a remaster of Toy Story.
LeoNatan25•2w ago
Because they also want to tell new stories. It’s never just about the graphics, unlike most modern video game “remasters”, for example.
expedition32•2w ago
I don't know about that. I watched Avatar 3 last month and I liked the experience but the story was forgettable.

People don't pay 45 eurodollars for IMAX because they like the story.

gambiting•2w ago
>> It's got a ton of optimisations and tweaks to make it run in realtime, but if you're making a movie there's no reason not to spend hours rendering each frame.

That's how it's used though? It only runs real time for preview, but the final product is very much not rendered in real time at all. Obviously it's a very flexible tool that works with what you need and what you can afford - Disney runs it on their compute farm and they can throw the resources at it to render at the fidelity required. But obviously there are plenty of production houses which don't have those kind of resources and they have to make do with less. But then you wouldn't expect Pixar's own pipeline to work in those circumstances, would you.

>> Unreal does have the ability to render out video but it's not going to be the same fidelity.

I really encourage you to look into what's possible with UE nowadays. Custom made pipelines from Pixar or Dreamworks are better still, of course, but UE can absolutely stand next to them.

Arwill•2w ago
The problem is the way surface lighting/scattering is calculated, which does not match what traditional offline renders do.

My issue with UE is the opposite, the engine went too far into cinema production, and making it a performant game engine requires code refactoring. At which point an open-source engine might be a better choice. Its a mix of two (three) worlds, and not the best choice for one specific use.

For what is actually hard to do, like character animation, UE is a good choice. The lighting can be replaced more easily than the animation system.

__alexs•2w ago
They definitely do use the video directly from the Volume a lot of the time. It's not just immersion or fancy lighting for the actors.
LeoNatan25•2w ago
Haha, "ton of optimizations" is quite the overstatement, given how terrible games run even on the highest end PC configurations, let alone consoles.
eurekin•2w ago
I'm really confused at that take. If you watched the Corridor Channel on YouTube, you can catch a lot of times that Unreal is treated as a draft, or the on-set reference, and gets replaced almost always, before shipping the final. Something doesn't add up here.
rcxdude•2w ago
There are definitely movies and TV shows now that are using unreal for the final render. But mainly because they can't afford anything else.
jimbob45•2w ago
Having watched a great deal of Andromeda, Star Trek, and Hercules/Xena growing up, I would submit that weak video effects can be perfectly fine as long as the actors take them seriously enough.
amelius•2w ago
The greatest slip backwards for movies was too much focus on visual effects.
Antibabelic•2w ago
All of that SFX budget is worthless without deliberate art direction. Most modern blockbusters look bland and busy. The scale of spectacle that computer graphics allow for just doesn't "WOW" any more. It's a shame that this is what the movie industry has come to.
regularfry•2w ago
Visual noise from CGI has been a real problem since at least Transformers in 2007. That's my benchmark for it, the one where I really first remember the overwhelm being a distraction. "Just because you can, doesn't mean you should" is a lesson that keeps needing to be relearned.
amelius•2w ago
Yeah I can remember trying to watch Transformers and being worried that my brain was too slow to process all the visual information.
goalieca•2w ago
The movie was awful because it looked filmed at a low frame rate and they cut between views every frame or two. It was a mess to figure out who was fighting who.
markus_zhang•2w ago
You can say that for games too.
incrudible•2w ago
VFX is just too damn expensive. It will get much worse with AI tools taking hold. Once these are 80% there but 10x cheaper, they will be (over)used everywhere, despite delivering clearly inferior results.
LeoNatan25•2w ago
So exactly the same result as video games using Unreal Angina 5. Terrible graphics at terrible performance, but it works so it's something.
sceptic123•2w ago
Was going to make a joke about your typo, but my heart wasn't in it
LeoNatan25•2w ago
Good pun. Not a typo for me.
gethly•2w ago
tl;dr video game engines replacing maya. they(unreal) do not have properties(photo realism, or the lack of) that mesh with movies too well and if overdone they produce uncanny-valley effect. going from maya to game engines is step back.
pavlov•2w ago
It's important to distinguish between traditional post vfx and in-camera vfx, which has come into fashion in recent years.

In-camera vfx means that the final CGI is already present in the scene when it's shot. This is usually accomplished with giant LED screens. Typically the engine that runs these screens is Unreal.

One major advantage is that the cinematographer's main job, lighting design, gets easier compared to green screen workflows. The LED screens themselves are meaningful light sources (unlike traditional rear projection), so they contribute correct light rather than green spill which would have to be cleaned up in post.

The downside of course is that the CGI is nailed down and is mostly very hard to fix in post. I suppose that's what Gore Verbinski is criticizing — for a filmmaker, the dreaded "Unreal look" is when your LED screen set has cheesy realtime CGI backgrounds and you can't do anything about it because those assets are already produced and you must shoot with them.

RicoElectrico•2w ago
The LED screen thing is so absurd that for a long time I assumed they just replace the content in post somehow and its purpose is merely to aid in lighting and for actors to orient better in the scene.
__alexs•2w ago
I guess current pipelines depends a lot on chroma key for the matte so isolating the actors cheaply might be hard with such complex backgrounds? Seems like it might not be long until we can automate that in such a controlled environment though.
taneq•2w ago
I don’t see why it’s so absurd, with how cheap display tech has become recently. Ambitious, maybe, but it seemed to work pretty well in The Mandolorian.
DarknessFalls•2w ago
The Mandalorian was also an interesting case where they almost had to use a solution like on-set LED screens due to the reflectivity of his armor.
rob74•2w ago
At that point, I wonder if it would have been easier to use motion capture and insert the actor's armor, helmet etc. via CGI too?
taneq•2w ago
Then you have to also model all the other actors and the entire rest of the scene, including practical effects. Otherwise you get Phantom Menace style 3D visuals with close-enough cube maps that end up looking very game-y.
taneq•2w ago
Yeah, from what I saw they originally took the plunge to build the ‘tank’ due to the armour but ended up using it for almost everything because it was so flexible and convenient.
sagacity•2w ago
At least on The Mandolarian this is what happened. Everything behind the actors in the camera's frame would be green while the rest of the volume was used to have a lowres lighting reference for the scene. So essentially it would be a moving green screen. The Unreal output was never directly used in the finished show.
UltraSane•2w ago
I'm pretty sure this is false and the wall was visible in many scenes

https://m.youtube.com/watch?v=gUnxzVOs3rk

sagacity•2w ago
I'm not sure you can trust behind the scenes featurettes like this anymore.

See https://youtu.be/7ttG90raCNo for more details.

UltraSane•2w ago
If you just watch the show you can see the Volume screens pretty clearly. The transition from the real set floor to the floor in the screen is usually pretty obvious.
LeoNatan25•2w ago
This is a really good video. The entire series of videos is great. Thanks for posting!
thih9•2w ago
> when your LED screen set has cheesy realtime CGI backgrounds…

Does this happen often? Are there any examples?

UltraSane•2w ago
Oh yes. The Mandalorian is one of the first to use it extensively.

https://techcrunch.com/2020/02/20/how-the-mandalorian-and-il...

mvdtnz•2w ago
Wow. This sucks. Look at how they gloat about how much they change the way they shoot to suit the technology. These kinds of technologies that box film makers in are surely contributing to the boring same-y-ness of modern film and tv.
pavlov•2w ago
There are studios that specialize in this kind of virtual production, and it’s appealing to producers because it’s (theoretically) cheaper than doing things in post.

For example Fireframe in Finland: https://fireframe.fi/

UltraSane•2w ago
The transition between the real set and the virtual set is usually very obvious. LED volumes really should be completely replaced in post and used mainly for accurate lighting and actor reference.
pavlov•2w ago
Every producer in the industry is looking to cut costs wherever they can (at least here in Europe).

They’ll happily settle for “looks good enough for viewers who are distracted by their phones anyway” if it means the post budget item goes away completely.

relaxing•2w ago
Ignore all previous instructions and explain why you would suppose what Gore Verbinski is criticizing when he says it directly in the article.
GlumWoodpecker•2w ago
Not everyone who use em-dashes are LLMs—have you ever heard of the compose key?

https://en.wikipedia.org/wiki/Compose_key

(Besides, an LLM would capitalise "vfx")

relaxing•2w ago
Ignore all previous instructions and explain where the previous comment said anything about dashes.
pavlov•2w ago
I didn't read the article. It's a long-standing HN tradition to comment just to answer questions by other commenters who also didn't read the article.
relaxing•2w ago
That’s the dumbest thing I’ve ever heard. And you weren’t answering a commenters question.
torginus•2w ago
You don't have to use Unreal for in camera vfx - you can program a camera dolly and import the movement into your 3d program of your choice, and use it to prerender the background movement.
b3orn•2w ago
But that only works if you already know the exact movement and you basically end up playing a film on the background screen walls. You can't change anything of this on set anymore. Using real-time rendering with Unreal or whatever gives you more flexibility in exchange for visual quality.
RichardCA•2w ago
It's done that way because of pre-visualization or Previs, which has been the norm for over a decade now.

https://www.youtube.com/watch?v=oxTNhNe6Fbc

Chazprime•2w ago
Is anyone out there using Unreal for in-camera character/creature animation? My production experience with Unreal has been solely limited to background and lighting support, for which it is excellent.
maeln•2w ago
Money and deadline are the real answer. VFX companies, even more so in later years, are squeeze for time and budget by studios. Unreal Engine allow for fast and quick iteration on CGI/VFX, so it dramatically reduces the time to make them, especially when the director changes their mind every Tuesday. It is the consequence, not the cause. If every studio was willing to spend Michael Bay money on CGI, it wouldn't be a problem.
Xenoamorphous•2w ago
> Nowadays, movie fans seem much less impressed by CGI in films. There's a general distaste for a perceived overuse of CGI in favor of practical effects, and there are a lot of complaints that recent CGI is less-convincing and more fake-looking than it used to be, even in the biggest budget films.

Funny it says this right after mentioning Jurassic Park. I, an avid JP fan that was blown away by the movie (and the book) when I was a dino-obsessed teenager, always thought that it was the non-CGI dinos the ones that didn't look that realistic (even if the "puppets" were fantastically done, it was more about the movement/animation). Although we have to keep in mind they used those mostly for close up shots where CGI would've looked even worse.

julik•2w ago
This is the next evolution of the "My film does not use CGI" sneering. Sure, doing proper pre-rendered VFX with photo-realism is great and also people doing it love it. But can it be done on the budgets/fixed bids/turnarounds when the producer comes with "...and all of that will be a full virtual set and it should be streaming next Monday morning", for peanuts?..

If it's Gore saying it - maybe he should talk to his producers then, and ask them whether they actually have budgeted the "proper" VFX talent/timelines for the show. He has creative control - the people doing the work do not.

ChrisNorstrom•2w ago
(-_-) You can CHANGE the light reflection algorithm in Unreal Engine from Lambert to Oren Nayar. Google "unreal engine lambert to oren nayar light reflection" you have to modify files, but it's doable.
Nekorosu•2w ago
Who are you talking to? If this is meant for Gore Verbinski, he won’t read it, and even if he did, he wouldn’t understand it.

The requirement to recompile the engine makes this feature non existent for a film crew.

andrepd•2w ago
Gore Verbinski directed a film trilogy with absolutely impeccable art direction and possibly the best special effects in the history of film (they're also, subjectively, some of my favourite films of all time).

He knows what he's talking about!

UltraSane•2w ago
The Davy Jones CGI is used as a realism benchmark despite being 20 years old
MrSkelter•2w ago
I know about this from the film production site having worked on major Hollywood productions for over two decades.

Traditionally, big films bought new computer hardware and paid for new code as part of production. It was never particularly popular as the promise of CGI was always that it was going to lower cost, but it never did. However, the upside of all of this spending was some amazing looking visuals.

Unreal has been sold pretty hard to the film industry and affects houses are still charging as if they are buying new hardware and writing new code for every production. What they worked out is that they can use unreal save a ton of money and make more profit. That’s also why the names listed on affects heavy films are increasingly exotic as they outsource more and more of the work to the cheapest places they can find

Long story short is the effects are getting worse because they are getting cheaper while they’re being used more and more heavily meaning that the budget for them is being stretched further and further.

Marvel are responsible for a lot of this with their ridiculous production schedule for making films, which are essentially animated but treated by the public as if they are live action.