frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Utah's hottest new power source is 15k feet below the ground

https://www.gatesnotes.com/utahs-hottest-new-power-source-is-below-the-ground
126•mooreds•3h ago•74 comments

How the "Kim" dump exposed North Korea's credential theft playbook

https://dti.domaintools.com/inside-the-kimsuky-leak-how-the-kim-dump-exposed-north-koreas-credent...
154•notmine1337•4h ago•20 comments

A Navajo weaving of an integrated circuit: the 555 timer

https://www.righto.com/2025/09/marilou-schultz-navajo-555-weaving.html
60•defrost•3h ago•9 comments

Shipping textures as PNGs is suboptimal

https://gamesbymason.com/blog/2025/stop-shipping-pngs/
42•ibobev•3h ago•15 comments

I'm Making a Beautiful, Aesthetic and Open-Source Platform for Learning Japanese

https://kanadojo.com
37•tentoumushi•2h ago•11 comments

C++26: Erroneous Behaviour

https://www.sandordargo.com/blog/2025/02/05/cpp26-erroneous-behaviour
12•todsacerdoti•1h ago•8 comments

Troubleshooting ZFS – Common Issues and How to Fix Them

https://klarasystems.com/articles/troubleshooting-zfs-common-issues-how-to-fix-them/
14•zdw•3d ago•0 comments

A history of metaphorical brain talk in psychiatry

https://www.nature.com/articles/s41380-025-03053-6
10•fremden•1h ago•2 comments

Over 80% of Sunscreen Performed Below Their Labelled Efficacy (2020)

https://www.consumer.org.hk/en/press-release/528-sunscreen-test
90•mgh2•4h ago•80 comments

Qwen3 30B A3B Hits 13 token/s on 4xRaspberry Pi 5

https://github.com/b4rtaz/distributed-llama/discussions/255
278•b4rtazz•13h ago•115 comments

We hacked Burger King: How auth bypass led to drive-thru audio surveillance

https://bobdahacker.com/blog/rbi-hacked-drive-thrus/
272•BobDaHacker•11h ago•148 comments

The maths you need to start understanding LLMs

https://www.gilesthomas.com/2025/09/maths-for-llms
455•gpjt•4d ago•99 comments

Oldest recorded transaction

https://avi.im/blag/2025/oldest-txn/
135•avinassh•9h ago•60 comments

What to Do with an Old iPad

http://odb.ar/blog/2025/09/05/hosting-my-blog-on-an-iPad-2.html
40•owenmakes•1d ago•28 comments

Anonymous recursive functions in Racket

https://github.com/shriram/anonymous-recursive-function
46•azhenley•2d ago•12 comments

Stop writing CLI validation. Parse it right the first time

https://hackers.pub/@hongminhee/2025/stop-writing-cli-validation-parse-it-right-the-first-time
56•dahlia•5h ago•20 comments

Using Claude Code SDK to reduce E2E test time

https://jampauchoa.substack.com/p/best-of-both-worlds-using-claude
96•jampa•6h ago•66 comments

Matmul on Blackwell: Part 2 – Using Hardware Features to Optimize Matmul

https://www.modular.com/blog/matrix-multiplication-on-nvidias-blackwell-part-2-using-hardware-fea...
7•robertvc•1d ago•0 comments

GigaByte CXL memory expansion card with up to 512GB DRAM

https://www.gigabyte.com/PC-Accessory/AI-TOP-CXL-R5X4
41•tanelpoder•5h ago•38 comments

Microsoft Azure: "Multiple international subsea cables were cut in the Red Sea"

https://azure.status.microsoft/en-gb/status
100•djfobbz•3h ago•13 comments

Why language models hallucinate

https://openai.com/index/why-language-models-hallucinate/
135•simianwords•16h ago•147 comments

Processing Piano Tutorial Videos in the Browser

https://www.heyraviteja.com/post/portfolio/piano-reader/
25•catchmeifyoucan•2d ago•6 comments

Gloria funicular derailment initial findings report (EN) [pdf]

https://www.gpiaaf.gov.pt/upload/processos/d054239.pdf
9•vascocosta•2h ago•6 comments

AI surveillance should be banned while there is still time

https://gabrielweinberg.com/p/ai-surveillance-should-be-banned
462•mustaphah•10h ago•169 comments

Baby's first type checker

https://austinhenley.com/blog/babytypechecker.html
58•alexmolas•3d ago•15 comments

Qantas is cutting executive bonuses after data breach

https://www.flightglobal.com/airlines/qantas-slashes-executive-pay-by-15-after-data-breach/164398...
39•campuscodi•3h ago•9 comments

William James at CERN (1995)

http://bactra.org/wm-james-at-cern/
13•benbreen•1d ago•0 comments

Rug pulls, forks, and open-source feudalism

https://lwn.net/SubscriberLink/1036465/e80ebbc4cee39bfb/
242•pabs3•18h ago•118 comments

Rust tool for generating random fractals

https://github.com/benjaminrall/chaos-game
4•gidellav•2h ago•0 comments

Europe enters the exascale supercomputing league with Jupiter

https://ec.europa.eu/commission/presscorner/detail/en/ip_25_2029
51•Sami_Lehtinen•4h ago•34 comments
Open in hackernews

Speeding up Unreal Editor launch by not spawning unused tooltips

https://larstofus.com/2025/09/02/speeding-up-the-unreal-editor-launch-by-not-spawning-38000-tooltips/
196•samspenc•3d ago

Comments

adithyassekhar•21h ago
This reminded me, I saw tooltips being a large chunk when I profiled my react app. I should go and check that.

Similarly, adding a modal like this

{isOpen && <Modal isOpen={isOpen} onClose={onClose} />}

instead of

<Modal isOpen={isOpen} onClose={onClose} />

Seems to make the app smoother the more models we had. Rendering the UI (not downloading the code, this is still part of the bundle) only when you need it seems to be a low hanging fruit for optimizing performance.

pathartl•20h ago
In the Blazor space we use factories/managers to spawn new instances of a modal/tooltip instead of having something idle waiting for activation.

The tradeoff is for more complicated components, first renders can be slower.

trylist•20h ago
I remember solving this problem before. These are both global components, so you create a single global instance and control them with a global context or function.

You basically have a global part of the component and a local part. The global part is what actually gets rendered when necessary and manages current state, the local part defines what content will be rendered inside the global part for a particular trigger and interacts with the global part when a trigger condition happens (eg hover timeout for a tooltip).

high_priest•12h ago
React devs re-discovering DOM manipulation... SMH.

This is, in general, the idea that is being solved by native interaction with the DOM. It stores the graphic, so it doesn't have to be re-instated every time. Gets hidden with "display:none" or something. When it needs to display something, just the content gets swapped and the object gets 'unhidden'.

Good luck.

jitl•9h ago
The post you’re replying to is saying they went FROM always having the component mounted (at least in the component tree if not in the DOM as display:hidden), TO only mounting the component when it needs to be open. They moved from the way you’re talking about, to creating the component/DOM nodes only when needed.

Excessive nodes - hidden or not - cost memory. On midrange Android it’s scarce and even if you’re not pushing against overall device memory limit, the system is more likely to kill your tab in the background if you’ve got a lot going on.

adithyassekhar•7h ago
Especially when you know the user won't be opening half of those. I did'nt use a global one because the modals themselves have some complex logic inside.
llbbdd•6h ago
"Devs use React because they how to use the web platform, here's how to do it right" and then posts a vanilla solution that doesn't solve understand or solve the problem. Tale as old as time. Bonus points if covering the edge cases in the vanilla solution and making it work for a second component would involve a tiny homegrown reimplentation of most of React anyway.
aiiizzz•17h ago
That breaks the out transition.
amatecha•17h ago
So, win-win? I want a modal to get out of the way as fast as possible, any fade/transition animations are keeping me from what I want to look at. :)
chamomeal•5h ago
The designers don’t want to break the out transition, and the PM wants whatever the designer wants
akie•16h ago
Unless you set `isOpen` only when the transition has ended
debugnik•16h ago
Isn't isOpen = false what triggers the transition in the first place here?
abdusco•15h ago
Even when using view transitions?

https://developer.mozilla.org/en-US/docs/Web/CSS/@starting-s...

csande17•14h ago
The "Transitioning elements on DOM addition and removal" example in that article uses a setTimeout() to wait an extra 1000 milliseconds before removing the element from the DOM. If you immediately remove the element from the DOM (like would usually happen if you do {isOpen && <Modal />} in React), it'll vanish immediately and won't have time to play the transition.
btown•12h ago
You can create a ref that stores whether isOpen has ever been true, and condition on that, letting you lazily initialize the Modal and its contents while preserving out transitions. I’m honestly surprised this isn’t recommended a lot more often!
zarzavat•11h ago
Good. Transitions are meant to serve a purpose, showing what came from where. A modal doesn't need a transition, it should just disappear instantly. Like closing a window. The user is not helped by animating that something disappears when they close it, they already knew that.
adithyassekhar•7h ago
Good point, luckily there are no transitions for modals in this project.
jitl•7h ago
You should rethink your modal system if removing the controller component from the render tree doesn’t transition out
chamomeal•5h ago
I mean in react if it’s gone from the render tree, it’s gone. I know there are libs like transition-group but I honestly don’t understand how they work.

And the “react way” is to have the UI reflect state. If the state says the modal is not being rendered, it should not be rendered

jitl•3h ago
Yes, React’s main idea is f(state) -> UI; but what’s returned from render is a declarative specification of what the UI should be. It’s up to React (and library authors) to make sure the UI ends up as we specify without our app logic needing to be concerned with how that happens. I view managing transition out animations for a component removed from the render tree the same way: I’m happy if the incidental complexity is encapsulated in a library (either a third party one or something I write myself), rather than spread across the whole app.

There are many high quality third party tools to help with this, such as Motion’s <AnimatePresence> (https://motion.dev/docs/react-animate-presence). I haven’t used the library you mentioned, but it seems somewhat unmaintained and isn’t compatible with react-dom@19.

First party support is coming to React with the new <ViewTransition> component (https://react.dev/reference/react/ViewTransition).

If you insist that only the React maintainers are allowed to diverge DOM state from the render tree or write code you don’t understand, you can adopt it today from react{,-dom}@experimental. It’s been available there since April (https://react.dev/blog/2025/04/23/react-labs-view-transition...).

socalgal2•1h ago
You can add some TransitionManager that uses a bool prop whether or not to render its children and when the prop goes from true to false, keeps rendering its children for some amount of time.
Cthulhu_•15h ago
Alternatively, how many modals can be open at any given time? And is it a floating element? May be an option to make it a global single instance thing then, set the content when needed. Allows for in/out transitions, too, as another commenter pointed out. See also "Portals" in React.
adithyassekhar•7h ago
There is only one. Won't contexts cause rerenders through the tree? We already use portals. It's just each modal have complex logic inside them that they're their own component.
hinkley•20h ago
Any time a library in your code goes from being used by a couple people to used by everyone, you have to periodically audit it from then on.

A set of libraries on our code had hit 20% of response time through years of accretion. A couple months to cut that in half, no architectural or cache changes. Just about the largest and definitely the most cost effective initiative we completed on that team.

Looking at flame charts is only step one. You also need to look at invocation counts, for things that seem to be getting called far more often than they should be. Profiling tools frequently (dare I say consistently) misattribute costs of functions due to pressures on the CPU subsystems. And most of the times I’ve found optimizations that were substantially larger improvements than expected, it’s been from cumulative call count, not run time.

smittywerben•12h ago
Dare me to say costless leaky abstraction. Then I'll point to the thread next door using Chrome profilers to diagnose Chrome internals using Scratch. Then I'll finish saying that at least Unreal has that authentic '90s feel to it.
kg•18h ago
This is one scenario where IMGUI approaches have a small win, even if it's by accident - since GUI elements are constructed on demand in immediate mode, invisible/unused elements won't have tooltip setup run, and the tooltip setup code will probably only run for the control that's showing a tooltip.

(Depending on your IMGUI API you might be setting tooltip text in advance as a constant on every visible control, but that's probably a lot fewer than 38000 controls, I'd hope.)

It's interesting that every control previously had its own dedicated tooltip component, instead of having all controls share a single system wide tooltip. I'm curious why they designed it that way.

krzat•16h ago
How does this compare to React-like approach (React, Flutter, SwiftUI)?

It seems like those libraries do what IMGUI do, but more structured.

socalgal2•1h ago
ImGUIs you do whatever you want with your state. You read your state and call UI functions.

React requires knowing about your state because it wants to monitor all of it for changes to try to optimize not doing things if nothing changed. This ends up infecting every part of your code to do so. It's the number 1 frustration I have using React. I haven't used Flutter or SwiftUI so I don't know if they are analogus

bob1029•14h ago
Unity uses an IMGUI approach and it makes all the difference in the universe. Overriding an OnDrawGizmos method to quickly get at an editor viz of a new component is super efficient. There are some sharp edges like forgetting to set/reset colors, etc, but I much prefer these little annoyances for the convenience I get in return.

AFAIK, UE relies on a retained mode GUI, but I never got far enough into that version of Narnia to experience it first hand.

lentil_soup•14h ago
No idea why you're getting down voted but that was my thought as well.

With immediate mode you don't have to construct any widgets or objects. You just render them via code every frame which gives you more freedom in how you tackle each UI element. You're not forced into one widget system across the entire application. For example, if you detect your tooltip code is slow you could memcpy all the strings in a block of memory and then have tooltips use an index to that memory, or have them load on demand from disk, or the cloud or space or whatever. The point being you can optimise the UI piecemeal.

Immediate mode has its own challenges but I do find it interesting to at least see how the different approaches would tackle the problem

ehsankia•18h ago
Kinda annoying that the article doesn't really answer the core question, which is how much time was saved in the start up time. It does give a 0.05ms per tooltip figure, so I guess multiplied by 38000 gives ~2s saved, which is not too bad.
charlie-83•18h ago
"Together, these two problems can result in the editor spending an extremely long time just creating unused tooltips. In a debug build of the engine, creating all of these tooltips resulted in 2-5 seconds of startup time. In comparison development builds were faster, taking just under a second."
0xml•18h ago
Don't have access to read the code, but I think ideally there should be only one instance created at startup, right?
RossBencina•17h ago
At most one instance at start up. Asynchronous creation or lazy creation on first use are two other potential options. Speaking generally, not Unreal-specific.
WhereIsTheTruth•16h ago
I once made the mistake to buy some sound effects from Fab, I had to download the entire Unreal Engine and start it to create a project to then import the assets..

It took the whole afternoon

It's no wonder UE5 games have the reputation of being poorly optimized, you need an insane machine only just to run the editor..

State of the art graphics pipeline, but webdev level of bloat when it comes to software.. I'd even argue electron is a smoother experience tan Unreal Engine Editor

Insanity

daemin•15h ago
Yet it is the engine dominating the industry and beloved by artists of all kinds.

To get UE games that run well you either need your own engine team to optimise it or you drop all fancy new features.

Ekaros•15h ago
Being around back in days when LCDs replaced the CRTs and learning importance of native resolutions. I feel like recent games have been saved too much by frame-generation and all sort of weird resolution hacks... Mostly by Nvidia and AMD.

I am kinda sad we have reached point where native resolution is not the standard for high mid tier/low high tier GPUs. Surely games should run natively at non-4k resolution on my 700€+ GPU...

daemin•15h ago
Games haven't been running full native resolution for quite some time, maybe even the last decade, as they tend to render to a smaller buffer and then upscale to the desired resolution in order to achieve better frame rates. This doesn't even include frame generation which is trading off supposed higher frame rates for worse response times so the games can feel worse to play.

By Games I mean modern AAA first or third person games. 2D and others will often run at full resolution all the time.

cheschire•13h ago
You mean back in the day when 30 fps at 1024x768 was the norm?

New monitors default to 60hz but folks looking to game are convinced by ads that the only reason they lost that last round was not because of the SBMM algorithm, but because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.

Competitive gaming and Twitch are what pushed the current priorities, and the hardware makers were only too happy to oblige.

rkomorn•13h ago
I don't play any online competitive games or FPSes, but I can definitely tell that 144 FPS on a synced monitor is nicer than 60 FPS, especially when I play anything that uses mouse look.

For me, it's not quite as big of a jump as, say, when we went from SD to HD TV, but it's still a big enough leap that I don't consider it gimmicky.

Gaming in 4K, on the other hand, I don't really care for. QHD is plenty, but I do find 4K makes for slightly nicer desktop use.

Edit: I'll add that I almost always limit FPS anyway because my GPU turns into a jet engine under high load and I hate fan noise, but that's a different problem.

daemin•12h ago
There is something to having a monitor display at a higher than 60fps frame rate especially if the game runs and can process inputs at that higher rate as well. This just decreases the response time that a player can attain as there is literally less time between seeing something on screen and then the game reacting to the player input.

For a bit of background modern games tend to do game procesing and rendering at the same time in parallel, but that means that the frame being processed by the rendering system is the previous frame, and then once rendering has been submitted to the "graphics card" it can take one or more more frames before it's actually visible on the monitor. So you end up with a lag of 3+ frames rather than only a single one like you had on old DOS games and such. So having a faster monitor and being able to render frames at that faster rate will give you some benefit.

In addition this is why using frame generation can actually hurt the gaming experience as instead of waiting 3+ frames to see your input reflected in what is on the screen you end up with something like 7+ frames because the fake in-between frames don't actually deal with any input.

Strom•10h ago
30 fps was not the norm, at least not with competitive games. Like Counter-Strike in 2000 on a CRT. Yes 1024x768 was common, but at 100 fps. Alternatively you would go to 800x600 to reach 120 fps.

It’s only when LCDs appeared that 60 Hz started being a thing on PCs and 60 fps followed as a consequence, because the display can’t show more anyway.

It’s true that competitive gaming has pushed the priority of performance, but this happened in the 90s already with Quake II. There’s nothing fake about it either. At the time a lot of playing happened at LANs not online. The person with the better PC got better results. Repeatedly reproduced by rotating people around on the available PCs.

boomlinde•9h ago
> You mean back in the day when 30 fps at 1024x768 was the norm?

I recall playing games at 100 FPS on my 100 Hz CRT. People seriously interested in multiplayer shooters at the time turned vsync off and aimed for even higher frame rates. It was with this in mind I was quick to upgrade to a 144 Hz display when they got cheap enough: I was taking back territory from when the relatively awful (but much more convenient) flat screens took over.

> because the other player undoubtedly had a 240hz 4K monitor rendering the player coming around the corner a tick faster.

I play 99% single player games and in most of those, response time differences at that scale seem inconsequential. The important difference to me is in motion clarity. It's much easier to track moving objects and anticipate where they will be when you get more frames of animation along their path. This makes fast-paced games much more fun to play, especially first person games where you're always rapidly shifting your view around.

ThatPlayer•12h ago
Native resolution was never good enough though. That's why antialiasing is a thing, to fake a higher than native resolution

And now antialiasing is so good you can start from lower resolutions and still fake even higher quality

boomlinde•9h ago
I don't agree with the framing of it as "faking" a higher than native resolution. The native resolution is what it is. The problem lies in how the view is sampled as it is rendered to the screen. What you ideally do when you have higher frequency content than the screen can represent is to oversample, filter and downsample the view, as in SSAA, or you approximate the effect or use it selectively when there is high frequency content, using some more clever methods.

It's really the same problem as in synthesizing audio. 44.1 kHz is adequate for most audio purposes, but if you are generating sounds with content past the nyquist frequency it's going to alias and fold back in undesirable ways, causing distortion in the audible content. So you multisample, filter to remove the high frequency content and downsample in order to antialias (which would be roughly equivalent to SSAA) or you build the audio from band limited impulses or steps.

Cthulhu_•15h ago
It's just like your computer and IDE, you start it up and never shut it down again.

Wouldn't it taking the whole afternoon be because it's downloading and installing assets, creating caches, indexing, etc?

Like with IDEs, it really doesn't matter much once they're up and running, and the performance of the product has ultimately little to do with the tools used in making them. Poorly optimized games have the reputation of being poorly optimized, that's rarely down to the engine. Maybe the complete package, where it's too easy to just use and plop down assets from the internets without tweaking for performance or having a performance budget per scene.

Stevvo•4h ago
No, it is not. You don't have to restart your computer/IDE every time you compile a project. With UE5, you nearly always do need to restart the editor.
drbig•14h ago
> It's no wonder UE5 games have the reputation of being poorly optimized

Care to exemplify?

I find UE games to be not only the most optimized, but also capable of running everywhere. Take X-COM, which I can play on my 14 year old linux laptop with i915 excuse-for-a-gfx-card, whereas Unity stuff doesn't work here, and on my Windows gaming rig always makes everything red-hot without even approaching the quality and fidelity of UE games.

To me UE is like SolidWorks, whereas Unity is like FreeCAD... Which I guess is actually very close to what the differences are :-)

Or is this "reputation of being poorly optimized" only specific to UE version 5 (as compared to older versions of UE, perhaps)?

Cloudef•14h ago
The reputation is specific to UE5. UE3 used to have such reputation as well. UE5 introduced new systems that are not compatible with traditional systems and these systems especially if used poorly tank the performance. Its not uncommon for UE5 games to run poorly even on the most expensive nvidia GPU and AI upscaling is requirement.
mort96•14h ago
The reputation of being poorly optimized only applies to version 5, UE was rather respected before the wave of terribly performing UE 5 AAA games came out and tanked UE's reputation.

It also has a terrible reputation because a bunch of the visual effects have a hard dependency on temporal anti-aliasing, which is a form of AA which typically results in a blurry-looking picture with ghosting as soon as anything is moving.

daemin•12h ago
Funnily enough a lot of those "poor performing" UE games were actually UE4 still, not UE5.
VoidWarranty•7h ago
Let's be real. UE5 is a marketing term for a .x version of UE4 that broke a bunch of the rendering pipeline such that they needed am excuse to force devs to deal with the changes.
drbig•12h ago
Thanks for the replies! Will note the UE5 specificity.
redox99•12h ago
It took that long because it had to compile shaders for the first time. After that it would open in seconds.
Sharlin•15h ago
Hmm, so what exactly is stored in that gigabyte of tooltips? Even 100,000 tooltips per language should take maybe a few tens of megabytes of space. How many localizations does the editor have?
lukan•15h ago
It is not the text data. It is that every tool tip gets made into an UI element.

"Firstly, despite its name, the function doesn’t just set the text of a tooltip; it spawns a full tooltip widget, including sub-widgets to display and layout the text, as well as some helper objects. This is not ideal from a performance point of view. The other problem? Unreal does this for every tooltip in the entire editor, and there are a lot of tooltips in Unreal. In fact, up to version 5.6, the text for all the tooltips alone took up around 1 GB of storage space."

But I assume the 1GB storage for all tooltips include boilerplate. I doubt it is 1 GB of raw text.

Sharlin•14h ago
Yes, I meant the size on disk. I presume the serialization format isn't the most efficient possible. But I can't think of any particular boilerplate that you'd want to store in a file that's just supposed to store localization strings.
cardanome•4h ago
That seems like a very wastefull way to implement tooltips.

The user can ever only see one single tooltip. (Or maybe more if you have tooltips for tooltips but I don't think Unreal has that, point is, a limited number.)

So initialize a single tooltip object. When the users mouses over an element with an tooltip, set the appropriate text, move the tooltip widget to the right position and show it. If the user moves away, hide it.

Simple and takes nearly no memory. Seems like some people still suffer from 90s OOP brain rot.

bob1029•15h ago
From a purely technical perspective, UE is an absolute monster. It's not even remotely in the same league as Unity, Godot, etc. when it comes to iteration difficulty and tooling.

I struggle with UE over others for any project that doesn't demand an HDRP equivalent and nanometric mesh resolution. Unity isn't exactly a walk in the park either but the iteration speed tends to be much higher if you aren't a AAA wizard with an entire army at your disposal. I've never once had a UE project on my machine that made me feel I was on a happy path.

Godot and Unity are like cheating by comparison. ~Instant play mode and trivial debugging experience makes a huge difference for solo and small teams. Any experienced .NET developer can become productive on a Unity project in <1 day with reasonable mentorship. The best strategy I had for UE was to just use blueprints, but this is really bad at source control and code review time.

cheschire•13h ago
And blueprints take forever to wire up in my experience compared to just writing the C++ directly.
diggan•12h ago
Never worked in a larger game-dev team before, but I always saw the benefits of Blueprints to be mainly for the ones who don't know how to code. Setup the right abstractions and you can let the level designers add interactivity for example, rather than Blueprints mainly existing for speeding up the work of C++ devs.
markus_zhang•9h ago
I think UE requires the dev team to have a clear cut between the designers and programmers. Programmers code BP "components" and give them to designers to wire them up. The heavy lifting and complicated logic should live in C++ IMO. Otherwise it's going to be hell.
smittywerben•13h ago
NEW QUEST: "These New Gaming Requirements Are Unreal"

OBJECTIVE: Any project that demands HDRP and Nanometric Mesh

BONUS: Find the happy path

blashyrk•13h ago
I felt the exact same way until I tried Hazelight's AngelScript UE fork. It is amazing, it brings the developer experience and iteration speed to Unity levels. They even made a VSCode plugin for it. Cannot recommend enough
soniczentropy•2h ago
I'll heartily second this. After years of Unity, I just couldn't stand the developer experience any more. Waiting for iterative compiles that took an ice age each time I changed a line of code killed me. Angelscript UE is as close to engine perfection as I can imagine
markus_zhang•11h ago
I think UE is so good at graphics that there is no reason to use it for most of the developers. I don't understand why many indie developers chose to use it.
forgotoldacc•11h ago
I'm working with a friend on a project and desperately trying to sway him away from Unreal. His reason for wanting to use it is because he can build the engine from source and modify it any way he wants (and he intends to attempt just that). He's also very much into pushing the engine's lighting to its limits.

We're a team with < 10 employees. He's paying very handsomely, so even if his Unreal foray is an absolute disaster, I'll have the savings to find something else.

markus_zhang•9h ago
Oh that's definitely something to play with if the $$ is good :D I wouldn't mind.
bob1029•9h ago
> He's also very much into pushing the engine's lighting to its limits.

With a bit of experience you can achieve global illumination results that are competitive with Pixar films by using static scene elements, URP, area lighting, baked GI and 5~10 minutes on a 5700XT. The resulting output will run at hundreds of FPS on most platform targets. If this means pegging vsync, it may also be representative of a power savings on those platforms.

Lights in video games certainly use real electricity, but the power spent on baked lights is amortized across every unique target that runs the game. The biggest advantage of baking is that you can use an unlimited # of lights. Emulation of a physical scene is possible. There are also types of lights that aren't even accessible at real-time (area/volumetric). These produce the most compelling visual results, avoiding problems that others create such as hotspots in reflection probes and hard shadowing.

Lightmap baking is quickly becoming a lost art because realtime lighting is so simple by comparison (at first!). It also handles a lot of edges cases automagically. The most important ones being things like dynamic scene elements and foliage. Approximately half of the editor overlays in Unity are dedicated to visualizing the various aspects of baked lighting. It is one of the more difficult things to polish but if you have the discipline to do so it will make your game highly competitive in the AAA arena.

The crazy thing to me about baked GI is that it used to be incredibly crippling on iteration time. Working at a studio back in 2014 I recall wild schemes to bake lights in AWS so we could iterate faster. Each scene would take hours to fully bake. Today, you can iterate global GI in a fixed viewport multiple times per second with a progressive GPU light mapper. Each scene can be fully baked in <10 minutes. There has never been a better time to build games using technology like this. If I took a game studio from a decade ago and gave them the technology we have today, they would wipe the floor with every other studio on earth right now.

This tech doesn't have to be all-or-nothing either. Most well engineered AAA games utilize a mixture of baked & real time. The key is to make as many lights baked as possible, to the extent that you are kind of a constraining asshole about it, even though you can support 8+ dynamic lights per scene object. I look at real time lighting as a bandaid, not a solution.

If you want to attack this from a business perspective - Bleeding edge lighting tech is a nightmare if you want to ship to a large # of customers on a wide range of platforms.

pdntspa•6h ago
The ecosystem, including tens (if not hundreds) of thousands of dollars of free assets are an extremely compelling use case for me
TGower•10h ago
Blueprints can be a great learning tool, if you double click a node it will open a VS window with the actual C++ code function.
stemlord•9h ago
That's extremely unfortunate given that Unity is becoming financially hostile gouging enterprise customers, and apparently Godot is not quite mature enough to compete. Game engines are quickly becoming a problem
idle_zealot•9h ago
> Game engines are quickly becoming a problem

Game engines have always been a problem. They're very tricky to make and cover everyone's use cases, and I don't think they've ever been in as good a state as right now.

MountainTheme12•7h ago
My issue with Unreal is that Epic puts little effort into improving the developer experience, focusing instead on churning out tech demos, flashy visuals and half baked features that only become usable after several major releases (if ever). The artists at my company love it, the developers not so much.
booi•7h ago
The really sad part is, Epic knows they don't need to sell it to you. They need to sell it to the C-suite.
Agentlien•5h ago
I have worked full time for five years with a combination of Unreal and Unity. I've also worked for five years with Frostbite and another five in various custom engines.

I absolutely love both Unreal and Unity. Unreal is amazing from a technical perspective and having worked with a talented team in Unreal the stuff we were able to make were mind-blowing given the resources we had.

Unity is way easier to work with if you aren't focused on high fidelity graphics. In fact, I've never tried any engine that felt as easy to work with as Unity. I would absolutely not call it a monster. Even with a fairly green team you can hit the ground running and get really productive with Unity.

moron4hire•4h ago
The problem with Unity for green developers is that a lot of the defaults are bad and there isn't a lot of guidance to fixing it unless you just Git Gud at Unity, which is not something I've observed a lot of green developers do. There's a reason why "Unity Asset Store Dump" is a meme and it's largely because of content farm companies churning projects off the backs of perpetually underpaid junior developers (with no seniors around to mentor them). I saw too many of my friends in the local VR scene 10 years ago see absolutely zero economic progress and very little technical progress because of it.

So yeah, Unity is an easy on-ramp. But unfortunately, I think it puts people in a bad market that doesn't serve them well.

Agentlien•4h ago
I do agree that some of the defaults are quite bad. A great example of this is how even Unity themselves recommend an object workflow very different from what their engine naturally seems to suggest: You really shouldn't use tons of objects with update methods called directly by the engine. You need managers with update functions which iterate over all their subjects. Doing it the intuitive way easily becomes unsustainable and is horrendous for performance.
thaumasiotes•15h ago
This was originally submitted with the title "Speeding up Unreal Editor launch by not spawning 38000 tooltips", a much closer match to the actual title of the post, "Speeding up the Unreal Editor launch by ... not spawning 38000 tooltips".

Why has it been changed? The number of tooltips improves the title and is accurate to the post.

Yiannis128•7h ago
They really need to rewrite the whole editor, or at least strip down unused systems... The whole thing being that big is inexcusable...
Stevvo•6h ago
I recently decided the fastest way to work with Unreal is to throw it out and go with something else. It's like 10 fucking minutes to compile an empty project.

I like Godot primarily because of GDScript; you are not compiling anything so iteration time is greatly reduced. Unigine is also worth a mention. It has all the modern features you could want like double precision, global illumination etc but without the bloat and complexity. It's easy to use as much or as little of the engine as you need; in every project you write the main() function. Similar license options to Unity/Unreal.