frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

AI for People

https://justsitandgrin.im/posts/ai-for-people/
1•dive•51s ago•0 comments

Rome is studded with cannon balls (2022)

https://essenceofrome.com/rome-is-studded-with-cannon-balls
1•thomassmith65•6m ago•0 comments

8-piece tablebase development on Lichess (op1 partial)

https://lichess.org/@/Lichess/blog/op1-partial-8-piece-tablebase-available/1ptPBDpC
2•somethingp•7m ago•0 comments

US to bankroll far-right think tanks in Europe against digital laws

https://www.brusselstimes.com/1957195/us-to-fund-far-right-forces-in-europe-tbtb
2•saubeidl•8m ago•0 comments

Ask HN: Have AI companies replaced their own SaaS usage with agents?

1•tuxpenguine•11m ago•0 comments

pi-nes

https://twitter.com/thomasmustier/status/2018362041506132205
1•tosh•13m ago•0 comments

Show HN: Crew – Multi-agent orchestration tool for AI-assisted development

https://github.com/garnetliu/crew
1•gl2334•13m ago•0 comments

New hire fixed a problem so fast, their boss left to become a yoga instructor

https://www.theregister.com/2026/02/06/on_call/
1•Brajeshwar•15m ago•0 comments

Four horsemen of the AI-pocalypse line up capex bigger than Israel's GDP

https://www.theregister.com/2026/02/06/ai_capex_plans/
1•Brajeshwar•15m ago•0 comments

A free Dynamic QR Code generator (no expiring links)

https://free-dynamic-qr-generator.com/
1•nookeshkarri7•16m ago•1 comments

nextTick but for React.js

https://suhaotian.github.io/use-next-tick/
1•jeremy_su•18m ago•0 comments

Show HN: I Built an AI-Powered Pull Request Review Tool

https://github.com/HighGarden-Studio/HighReview
1•highgarden•18m ago•0 comments

Git-am applies commit message diffs

https://lore.kernel.org/git/bcqvh7ahjjgzpgxwnr4kh3hfkksfruf54refyry3ha7qk7dldf@fij5calmscvm/
1•rkta•21m ago•0 comments

ClawEmail: 1min setup for OpenClaw agents with Gmail, Docs

https://clawemail.com
1•aleks5678•27m ago•1 comments

UnAutomating the Economy: More Labor but at What Cost?

https://www.greshm.org/blog/unautomating-the-economy/
1•Suncho•34m ago•1 comments

Show HN: Gettorr – Stream magnet links in the browser via WebRTC (no install)

https://gettorr.com/
1•BenaouidateMed•35m ago•0 comments

Statin drugs safer than previously thought

https://www.semafor.com/article/02/06/2026/statin-drugs-safer-than-previously-thought
1•stareatgoats•37m ago•0 comments

Handy when you just want to distract yourself for a moment

https://d6.h5go.life/
1•TrendSpotterPro•39m ago•0 comments

More States Are Taking Aim at a Controversial Early Reading Method

https://www.edweek.org/teaching-learning/more-states-are-taking-aim-at-a-controversial-early-read...
2•lelanthran•40m ago•0 comments

AI will not save developer productivity

https://www.infoworld.com/article/4125409/ai-will-not-save-developer-productivity.html
1•indentit•45m ago•0 comments

How I do and don't use agents

https://twitter.com/jessfraz/status/2019975917863661760
1•tosh•51m ago•0 comments

BTDUex Safe? The Back End Withdrawal Anomalies

1•aoijfoqfw•54m ago•0 comments

Show HN: Compile-Time Vibe Coding

https://github.com/Michael-JB/vibecode
7•michaelchicory•56m ago•1 comments

Show HN: Ensemble – macOS App to Manage Claude Code Skills, MCPs, and Claude.md

https://github.com/O0000-code/Ensemble
1•IO0oI•1h ago•1 comments

PR to support XMPP channels in OpenClaw

https://github.com/openclaw/openclaw/pull/9741
1•mickael•1h ago•0 comments

Twenty: A Modern Alternative to Salesforce

https://github.com/twentyhq/twenty
1•tosh•1h ago•0 comments

Raspberry Pi: More memory-driven price rises

https://www.raspberrypi.com/news/more-memory-driven-price-rises/
2•calcifer•1h ago•0 comments

Level Up Your Gaming

https://d4.h5go.life/
1•LinkLens•1h ago•1 comments

Di.day is a movement to encourage people to ditch Big Tech

https://itsfoss.com/news/di-day-celebration/
4•MilnerRoute•1h ago•0 comments

Show HN: AI generated personal affirmations playing when your phone is locked

https://MyAffirmations.Guru
4•alaserm•1h ago•3 comments
Open in hackernews

Learn computer graphics from scratch and for free

https://www.scratchapixel.com
310•theusus•1mo ago

Comments

yunnpp•1mo ago
The website has come a long way, a good reminder for Santa to drop a donation.

Computer graphics needs more open education for sure. Traditional techniques are sealed in old books you have to go out of your way and find; Sergei Savchenko's "3D Graphics Programming Games and Beyond" is a good one. New techniques are often behind proprietary gates, with shallow papers and slides that only give a hint of how things may work. Graphics APIs, especially modern ones, make things more confusing than they need to be too. I think writing software rasterizers and ray tracers is a good starting point; forget GPUs exist.

Also, slight tangent, but there doesn't seem to be any contact method here other than Discord, which I find to be an immediate turn-off. Last time I checked, it required a phone number.

The donations page could use a link directly from the homepage too.

WillAdams•1mo ago
I can still remember a fellow student wanting to know how to write a 3D computer game, the professor being stumped, and my chiming in w/

>Get Foley & Van Dam from the library

noting it should be available to check out, since I'd just checked it back in.

Several new editions since:

https://www.goodreads.com/book/show/5257044-computer-graphic...

yunnpp•1mo ago
Yeah, that's "the mouse book" in my mind. The tiger book is also a very good compilation of topics, though it leaves things as "exercise for the reader" more often than I would like to.

https://www.goodreads.com/book/show/1933732.Fundamentals_of_...

WillAdams•1mo ago
I really miss the days when folks referred to books by cover descriptor --- I still have my copies of the Red/Green/Purple/Black books for PostScript.....
john01dav•1mo ago
> New techniques are often behind proprietary gates, with shallow papers and slides that only give a hint of how things may work.

I've been able to implement techniques based on such things without too much trouble. Also, Unreal is source available, although I haven't used its source to learn, and haven't checked the license for risks with doing so.

tombert•1mo ago
Graphics have been a blind spot for me for pretty much my entire career. I more or less failed upward into where I am now (which ended up being a lot of data and distributed stuff). I do enjoy doing what I do and I think I'm reasonably good at it so it's hardly a "bad" thing, but I (like I think a lot of people here) got into programming because I wanted to make games.

Outside of playing with OpenGL as a teenager to make a planet orbit around a sun, a bad space invaders clone in Flash where you shoot a bird pooping on you, a really crappy Breakout clone with Racket, and the occasional experiments with Vulkan and Metal, I never really have fulfilled the dream of being the next John Carmack or Tim Sweeney.

Every time I try and learn Vulkan I end up getting confused and annoyed about how much code I need to write and give up. I suspect it's because I don't really understand the fundamentals well enough, and as a result jumping into Vulkan I end up metaphorically "drinking from a firehose". I certainly hope this doesn't happen, but if I manage to become unemployed again maybe that could be a good excuse to finally buckle down and try and learn this.

socalgal2•1mo ago
Try WebGL or better, WebGPU. It's so much easier and all the concepts you learn are applicable to other APIs.

https://webgpufundamentals.org

or

https://webgl2fundamentals.org

I'd choose webgpu over webgl2 as it more closely resembles current mondern graphics APIs like Metal, DirectX12, Vulkan.

tombert•1mo ago
Yeah you're not the first one to mention that to me. I'll probably try WebGPU or wgpu next time I decide to learn graphics. I'd probably have more fun with it than Vulkan.
gattr•1mo ago
I concur; just last month I started with `wgpu` (the Rust bindings for WebGPU) after exclusively using OpenGL (since 2000, I think? via Delphi 2). Feels a bit verbose at first (with all the pipelines/bindings setup), but once you have your first working example, it's smooth sailing from there. I kind of liked (discontinued) `glium`, but this is better.
weslleyskah•1mo ago
I feel the same. I was trying to make some "art" with shaders.

I was inspired by Zbrush and Maya, but I don't think I can learn what is necessary to build even a small clone of these gigantic pieces of software, unless I work with this on a day to day basis.

The performance of Zbrush is so insane... it is mesmerizing. I don't think I can go deep into this while treading university.

bicolao•1mo ago
> Every time I try and learn Vulkan I end up getting confused and annoyed about how much code I need to write and give up.

Vulkan isn't meant for beginners. It's a lot more verbose even if you know the fundamentals. Modern OpenGL would be good enough. If you have to use Vulkan, maybe use one of the libraries built on top of it (I use SDL3 for example). You still have freedom doing whatever you want with shaders and leave most of resource management to those libraries.

shmerl•1mo ago
Vulkan isn't a graphics API, it's a low level GPU API. Graphics just happens to be one of the functions that GPUs can handle. That can help understand why Vulkan is the way it is.
delta_p_delta_x•1mo ago
I have a hot take. Modern computer graphics is very complicated, and it's best to build up fundamentals rather than diving off the deep end into Vulkan, which is really geared at engine professionals who want to shave every last microsecond off their frame-times. Vulkan and D3D12 are great, they provide very fine-grained host-device synchronisation mechanisms that can be used to their maximum by seasoned engine programmers. At the same time a newbie can easily get bogged down by the sheer verbosity, and don't even get me started on how annoying the initial setup boilerplate is, which can be extremely daunting for someone just starting out.

GPUs expose a completely different programming memory model, and the issue I would say is conflating computer graphics with GPU programming. The two are obviously related, don't get me wrong, but they can and do diverge quite significantly at times. This is more true recently with the push towards GPGPU, where GPUs now combine several different coprocessors beyond just the shader cores, and can be programmed with something like a dozen different APIs.

I would instead suggest:

  1) Implement a CPU rasteriser, with just two stages: a primitive assembler, and a rasteriser.
  2) Implement a CPU ray tracer. 
Web links for tutorials respectively:

  https://haqr.eu/tinyrenderer/
  https://raytracing.github.io/books/RayTracingInOneWeekend.html
These can be extended in many, many ways that will keep you sufficiently occupied trying to maximise performance and features. In fact to even achieve some basic correctness will require quite a degree of complexity: the primitive assembler will of course need frustum- and back-face culling (and these will mean re-triangulating some primitives). The rasteriser will need z-buffering. The ray-tracer will need lighting, shadow, and camera intersection algorithms for different primitives, accounting for floating-point divergence; spheres, planes, and triangles can all be individually optimised.

Try adding various anti-aliasing algorithms to the rasteriser. Add shading; begin with flat, then extend to per-vertex to per-fragment. Try adding a tessellator where the level of detail is controlled by camera distance. Add in early discard instead of the usual z-buffering.

To the basic Whitted CPU ray tracer, add BRDFs; add microfacet theory, add subsurface scattering, caustics, photon mapping/light transport, and work towards a general global illumination implementation. Add denoising algorithms. And of course, implement and use acceleration data structures for faster intersection lookups; there are many.

Working on all of these will frankly give you a more detailed and intimate understanding of how GPUs work and why they have been developed a certain way, rather than programming with something like Vulkan, spending time filling in struct after struct.

After this, feel free to explore any one of the two more 'basic' graphics APIs: OpenGL 4.6, or D3D11. shadertoy.com and shaderacademy.com are great resources to understand fragment shaders. There are again several widespread shader languages, though most of industry uses HLSL. GLSL can be simpler, but HLSL is definitely more flexible.

At this point, explore more complicated scenarios: deferred rendering, pre- and post-processing for things like ambient occlusion, mirrors, temporal anti-aliasing, render-to-texture for lighting and shadows, etc. This is video-game focused; you could go another direction by exploring 2D UIs, text rendering, compositing, and more.

As for why I recommend starting with CPUs, only to end up back with GPUs again, and one may ask: 'hey, who uses CPUs any more for graphics?' Let me answer: WARP[1] and LLVMpipe[2] are both production-quality software rasterisers; frequently loaded during remote desktop sessions. In fact 'rasteriser' is an understatement: they expose full-fledged software implementations of D3D10/11 and OpenGL/Vulkan devices respectively. And naturally, most film renderers still run on the CPU, due to their improved floating-point precision; films can't really get away with the ephemeral smudging of video games. Also, CPU cores are quite cheap nowadays, so it's not unusual to see a render farm of a million+ cores chewing away at a complex Pixar or Dreamworks frame.

[1]: https://learn.microsoft.com/en-gb/windows/win32/direct3darti...

[2]: https://docs.mesa3d.org/drivers/llvmpipe.html

bsder•1mo ago
I would simplify further:

1) Implement 2D shapes and sprites with blits

With modern compute shaders, this has 95% of "How to use a GPU" while omitting 99% of the "Complicated 3D Graphics" that confuses everybody.

caspar•1mo ago
I can vouch for this "study plan".

4 years ago I tackled exactly those courses (raytracer[0] first, then CPU rasterizer[1]) to learn the basics. And then, yes, I picked up a lib that's a thin wrapper around OpenGL (macroquad) and learned the basics of shaders.

So far this has been enough to build my prototype of a multiplayer Noita-like, with radiance-cascades-powered lighting. Still haven't learned Vulkan or WebGPU properly, though am now considering porting my game to the latter to get some modern niceties.

[0]: https://github.com/caspark/the-all-seeing-crab [1]: https://github.com/caspark/little-crab-tv

suioir•1mo ago
I really enjoy the website content and appreciate the hard work to create it. Also, thank you to the author for taking action on the HN feedback last year about the AI thumbnails that used to be all over this site. [0]

[0] https://news.ycombinator.com/item?id=40622209

RodgerTheGreat•1mo ago
There's still an obnoxious slop image front-and-center, full of nonsense typos. Not a good look for any educational resource.
theusus•1mo ago
This material is best one you can find out there. You can always report typos.
suprjami•1mo ago
One of my goals this year is to write a basic software 3D renderer from first principles. No game engine, no GPU. I'm looking forward to it.
listic•1mo ago
build-your-own-x*, a popular compilation of "well-written, step-by-step guides for re-creating our favorite technologies from scratch", contains some.

* https://github.com/codecrafters-io/build-your-own-x

delta_p_delta_x•1mo ago
Here you go: https://haqr.eu/tinyrenderer/
ggambetta•1mo ago
OP's link is a good one, but if you want a different perspective (heh), there's https://gabrielgambetta.com/computer-graphics-from-scratch/i..., also from scratch, also for free. The name clash is unfortunate, I don't really know who started using it earlier :(
rabf•1mo ago
https://www.youtube.com/watch?v=qjWkNZ0SXfo

One Formula That Demystifies 3D Graphics

atan2•1mo ago
this is such a a nice video. but honestly understanding perspective divide intuitively is probably less than 1% of what 3d rendering is all about.
pixelpoet•1mo ago
Good show, this is how I recommend doing it and have been teaching it for years.

It's quite unfortunate that basically everyone thinks 3D graphics necessarily implies rasterisation and using someone else's API, and I feel extremely lucky to have taught myself in a time when you could trivially display images by direct memory access (mode 13h), and to have focused on ray tracing instead of rasterisation.

RamtinJ95•1mo ago
if you are looking for a resource for this, I did this exact thing last year through pikuma that man is the best software teacher online I have come across. Highly recommend his 3D software renderer course.
atan2•1mo ago
I agree 100% gustavo is such a great teacher. i tried scratchapixel before and its nice but it was with pikuma that everything finally clicked.
gustavopezzi•1mo ago
Gustavo here. Thank you for the mention!
RamtinJ95•1mo ago
Wow did not expect to get a thank you from the legend himself! Thank you for your great courses and the care and effort you put into them sir! :D
neuroelectron•1mo ago
Just in case NVidia stops having a monopoly of graphics APIs, and Google on the web, and AMD as the alternative that sucks and isn't maintained.
random9749832•1mo ago
You can now post a link of a website into an LLM and turn it into an interactive resource. I did this but with a 1000 page PDF today to help me learn more about game engines. Best way to do it if you don't want it to become another PDF / bookmark that is forgotten.
flumpcakes•1mo ago
Which LLMs have a context window big enough to fit a 1000 page PDF?
random9749832•1mo ago
https://gemini.google/overview/long-context/
flumpcakes•1mo ago
Can you give an example of how this has helped you with graphics/gamedev? What kind of questions are you asking gemini when giving it a 1000 page book?
robaye•1mo ago
I maintain (not much anymore) a list of free resources for graphics programming that some of you might find helpful. https://gist.github.com/notnotrobby/ceef71527b4f15869133ba7b...
reactordev•1mo ago
This is gold people.

My username on here is after my (now older) game engine Reactor 3D.

I taught myself this stuff back when Quake 3 took over my high school. Doom got me into computers but Quake 3 got me into 3D. I didn’t quite understand the math in the books I bought but copied the code anyway.

Fast forward into my career and it’s been a pleasant blend of web and graphics. Now that WebGL/WebGPU is widely available. I taught PhD’s how to vertex pack and align and how to send structs to the GPU at my day job. I regret not continuing my studies and getting a PhD but I ended up writing Reactor 3D part time for XNA on Xbox 360 and then rewriting it half a decade later to be pure OpenGL. I still struggle with the advanced concepts but luckily there are others out there.

Fun fact, I worked with the guy who wrote XNA Silverlight, which would eventually be used as the basis for MonoGame, so I’m like MonoGame’s great grand uncle half removed or something. However,

Now that we have different ways of doing things, it demands a different kind of engine. So the Vulkan/Dx12/Metal way is the new jam.

slu•1mo ago
This looks great, will have to check it out. I've always been interested in computer graphics, I even wrote a ray tracer when studying for my master. It was based on the, then, brand new book "An Introduction to Ray Tracing" by Andrew S. Glassner et al. The book is now free to download: https://www.realtimerendering.com/blog/an-introduction-to-ra...
pipes•1mo ago
Is this in anyway related to this:

https://www.gabrielgambetta.com/computer-graphics-from-scrat...