frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

Open in hackernews

AMD and Sony's PS6 chipset aims to rethink the current graphics pipeline

https://arstechnica.com/gaming/2025/10/amd-and-sony-tease-new-chip-architecture-ahead-of-playstation-6/
92•zdw•4h ago

Comments

three_burgers•2h ago
It feels like each time SCE makes a new console, it'd always come with some novelty that's supposed to change the field forever, but after two years they'd always end up just another console.
noir_lord•2h ago
It does but I don't think that's necessarily a bad thing, they at least are willing to take some calculated risks about architecture - since consoles have essentially collapsed to been a PC internally.
three_burgers•2h ago
I don't think it's a bad thing either. Consoles are a curious breed in today's consumer electronics landscape, it's great that someone's still devoted to doing interesting experiments with it.
jpalawaga•2h ago
You end up with a weird phenomenon.

Games written for the PlayStation exclusively get to take advantage of everything, but there is nothing to compare the release to.

Alternatively, if a game is release cross-platform, there’s little incentive to tune the performance past the benchmarks of comparable platforms. Why make the PlayStation game look better than Xbox if it involves rewriting engine layer stuff to take advantage of the hardware, for one platform only.

Basically all of the most interesting utilization of the hardware comes at the very end of the consoles lifecycle. It’s been like that for decades.

ViscountPenguin•2h ago
I suspect it won't be as much of an issue next gen, with Microsoft basically dropping out of the console market.
awill•2h ago
3rd party games will still want to launch on the Nintendo Switch 2, so it's still the same problem.
dontlaugh•47m ago
The Switch (even 2) is nowhere near the same class of performance as PlayStation or Xbox, games on them aren't comparable.
beagle3•2h ago
It’s also that way on the C64 - while it came out in 1981, people figures out how to get 8 bit sound and high resolution color graphics with multiple sprites only after 2000…
three_burgers•2h ago
I think apart from cross-platform woes (if you can call it that), it's also that the technology landscape would shift, two or few years after the console's release:

For PS2, game consoles didn't become the centre of home computing; for PS3, programming against the GPU became the standard of doing real time graphics, not some exotic processor, plus that home entertaining moved on to take other forms (like watching YouTube on an iPad instead of having a media centre set up around the TV); for PS4, people didn't care if the console does social networking; PS5 has been practical, it's just the technology/approach ended up adopted by everyone, so it lost its novelty later on.

pjmlp•2h ago
That is very country specific, many countries home computers since the 8 bit days always dominated, whereas others consoles always dominated since Nintendo/SEGA days.
anthk•1h ago
Also tons of blue collar people bought Chinese NES clones even in mid 90's (at least in Spain) while some other people with white collar jobs bought their kids a Play Station. And OFC the Brick Game Tetris console was everywhere. By late 90's, yes, most people afforded a Play Station, but as for myself I've got a computer in very early 00's and I would emulate the PSX and most N64 games just fine (my computer wasn't a high end one, but the emulators were good enough to play the games at 640x480 and a bilinear filter).
ffsm8•21m ago
You got a very "interesting" history there, it certainly not particularly grounded in reality however.

PS3s edge was generally seen as the DVD player.

That's why Sony went with Blue Ray in the PS4, hoping to capitalize on the next medium, too. While that bet didn't pay out, Xbox kinda self destructed, consequently making them the dominant player any way.

Finally:

> PS5 has been practical, it's just the technology/approach ended up adopted by everyone, so it lost its novelty later on.

PS5 did not have any novel approach that was consequently adopted by others. The only thing "novel" in the current generation is frame generation, and that was already being pushed for years by the time Sony jumped on that bandwagon.

MindSpunk•13m ago
You've got your history wrong too.

The PS2 was the DVD console. The PS3 was the bluray console.

The PS4 and PS5 are also bluray consoles, however blurays are too slow now so they're just a medium for movies or to download the game from.

ericye16•1h ago
Maybe I ate too much marketing but it does feel like having the PS5 support SSDs raised the bar for how fast games are expected to load, even across platforms.
numpad0•45m ago
That was kind of true until Xbox 360 and later Unity, those ended eras of consoles as machines made of quirks as well as game design as primarily software architecture problems. The definitive barrier to entry for indie gamedevs before Unity was the ability to write a toy OS, a rich 3D engine, and GUI toolkit by themselves. Only little storytelling skills were needed.

Console also partially had to be quirky dragsters because of Moore's Law - they had to be ahead of PC by years, because it had to be at least comparable to PC games at the end of lifecycle, not utterly obsolete.

But we've all moved on. IMO that is a good thing.

Negitivefrags•2h ago
I really hope that this doesn't come to pass. It's all in on the two worst trends in graphics right now. Hardware Raytracing and AI based upscaling.
RedShift1•2h ago
What's wrong with hardware raytracing?
Negitivefrags•2h ago
There are a lot of theoretical arguments I could give you about how almost all cases where hardware BVH can be used, there are better and smarter algorithms to be using instead. Being proud of your hardware BVH implementation is kind of like being proud of your ultra-optimised hardware bubblesort implementation.

But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.

A common argument is that we don't have fast enough hardware yet, or developers haven't been able to use raytracing to it's fullest yet, but it's been a pretty long damn time since this hardware was mainstream.

I think the most damning evidence of this is the just released Battlefield 6. This is a franchise that previously had raytracing as a top-level feature. This new release doesn't support it, doesn't intend to support it.

And in a world where basically every AAA release is panned for performance problems, BF6 has articles like this: https://www.pcgamer.com/hardware/battlefield-6-this-is-what-...

noir_lord•2h ago
> But how about a practical argument instead. Enabling raytracing in games tends to suck. The graphical improvements on offer are simply not worth the performance cost.

Pretty much this - even in games that have good ray tracing, I can't tell when it's off or on (except for the FPS hit) - I cared so little I bought a card not known to be good at it (7900XTX) because the two games I play the most don't support it anyway.

They oversold the technology/benefits and I wasn't buying it.

ahoka•13m ago
There were and always are people who swear to not see the difference with anything above 25hz, 30hz, 60hz, 120hz, HD, Full HD, 2K, 4K. Now it's ray-tracing, right.
asah•1h ago
naive q: could games detect when the user is "looking around" at breathtaking scenery and raytrace those? offer a button to "take picture" and let the user specify how long to raytrace? then for heavy action and motion, ditch the raytracing? even better, as the user passes through "scenic" areas, automatically take pictures in the background. Heck, this could be an upsell kind of like the RL pictures you get on the roller coaster... #donthate

(sorry if obvious / already done)

danparsonson•57m ago
Not exactly the same but adaptive rendering based on viewer attention reminded me of this: https://en.wikipedia.org/wiki/Foveated_rendering
Our_Benefactors•2h ago
Not OP, but a lot of the current kvetching about hardware based ray tracing is that it’s basically an nvidia-exclusive party trick, similar to DLSS and physx. AMD has this inferiority complex where nvidia must not be allowed to innovate with a hardware+software solution, it must be pure hardware so AMD can compete on their terms.
diffeomorphism•2h ago
Much higher resource demands, which then requires tricks like upscaling to compensate. Also you get uneven competition between GPU vendors because it is not hardware ray tracing but Nvidia raytracing in practice.

On a more subjective note, you get less interesting art styles because studio somehow have to cram raytracing as a value proposition in there.

bob1029•1h ago
It will never be fast enough to work in real time without compromising some aspect of the player's experience.

Ray tracing is solving the light transport problem in the hardest way possible. Each additional bounce adds exponentially more computational complexity. The control flows are also very branchy when you start getting into the wild indirect lighting scenarios. GPUs prefer straight SIMD flows, not wild, hierarchical rabbit hole exploration. Disney still uses CPU based render farms. There's no way you are reasonably emulating that experience in <16ms.

The closest thing we have to functional ray tracing for gaming is light mapping. This is effectively just ray tracing done ahead of time, but the advantage is you can bake for hours to get insanely accurate light maps and then push 200+ fps on moderate hardware. It's almost like you are cheating the universe when this is done well.

The human brain has a built in TAA solution that excels as frame latencies drop into single digit milliseconds.

zubspace•1h ago
The problem is the demand for dynamic content in AAA games. Large exterior and interior worlds with dynamic lights, day and night cycle, glass and translucent objects, mirrors, water, fog and smoke. Everything should be interactable and destructable. And everything should be easy to setup by artists.

I would say, the closest we can get are workarounds like radiance cascades. But everything else than raytracing is just an ugly workaround which falls apart in dynamic scenarios. And don't forget that baking times and storing those results, leading to massive game sizes, are a huge negative.

Funnily enough raytracing is also just an approximation to the real world, but at least artists and devs can expect it to work everywhere without hacks (in theory).

realusername•2h ago
So far the AI upscaling/interpolating has just been used to ship horribly optimized games with a somewhat acceptable framerate
washadjeffmad•1h ago
The gimmicks aren't the product, and the customers of frontier technologies aren't the consumers. The gamers and redditors and smartphone fanatics, the fleets of people who dutifully buy, are the QA teams.

In accelerated compute, the largest areas of interest for advancement are 1) simulation and modeling and 2) learning and inference.

That's why this doesn't make sense to a lot of people. Sony and AMD aren't trying to extend current trends, they're leveraging their portfolios to make the advancements that will shape future markets 20-40 years out. It's really quite bold.

distances•1h ago
I also find them completely useless for any games I want to play. I hope that AMD would release a card that just drops both of these but that's probably not realistic.
stanac•1h ago
They will never drop ray tracing, some new games require ray tracing. The only case where I think it's not needed is some kind of specialized office prebuilt desktops or mini PCs.
Sol-•1h ago
The amount of drama about AI based upscaling seems disproportionate. I know framing it in terms of AI and hallucinated pixels makes it sound unnatural, but graphics rendering works with so many hacks and approximations.

Even without modern deep-learning based "AI", it's not like the pixels you see with traditional rendering pipelines were all artisanal and curated.

anal_reactor•34m ago
Regarding AI-based upscaling, autists go reeeee meanwhile gamers go wrrrrrrrrr.

AI upscaling is an amazing technology that can provide excellent graphics with minimal overhead. I always play with DLSS on because this is just the be superior experience. Of course there's a group of devs that will excuse poor programming because the gamers can just run lower res and upscale, but this complaint shows up literally every time there's a new technique to do something.

Our_Benefactors•2h ago
Cell processor 2: electric boogaloo

Seems they didn’t learn from the PS3, and that exotic architectures don't drive sales. Gamers don’t give a shit and devs won’t choose it unless they have a lucrative first party contract.

bigyabai•2h ago
Custom graphics architectures aren't always a disaster - the Switch 2 is putting up impressive results with their in-house DLSS acceleration.

Now, shackling yourself to AMD and expecting a miracle... that I cannot say is a good idea. Maybe Cerny has seen something we haven't, who knows.

farseer•2h ago
The entire Switch 1 game library is free to play on emulators. They probably put a custom accelerator to prevent reverse engineering. A consequence of using weaker spec parts than their competitors.
bigyabai•2h ago
The Switch 1 also had CUDA cores and other basic hardware accelerators. To my knowledge (and I could be wrong), none of the APIs that Nintendo exposed even gave access to those fancy features. It should just be calls to NVN, which can be compiled into Vulkan the same way DXVK translates DirectX calls.
whatever1•2h ago
Hopefully their game lineup is not as underwhelming as the ps5 one.
WhereIsTheTruth•1h ago
underwhelming? what do you mean?

every year, Playstation ranks very high when it comes to GOTY nominations

just last year, Playstation had the most nominations for GOTY: https://x.com/thegameawards/status/1858558789320142971

not only that, but PS5 has more 1st party games than Microsoft's Xbox S|X

1053 vs 812 (that got inflated with recent Activision acquisition)

https://en.wikipedia.org/wiki/List_of_PlayStation_5_games

https://en.wikipedia.org/wiki/List_of_Xbox_Series_X_and_Seri...

It's important to check the facts before spreading random FUD

PS5 had the strongest lineup of games this generation, hence why they sold this many consoles

Still today, consumers are attracted to PS5's lineup, and this is corroborated by facts and data https://www.vgchartz.com/

In August for example, the ratio between PS5 and Xbox is 8:1; almost as good as the new Nintendo Switch 2, and the console is almost 5 years old!

You say "underwhelming", people are saying otherwise

whatever1•1h ago
Yeah, I don’t recall a single original game from the PS5 exclusive lineup (that wasn’t available for PS4). We did get some remakes and sequels, but the PS5 lineup pales in comparison to the PS4 one.

Also, to my knowledge, the PS5 still lags behind the PS4 in terms of sales, despite the significant boost that COVID-19 provided.

guidedlight•54m ago
The PS4 lineup pales in comparison to the PS3 lineup, which pales in comparison to the PS2 lineup, which pales in comparison to the PS1 lineup.

Each generation has around half the number of games as the previous. This does get a bit murky with the advent of shovelware in online stores, but my point remains.

I think this only proves is that games are now ridiculously expensive to create and met the quality standards expected. Maybe AI will improved this in this future. Take-Two has confirmed that GTA6's budget has exceeded US$1 billion, which is mind-blowing.

ManlyBread•15m ago
There's simply no point in buying that console when it has like what, 7 exclusive titles that aren't shovelware? 7 titles after 5 years? And this number keeps going down because games are constantly being ported to other systems.
lofaszvanitt•2h ago
Noone is gonna give you some groundbreaking tech for your electronic gadget.... As IBM showed when they created the Cell for Sony and then gave almost the same tech to Microsoft :D.
magicalhippo•2h ago
I was going to say "again?", but then I recalled DirectX 12 was released 10 years ago and now I feel old...

The main goal of Direct3D 12, and subsequently Vulcan, was to allow for better use of the underlying graphics hardware as it had changed more and more from its fixed pipeline roots.

So maybe the time is ripe for a rethink, again.

Particularly the frame generation features, upscaling and frame interpolation, have promise but needs to be integrated in a different way I think to really be of benefit.

Hikikomori•24m ago
Don't forget mantle.
pjmlp•14m ago
The rethink is already taking place via mesh shaders and neural shaders.

You aren't seeing them adopted that much, because the hardware still isn't deployed at scale that games can count on them being available, and also it cannot ping back on improving the developer experience adopting them.

phendrenad2•1h ago
Seems like the philosophy here is, if you're going to do AI-based rendering, might as well try it across different parts of the graphics pipeline and see if you can fine-tune it at the silicon level. Probably a microoptimization, but if it makes the PS6 look a tiny bit better than the Xbox, people will pay for that.
amlib•1h ago
Could the PS6 be the last console generation with an expressive improvement in compute and graphics? Miniaturization keeps giving ever more diminishing returns each shrink, prices of electronics are going up (even sans tariffs), lead by the increase in the price of making chips. Alternate techniques have slowly been introduced to offset the compute deficit, first with post processing AA in the seventh generation, then with "temporal everything" hacks (including TAA) in the previous generation and finally with minor usage of AI up-scaling in the current generation and (projected) major usage of AI up-scaling and frame-gen in the next gen.

However, I'm pessimistic on how this can keep evolving. RT already takes a non trivial amount of transistor budget and now those high end AI solutions require another considerable chunk of the transistor budget. If we are already reaching the limits of what non generative AI up-scaling and frame-gen can do, I can't see where a PS7 can go other than using generative AI to interpret a very crude low-detail frame and generating a highly detailed photorealistic scene from that, but that will, I think, require many times more transistor budget than what will likely ever be economically achievable for a whole PS7 system.

Will that be the end of consoles? Will everything move to the cloud and a power guzzling 4KW machine will take care of rendering your PS7 game?

I really can only hope there is a break-trough in miniaturization and we can go back to a pace of improvement that can actually give us a new generation of consoles (and computers) that makes the transition from an SNES to a N64 feel quaint.

bob1029•1h ago
Gaming using weird tech is not a hardware manufacturer or availability issue. It is a game studio leadership problem.

Even in the latest versions of unreal and unity you will find the classic tools. They just won't be advertised and the engine vendor might even frown upon them during a tech demo to make their fancy new temporal slop solution seem superior.

The trick is to not get taken for a ride by the tools vendors. Real time lights, "free" anti aliasing, and sub-pixel triangles are the forbidden fruits of game dev. It's really easy to get caught up in the devil's bargain of trading unlimited art detail for unknowns at end customer time.

pixelpoet•1h ago
Teenage me from the 90s telling everyone that ray tracing will eventually take over all rendering and getting laughed at would be happy :)
prox•26m ago
Hi teenage you! You did well :)

The idea of the radiance cores is pretty neato

poisonborz•1h ago
The industry, and at large the gaming community is just long past being interested in graphics advancement. AAA games are too complicated and expensive, the whole notion of ever more complex and grandiose experiences doesn't scale. Gamers are fractured along thousands of small niches, even in sense of timeline in terms of 80s, 90s, PS1 era each having a small circle of businesses serving them.

The times of console giants, their fiefdoms and the big game studios is coming to an end.

b_e_n_t_o_n•1h ago
idk, battlefield 6 came out today to very positive reviews and it's absolutely gorgeous.
ksec•45m ago
It looks like Frostbite 4.0 is so much better than Unreal 5.x. I cant wait to see comparison.
jimaek•8m ago
It's fine, but definitely a downgrade compared to previous titles like Battlefield 1. At moments it looks pretty bad.

I'm curious why graphics are stagnating and even getting worse in many cases.

rafaelmn•1h ago
I disagree - current gen console aren't enough to deliver smooth immersive graphics - I played BG3 on PS first and then on PC and there's just no comparing the graphics. Cyberpunk same deal. I'll pay to upgrade to consistent 120/4k and better graphics, and I'll buy the games.

And there are AAA that make and will make good money with graphics being front and center.

Ntrails•15m ago
>aren't enough to deliver smooth immersive graphics

I'm just not sold.

Do I really think that BG3 being slightly prettier than, say, Dragon Age / Skyrim / etc made it a more enticing game? Not to me certainly. Was cyberpunk prettier than Witcher 3? Did it need to be for me to play it?

My query isn't about whether you can get people to upgrade to play new stuff (always true). But whether they'd still upgrade if they could play on the old console with worse graphics.

I also don't think anyone is going to suddenly start playing video games because the graphics improve further.

pjmlp•10m ago
Being an old dog that still cares about gaming, I would assert many games are also not taking advantage of current gen hardware, coded in Unreal and Unity, a kind of Electron for games, in what concerns taking advantage of existing hardware.

There is a reason there are so many complaints in social media about being obvious to gamers in what game engine a game was written on.

It used to be that game development quality was taken more seriously, when they were sold via storage media, and there was a deadline to burn those discs/cartridges.

Now they just ship whatever is done by the deadline, and updates will come later via a DLC, if at all.

amazari•1h ago
So this is AMD catching up with Nvidia in the RT and AI upscaling/frame gen fields. Nothing wrong with it, and I am quite happy as an AMD GPU owner and Linux user.

But the way it is framed as a revolutionary step and as a Sony collab is a tad misleading. AMD is competent enough to do it by itself, and this will definitely show up in PC and the competing Xbox.

esperent•56m ago
I think we don't have enough details to make statements like this yet. Sony have shown they are willing to make esoteric gaming hardware in the past (cell architecture) and maybe they'll do something unique again this time. Or, maybe it'll just use a moderately custom model. Or, maybe it's just going to use exactly what AMD have planned for the next few year anyway (as you say). Time will tell.

I'm rooting for something unique because I haven't owned a console for 20 years and I like interesting hardware. But hopefully they've learned a lesson about developer ergonomics this time around.

jiehong•1h ago
I see this as a test ground for the next thing on PC.

Why not also give a mini AMD EPYC cpu with 32 cores? This way games would start to be much better at multicore.