Mainstream users and business organizations don’t really understand that concept and would prefer to see how the operating system enables their use cases and workflows.
RAM shortages will be quite temporary. Making predictions based on individual component shortages has never been a winning strategy in the history of the industry. Next you’ll tell me that graphics cards will be impossible to get because of blockchain.
The same as any other corporate PR department: "At least now when people run it with N GB of RAM, we can just point to the system requirements and say 'This is what we support' rather than end up in a back-and-forth"
If you expect them to have any sort of long-term outlook on "Lets be careful with how developers view our organization", I think you're about a decade too late for Canonical.
At home I have a desktop running Arch plus Gnome with 32GB RAM and I am at 7GB on a normal day and below 16GB at all times unless I run an LLM.
Unrelated to this, despite Ubuntu’s popularity, I think it’s one of the worst distro choices out there, especially for including old kernels for essentially no discernible reason.
I wouldn’t go so far as defending Microslop but I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
They seem to defend Apple’s 8GB machines by saying that Apple systems perform better than Windows with the same amount of RAM. This claim is entirely unsubstantiated.
Windows has a lot of problems but performance and memory efficiency is not one of them. We should recall that Microsoft actually reduced RAM usage and minimum requirements between windows 7 and 8 as they wanted to get into the tablet game, and Windows has remained efficient with memory since then as Microsoft wants Windows to come with cheap Chromebook-like hardware and other similar low-end systems.
I have seen MacOS overcommit up to 50% of memory and still have the system be responsive.
Yesterday I filled up my ram accidentally on Fedora and even earlyoom took several minutes to trigger and in the meantime the system was essentially non-responsive
How do you think data is created? It's lots of anecdotes, normalised.
On my desktop I use linux-cachyos-bore-lto which seems to give me a slight performance boost in compilation times compared to the regular kernel, but I've had at least one crash that I've unable to attribute to any other specific issue, so could be the kernel I suppose, I wouldn't use it on a server nonetheless.
If you’re running applications as in a server that’s an entirely different discussion. I have been assuming we are talking about desktop users who are not serving anything.
E.g., if I go out and buy a 2026 Panther Lake laptop with a new WiFi 7 chip or what have you, I’m going to want a distro with the latest kernels so that I don’t have hardware issues. If I install the default Ubuntu download it’s going to almost certainly have problems.
> I do get tired of the Apple fanboys accusing Windows of being bloated and running poorly.
> Windows has a lot of problems but performance and memory efficiency is not one of them.
I can't even describe how much your experience differs from mine. I would never have imagine someone to utter such sentence about Windows in todays day and age.
For everyone else reading this, a couple of advice I have gotten that made me suffer less with Windows is to replace Windows search with Everything (by Voidtools) and replace Explorer with Filepilot (filepilot.tech).
On a older machine, I switched to Tiny10.
So around ~3 years ago or so I bought a lightweight low-end laptop (Intel Core i3, 14 inch display, 8GB of RAM) for everyday stuff so I could easily bring it with me everywhere I need to go (I mean, everywhere I would need it). It came with Windows 11 pre-installed. Now, for you to understand, previously, like ~10 years ago or so I had a Windows 7 system and it was pretty neat. And I remembered when people were switching from Windows 7 to Windows 8 or 10, they blamed the new OS version just like right now the Windows 11 was blamed; yet everyone got used to it, it received some fixes, improvements, etc; so I thought "well, maybe Windows 11 is not so bad, I should try it out at least just for the sake of curiosity".
And now, the clean installation of the Windows 11 that came to my was requiring like ~20 seconds to fully boot up to the login window. I know that my laptop is not best of the best, but still... After a startup, with no apps opened, there was like ~4 GB of RAM usage just out of nowhere; so effectively I was limited to ~4GB of RAM to run something I want to. Bluetooth drivers were terrible (at the time) - sometimes I was able to connect to my headphones and sometimes I wasn't, while they were working with all of my other devices perfectly. Then there was also this hellish "Antimalware Executable" - and I know how it sounds, I have nothing against anti-virus software, but when it randomly shows up several times per day, eats all of your processing power (like ~80% of CPU usage, and note that I have 8 cores ~3 GHz here), heats up your laptop to the point that fan starts screaming... that was not very good, to speak softly. Battery usage was also a disappointment - sometimes it couldn't last for just 3 hours, while the most heavy thing I was doing during that time was compilation of some software.
I was trying, I was re-configuring, I was applying patches... and finally I got fed up with all of this bloat, broken updates and other garbage. So I just backed up all of my important files and data to external drive and installed Linux Mint (because in this particular case I just needed working laptop). And wow, it just worked! Now at startup I get like ~1 GB RAM usage at most (this actually depends on the DE I use, so numbers could be different from time to time), battery life improved, no more weird Bluetooth issues, no more random bloatware... it just works, and that's it.
I know that distros like Mint are focused on stability and efficiency, so maybe the comparison is a bit unfair. But hell, even while I don't have anything against Windows 7 or Windows 8, the recent Windows 11 is a real combination of bloatware and spyware. So performance and memory efficiency is, actually, the problem here. Or at least it was a problem last time I tried it.
Now, again, I may be wrong somewhere, maybe I missed something out. If I did - please point it out.
My Framework laptop running CachyOS with KDE Plasma with nothing open except System Monitor reserves 4GB with 500MB in swap (I enabled swap for sleep to hibernate, normally there’s no swap).
Reserving RAM doesn’t mean there’s a performance problem.
Most of the things you’re talking about in your comment have nothing to do with RAM usage and memory efficiency. You’re complaining about some annoying preinstalled OEM software [1], bad drivers, fan noise, battery life, and windows updates. That stuff isn’t great but a lot of it doesn’t have anything to do with Windows RAM efficiency itself.
If you download the Windows ISO from Microsoft and clean install you’ll have a pretty nice experience. I think Microsoft needs to crack down on OEM software additions.
As far as slow boot up times/slow initial setup I’ll remind you that Macs also have that as an issue during first boot and spend a lot of time doing initial indexing.
Linux mint is a great distro and I also prefer Linux to both Mac and Windows as well. Mostly my commentary is on the subject of people claiming Microsoft Windows is bad with RAM when we now see some Linux distros asking for more RAM than Windows. I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
[1] I have to assume you’re talking about some third party antimalware program because the Microsoft one absolutely does not behave how you describe.
It does in my own experience (so it may not be a problem for you, I agree, but it is a problem for me). Because when OS allocates ~50% percents of RAM for itself and isn't letting it go, then other software simply can't use it. Therefore, you're limited. Your potential performance is capped at certain level just because your OS decided to allocate half or more of your system RAM. Why? Well, just because it wants to.
> have nothing to do with RAM usage or performance
Well, to be honest, most of them don't. But would you please explain then, why it takes around 20 seconds just to boot up, while for the aforementioned Linux Mint (and I'll clarify that it's currently 22.3 for me, the latest version, it was 22.1 at the time as far as I remember) it's only around ~3-4 second to take me to the login screen and then another second (at most) to load everything after I have logged in? Could you also, please, explain how does it happen that even GNOME's Nautilus file explorer takes less RAM and far less CPU usage than Microsoft's Explorer (and I won't even mention Thunar, that's kinda unfair)? What about "Start" menu in Windows which spiked up CPUs just by opening/closing? There's a lot of performance issues, both with RAM and CPU usage.
I'm not saying that these problems are unique to Windows, no; but saying that Windows doesn't have any performance issues is not really true.
> I think it’s quite clear that RAM isn’t the problem with Windows, it’s a lot of other things and the surrounding ecosystem.
I agree with you here. That's true. A large part of the problem comes not from the actual operating system, but from the application software. I thought once that well, maybe if RAM shortages will last longer than for just one or two years, that will be bad, but also, maybe - just maybe - some software developers will start to think at least a bit more about optimization...
Editing without specifying that you have edited your reply is not very good, you know. But okay.
Actually, I'm talking about the Windows-shipped Microsoft Defender process (at least it seems to come from Microsoft Defender). I have not seen anything third-party installed on my laptop at the time, and it actually behaved just like I described. I should also remind you that it is a low-end laptop, that's just Intel Core i3-N305, it's not the most powerful CPU in the world - just 8 cores, 8 threads and 3.80 GHz of max boost frequency.
If you think that I'm lying, then just search for "antimalware executable high CPU usage" in any search engine. You will find a plenty of complaints and even some guides on how to deal with it.
2: Win11 is not usable with 4GB
3: Trisquel 12 Ecne exists. You might need Xanmos as a propietary kernel because of hardware, but try to blacklist mei and mei_me first in some .conf file at /lib/modprobe.d. Value your privacy.
Trisquel Mate with zram-config and some small tweaks can work with 4GB of RAM even with a browser with dozens of Tabs, at least with UBlock Origin.
but I still would recommend 6 GiB.
no matter of the OS
the problem here is more the programs you run on top of the OS (browser, electron apps, etc.)
realistic speaking you should budged at least 1GiB for you OS even if it's minimalist, and to avoid issues make it 2GiB of OS + some emergency buffer, caches, load spikes etc.
and 2GiB for your browser :(
and 500MiB for misc apps (mail, music, etc.)
wait we are already at 4.5 GiB I still need open office ....
even if xfc would safe 500 MiB it IMHO wouldn't matter (for the recommendation)
and sure you can make it work, can only have one tab open at a time, close the browser every time you don't need it, not use Spotify or YT etc.
but that isn't what people expect, so give them a recommendation which will work with what they expect and if someone tries to run it at smaller RAM it may work, but if it doesn't it at least isn't your fault
Snap still kinda egh though ;-D
they compared the Ubuntu minimal recommended RAM to Windows absolute minimal RAM requirements.
but Windows has monetary incentives (related to vendors) to say they support 4GiB of RAM even if windows runs very shitty on it, on the other had Ubuntu is incentivized to provider a more realistic minimum for convenient usage
I mean taking a step back all common modern browsers under common usage can easily use multiple GiB of memory and that is outside of the control of the OS vendor. (1)
As consequence IMHO recommending anything below 6 GiB is just irresponsible (iff a modern browser is used) _not matter what OS you use_.
---
(1): If there is no memory pressure (i.e. caches doesn't get evicted that fast, larger video buffers are used, no fast tab archiving etc.) then having YT playing likely will consume around ~600-800 MiB.(Be aware that this is not just JS memory usage but the whole usage across JS, images, video, html+css engine etc. For comparison web mail like proton or gmail is often roughly around 300MiB, Spotify interestingly "just" around 200MiB, and HN around 55MiB.
I suppose the most major change on RAM usage is electron and the bloated world of text editors and other simple apps written in electron.
For people wanting the old-fashioned fast and simple GUI experience, I recommend LXQt.
So basically, no use when you've not tasted 120+Hz displays. And don't because once you do, you won't go back.
But for gaming, it really is hard to go back to 60.
On a CRT monitor the difference between running at 60 Hz and even a just slightly better 72 Hz was night and day. Unbearable flickening vs a much better experience. I remember having some little utility for Windows that'd allow the display rate to be 75 (not 72 but 75). Under Linux I was writing modelines myself (these were the days!) to have the refresh rate and screen size (in pixels) I liked: I was running "weird" resolutions like 832x604 @ 75 Hz instead of 800x600 @ 60 Hz, just to gain a little bit more screen real estate and better refresh rate.
Now since monitors started using flat panels: I sure as heck have no idea if 60 fps vs 120 fps or whatever change anything for a "desktop" usage. I don't think the problem of the image fading too quickly at 60 Hz that CRT had is still present. But I'm not sure about it.
I guess one of the few smaller things would be wayland, but this has so few features that you have to wonder why it is even used.
If there's no opt-out, that's a different story.
If META's business model is not lucrative, is not my problem.
Given it's a field where you can put absolutely anything in (and probably randomize, if you want), how is this different than the situation today, where random sites ask you for your birthday (also unverified)? Moreover Meta already has your birthday. It's already mandated for account creation, so claims of "so they can earn money with users' preferences" don't make any sense.
Those are all development tools. Has the runtime overhead grown proportionally, and what accounts for the extra weight?
Gnome 3 seems similar to Unity nowadays, and it is pretty good.
I find it much easier to use than Windows or Mac, which is credit to the engineers who work on it.
Most of the bloat these days is from containers and Canonical's approach to Ubuntu since ~2014 has been very heavy on using upstream containers so they don't have to actually support their software ecosystem themselves. This has lead to severe bloat and bad graphical theming and file system access.
Maybe in some ways, yes. But there are distros out there that can run easily in as little as 1G RAM. And I heard people have used it with far less.
I also remember hearing Ubuntu moved to default to Wayland, if true I have to wonder if defaulting to Wayland is part of the problem because Gnome / KDE on Wayland will use far more memory than FVWM / Fluxbox on X11.
FWIW, you can do a lot just from the console without a GUI w/Linux and any BSD, in that case the RAM usage will be tiny compared to Windows and Apple.
It always make me chuckle when I hear this. Default server (ie no GUI at all) installation of a RHEL derivative just outright dies silently with 1GB of RAM if there is no swap. Sure with the enabled swap it no longer dies but to say what the performance is anywhere performant is to lie to yourself.
Windows, its default, used so much memory that there was not much left for apps.
Ubuntu used 500MB less than Windows in system monitor. I think it was still 1GB or more. It also appeared to run more slowly than it used to on older hardware.
Lubuntu used hundreds of MB less than Ubuntu. It could still run the same apps but had less features in UI (eg search). It ran lightening fast with more, simultaneous apps.
(Note: That laptop's Wifi card wouldn't work with any Linux using any technique I tried. Sadly, I had to ditch it.)
I also had Lubuntu on a 10+ year old Thinkpad with an i7 (2nd gen). It's been my daily machine for a long time. The newer, USB installers wouldn't work with it. While I can't recall the specifics, I finally found a way to load an Ubuntu-like interface or Ubuntu itself through the Lubuntu tech. It's now much slower but still lighter than default Ubuntu or Windows.
(Note: Lubuntu was much lighter and faster on a refurbished Dell laptop I tested it on, too.)
God blessed me recently by a person who outright gave me an Acer Nitro with a RTX and Windows. My next step is to figure out the safest way to dual boot Windows 11 and Linux for machine learning without destroying the existing filesystem or overshrinking it.
Those number meant nothing comparing across OS. Depends on how they counts shared memory and how aggressive it cache, they can feel very different.
The realistic benchmark would be open two large applications (e.g. chrome + firefox with youtube and facebook - to jack up the memory usage), switch between them, and see how it response switching different tasks.
https://community.acer.com/en/kb/articles/16556-how-to-upgra...
Looks like you got space for 2 drive.
I'd say Windows 11's real minimal is 8 GB in 2026, with the recommended being 16 GB.
PS - And even at 8 GB, it hits 100% usage and pages under moderate load or e.g. Windows Update running in the background.
> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows
Basically the change reflects the fact that, at this level of analysis (how much RAM do I need in my consumer PC), the OS is irrelevant these days. If you use a web browser then that will dominate your resource requirements and there's nothing Linux can do about that.
It doesn't matter how efficient your kernel or DE is if users expect to be able to load bloated websites in Chrome.
Can't move to Linux because it's Intel Atom and Intel P-state driver for that is borked, never fixed.
You can install Debian and it gives you all that you are familiar with from Ubuntu.
1.5TB in /var/log
All from the Firefox snap package complaining every millisecond about some trivial Snap permission.
I'm glad I chose an OS without goddamn Snap. It's been unadulterated pain every time I've ever interacted with it.
> The change isn't about the core operating system becoming resource-hungry. Instead, it reflects the way people use computers today—multiple browser tabs, web apps, and multitasking workflows, all of which demand additional memory.
So it is more about the 3rd party software instead of OS or desktop environment. Actually, nowadays it's recommended to have 8+ GB of RAM, regardless of OS.
I just checked the memory usage on Ubuntu 24.04 LTS after closing all the browser tabs. It's about 2GB of 16GB total RAM. 26.04 LTS might have higher RAM usage but it seems unlikely that it will get anywhere close to 6GB.
There's rather a lot of information in a single uncompressed 1080p image. I can't help but wonder what it all gets used to for.
But I remember in 2016 Fedora Gnome consumed about 1.6GB of RAM on my PC with 2GB of RAM a decade ago. Considering that after a decade the standard Ubuntu Gnome consumes only 400MB more RAM and also that my new laptop has 16GB of RAM (the system might use more RAM when more RAM is installed), I think the increase is not that bad for a decade. I thought it would be much worse.
The culprit is browsers, mostly.
What has changed? Why do I need 10x the RAM to open a handful of terminals and a text editor?
Partly because increased RAM usage can sometimes improve execution speed / smoothness or security (caching, browser tab isolation).
Partly because developers have less pressure to optimize software performance, so they optimize other things, such as development time.
Here is an article about bloat: https://waspdev.com/articles/2025-11-04/some-software-bloat-...
I just tested this with 25.10 desktop, default gnome. With 24.04 LTS it doesn't even start up with 2GiB.
Apparently it's still in discussion but it's April now so seems unlikely.
Kind of weird how controversial it is considering DOS had QEMM386 way back in 1987.
https://www.microsoft.com/en-us/windows/windows-11-specifica...
4GB of RAM? What? I guess if your minimum is "able to start Windows and eventually reach the desktop", sure? I wouldn't even use Windows 11 with 8GB even though it would theoretically be okay.
Not okay as soon as you throw on the first security tool, lol.
I work in an enterprise environment with Win 11 where 16 GB is capped out instantly as soon as you open the first browser tab thanks to the background security scans and patch updates. This is even with compressed memory paging being turned on.
If ram is a problem there's always alternatives. The impediment is always having to rethink your workflow or adopting someone else's opinion.
I knew they were fucking with my virtual memory cause theirs sucks, the partition schemes on this Mac mini were ridiculous and the helpers weren’t stealing my information.
My desktop runs Arch with Sway (so quite close), three monitors, and uses ~400MB ram after boot. Most of it are the framebuffers. All the rest is eaten by Firefox, rust-analyzer and qemu.
> Linux's advantage is slowly shrinking
Ubuntu is not Linux. Also, would love to see Windows running on 4GB.This is garbage writing. Linux’s advantages are numerous and growing. Ubuntu ≠ Linux. WRT RAM requirements, Win 11’s 4GB requirement isn’t viable for daily use and won’t represent any practical machine configuration that has the requisite TPM 2 module. On the other side, the Linux ecosystem offers a wide variety of minimal distributions that can run on ancient hardware.
Maybe I’m just grouchy today but I would flag this content if sloppy MS PR was a valid reason.
Apps are still a huge gap on Linux, but as an OS, I choose it every time over Windows and MacOS.
"With Desktop" has 1GB minimum and 2GB recommended - along with Pentium 4, 1GHz cpu.
It says that Ubuntu increase the requirements not because of the OS itself but to have a better user experience when people have many browser tabs opened. Then it compares to Windows which has lower nominal requirements but higher requirements in practice to get a passable user experience.
I couldn't understand why everything was that slow compared to Debian and didn't want to bother looking into it so...
After a few weeks: got rid of Ubuntu, installed her Debian. A simple "IceWM" WM (I use the tiling "Awesome WM" but that's too radical for my wife) and she loves it.
She basically manages her two SMEs entirely from a browser: Chromium or Firefox (but a fork of Firefox would do too).
It works so well since years now that for her latest hire she asked me to set her with the same config. So she's now got one employee on a Debian machine with the IceWM WM. Other machines are still on Windows but the plan is to only keep one Windows (just in case) and move the other machines to Debian too.
Unattended upgrades, a trivial firewall "everything OUT or IN but related/established allowed" and that's it.
I don't remember all of my frustrations, but I remember having a lot of trouble with snap. Specifically, it really annoyed me that the default install of firefox was the snap version instead of native. I want that to be an opt-in kind of thing. I found that flatpak just worked better anyway.
I almost tried making the switch to arch, but I've been pretty happy running debian sid (unstable) since. The debian installer is just more friendly to me for getting encrypted drives and partitions set up how I want.
It's not for everyone, but I like the structured rolling updates of sid and having access to the debian ecosystem too much to switch to something else at this point.
I use sway with a radeon card for my primary and have a secondary nvidia card for games and AI stuff.
It has its warts, but I love my debian+sway setup
What I mean is, yes, WE know Win11 barely works with 4GB and WE know that 6gb is quite generous for a Linux machine, but they don't.
The general public isn't as informed as we think they are (which is proven by 75 million people last election).
I think we have quite different definition of "minimum requirement", then.
intothemild•3h ago
First, it sounds like this 6gb requirement is more like a suggestion/recommendation than a requirement. I also am curious if it actually actively uses all 6gb. From my own usage of Linux over the years the OS itself isn't using that much ram, but the application is, which is almost always the browser.
Secondly. I haven't used Ubuntu desktop in years. So I have no real idea if this is something specific to them, but I do use Fedora, so I would imagine that the memory footprint cannot be too different. Whilst I could easily get away with <8gb ram, you really kind of don't want too if you're going to be doing anything heavier than web browsing or editing documents. Dev work? Or CAD, Design etc etc. But this isn't unique to Linux.
heelix•3h ago
When they turned Centos into streams, I cut my workstation over to Ubuntu. It has been a reasonable replacement. Only real issues were when dual booting Win10 horked my grub and snap being unable to sort itself on occasion. When they release 26 as an LTS, I'm planning to update. You are spot on - the desktop itself is reasonably lean. 100+ tabs in Firefox... less so. Mind you, the amount of RAM in the workstations I'm using could buy a used car these days.
hhh•3h ago
bee_rider•2h ago
foepys•1h ago
sunshine-o•2h ago
I believe Fedora and Ubuntu use about the same set of technologies: systemd, wayland, Gnome, etc. so it is about the same.
Apart from working out of the box I do not really know what those distros have and I don't. I just have to admit managing network interfaces is really easy in Gnome.
With the skyrocketing price of RAM this might finally be the year of the Linux desktop. But it is not gonna be Gnome I guess.