So good or bad idea, Wayland is slowly shifting to being the default in virtue of being the most maintained up to date compositor.
I was open minded toward Wayland when the project was started... in 2008. We are 18 years down the road now. It has failed to deliver a usable piece of software for the desktop. That's long enough for me to consider it a failed project. If it still exists, it's probably for the wrong reasons (or at the least, reasons unrelated to any version of desktop Linux I want to run, like perhaps it has use in the embedded space).
Like, ok, its 2030 and X11 is dead, no one works on it anymore and 90% of Linux users use Wayland, what did they gain? I know they did employ Pottering but not anymore, and AFAIK they contribute a non-trivial amount of code up stream to, Linux, Gnome? KDE? If more users are on wayland they can pressure Gnome to ... what?
I sort of get an argument around systemd and this, in that they can push I guess their target feature sets into systemd and force the rest of the eco-system to follow them, but, well I guess I don't get that argument either, cause they can already put any special sauce they want in Redhat's shipped systemd implementation and if its good it will be picked up, if its bad it wont be?
I guess, if Redhat maintains systemd & wayland, then they could choke out community contributions by ignoring them or whatever, but wouldn't we just see forks? Arch would just ship with cooler-systemd or whatever?
The same goal any group of savvy corporate employees has when their marquee project has proved to be far more difficult, taken way longer, and required far more resources than anticipated to get within artillery distance of its originally-stated goal?
I've personally seen this sort of thing play out several times during my tenure in the corporate environment.
I guess I just don't get how the third E in EEE plays out in an open source environment.
- Maintaining X requires a lot of time, expertise and cost, it's a hard codebase to work with, deprecating X saves them money
- Wayland is simpler and achieves greater security by eliminating features of the desktop that most users value, but perhaps Redhat's clients in security-conscious fields like healthcare, finance and government are willing to live without
So I suspect it comes down to saving money and delivering something they have more control of which is more tailored to their most lucrative enterprise scenarios; whereas X is an old mess of cranky unix guys and their belligerent libre cruft.
There are some parallels to systemd I guess, in that its design rejected the Unix philosophy, and this was a source of concern for a lot of people. Moreover at the time systemd was under development, my impression of Poettering was that he was as incompetent as he was misguided and belligerent - he was also advocating for abandoning POSIX compatibility, and PulseAudio was the glitchiest shit on my desktop back then. But in the end systemd simply appeared on my computer one day and nothing got worse, and that is the ultimate standard. If they forced wayland on me tomorrow something on my machine would break (this is the main point of the OP), and they've had almost 20 years to fix that but it may arguably never get fixed due to Wayland's design. So Wayland can go the way of the dodo as far as I'm concerned.
I did welcome happily the revert of all his code.
What used to be maintained in one codebase by Xorg devs is now duplicated in at least three major compositors, each with their own portal implementation and who knows what else. And those are primarily maintaned by desktop environment devs who also have the whole rest of the DE to worry about.
There's just no way to make that make sense.
It's absolutely an essential characteristic for long term survival, for long term excellence. To not be married to one specific implementation forever.
Especially in open source! What is that organizational model for this Authoritarian path, how are you going to - as Wayland successfully has - get every display server person onboard? Who would had the say on what goes into The Wayland Server? What would the rules be?
Wayland is the only thing that makes any sense at all. A group of peers, fellow implementers, each striving for better, who come together to define protocols. This is what made the internet amazing what made the web the most successful media platform, is what creates the possibility for ongoing excellence. Not being bound to fixed decisions is an option most smart companies lust but somehow when Wayland vs X comes up, everyone super wants there to be one and only one path, set forth three decades ago that no one can ever really overhaul or redo.
It's so unclear to me how people can be so negative and so short and so mean on Wayland. There's no viable alternative organization model for Authoritarian display servers. And if somehow you did get people signed up, this fantasy, there's such a load of pretense that it would have cured all ills? I don't get it.
>So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
Kudos to Michael for even attempting it. Personally nowadays unless my working stack stops, well, working, or there're significant benefits to be found, don't really feel even putting the effort to try the shiny new things out.
And for taking the time to thoroughly document real issues.
You don't always have to replace something that works with something that doesn't but is "modern."
My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
I switched in 2018 and was surprised I couldn’t use fractional scaling on one monitor like I’d been doing for years on windows.
Save money on the monitor, save money on the gpu (because it's pushing fewer pixels, you don't need as much oomph), save frustration with software.
Most high-DPI displays are simply the same thing with exactly twice the density.
We settled on putting exactly twice as many pixels in the same panels because it facilitates integer scaling
To maintain a clean 200% scale you need a 27" 5K panel instead, which do exist but are vastly more expensive than 4K ones and perform worse in aspects other than pixel density, so they're not very popular.
Also, 200% on an FHD 14" laptop means 960x540 px equivalent. That's too big to the point of rendering the laptop unusable. Also, X11 doesn't support switching DPI on the fly AFAIK, and I don't want to restart my session whenever I plug or unplug the external monitor, which happens multiple times a day when I'm at the office.
This really isn't this far off. If we imagined the screens overlayed semi-transparently an 16 pixel letter would be over a 14 pixel one.
If one imagines an ideal font size for a given user's preference for physical height of letterform one one could imagine a idealized size of 12 on another and 14 on the other and setting it to 13 and being extremely close to ideal.
>So if you zoom at 300%, it will scroll by a lot at a time, whereas 200% is still usable.
This is because it's scrolling a fixed number of lines which occupy more space at 300% zoom notably this applies pretty much only to people running high DPI screens at 100% because if one zoomed to 300% otherwise the letter T would be the size of the last joint on your thumb and legally blind folks could read it. It doesn't apply to setting the scale factor to 200% nor the setting for Firefox's internal scale factor which is independent from the desktop supports fractional scaling in 0.05 steps and can be configured in about:config
layout.css.devPixelsPerPx
I don't really care about this but here's an example:
I have 2 27" screens, usually connected to a windows box, but while working they're connected to a MBP.
Before the MBP they were connected to several ThinkPads where I don't remember what screen size or scaling, I don't even remember if I used X11 or Wayland. But the next ThinkPad that will be connected will probably be HiDPI and with Wayland. What will happen without buying a monitor? No one knows.
Btw, everybody who I know, and I too, changes the font size, and leaving the DPI scaling on 100%, or maybe 200% on X11.
Doesn't work if your screens are too different (e.g. 4k laptop screen and 32" desktop monitor).
The post is from the Dev of i3wm an x11 window manager complaining among other things about how well his 8k monitor works under x11 and how poorly it works under Wayland.
You can also consult the arch wiki article on high DPI which is broadly applicable beyond arch
Ten years ago there were cursor clipping issues, cursor coordinates issues and crashes and I've been home-baking patches for that.
Also it was impossible for one X session to span across two GPUs. Dunno if that was improved.
Now it's bit better, but for sure your amdgpu will entertain you with little nice crashes when you run something heavy on a scaled display.
I'm not even talking about VRR, HDR and all that stuff.
The totality of my education on the topic was reading the arch wiki on hidpi once.
AFAIK one cannot span one x session across multiple GPUs although AMD had something that it once referred to as "eyefinity" for achieving this.
It is rarely needed discreet GPU often support 3 or even 4 outputs
One may wonder if you tried this a very long time ago back when AMD sucked and Nvidia worked well in 2005-2015
I don't know what to do. The outpouring of negative energies are so severe. But I think it's so incredibly un-representative, is so misleading. The silent majority problem is so real. Come to the better place.
OTOH though, there are a lot of reasons for projects like GNOME and KDE to want to switch to Wayland, and especially why they want to drop X11 support versus maintaining it indefinitely forever, so it is beneficial if we at least can get a hold on what issues are still holding things up, which is why efforts like the ones outlined in this blog post are so important: it's hard to fix bugs that are never reported, and I especially doubt NVIDIA has been particularly going out of their way to find such bugs, so I can only imagine the reports are pretty crucial for them.
So basically, this year the "only downsides" users need to at least move into "no downsides". The impetus for Wayland itself is mainly hinged on features that simply can be done better in a compositor-centric world, but the impetus for the great switchover is trying to reduce the maintenance burden of having to maintain both X11 and Wayland support forever everywhere. (Support for X11 apps via XWayland, though, should basically exist forever, of course.)
I don't get why X11 shouldn't work forever. It works today. As you said, there's no obvious reason for an end user to switch to Wayland if there isn't any particular problems with their current setup. "Because it's modern" and "Because it's outdated" just aren't compelling reasons for anyone besides software developers. And "because we're going to drop support so you have to switch eventually" is an attitude I'd expect out of Apple, not Linux distributions.
Supporting legacy stuff is universally difficult, and makes it significantly harder to implement new things.
X11 as a display server will continue to work ~forever as long as someone maintains a display server that targets Linux.
KDE and GNOME will not support X11 forever because it's too much work. Wayland promises to improve on many important desktop use cases where X.org continues to struggle and where the design of X11 has proven generally difficult to improve. The desktop systems targeting Linux want these improvements.
> "Because it's modern" and "Because it's outdated" just aren't compelling reasons for anyone besides software developers.
I can do you one better: that's also not really compelling to software developers either most of the time. I beg you to prove that the KDE developers pushed Wayland hard because they badly wanted to have to greatly refactor the aging and technical debt heavy KWin codebase, just for the hell of it. Absolutely not.
The Wayland switchover that is currently ongoing is entirely focused on end users, but it's focused on things they were never able to do well in X11, and it shows. This is the very reason why Wayland compositors did new things better before they handled old use cases at parity. The focus was on shortcomings of X11 based desktops.
> And "because we're going to drop support so you have to switch eventually" is an attitude I'd expect out of Apple, not Linux distributions.
Yeah. Except Apple is one of the five largest companies in the United States and GNOME and KDE are software lemonade stands. I bet if they could they would love to handle this switchover in a way that puts no stress on anyone, but as it is today it's literally not feasible to even find the problems that need to be solved without real users actually jumping on the system.
This isn't a thing where people are forcing you to switch to something you don't want under threat of violence. This is a thing where the desktop developers desperately want to move forward on issues, they collectively picked a way forward, and there is simply no bandwidth (or really, outside of people complaining online, actual interest) for indefinitely maintaining their now-legacy X11-based desktop sessions.
It actually would have been totally possible, with sufficient engineering, to go and improve things to make it maintainable longer term and to try to backport some more improvements from the Wayland world into X11; it in fact seems like some interested people are experimenting with these ideas now. On the other hand though, at this point it's mostly wishful thinking, and the only surefire thing is that Wayland is shipping across all form factors. This is no longer speculative, at this point.
If you really want to run X.org specifically, that will probably continue to work for a decently long time, but you can't force the entire ecosystem to all also choose to continue to support X.org anymore than anyone can force you to switch to Wayland.
Attracting new contributors is an existential problem in OSS.
But, I am stuck on Xorg only because of one app that I have to use to work.
> My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
This is already happening. in my knowledge, Archlinux, Ubuntu already switched to Gnome 49, which do not support X without recompilation. So most likely, any distro using Gnome 49 upwards will not provide Xorg by default. KDE also going to do it soon.
Xorg is going away pretty soon
I believe its step to the right direction, only issue is some annoying app holding us back
It doesn't really matter if you like or dislike wayland, the major DE have decided they don't like X11 and they are making the switch to wayland. X11 code is actively being removed from these desktop environments.
If you want to use X11, you can either stay on an old unmaintained DE or switch to a smaller one that supports X11. But you should realize that with wayland being the thing major DEs are targeting, your experience with X11 will likely degrade with time.
I see it as a win for both developers and users in the long run.
Does XWayland help?
I can’t find a low-effort, high-portability, low-menory way to do it with Wayland.
Local and also in CI pipelines.
Right now with X11, IIRC, if one application has access to your display they can read what is going on in other applications running on the same display.
If browser tabs were able to do that, all hell would break loose. So why do we accept it from applications?
Anyway, despite this, I still use X11 instead of Wayland because of all the shortcomings.
Because I don't run random untrusted apps all the time. Whereas I do visit random untrusted websites all the time.
I had an old Chromebook which had Lubuntu on it - screen tearing was driving me crazy so I switched to Wayland and it is buttery smooth. No mean feat given the decrepit hardware.
I'm sure someone will be along to tell me that I'm wrong - but I've yet to experience any downsides, other than people telling me I'm wrong.
That's fine as long as it goes both ways. If Wayland works for you, great. Equally, for some of us it doesn't work.
I'd be investigating that issue instead, should have errors in systemd/journalctl or whatever you use for managing daemons. I'm using ydotool on Arch, pretty much all defaults, together with a homegrown voice dictation thing, and it's working 100% of the times.
Maybe in another decade or so.
After an Nvidia graphics driver release everything cleared up to be very usable (though occasionally stuff still crashed, like once or twice a week). I heavily dislike Nvidia and went with AMD just around a month ago, zero issues.
I'm curious to hear about what hardware you have.
Wayland fixes that, so that part is a huge improvement to me. Unfortunately this also limited my choice of Distros as not all of them use Wayland. I landed on Ubuntu again, despite some issues I have with it. The most annoying initially was that the Snap version of Firefox didn't use hardware acceleration, which is just barely usable.
I don't entirely love MacOS (mostly because I can't run it on my desktop, lol). But it does fractional scaling so well, I always choose the "looks like 1440p" scaling on 4K resolution, and literally every app looks perfect and consistent and I don't notice any performance impact.
On windows the same thing, except some things are blurry.
On Linux yeah I just have to bear huge UI (x2 scaling) or tiny UI (X1) or live with a noticeable performance delay that's just too painful to work with.
Don’t know what the deal is with Linux desktop experience. I have encountered various forms of perfection and had them taken away.
Once on my XPS M1330 I clicked to lift a window and then three finger swiped to switch workspace and the workspace switched and I dropped the window. It was beautiful. I didn’t even notice until after I’d done it what an intuitive thing it felt like.
Then a few years later I tried with that fond memory and it didn’t work. Where did the magic go?
Probably some accidental confluence of features broken in some change.
But I'm also running all AMD hardware, that may be a factor. Life is too short for nvidia bullshit on Linux.
I switched to get support for different scaling on different outputs and I have gone back.
So much NVidia hate, but in 23 years the only problems I've had with NVidia on Linux were when they dropped support for old GPUs. Even on proprietary hardware like iMacs and MacBooks.
But to each their own.
The better path on Linux was always AMD, and still is, to this day, since it simply works without me needing to care about driver versions, or open vs closed source, at all.
Source: Was burned many times by ATI's promises to deliver functioning software over the years. Been using Nvidia on Linux and Freebsd for as long as can recall now.
Currently dual 3090s in this box and nvidia is still as simple as just installing the distro package.
There was a period in the mid 2010s where trying to get better battery life on laptops by optionally using the discrete gpu vs the integrated was a real pain (bumblebee/optirun were not so solid), but generally speaking for desktop machines that need GPU support... Nvidia was the route.
Don't love their company politics so much, although I think they're finally getting on board now that so many companies want to run GPU accelerated workloads on linux hosts for LLMs.
But ATI sucked. They seem to have finally gotten there, but they were absolutely not the best choice for a long time.
Hell - I still have a machine in my basement running a GTX970 from 2015, and it also works fine on modern linux. It currently does the gpu accel for whisper speech to text for HA.
When AMD bought ATI they started work on the open source drivers and improved the situation, but they had already lost me as a GPU customer by that point.
Maybe now in the 2020s AMD has caught up, and I'll keep them in mind next time I buy a GPU, but I've been happy with NVidia for a long time.
It would be nice if the NVidia driver were in the kernel and open source, but the Debian package has just worked for a very long time now.
Viewed from the Other Side, I'm far more inclined to think that NVidia actually knows what they are doing and the authors of Wayland do not.
This stuff has been flawless on AMD systems for a while a couple of years now, with the exception of the occasional archaic app that only runs on X11 (thus shoved in a container).
Hopefully AnyDesk and Remmina will address this issue before KDE ends it's mainline X11 support next year.
It’d be very handy if we had a performant remote desktop option for Linux. I could resume desktop sessions on my workstation from my laptop and I could pair program with remote colleagues more effectively.
In the past I’d boot into Windows and then boot my Linux system as a raw disk VM just so I could use Windows’s Remote Desktop. Combined with VMware Workstation’s support for multiple monitors, I had a surprisingly smooth remote session. But, it was a lot of ceremony.
One of the obstacle that I faced is wrong resolution. On Xorg I could just add new mode and get up and running quickly. On Wayland, I have to either do some EDID changes or go through even worse.
This is a common mischaracterizarion of what happened. This API, GBM, was a proprietary API that was a part of Mesa. Nvidia couldn't add GBM to their own driver as it is a Mesa concept. So instead Nvidia tried to make a vendor neutral solution that any graphics drivers could use which is where you see EGLStreams come into the picture. Such an EGL API was also useful for other nonwayland embedded usecases. In regards to Nvidia's proprietary driver's GBM support, Nvidia themselves had to add support to the Mesa project to support dynamically loading new backends that weren't precompiled into Mesa. Then they were able to make their own backend.
For some reason when this comes up people always phrase it in terms of Nvidia not supporting something instead of the freedesktop people not offering a way for the Nvidia driver to work, which is a prerequisite of Nvidia following such guidance.
Even today if you use the API your program has to link to Mesa's libgbm.so as opposed to linking to a library provided by the graphics driver like libEGL.so.
How can you call all of that a mischaracterization? In my humble opinion, and I am not anything more than a bystander in this with only superficial knowledge of the domain, it's you that is trying to mischaracterize the situation.
Yes, it does and it is different the the well defined meaning when talking in regards to the software itself. OpenGL is an open API, but the source code for an implementation isn't necessarily open.
>Nvidia was not willing to implement an API for their drivers
They couldn't because this API is a part of Mesa itself. As I mentioned programs link to a Mesa library directly.
>since Mesa is not an actual GPU vendor
They are a driver vendor.
>the other vendors (Intel and AMD at this point), which have already implemented GBM
Support was added to Mesa itself and not to the driver's by those companies. The proprietary, now deprecated, AMD kernel module still doesn't support GBM.
>should switch too in the name of this
I think it is beneficial for standards to be implemented by multiple vendors, so I think they should implement it at least.
>How can you call all of that a mischaracterization?
What people think as Nvidia needing to implement an API is actually an ask for Nvidia to make a Mesa API work.
From my perception essentially the ask was that Nvidia needed to open source the kernel driver like AMD did and then eventually a nvidia gbm backend would be built into Mesa for it. For obvious reasons this was never going to happen. The fact that no agreeable solution was figured out in about a decade, and then Nvidia has to code up that solution for the Mesa project is a failure on Mesa's end. A lot of user pain happened due due to them not willing to work together with proprietary software and focusing solely on supporting open source drivers.
Well, I guess this is the crux of the problem, and for open-source enthusiasts like me this is not obvious at all. What we can surmise is that Nvidia refused to collaborate, therefore they were the party to blame for the status of their video cards not being supported as well as others' vendors on linux.
I saw more effort on Nvidia's side trying to collaborate than on the Wayland side. I think it's unfair to not call out the people who had a hardline stance of only caring about open source drivers and didn't want to do the work to onboard Nvidia.
Mesa did discuss EGL but felt it wasn’t the right choice. https://mesa-dev.freedesktop.narkive.com/qq4iQ7RR/egl-stream...
In much the same way that NVIDIA may have felt that EGL was the better choice.
However none of your description of the way things are explains why NVIDIA couldn’t have made their own libgbm that matched the symbols of mesa and worked on standardizing the api by de facto.
Arguably my hardware is a lot simpler and I don't use Nvidia. But I just want to point out that, for all the flak wayland receives, it can work quite well.
I'm having more issues with games/websites/programs that didn't take high display refresh rate into account, than Wayland, at this point.
You will also note many items in the post above are papercuts that might go unnoted like input feeling a little worse or font issues.
Actually, GPU acceleration was why I initially switched. For whatever reason, this GPU (Radeon VII) crashes regularly under X11 nearly every time I open a new window, but is perfectly stable under wayland. Really frustrating! So, I had some encouragement, and I was waiting for plasma-wayland to stabilize enough to try it properly. I still have the X11 environment installed as a fallback, just in case, but I haven't needed to actually use it for months.
Minor pain points so far mostly include mouse acceleration curves being different and screen capture being slightly more annoying. Most programs do this OS-level popup and then so many follow that up with their own rectangle select tool after I already did that. I had some issues with sdl2-compat as well, but I'm not sure that was strictly wayland's fault, and it cleared up on its own after a round of updates. (I develop an SDL2 game that needs pretty low latency audio sync to run smoothly)
I use it extensively, it's easy to use, UI is compact but clear, works perfectly all the time. I honestly don't care that it is unmaintained at this point.
FWIW, I have a KDE Wayland box and OBS works for screen recording. Slightly more complex than simplescreenrecorder, but not bad.
At some point I'll get irritated enough to seek out more alternatives and give them a whirl. Such is fate :)
1) Hugely enjoyable content - as usual - by Michael Stapelberg: relevant, detailed, organized, well written.
2) I am also an X11 + i3 user (and huge thanks to Michael for writing i3, I'm soooo fast with it), I also keep trying wayland on a regular basis because I don't want to get stuck using deprecated software.
I am very, very happy to read this article, if only because it proves I'm not the only one and probably not crazy.
Same experience he has: everytime I try wayland ... unending succession of weird glitches and things that plain old don't work.
Verdict: UNUSABLE.
I am going to re-iterate something I've said on HN many times: the fact that X11 has designs flaws is a well understood and acknowledged fact.
So is the fact that a new solution is needed.
BUT, because Wayland is calling themselves the new shite supposed to be that solution DOES NOT AUTOMATICALLY MEAN they actually managed to solve the problem.
As a matter of fact, in my book, after so many years, they completely and utterly failed, and they should rethink the whole thing from scratch.
And certainly not claim they're the replacement until they have reached feature and ease of use parity.
Which they haven't as Michael's article clearly points out.
I know that it wasn't originally conceived to do what it does today, but I've never had any problem using it, and when I tried Wayland I didn't notice any difference whatsoever.
Is it just that it's a pain to write apps for it..?
https://www.youtube.com/watch?v=GWQh_DmDLKQ
https://people.freedesktop.org/~daniels/lca2013-wayland-x11....
It makes sand-boxing security impossible. The moment a process has access to the Xorg socket, it has access to everything. It is weird that this oftentimes misses from the discussion though.
Any extra effort on X11 might help to buy more time, but will in the end be for nothing. And in this time of supply-chain attacks, vs-code plugins, npm packages, agents and what-not, X11 is just too dangerous.
Other way around: Maintaining Xorg itself is awful.
For instance, a compositor may not support a clipboard, and the "data" related interfaces must be queried for availability (those interface are stable in core) and the client must disable such functionality if not there (for instance, wterm terminal is faulty because it forces a compositor to have such interfaces... but havoc terminal is doing it right). I don't know yet if libSDL3 wayland support "behaves" properly. wterm fix is boring but should be easy.
As wayland usage, it is probably almost everwhere (and Xwayland is there for some level of legacy compatibility).
(I am currently writting my own compositor for AMD GPUs... in risc-v assembly running on x86_64 via an interpreter)
Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026. If you are a Linux on desktop advocate, read the comments and see why so many are still hesitating.
>Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026.
Quite ironically there're people refusing to leave Windows 7, which has been EOS since 2020, because they find modern Windows UI unbearable. Windows 11 being considered that bad that people are actually switching OSes due to it. Have seen similar comments about OSX/macOS.
The big difference between those and Linux is that Linux users have a choice to reject forced "upgrades" and build very personalized environments. If had to live with Wayland could do it really, even if there're issues, but since my current environment is fine don't really need/care to. And it's having a personalized environment such a change is a chore. If was using a comprehensive desktop environment like GNOME (as many people do), maybe wouldn't even understand something changed underneath.
For me, Wayland seems to work OK right now, but only since the very latest Ubuntu release. I'm hoping at this point we can stop switching to exciting new audio / graphics / init systems for a while, but I might be naive.
Edit: I guess replacing coreutils is Ubuntu's latest effort to keep things spicy, but I haven't seen any issues with that yet.
Edit2: I just had the dispiriting thought that it's about twenty years since I first used Ubuntu. At that point it all seemed tantalizingly close to being "ready for primetime". You often had to edit config files to get stuff working, and there were frustrating deficits in the application space, but the "desktop" felt fine, with X11, Alsa, SysV etc. Two decades on we're on the cusp of having a reliable graphics stack.
But then I try and focus on what each author thinks is important to them and it’s often wildly different than what’s important to me.
But a lot of internet discussion turns into very ego-centric debate including on here, where a lot of folks who are very gung-ho on the adoption of something (let’s say Linux, but could be anything) don’t adequately try and understand that people have different needs and push the idea of adoption very hard in the hopes that once you’re over the hump you might not care about what you lost.
I feel the same and find it a bit strange. I am happy with hyprland on wayland since a few months back but somehow it reminds me of running enlightenment or afterstep in the 90s. My younger self would have expected at least a decade of "this is how the UI works in Linux and it's great" by now.
Docker and node both got started after wayland and they are mature enterprise staples. What makes wayland such a tricky problem?
Everything actually feels significantly more solid/stable/reliable than modern Windows does. I can install updates at my own pace and without worrying that they'll add an advert for Candy Crush to my start menu.
I also run Bazzite-deck on an old AMD APU minipc as a light gaming HTPC. Again, it's a much better experience than my past attempts to run Windows on an HTPC.
As with everything, the people having issues will naturally be heard louder than the people who just use it daily without issues.
But with Linux being mostly hobbyist-friendly a number of folks have custom setups and do not want to be forced into the standardized mold for the sake of making it super smooth to transition from Windows.
I have such a setup (using FVWM with customized key bindings and virtual layout that I like, which cannot work under Wayland), so can I donate some money to Microsoft to keep Windows users less grumpy and not bringing yet another eternal September to Linux. I like my xorg, thank you very much :).
LOL
I installed a new windows 11 yesterday on a fairly powerful machine, everything lags so much on a brand new install it's unreal. Explorer takes ~2-3 seconds to be useable. Any app that opens in the blink of an eye under Linux on the same machine takes seconds to start. Start menu lags. It's just surrealistic. People who say these things work just have never used something that is actually fast.
Linux is faster in some places, maybe. But still with many issues like some applications not being drawn properly or just some applications not available (nice GUI for monitor control over ddc)
Good news: My laptop (Lenovo P53) can now suspend / resume successfully. With Ubuntu 25.04 / Wayland it wouldn't resume successfully, which was a deal breaker.
Annoying thing: I had a script that I used to organize workspaces using wmctrl, which doesn't work anymore so I had to write a gnome-shell extension. Which (as somebody who's never written a gnome-shell extension before) was quite annoying as I had to keep logging out and in to test it. I got it working eventually but am still grumpy about it.
Overall: From my point of view as a user, the switch to Wayland has wasted a lot of my time and I see no visible benefits. But, it seems to basically work now and it seems like it's probably the way things are headed.
Edit: Actually I've seen some gnome crashes that I think happen when I have mpv running, but I can't say for sure if that's down to Wayland.
"Wayland is the successor to the X server "
Wayland is primarily a protocol, but most definitely not a "success" to the xorg-server. This is why it does not have - and will never have - the same feature set. So trying to sell it as "the new shiny thing" after almost 20 (!!!!!) years, is simply wrong. One should instead point out that wayland is a separate way to handle a display server / graphics. There are different trade-offs.
> but for the last 18 years (!), Wayland was never usable on my computers
I can relate to this a bit, but last year or perhaps even the year before, I used wayland via plasma on manjaro. It had various issues, but it kind of worked, even on nvidia (using the proprietary component; for some reason the open-source variant nouveau works less-well on my current system). So I think wayland was already usable even before 2025, even on problematic computer systems.
> I don’t want to be stuck on deprecated software
I don't want to be stuck on software that insinuates it is the future when it really is not.
> With nVidia graphics cards, which are the only cards that support my 8K monitor, Wayland would either not work at all or exhibit heavy graphics glitches and crashes.
I have a similar problem. Not with regards to a 8K monitor, but my ultra-widescreen monitor also has tons of issues when it comes to nvidia. I am also getting kind of tired of nvidia refusing to fix issues. They are cheap, granted, but I'd love viable alternatives. It seems we have a virtual monopoly situation here. That's not good.
> So the pressure to switch to Wayland is mounting!
What pressure? I don't feel any pressure. Distributions that would only support wayland I would not use anyway; I am not depending on that, though, as I compile everything from source using a set of ruby scripts. And that actually works, too. (Bootstrapping via existing distributions is easier and faster though. As stated, trade-offs everywhere.)
> The reason behind this behavior is that wlroots does not support the TILE property (issue #1580 from 2019).
This has also been my impression. The wayland specific things such as wlroots, but also other things, just flat out suck. There are so many things that suck with this regard - and on top of that, barely any real choice on wayland. Wayland seems to have dumbed down the whole ecosystem. After 20 years, having such a situation is shameful. That's the future? I am terrified of that future.
> During 2025, I switched all my computers to NixOS. Its declarative approach is really nice for doing such tests, because you can reliably restore your system to an earlier version.
I don't use NixOS myself, but being able to have determined system states that work and are guaranteed to work, kind of extends the reproducible builds situation. It's quite cool. I think all systems should incorporate that approach. Imagine you'd no longer need StackOverflow because people in the NixOS sphere solved all those problems already and you could just jump from guaranteed snapshot to another one that is guaranteed to also work. That's kind of a cool idea.
The thing I dislike about NixOS the most is ... nix. But I guess that is hard to change now. Every good idea to be ruined via horrible jokes of an underperforming programming language ...
> So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
I had a similar impression. I guess things will improve, but right now I feel as if I lose too much for "this is now the only future". And I don't trust the wayland-promo devs anymore either - too much promo, too few results. After 20 years guys ...
There's Nickel, if it's only about the language, and Guix (Guile Scheme) which goes beyond just the language.
I don’t get the hate for Nix, honestly. (I don’t get the complaints that it’s difficult, either, but I’m guessing you’re not making one here. I do get the complaint that the standard library is a joke, but you’re not making that one either that I can see.) The derivation and flake stuff excepted, Nix is essentially the minimal way to add lazy functions to JSON, plus a couple of syntax tweaks. The only performance-related thing you could vary here is the laziness, and it’s essential to the design of Nixpkgs and especially NixOS (the only config generator I know that doesn’t suck).
I’ll grant that the application of Nix to Nixpkgs is not in any reasonable sense fast, but it looks like a large part of that is fairly inherent to the problem: you’ve got a humongous blob of code that you’re going to (lazily and in part) evaluate once. That’s not really something typical dynamic-language optimization techniques excels at, whatever the language.
There’s still probably at least an order of magnitude to be had compared to mainline Nix the implementation, like in every codebase that hasn’t undergone a concerted effort to not lose performance for stupid reasons, but there isn’t much I can find to blame Nix the language for.
I'm very happy with Wayland, but what a strange comparison to make if you're not. IPv6 is objectively an enormous improvement over IPv4, and the only gripe with it is that it's still not ubiquitous.
However, my comparison is end-user focused (ie. the Linux desktop experience). I should have been more clear about the scope perhaps.
Both IPv6 and Wayland have increased complexity and surface area for pain (cost) without an obvious benefit for the end-user.
Also: wrt IPv6 specifically, I don’t believe every device on a private network should be publicly addressable/routable. To me that’s a bug, not a feature, and again does not serve the consumer, only the producer.
I'd argue the opposite: IPv6 has lowered complexity for the end user: SLAAC, endless addresses, no need for CIDR – these are all simplifications for the end user.
> Also: wrt IPv6 specifically, I don’t believe every device on a private network should be publicly addressable/routable. To me that’s a bug, not a feature,
Some would argue it's a feature. But let's say it's not useful. It's still surely not a bug. An address being publicly routeable doesn't mean you have to route traffic to it. Just don't, if you don't want to.
> and again does not serve the consumer, only the producer.
I'd argue that it simplifies some things for the consumer (see above), and also lets the consumer be a producer more easily. I'd argue that that's a good thing, more in the spirit of the internet. But even if the end user doesn't care, it's not a detriment.
Just as I am oblivious to whether this is posted over ipv4 or 6.
That they all have to implement the protocol seems like 20 years of wayland might actually have hurt Linux more than it fixed - without it something else would have happened. Think of how many man hours have been wasted doing the same thing for KDE, gnome, sway, hyprland, etc.
(also I agree about the publicly available thing, it's a bug for me as well. Companies will harvest everything they can and you better believe defaults matter - aka publicly available, for the producer, but they will say your security, of course)
I guess HDR support, 10/12bit colors, displays with different dpi/refresh rate etc is just not really an obvious benefit to you?
I mean, it works alot better than it did before, still i wouldn't recommend it for someone who isn't ready to tinker in order to make stuff work.
The point why i mention this is, while most normal desktop/coding stuff works okay with wayland, as soon i try any gaming its just a sh*show. From stuff that doesn't even start (but works when i run on x) to heavyly increased performance demands from games that work a lot smoother on x.
While i have no personal relation to any of both, and i couldn't technically care less which of them to use - if you are into gaming, at least in my experience, x is rn still the more stable solution.
This post is a lot more relatable.
As an aside, regarding remote Emacs - I can attest that Waypipe does indeed work fantastically for this. Better than X11 ever worked over the network for me.
I, too, suffer from the pgtk is slow issue (only a 4k monitor though it's mitigable and manageable for me)
I screen share and video call with Slack and Google Meet.
I use alacritty/zsh/tmux as my terminal. I use chromium as my browser, vscode and sublime text as code editors.
Slack, Spotify, my studio mic, my Scarlett 2i2, 10gbe networking, thunderbolt, Logitech unifying receiver…. Literally everything “just works” and has been a joy to use.
Only issues I’ve ever faced have been forcing an app to run native Wayland not xwayland (varies from app to app but usually a cli flag needed) and Bluetooth pairing with my Sony noise canceling which is unrelated to Wayland. Periodically I get into a dance where it won’t pair, but most of the time it pairs fine.
Sounds like someone made a listener that listens on key events, but didn't bother to check the state of the event, meaning it hits releases as well. Should be easy to verify by keeping them pressed long enough to trigger the key repeat events.
> I also noticed that font rendering is different between X11 and Wayland! The difference is visible in Chrome browser tab titles and the URL bar, for example:
Tab title display is not owned by wayland unless you are running with the client side decor extension, which Gnome is not. So looking at the application or GUI framework (GTK in this case) are really the two only choices.
And is it possible to get fullscreen but within a container (e.g. get rid of browser gui to see more in a small container)
I feel like the biggest issue for Wayland is the long tail of people using alternative WMs. A lot of those projects don't have manpower to do what amounts to a complete rewrite.
I honestly don't have a preference between Wayland and X, but I feel very strongly about keeping my current WM. XWayland supposedly works, but I'm not in any hurry to add an extra piece of software and extra layer of configuration for something I already have working exactly the way I want. If Wayland offered some amazing advantages over X, it might be different, but I haven't seen anything to win me over.
Looking at your github, it seems you use StumpWM. It seems they are also working on a wayland version under the name Mahogany. Development seems pretty active: https://github.com/stumpwm/mahogany
> I'll probably switch, grudgingly, to XWayland whenever X gets removed from Debian.
FWIW I think "wayback" is the project for this. It seems to be trying to use XWayland to run full X11 desktop environments on top of Wayland: https://gitlab.freedesktop.org/wayback/wayback
Is this specific to the WM he used or does HW acceleration straight up not work in browsers under Wayland? That to me seems like a complete deal breaker.
The Chrome crashes when resizing a window doesn't makes any sense, apart from being a WM fault. The Xwayland scaling, again, has native scaling support on Gnome. Same for the monitor resolution problem (which he acknowledged). Same for font rendering. Idk.
> By the way, when I mentioned that GNOME successfully configures the native resolution, that doesn’t mean the monitor is usable with GNOME! While GNOME supports tiled displays, the updates of individual tiles are not synchronized, so you see heavy tearing in the middle of the screen, much worse than anything I have ever observed under X11. GNOME/mutter merge request !4822 should hopefully address this.
https://forums.tomshardware.com/threads/nvidias-name-change....
When I got to know their products, they were nVidia.
"The name of this corporation is NVIDIA Corporation." - 1995 amendment.
I have a 7,000 word blog post and demo videos coming out this Tuesday with the details but I think I uncovered a driver bug having switched to native Linux a week ago with a low GPU memory card (750 Ti).
Basically on Wayland, apps that request GPU memory will typically crash if there's no more GPU memory to allocate where as on X11 it will transparently offload those requests to system memory so you can open up as much as you want (within reason) and the system is completely usable.
In practice this means opening up a few hardware accelerated apps in Wayland like Firefox and most terminals will likely crash your compositor or at the very least crash those apps. It can crash or make your compositor unstable because if it in itself gets an error allocating GPU memory to spawn the window it can do whatever weird things it was programmed to do in that scenario.
I reported it here: https://github.com/NVIDIA/egl-wayland/issues/185
Some end users on the NVIDIA developer forums looked into it and determined it's likely a problem for everyone, it's just less noticeable if you have more GPU memory and it's especially less noticeable if you reboot daily since that clears all GPU memory leaks which is also apparent in a lot of Wayland compositors.
gsliepen•1d ago
PunchyHamster•1d ago
dotancohen•23h ago
hsbauauvhabzb•23h ago
the_why_of_y•23h ago
For those who want to complain how lack of choice between multiple implementations is an obvious problem and deviates from UNIX tradition, please wait until the next systemd thread.
hsbauauvhabzb•11h ago
charcircuit•23h ago
That's an easy way to excuse bad design. Look at the designs of other operating systems designed by professionals and you won't see windows managers having to handle raw inputs or being in the same process as the compositor.
the_why_of_y•23h ago
https://en.wikipedia.org/wiki/Desktop_Window_Manager
The Desktop Window Manager is a compositing window manager, meaning that each program has a buffer that it writes data to; DWM then composites each program's buffer into a final image.
https://web.archive.org/web/20040925095929/http://developer....
The Quartz Compositor layer of Mac OS X comprises the window server and the (private) system programming interfaces (SPI) implemented by the window server. In this layer are the facilities responsible for rudimentary screen displays, window compositing and management, event routing, and cursor management.
The window server is a single system-wide process that coordinates low-level windowing behavior and enforces a fundamental uniformity in what appears on the screen. It is a lightweight server in that it does not do any rendering itself, but instead communicates with the client graphics libraries layered on top of it. It is “agnostic” in terms of a drawing model.
The window server has few dependencies on other system services and libraries. It relies on the kernel environment’s I/O Kit (specifically, device drivers built with the I/O Kit) in order to communicate with the frame buffer, the input infrastructure, and input and output devices.
charcircuit•22h ago
Window management on MacOS is done by Dock which talks to Quartz Compositor where the underlying windows live.
abhinavk•22h ago
charcircuit•21h ago
simondotau•21h ago
charcircuit•20h ago
dagmx•18h ago
2. You’re provably wrong even if someone followed your description because you can kill the dock or explorer process and still be able to switch between windows and move them around. Killing explorer is a little more heavy handed than killing the dock but it doesn’t take down the window manager.
simondotau•6h ago
secure•23h ago
mananaysiempre•21h ago
immibis•20h ago
Similarly, tearing gets pixels to the screen faster.
silon42•19h ago
mananaysiempre•19h ago
My point is that, now that bare fillrate and framebuffer memory haven’t been a limiting factor for 15 to 20 years, it is a reasonable choice to build a desktop graphics system with the invariant of every frame being perfect—not even because of the user experience, but because that allows the developer to unequivocally classify every imperfect frame as a bug. Invariants are nice like that. And once that decision has been made, you cannot have asynchronous out-of-process window management. (I’m not convinced that out-of-process but synchronous is useful.) A reasonable choice is not necessarily the right choice, but neither is it moronic, and I’ve yet to see a discussion of that choice that doesn’t start with calling (formerly-X11) Wayland designers morons for not doing the thing that X11 did (if in not so many words).
To be clear, I’m still low-key pissed that a crash in my desktop shell, which was deliberately designed as a dynamic-language extensibility free-for-all in the vein of Emacs or TeX, crashes my entire graphical session, also as a result of deliberate design. The combination of those two reasonable decisions is, in fact, moronic. But it didn’t need to be done that way even on Wayland.
speed_spread•18h ago
hulitu•8m ago
Citation needed. Windows has its own share of graphics bugs, even on "corporate" hardware (HP).
Maskawanian•18h ago
jcelerier•18h ago
ptx•17h ago
> There are many race conditions that must be dealt with in input and window management because of the asynchronous nature of event handling. [...] However, not all race conditions have acceptable solutions within the current X design. For a general solution it must be possible for the manager to synchronize operations explicitly with event processing in the server. For example, a manager might specify that, at the press of a button, event processing in the server should cease until an explicit acknowledgment is received from the manager.
[1] https://web.mit.edu/6.033/2006/wwwdocs/papers/protected/xwin...
mananaysiempre•15h ago
[1] https://dx.doi.org/10.1002/spe.4380201409, https://people.freedesktop.org/~ajax/WhyX.pdf or http://os.4uj.org/WhyX.pdf
[2] https://news.ycombinator.com/item?id=15120308
[3] https://x.org/wiki/XorgDeveloperDocumentation/
someguyiguess•16h ago
exe34•22h ago
naikrovek•18h ago
I don’t know how GPU acceleration would have fit in, but I bet it would have been trivial provided the drivers were sufficient.
All of Rio in Plan9 is 6K lines of code and it’s a more powerful display protocol and window manager (all of the fundamentals are there but none of the niceties) than anything else I’ve ever seen.
The defacto way to remote into a Plan9 system from any OS even today is to use a client side program which implements it all the same way Plan9 does.
tuna74•7h ago
TacticalCoder•22h ago
To me the biggest issue of Wayland is that it aimed, on purpose, to imitate Windows or OS X or any GUI that is not built on the idea of a client/server protocol.
From TFA:
> I’ll also need a solution for running Emacs remotely.
If only there was something conceived from the start as a client/server display protocol...
gspr•21h ago
jauntywundrkind•4h ago
The big thing to me is, Wayland servers have way way less responsibility than X. X had a huge Herculean task, of doing everything the video card needed. It was a big honking display server because it took up a huge chunk of the stack to run a desktop.
Wayland servers all use kernel mode setting kernel buffers, so much more. So much of the job is done. There is a huge shared code base that Wayland has that X never had, good Kernel's with actual drivers for GPUs.
If we wanted one stable platform that we could not innovate on, that was what it was and we all had to deal with it... We'd all just use Mac. punchyHamster is saying The Cathedral is the right model and The Bazaar is the bad model, of the famous Cathedral vs Bazaar.
But the model really does not enable fast iteration & broader exploration of problem spaces. The ask doesn't even make sense: there are incredibly good libraries for making Wayland servers (wlroots, smithay, more). And they're not always even huge, but do all the core protocols. Some people really want professional industrial direct software that they never have to think about that only works one way and will only evolve slowly and deliberately. I'm thankful as fuck Wayland developers aren't catering to these people, and I think that's the wrong abstraction for open source and the wrong excitement to allow timeless systems to be built grown and evolved. We should avoid critical core dependencies, so that we can send into the future, without being tied to particular code-bases. That seems obvious and proposing otherwise to consign ourselves to small limp fates.
mindcrash•23h ago
Because both have their own portal implementation/compositor with their own issues and service spec implementations. KDE has xdg-desktop-portal-kde, and GNOME has xdg-desktop-portal-gnome. On top of that each (still) has their own display server; KDE has KWin, and GNOME has Mutter.
> The reference compositor, Weston, is not really usable as a daily driver.
Weston is probably good for two things: Running things in Kiosk mode and showcasing how to build a compositor.
That's why you should at least use xdg-desktop-portal if you are not running KDE or GNOME. But this is a vanilla compositor (without implementations of any freedesktop desktop protocols), and as-is has no knowledge of things like screenshots or screensharing.
If you run any wlroots based compositor except Hyprland you should run xdg-desktop-portal-wlr which does implement the desktop protocols org.freedesktop.impl.portal.Screenshot and org.freedesktop.impl.portal.ScreenCast.
If you use Hyprland you should run its fork xdg-desktop-portal-hyprland instead which additionaly has things like file picking built in. Additionally you can/should run xdg-desktop-portal-gtk and/or xdg-desktop-portal-kde to respectively get GTK ("GNOME") and QT ("KDE") specific implementations for desktop protocols. And you absolutely should use xdg-desktop-portal-gtk instead of xdg-desktop-portal-gnome, because xdg-desktop-portal-gnome really doesn't like to share with others.
> With Wayland the each desktop is reinventing the wheel
Not really true, as I mentioned earlier there's still a DE specific display server running in the background (like Mutter and KWin-X11 for X11), and graphics in each compositor is driven directly by the graphics driver in the kernel (through KMS/DRM).
In fact, on paper and in theory, the architecture looks really good: https://wayland.freedesktop.org/architecture.html. However, in practice, some pretty big chunks of functionality on the protocol level is missing but the freedesktop contributors, and the GNOME and KDE teams will get there eventually.
WhyNotHugo•23h ago
Outside of the domain of Firefox/Chromium, screencasting is much seamless. But 90% of the screen-sharing happens in browsers.
mindcrash•22h ago
Well, I think you should blame Google and Mozilla for that.
KDE (through Discover, https://apps.kde.org/discover/) and GNOME (through Software, https://apps.gnome.org/Software/) both have innate support for Flatpak.
So, given that the majority of normie Linux users will use Flatpak to install a browser, they will just use and support that in the browser (because the underlying DE wil more than likely have Flatpak support integrated too) and go on with their day to day.
Means that people who don't want to deal with Flatpak have to deal with Flatpak (or at least parts of it) too unfortunately.
michaelmrose•19h ago
Also, software stores show both native and flatpak or on Ubuntu snap. One can easily install the system package of chrome if one doesn't want to deal with flatpak.
thayne•12h ago
Not always. In my experience Zoom screencasting is much, much worse than on browsers in Wayland. But that isn't terribly surprising given how generally bad Zoom UX is on Linux.
rjzzleep•22h ago
2026 and you will still run into plenty of issues with random behaviour, especially if you run anything based on wlroots. Wine apps will randomly have pointer location issues if you run multiple displays. Crashes, video sharing issues with random apps, 10 bit issues. Maybe in 2027 we'll finally make it. But I feel like these 20 years of development could have been better spent on something that doesn't end up with 4 or more implementations.
abhinavk•21h ago
rjzzleep•21h ago
myaccountonhn•3h ago
I noticed it's far far more work to build a wm for Wayland than it is for Xorg.
imtringued•22h ago
People who are thinking of a Wayland replacement at this stage, mostly because they don't like it, will waste their time reinventing the mature parts instead of thinking about how to solve the remaining problems.
There is also a misunderstanding of the ideology the Wayland developers subscribe to. They want Wayland to be display only, but that doesn't mean they would oppose an input protocol or a window protocol. They just don't want everything to be under the Wayland umbrella like systemd.
michaelmrose•19h ago
mrighele•19h ago
Now, if only people deciding to replace X11 with Wayland heeded your suggestion...
tasuki•17h ago
ezst•21h ago
thayne•13h ago
There isn't any technical reason we couldn't have a single standardized library, at the abstraction level of wlroots.