A rather big problem is that Wayland is just a protocol, not an implementation. There are many competing implementations, like Gnome, KDE and wlroots. The problems you have with one of them might not appear in another. The reference compositor, Weston, is not really usable as a daily driver. So while with Xorg you have a solid base, and desktops are implemented on top of that, with Wayland the each desktop is reinventing the wheel, and each of them has to deal with all the quirks of the graphics drivers. I think this is a big problem with the architecture of Wayland. There really should be a standard library that all desktops use. Wlroots aims to be one, but I don't see Gnome and KDE moving to it anytime soon.
X.org picked the right level of abstraction (even if implementation could use a rewrite). No WM should care about handling raw inputs or forced to be proxy between driver and the app for the output (it could be, if it needed/wanted, but there is no reason to add another layer of abstraction and cycle-wasting for most use cases). And it shows in complexity and even in power use. Wayland basically failed to learn the lessons from X11
No, the lesson of “separate display server from window manager” was very clear when Wayland was started. People have been discussing this over the years ever since. (See also “client-side decorations” for another part of this issue that was heavily discussed.)
I seem to remember reading in an old paper (1990s?) that the asynchronous nature of the connection between the X server and the window manager results in essentially unfixable protocol-level races.
> There are many race conditions that must be dealt with in input and window management because of the asynchronous nature of event handling. [...] However, not all race conditions have acceptable solutions within the current X design. For a general solution it must be possible for the manager to synchronize operations explicitly with event processing in the server. For example, a manager might specify that, at the press of a button, event processing in the server should cease until an explicit acknowledgment is received from the manager.
I went and checked, and it was an SPE article from 1990[1]: “Why X is not our ideal window system” by Gajewska, Manasse, and McCormack. Section 3 has extensive diagrams of various races, including ones between clients and the window manager. Was even discussed here at one point[2], it turns out. Where I came across it I’m not sure (on X.org[3] possibly?).
Wayland has a philosophy of "every frame is perfect", which means fixing every race condition. However, X11 doesn't have this philosophy. If the window manager is slow and doesn't respond to a notification that a window has been resized, drawing the new window content over the old borders is the correct thing to do. What sense does it make to freeze the whole display just for a window border?
Similarly, tearing gets pixels to the screen faster.
Reacting somehow to user input, even if not perfect, is more important for me...
That's why we have HW cursors, and frame interpolation in games, etc...
Tearing and hidpi is why I left Linux for Windows between 2012 ans 2022. Once wayland was good enough I returned. Tearing is awful, and should be opt in (which wayland provides), not opt out.
Conversely, I much prefer lowest latency at the cost tearing; when I'm forced to use windows I generally disabled the compositor there too whenever i could (I certainly don't use one under Linux and that's one of my reasons for being there). I find macOS unuseable, even on brand new top-end mac studios the input lag and slow reaction of the OS to... any user input, is frightening when you're used to your computer reacting instantly.
The races I recall being described were substantially worse, but that’s largely beside my point.
My point is that, now that bare fillrate and framebuffer memory haven’t been a limiting factor for 15 to 20 years, it is a reasonable choice to build a desktop graphics system with the invariant of every frame being perfect—not even because of the user experience, but because that allows the developer to unequivocally classify every imperfect frame as a bug. Invariants are nice like that. And once that decision has been made, you cannot have asynchronous out-of-process window management. (I’m not convinced that out-of-process but synchronous is useful.) A reasonable choice is not necessarily the right choice, but neither is it moronic, and I’ve yet to see a discussion of that choice that doesn’t start with calling (formerly-X11) Wayland designers morons for not doing the thing that X11 did (if in not so many words).
To be clear, I’m still low-key pissed that a crash in my desktop shell, which was deliberately designed as a dynamic-language extensibility free-for-all in the vein of Emacs or TeX, crashes my entire graphical session, also as a result of deliberate design. The combination of those two reasonable decisions is, in fact, moronic. But it didn’t need to be done that way even on Wayland.
Perfect frames is what Mac and Windows provide and what Linux should also aim for. Border tearing is a display bug, correctness should come first, Wayland's approach is right. X was designed for CPU and IO constraints that no longer apply. _Graceful_ degradation of slow UI should lower the frame rate, not compromise rendering of individual frames.
That's an easy way to excuse bad design. Look at the designs of other operating systems designed by professionals and you won't see windows managers having to handle raw inputs or being in the same process as the compositor.
The Desktop Window Manager is a compositing window manager, meaning that each program has a buffer that it writes data to; DWM then composites each program's buffer into a final image.
The Quartz Compositor layer of Mac OS X comprises the window server and the (private) system programming interfaces (SPI) implemented by the window server. In this layer are the facilities responsible for rudimentary screen displays, window compositing and management, event routing, and cursor management.
The window server is a single system-wide process that coordinates low-level windowing behavior and enforces a fundamental uniformity in what appears on the screen. It is a lightweight server in that it does not do any rendering itself, but instead communicates with the client graphics libraries layered on top of it. It is “agnostic” in terms of a drawing model.
The window server has few dependencies on other system services and libraries. It relies on the kernel environment’s I/O Kit (specifically, device drivers built with the I/O Kit) in order to communicate with the frame buffer, the input infrastructure, and input and output devices.
1. Nobody else is talking about managing windows as a user. They’re talking about the system that manages windows for drawing and interaction.
2. You’re provably wrong even if someone followed your description because you can kill the dock or explorer process and still be able to switch between windows and move them around. Killing explorer is a little more heavy handed than killing the dock but it doesn’t take down the window manager.
They should have looked at Plan9 and the Rio window manager there.
I don’t know how GPU acceleration would have fit in, but I bet it would have been trivial provided the drivers were sufficient.
All of Rio in Plan9 is 6K lines of code and it’s a more powerful display protocol and window manager (all of the fundamentals are there but none of the niceties) than anything else I’ve ever seen.
The defacto way to remote into a Plan9 system from any OS even today is to use a client side program which implements it all the same way Plan9 does.
Since this is a Wayland thread, obviously the problem is a lack of a common implementation, which deviates from UNIX tradition.
For those who want to complain how lack of choice between multiple implementations is an obvious problem and deviates from UNIX tradition, please wait until the next systemd thread.
The blanket statement "right level of abstraction" betrays a pretty narrow minded view. Right abstraction for what?
The big thing to me is, Wayland servers have way way less responsibility than X. X had a huge Herculean task, of doing everything the video card needed. It was a big honking display server because it took up a huge chunk of the stack to run a desktop.
Wayland servers all use kernel mode setting kernel buffers, so much more. So much of the job is done. There is a huge shared code base that Wayland has that X never had, good Kernel's with actual drivers for GPUs.
If we wanted one stable platform that we could not innovate on, that was what it was and we all had to deal with it... We'd all just use Mac. punchyHamster is saying The Cathedral is the right model and The Bazaar is the bad model, of the famous Cathedral vs Bazaar.
But the model really does not enable fast iteration & broader exploration of problem spaces. The ask doesn't even make sense: there are incredibly good libraries for making Wayland servers (wlroots, smithay, more). And they're not always even huge, but do all the core protocols. Some people really want professional industrial direct software that they never have to think about that only works one way and will only evolve slowly and deliberately. I'm thankful as fuck Wayland developers aren't catering to these people, and I think that's the wrong abstraction for open source and the wrong excitement to allow timeless systems to be built grown and evolved. We should avoid critical core dependencies, so that we can send into the future, without being tied to particular code-bases. That seems obvious and proposing otherwise to consign ourselves to small limp fates.
> Wayland basically failed to learn the lessons from X11
To me the biggest issue of Wayland is that it aimed, on purpose, to imitate Windows or OS X or any GUI that is not built on the idea of a client/server protocol.
From TFA:
> I’ll also need a solution for running Emacs remotely.
If only there was something conceived from the start as a client/server display protocol...
I use it as my daily driver. I used Sway for a very long time, tried Hyprland for a bit and am now running niri as my daily driver. Sway and niri are wlroots based, Hyprland at some point rolled its own because they didn't want to wait for wlroots protocol extensions. Sometimes I have to switch to Gnome to do screen sharing.
2026 and you will still run into plenty of issues with random behaviour, especially if you run anything based on wlroots. Wine apps will randomly have pointer location issues if you run multiple displays. Crashes, video sharing issues with random apps, 10 bit issues. Maybe in 2027 we'll finally make it. But I feel like these 20 years of development could have been better spent on something that doesn't end up with 4 or more implementations.
> The problems you have with one of them might not appear in another.
Because both have their own portal implementation/compositor with their own issues and service spec implementations. KDE has xdg-desktop-portal-kde, and GNOME has xdg-desktop-portal-gnome. On top of that each (still) has their own display server; KDE has KWin, and GNOME has Mutter.
> The reference compositor, Weston, is not really usable as a daily driver.
Weston is probably good for two things: Running things in Kiosk mode and showcasing how to build a compositor.
That's why you should at least use xdg-desktop-portal if you are not running KDE or GNOME. But this is a vanilla compositor (without implementations of any freedesktop desktop protocols), and as-is has no knowledge of things like screenshots or screensharing.
If you run any wlroots based compositor except Hyprland you should run xdg-desktop-portal-wlr which does implement the desktop protocols org.freedesktop.impl.portal.Screenshot and org.freedesktop.impl.portal.ScreenCast.
If you use Hyprland you should run its fork xdg-desktop-portal-hyprland instead which additionaly has things like file picking built in. Additionally you can/should run xdg-desktop-portal-gtk and/or xdg-desktop-portal-kde to respectively get GTK ("GNOME") and QT ("KDE") specific implementations for desktop protocols. And you absolutely should use xdg-desktop-portal-gtk instead of xdg-desktop-portal-gnome, because xdg-desktop-portal-gnome really doesn't like to share with others.
> With Wayland the each desktop is reinventing the wheel
Not really true, as I mentioned earlier there's still a DE specific display server running in the background (like Mutter and KWin-X11 for X11), and graphics in each compositor is driven directly by the graphics driver in the kernel (through KMS/DRM).
In fact, on paper and in theory, the architecture looks really good: https://wayland.freedesktop.org/architecture.html. However, in practice, some pretty big chunks of functionality on the protocol level is missing but the freedesktop contributors, and the GNOME and KDE teams will get there eventually.
The fact that we need the entire xdg-desktop-portal stack for screen sharing on browsers is a major annoyance. We now have a standardised extension for screencasting and screencopy (formerly it was not standard, but had been around for years), but browsers only support the Flatpak stack, which has a lot of moving parts and IPC. Doing out-of-band IPC for this is kind of pointless when the client and the server already have a Wayland connection to begin with.
Outside of the domain of Firefox/Chromium, screencasting is much seamless. But 90% of the screen-sharing happens in browsers.
> Outside of the domain of Firefox/Chromium, screencasting is much seamless
Not always. In my experience Zoom screencasting is much, much worse than on browsers in Wayland. But that isn't terribly surprising given how generally bad Zoom UX is on Linux.
So, given that the majority of normie Linux users will use Flatpak to install a browser, they will just use and support that in the browser (because the underlying DE wil more than likely have Flatpak support integrated too) and go on with their day to day.
Means that people who don't want to deal with Flatpak have to deal with Flatpak (or at least parts of it) too unfortunately.
Most distros come with Firefox which most normies will simply use as is.
Also, software stores show both native and flatpak or on Ubuntu snap. One can easily install the system package of chrome if one doesn't want to deal with flatpak.
The real problem with post X compositors is that the Wayland developers assumed that the compositor developers will develop additional working groups (an input protocol, a window management protocol, etc) on top of the working group that exclusively focuses on display aka Wayland. Wayland was supposed to be one protocol out of many, with the idea being that if Wayland ever turns out to be a problem it is small in scope and can be replaced easily.
People who are thinking of a Wayland replacement at this stage, mostly because they don't like it, will waste their time reinventing the mature parts instead of thinking about how to solve the remaining problems.
There is also a misunderstanding of the ideology the Wayland developers subscribe to. They want Wayland to be display only, but that doesn't mean they would oppose an input protocol or a window protocol. They just don't want everything to be under the Wayland umbrella like systemd.
> People who are thinking of a Wayland replacement at this stage, mostly because they don't like it, will waste their time reinventing the mature parts instead of thinking about how to solve the remaining problems.
Now, if only people deciding to replace X11 with Wayland heeded your suggestion...
Technically, X is also just a protocol. But there was just one main implementation of the server (X.org), and just a couple implementations of the client library (xlib and xcb).
There isn't any technical reason we couldn't have a single standardized library, at the abstraction level of wlroots.
I still don't know why I would want to use it. The benefits don't seem to outweigh the costs yet, and xorg is tried and true. So many Linux articles and forum posts about fixing problems with your desktop graphics start with "If you're using Wayland, go back to xorg, it'll probably fix the problem you're seeing."
You don't always have to replace something that works with something that doesn't but is "modern."
My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
There's no obvious reason for an end used to switch to Wayland if there isn't any particular problems with their current setup, the main improvements come down to things X11 never supported particularly well and are unlikely to be used in many existing X11 setups. My big use case that Wayland enabled was being able to dock my laptop and seamlessly switching apps between displays with different scale factors. And as an added bonus my experience has been that apps, even proprietary ones like Zoom, tend to handle switching scale factors completely seamlessly. It's not that high of importance, but I do like polish like this. (Admittedly, this article outlines that foot on Sway apparently doesn't handle this as gracefully. The protocols enable seamless switching, but of course, they can't really guarantee apps will always render perfect frames.)
OTOH though, there are a lot of reasons for projects like GNOME and KDE to want to switch to Wayland, and especially why they want to drop X11 support versus maintaining it indefinitely forever, so it is beneficial if we at least can get a hold on what issues are still holding things up, which is why efforts like the ones outlined in this blog post are so important: it's hard to fix bugs that are never reported, and I especially doubt NVIDIA has been particularly going out of their way to find such bugs, so I can only imagine the reports are pretty crucial for them.
So basically, this year the "only downsides" users need to at least move into "no downsides". The impetus for Wayland itself is mainly hinged on features that simply can be done better in a compositor-centric world, but the impetus for the great switchover is trying to reduce the maintenance burden of having to maintain both X11 and Wayland support forever everywhere. (Support for X11 apps via XWayland, though, should basically exist forever, of course.)
> having to maintain both X11 and Wayland support forever everywhere
I don't get why X11 shouldn't work forever. It works today. As you said, there's no obvious reason for an end user to switch to Wayland if there isn't any particular problems with their current setup. "Because it's modern" and "Because it's outdated" just aren't compelling reasons for anyone besides software developers. And "because we're going to drop support so you have to switch eventually" is an attitude I'd expect out of Apple, not Linux distributions.
X11 as a protocol will probably continue to work ~forever.
X11 as a display server will continue to work ~forever as long as someone maintains a display server that targets Linux.
KDE and GNOME will not support X11 forever because it's too much work. Wayland promises to improve on many important desktop use cases where X.org continues to struggle and where the design of X11 has proven generally difficult to improve. The desktop systems targeting Linux want these improvements.
> "Because it's modern" and "Because it's outdated" just aren't compelling reasons for anyone besides software developers.
I can do you one better: that's also not really compelling to software developers either most of the time. I beg you to prove that the KDE developers pushed Wayland hard because they badly wanted to have to greatly refactor the aging and technical debt heavy KWin codebase, just for the hell of it. Absolutely not.
The Wayland switchover that is currently ongoing is entirely focused on end users, but it's focused on things they were never able to do well in X11, and it shows. This is the very reason why Wayland compositors did new things better before they handled old use cases at parity. The focus was on shortcomings of X11 based desktops.
> And "because we're going to drop support so you have to switch eventually" is an attitude I'd expect out of Apple, not Linux distributions.
Yeah. Except Apple is one of the five largest companies in the United States and GNOME and KDE are software lemonade stands. I bet if they could they would love to handle this switchover in a way that puts no stress on anyone, but as it is today it's literally not feasible to even find the problems that need to be solved without real users actually jumping on the system.
This isn't a thing where people are forcing you to switch to something you don't want under threat of violence. This is a thing where the desktop developers desperately want to move forward on issues, they collectively picked a way forward, and there is simply no bandwidth (or really, outside of people complaining online, actual interest) for indefinitely maintaining their now-legacy X11-based desktop sessions.
It actually would have been totally possible, with sufficient engineering, to go and improve things to make it maintainable longer term and to try to backport some more improvements from the Wayland world into X11; it in fact seems like some interested people are experimenting with these ideas now. On the other hand though, at this point it's mostly wishful thinking, and the only surefire thing is that Wayland is shipping across all form factors. This is no longer speculative, at this point.
If you really want to run X.org specifically, that will probably continue to work for a decently long time, but you can't force the entire ecosystem to all also choose to continue to support X.org anymore than anyone can force you to switch to Wayland.
That was the first thing I noticed when I recently went back to messing with Linux distros after 15 years. Booting into Ubuntu and having to use Gnome Tweaks or whatever it’s called for basic customizations was incredibly confusing considering Linux is touted as being the customizable and personal OS. I doubt I’ll ever give Gnome another try after that.
I get the impression gnome3 is loosely a clone of osx, I much prefer a windows-esc desktop. I’ve never tried kde but feel pretty at home with xfce or openbox. YMMV, but if you have the time they’re worth trying if you’re a recent windows refugee.
GNOME is a much closer match for iPadOS than it is macOS due to how far it goes with minimalism, as well as how it approaches power user functionality (where macOS might move it off to the side or put it behind a toggle, GNOME just won’t implement it at all). Extensions can alleviate that to a limited extent, but there are several aspects that can’t be improved upon without forking.
At the end of the day these developers are almost entirely volunteers. Codebases that are a mess, ie X11, are not enjoyable to work on and therefore convincing people to use their discretionary time on it is more difficult. If there wasn't Wayland the current set of developers on Wayland might not have been doing DE work at all.
Attracting new contributors is an existential problem in OSS.
I prefer Wayland, as I feel Wayland's performance is much smoother than Xorg. Though, I have no use for VRR, and I hate the slight lag that is introduced due to font scaling, so I do not use it either.
But, I am stuck on Xorg only because of one app that I have to use to work.
> My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
This is already happening. in my knowledge, Archlinux, Ubuntu already switched to Gnome 49, which do not support X without recompilation. So most likely, any distro using Gnome 49 upwards will not provide Xorg by default. KDE also going to do it soon.
Xorg is going away pretty soon
I believe its step to the right direction, only issue is some annoying app holding us back
This is the real reason to make the wayland switch.
It doesn't really matter if you like or dislike wayland, the major DE have decided they don't like X11 and they are making the switch to wayland. X11 code is actively being removed from these desktop environments.
If you want to use X11, you can either stay on an old unmaintained DE or switch to a smaller one that supports X11. But you should realize that with wayland being the thing major DEs are targeting, your experience with X11 will likely degrade with time.
Yes, and besides, developers not having to support two servers, can focus on improving the DE where it actually matters. And with that, fixing issues, adding features becomes much faster.
I see it as a win for both developers and users in the long run.
I've had to dive into xorg.conf more than once. Switched to Wayland as soon as it became an (experimental) option in Ubuntu and never looked back. Probably helps that I've always had AMD cards when running Linux, but it has been smooth sailing nonetheless. I can vaguely remember something not working under Wayland in the early days.. Maybe something with Wine or Steam? Anyway, that has to be 10 years ago now.
That's always what's missing from these threads. Wayland is boring actually. It just works. KDE, Sway, Niri, whatever, are all good.
I don't know what to do. The outpouring of negative energies are so severe. But I think it's so incredibly un-representative, is so misleading. The silent majority problem is so real. Come to the better place.
I have essential workflows using x2x, xev, and xdotool. Apparently this kind of stuff is contrary to Wayland's security model, so I'm stuck on Xorg, and I'm ok with that.
Not the GP, but I recall the KeePass password manager using xdotool for its autotype feature. I struggled to get xdotool to work correctly back in 2014 on a Debian 7 personal computer. Not familiar with 'x2x' or 'xev'
We are under an article which tells you that you can have problems with Wayland and hiDPI screens. And for example I’m one of those people, who uses X11, because Wayland failed on many levels, like buggy video playing, crashing while changing monitors, or simply waking up my laptop with an external monitor, and I didn’t give more than a few days to fix these (cheers to the author to try this long), so I went back to X11. Which is still buggy, but on a “you can live with it level” buggy.
Btw, everybody who I know, and I too, changes the font size, and leaving the DPI scaling on 100%, or maybe 200% on X11.
I have a setup with a high DPI monitor mixed with a normal DPI monitor and KDE over Wayland just works fine. The only issue that I found are with Libre Office doing weird over scaling and Chrome/Chromium window resizing his window to the oblivion.
I don't really care about this but here's an example:
I have 2 27" screens, usually connected to a windows box, but while working they're connected to a MBP.
Before the MBP they were connected to several ThinkPads where I don't remember what screen size or scaling, I don't even remember if I used X11 or Wayland. But the next ThinkPad that will be connected will probably be HiDPI and with Wayland. What will happen without buying a monitor? No one knows.
Not to mention that fractional scaling is practically required in order to use the majority of higher DPI monitors on the market today. Manufacturers have settled on 4K at 27" or 32" as the new standard, which lends itself to running at around 150% scale, so to avoid fractional scaling you either need to give up on high DPI or pay at least twice as much for a niche 5K monitor which only does 60hz.
Fractional scaling is a really bad solution. The correct way to fix this is to have the dpi aware applications and toolkits. This does in fact work and I have ran xfce under xorg for years now on hi-dpi screens just by setting a custom dpi and using a hi-dpi aware theme. When the goal is to have perfect output why do people suddenly want to jump to stretching images?
That doesn't gel with my experience, 1080p was the de-facto resolution for 24" monitors but 27" monitors were nearly always 1440p, and switching from 27" 1440p to 27" 4K requires a fractional 150% scale to maintain the same effective area.
To maintain a clean 200% scale you need a 27" 5K panel instead, which do exist but are vastly more expensive than 4K ones and perform worse in aspects other than pixel density, so they're not very popular.
4K monitors aren't a significant expense at this point, and text rendering is a lot nicer at 150% scale. The GPU load can be a concern if you're gaming but most newer games have upscalers which decouple the render resolution from the display resolution anyway.
I used to be like this. I actually ran a 14" FHD laptop with a 24" 4k monitor, both at 100%. Using i3 and not caring about most interface chrome was great, it was enough for me to zoom the text on the 4k one. But then we got 27" 5k screens at work, and that had me move to wayland since 100% on that was ridiculously small.
Because although I don't care much about the chrome, I sometimes have to use it. For example, the address bar in firefox is ridiculously small. Also, some apps, like firefox (again) have a weird adaptation of the scroll to the zoom. So if you zoom at 300%, it will scroll by a lot at a time, whereas 200% is still usable.
Also, 200% on an FHD 14" laptop means 960x540 px equivalent. That's too big to the point of rendering the laptop unusable. Also, X11 doesn't support switching DPI on the fly AFAIK, and I don't want to restart my session whenever I plug or unplug the external monitor, which happens multiple times a day when I'm at the office.
This really isn't this far off. If we imagined the screens overlayed semi-transparently an 16 pixel letter would be over a 14 pixel one.
If one imagines an ideal font size for a given user's preference for physical height of letterform one one could imagine a idealized size of 12 on another and 14 on the other and setting it to 13 and being extremely close to ideal.
>So if you zoom at 300%, it will scroll by a lot at a time, whereas 200% is still usable.
This is because it's scrolling a fixed number of lines which occupy more space at 300% zoom notably this applies pretty much only to people running high DPI screens at 100% because if one zoomed to 300% otherwise the letter T would be the size of the last joint on your thumb and legally blind folks could read it. It doesn't apply to setting the scale factor to 200% nor the setting for Firefox's internal scale factor which is independent from the desktop supports fractional scaling in 0.05 steps and can be configured in about:config
It does work and has worked for over a decade. You can configure scaling under settings in Cinnamon or plasma for instance or via environmental variables in a simple environment like i3wm.
The post is from the Dev of i3wm an x11 window manager complaining among other things about how well his 8k monitor works under x11 and how poorly it works under Wayland.
You can also consult the arch wiki article on high DPI which is broadly applicable beyond arch
In that time I've had Hidpi work perfectly on first on Nvidia then recently on AMD GPUs on several different distros and desktops all running on X on several distros. They all worked out of the box and were able to scale correctly once configured.
The totality of my education on the topic was reading the arch wiki on hidpi once.
AFAIK one cannot span one x session across multiple GPUs although AMD had something that it once referred to as "eyefinity" for achieving this.
It is rarely needed discreet GPU often support 3 or even 4 outputs
One may wonder if you tried this a very long time ago back when AMD sucked and Nvidia worked well in 2005-2015
Right now with X11, IIRC, if one application has access to your display they can read what is going on in other applications running on the same display.
If browser tabs were able to do that, all hell would break loose. So why do we accept it from applications?
Anyway, despite this, I still use X11 instead of Wayland because of all the shortcomings.
That leads to technical debt in the long term. Yes, it might be working well for now but the more outdated it becomes, the harder it will be to maintain later.
>nVidia refused to support the API that Wayland was using, insisting that their EGLStreams approach was superior
This is a common mischaracterizarion of what happened. This API, GBM, was a proprietary API that was a part of Mesa. Nvidia couldn't add GBM to their own driver as it is a Mesa concept. So instead Nvidia tried to make a vendor neutral solution that any graphics drivers could use which is where you see EGLStreams come into the picture. Such an EGL API was also useful for other nonwayland embedded usecases. In regards to Nvidia's proprietary driver's GBM support, Nvidia themselves had to add support to the Mesa project to support dynamically loading new backends that weren't precompiled into Mesa. Then they were able to make their own backend.
For some reason when this comes up people always phrase it in terms of Nvidia not supporting something instead of the freedesktop people not offering a way for the Nvidia driver to work, which is a prerequisite of Nvidia following such guidance.
I mean proprietary API in the sense that the API is solely owned and developed by Mesa. It is not a standardized API, but a custom one specific to their project.
Even today if you use the API your program has to link to Mesa's libgbm.so as opposed to linking to a library provided by the graphics driver like libEGL.so.
OK, leaving aside the fact that "proprietary" has a very well defined meaning in this context and using it makes your comment very charged, you're basically telling us that Nvidia was not willing to implement an API for their drivers, but tried to push for one designed by themselves (you're calling it "vendor neutral", but since Mesa is not an actual GPU vendor it's most likely another subtle mistake on your part that completely changes the meaning of your words) and all the other vendors (Intel and AMD at this point), which have already implemented GBM should switch too in the name of this ?
How can you call all of that a mischaracterization? In my humble opinion, and I am not anything more than a bystander in this with only superficial knowledge of the domain, it's you that is trying to mischaracterize the situation.
>leaving aside the fact that "proprietary" has a very well defined meaning in this context
Yes, it does and it is different the the well defined meaning when talking in regards to the software itself. OpenGL is an open API, but the source code for an implementation isn't necessarily open.
>Nvidia was not willing to implement an API for their drivers
They couldn't because this API is a part of Mesa itself. As I mentioned programs link to a Mesa library directly.
>since Mesa is not an actual GPU vendor
They are a driver vendor.
>the other vendors (Intel and AMD at this point), which have already implemented GBM
Support was added to Mesa itself and not to the driver's by those companies. The proprietary, now deprecated, AMD kernel module still doesn't support GBM.
>should switch too in the name of this
I think it is beneficial for standards to be implemented by multiple vendors, so I think they should implement it at least.
>How can you call all of that a mischaracterization?
What people think as Nvidia needing to implement an API is actually an ask for Nvidia to make a Mesa API work.
From my perception essentially the ask was that Nvidia needed to open source the kernel driver like AMD did and then eventually a nvidia gbm backend would be built into Mesa for it. For obvious reasons this was never going to happen. The fact that no agreeable solution was figured out in about a decade, and then Nvidia has to code up that solution for the Mesa project is a failure on Mesa's end. A lot of user pain happened due due to them not willing to work together with proprietary software and focusing solely on supporting open source drivers.
> For obvious reasons this was never going to happen.
Well, I guess this is the crux of the problem, and for open-source enthusiasts like me this is not obvious at all. What we can surmise is that Nvidia refused to collaborate, therefore they were the party to blame for the status of their video cards not being supported as well as others' vendors on linux.
>What we can surmise is that Nvidia refused to collaborate
I saw more effort on Nvidia's side trying to collaborate than on the Wayland side. I think it's unfair to not call out the people who had a hardline stance of only caring about open source drivers and didn't want to do the work to onboard Nvidia.
In much the same way that NVIDIA may have felt that EGL was the better choice.
However none of your description of the way things are explains why NVIDIA couldn’t have made their own libgbm that matched the symbols of mesa and worked on standardizing the api by de facto.
It may not just be NVIDIA. From what I understand any open source solution is stuck with second rate graphics support on Linux, simply because the groups behind HDMI and other graphics related standards have overly restrictive licensing agreements. Valve ran directly into that while working on its newest console, the AMD drivers for its GPU cannot legally provide full support HDMI 2.1 .
I've been using wayland with Gnome for years without a single issue.
Arguably my hardware is a lot simpler and I don't use Nvidia. But I just want to point out that, for all the flak wayland receives, it can work quite well.
Me too. But first with Sway in 2016, then with KDE Plasma 6. Everything works flawless, everything runs in native Wayland except Steam games. I prefer AMD or Intel hardware over NVIDIA since forever.
I've maybe used Wayland on Gnome for 1-2 years at this point, always with nvidia hardware. Works OK now, but didn't 2 years ago, and before that, used to be very janky, today is smoother than Xorg. But at this point, I don't think there is a single blocker left for me. Took some time to rewrite some programs that I have to control their own window position and wants to see what other applications are running, but was easy to work around with a Gnome Shell Extension in the end, as the design of Wayland doesn't really "allow" those sort of things.
I'm having more issues with games/websites/programs that didn't take high display refresh rate into account, than Wayland, at this point.
I remember having a gentleman over I think to fix something or other and when he walked into the living room he explained my crt monitor was misconfigured and to his perception had a visible flicker. We checked it and it was indeed misconfigured although I couldn't see it but it was such an aberration to him that he took time away from his actual job to make the flicker go away.
You will also note many items in the post above are papercuts that might go unnoted like input feeling a little worse or font issues.
I just recently switched to Linux since I had some weird Windows issues I couldn't fix. I've tried to switch a few times before, but the main problem at some point was that I didn't have proper fractional scaling on Linux. And that alone pretty much made Linux unusable for me on my specific hardware.
Wayland fixes that, so that part is a huge improvement to me. Unfortunately this also limited my choice of Distros as not all of them use Wayland. I landed on Ubuntu again, despite some issues I have with it. The most annoying initially was that the Snap version of Firefox didn't use hardware acceleration, which is just barely usable.
Yeah, fractional scaling is absolutely the one thing that I miss on Linux. On X11 it's too slow and laggy. On Wayland I have... Wayland issues.
I don't entirely love MacOS (mostly because I can't run it on my desktop, lol). But it does fractional scaling so well, I always choose the "looks like 1440p" scaling on 4K resolution, and literally every app looks perfect and consistent and I don't notice any performance impact.
On windows the same thing, except some things are blurry.
On Linux yeah I just have to bear huge UI (x2 scaling) or tiny UI (X1) or live with a noticeable performance delay that's just too painful to work with.
It seems wayland has fractional scaling, but it is recent, but the bottom of this is high DPI handling should be handled at the GUI toolkit level. Compositor scaling is just a dirty fix for legacy GUI apps.
I am the same, now. But I did have it working previously on Nvidia and it was good enough. I’ve also used the TILE patch at work and that seemed pretty good on the 5k screens they have there.
I switched to get support for different scaling on different outputs and I have gone back.
So much NVidia hate, but in 23 years the only problems I've had with NVidia on Linux were when they dropped support for old GPUs. Even on proprietary hardware like iMacs and MacBooks.
Even when it worked, it was more clunky than AMD at all times. It came in form of a huge driver bundle you had to install manually (or rely on a distro to do it for you), while AMD GPUs just worked due to `amdgpu` being part of the kernel since forever. Then is the EGLStreams debacle where nvidia lost a lot of goodwill in my opinion, including mine. And finally, nvidia has managed to opensource their driver on Linux - except that it's less performant then the closed source one, and thus a second class citizen, still. Correct me if I am wrong, please.
The better path on Linux was always AMD, and still is, to this day, since it simply works without me needing to care about driver versions, or open vs closed source, at all.
AMD used to be terrible on Linux, perhaps before your time. Nvidia was always the choice if you needed functional hardware acceleration that worked on par with windows. The nvidia driver was/still is? the same driver accross platforms with a compatibility shim for each OS. This is how Nvidia managed to have best in class 3D acceleration accross Windows, FreeBSD, and Linux for decades now. OpenGL support historically on AMD was really bad, and AMD support was through a propriety driver back the day as well. Part of the reason the AMD/ATI opensource driver gained so much traction and support was that the propritery driver was so bad! Then you get onto other support for things like CUDA for professional work which Nvidia has always been light years ahead of any other card manufacturer.
Source: Was burned many times by ATI's promises to deliver functioning software over the years. Been using Nvidia on Linux and Freebsd for as long as can recall now.
Nvidia and intel on linux for near on 20 years now, and also agree - generally the ATI/AMD experience was markedly worse.
Currently dual 3090s in this box and nvidia is still as simple as just installing the distro package.
There was a period in the mid 2010s where trying to get better battery life on laptops by optionally using the discrete gpu vs the integrated was a real pain (bumblebee/optirun were not so solid), but generally speaking for desktop machines that need GPU support... Nvidia was the route.
Don't love their company politics so much, although I think they're finally getting on board now that so many companies want to run GPU accelerated workloads on linux hosts for LLMs.
But ATI sucked. They seem to have finally gotten there, but they were absolutely not the best choice for a long time.
Hell - I still have a machine in my basement running a GTX970 from 2015, and it also works fine on modern linux. It currently does the gpu accel for whisper speech to text for HA.
It's ridiculous to say that AMD (previously ATI) has always been the better choice, and I don't think anybody who used the ATI drivers would agree with you. For years the only reliable way to get GPU acceleration on Linux was basically NVidia.
When AMD bought ATI they started work on the open source drivers and improved the situation, but they had already lost me as a GPU customer by that point.
Maybe now in the 2020s AMD has caught up, and I'll keep them in mind next time I buy a GPU, but I've been happy with NVidia for a long time.
It would be nice if the NVidia driver were in the kernel and open source, but the Debian package has just worked for a very long time now.
Heh, interesting seeing we use pretty much the same things, i3+NixOS+urxvt+zsh+Emacs+rofi+maim+xdotool, only differentiating in browser choice (it's Firefox for me) and (me) not using any term multiplexer.
>So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
Kudos to Michael for even attempting it. Personally nowadays unless my working stack stops, well, working, or there're significant benefits to be found, don't really feel even putting the effort to try the shiny new things out.
I'm not switching to Wayland until my window manager supports it. It doesn't look like anybody has time to do the work, so I'll probably switch, grudgingly, to XWayland whenever X gets removed from Debian.
I feel like the biggest issue for Wayland is the long tail of people using alternative WMs. A lot of those projects don't have manpower to do what amounts to a complete rewrite.
I honestly don't have a preference between Wayland and X, but I feel very strongly about keeping my current WM. XWayland supposedly works, but I'm not in any hurry to add an extra piece of software and extra layer of configuration for something I already have working exactly the way I want. If Wayland offered some amazing advantages over X, it might be different, but I haven't seen anything to win me over.
> I'm not switching to Wayland until my window manager supports it.
Looking at your github, it seems you use StumpWM. It seems they are also working on a wayland version under the name Mahogany. Development seems pretty active: https://github.com/stumpwm/mahogany
> I'll probably switch, grudgingly, to XWayland whenever X gets removed from Debian.
FWIW I think "wayback" is the project for this. It seems to be trying to use XWayland to run full X11 desktop environments on top of Wayland: https://gitlab.freedesktop.org/wayback/wayback
I've been running Wayland on a Framework laptop and it just works. Droves my 4K external monitor, quickly switches to single screen, does fractional scaling well, runs all my apps without complaint.
I had an old Chromebook which had Lubuntu on it - screen tearing was driving me crazy so I switched to Wayland and it is buttery smooth. No mean feat given the decrepit hardware.
I'm sure someone will be along to tell me that I'm wrong - but I've yet to experience any downsides, other than people telling me I'm wrong.
I don't think Wayland is fully ready, at least not with NVIDIA GPUs with limited GPU memory.
I have a 7,000 word blog post and demo videos coming out this Tuesday with the details but I think I uncovered a driver bug having switched to native Linux a week ago with a low GPU memory card (750 Ti).
Basically on Wayland, apps that request GPU memory will typically crash if there's no more GPU memory to allocate where as on X11 it will transparently offload those requests to system memory so you can open up as much as you want (within reason) and the system is completely usable.
In practice this means opening up a few hardware accelerated apps in Wayland like Firefox and most terminals will likely crash your compositor or at the very least crash those apps. It can crash or make your compositor unstable because if it in itself gets an error allocating GPU memory to spawn the window it can do whatever weird things it was programmed to do in that scenario.
Some end users on the NVIDIA developer forums looked into it and determined it's likely a problem for everyone, it's just less noticeable if you have more GPU memory and it's especially less noticeable if you reboot daily since that clears all GPU memory leaks which is also apparent in a lot of Wayland compositors.
For me wayland offers only downsides, without any upsides. I feel the general idea behind it (pushing all complexity and work onto other layers) is broken.
I'll stick to xorg and openbox for many years to come.
Moving complexity from the layer you only have one of, to the layers where there are many, many competing pieces of software, was an absolutely bonkers decision.
It's hard to imagine a statement that could fly more in the face of open source.
It's absolutely an essential characteristic for long term survival, for long term excellence. To not be married to one specific implementation forever.
Especially in open source! What is that organizational model for this Authoritarian path, how are you going to - as Wayland successfully has - get every display server person onboard? Who would had the say on what goes into The Wayland Server? What would the rules be?
Wayland is the only thing that makes any sense at all. A group of peers, fellow implementers, each striving for better, who come together to define protocols. This is what made the internet amazing what made the web the most successful media platform, is what creates the possibility for ongoing excellence. Not being bound to fixed decisions is an option most smart companies lust but somehow when Wayland vs X comes up, everyone super wants there to be one and only one path, set forth three decades ago that no one can ever really overhaul or redo.
It's so unclear to me how people can be so negative and so short and so mean on Wayland. There's no viable alternative organization model for Authoritarian display servers. And if somehow you did get people signed up, this fantasy, there's such a load of pretense that it would have cured all ills? I don't get it.
I think then big part is maintenance, xorg doesn't look likely to be maintained long into the future in the way Wayland will be. And a lot of the Xorg maintainers are now working in Wayland.
So good or bad idea, Wayland is slowly shifting to being the default in virtue of being the most maintained up to date compositor.
Wayland is not a compositor. Being more maintaned than Xorg doesn't mean anything because wayland doesn't do a tenth of the things Xorg did.
What used to be maintained in one codebase by Xorg devs is now duplicated in at least three major compositors, each with their own portal implementation and who knows what else. And those are primarily maintaned by desktop environment devs who also have the whole rest of the DE to worry about.
This guy started that Xlibre fork over throwing a fit because he was told not to break Xorg with his contributions, and he ranted that he just wants to be able to merge whatever he wants. I would not trust the stability of that fork at all.
Looks to me like he's a belligerent personality, but probably not wrong when he says Redhat has an agenda that involves suppressing progress on Xorg and forcing Wayland on users instead.
I was open minded toward Wayland when the project was started... in 2008. We are 18 years down the road now. It has failed to deliver a usable piece of software for the desktop. That's long enough for me to consider it a failed project. If it still exists, it's probably for the wrong reasons (or at the least, reasons unrelated to any version of desktop Linux I want to run, like perhaps it has use in the embedded space).
Taking the proposition as true, what goal does Redhat have in "forcing wayland on users"? I am asking this in good faith, I literally naively do not understand what the "bad" bit is.
Like, ok, its 2030 and X11 is dead, no one works on it anymore and 90% of Linux users use Wayland, what did they gain? I know they did employ Pottering but not anymore, and AFAIK they contribute a non-trivial amount of code up stream to, Linux, Gnome? KDE? If more users are on wayland they can pressure Gnome to ... what?
I sort of get an argument around systemd and this, in that they can push I guess their target feature sets into systemd and force the rest of the eco-system to follow them, but, well I guess I don't get that argument either, cause they can already put any special sauce they want in Redhat's shipped systemd implementation and if its good it will be picked up, if its bad it wont be?
I guess, if Redhat maintains systemd & wayland, then they could choke out community contributions by ignoring them or whatever, but wouldn't we just see forks? Arch would just ship with cooler-systemd or whatever?
- Maintaining X requires a lot of time, expertise and cost, it's a hard codebase to work with, deprecating X saves them money
- Wayland is simpler and achieves greater security by eliminating features of the desktop that most users value, but perhaps Redhat's clients in security-conscious fields like healthcare, finance and government are willing to live without
So I suspect it comes down to saving money and delivering something they have more control of which is more tailored to their most lucrative enterprise scenarios; whereas X is an old mess of cranky unix guys and their belligerent libre cruft.
There are some parallels to systemd I guess, in that its design rejected the Unix philosophy, and this was a source of concern for a lot of people. Moreover at the time systemd was under development, my impression of Poettering was that he was as incompetent as he was misguided and belligerent - he was also advocating for abandoning POSIX compatibility, and PulseAudio was the glitchiest shit on my desktop back then. But in the end systemd simply appeared on my computer one day and nothing got worse, and that is the ultimate standard. If they forced wayland on me tomorrow something on my machine would break (this is the main point of the OP), and they've had almost 20 years to fix that but it may arguably never get fixed due to Wayland's design. So Wayland can go the way of the dodo as far as I'm concerned.
Cant Edit: I mention Pottering above because I remember similar arguments against his stuff (that I also never really fully understood in terms of "end game"), not because I have personal animosity against him or his projects or want to hold him up as "example of what can go wrong".
There is a massive class of things in open source you can look at from the perspective of "Suppose a megacorp or private equity owns this entity and wants to cut costs as much as possible while contributing as little back to the community/ecosystem as possible... what happens next?" And boom you can suddenly see the Matrix. So in the case of Redhat it's likely just IBM being IBM at the financial level and all these little decisions trend a certain way in the long run because of that
> ...what goal does Redhat have in "forcing wayland on users"?
The same goal any group of savvy corporate employees has when their marquee project has proved to be far more difficult, taken way longer, and required far more resources than anticipated to get within artillery distance of its originally-stated goal?
I've personally seen this sort of thing play out several times during my tenure in the corporate environment.
I honestly dont know what that means, I've never worked in a big company/corporation. Try and disown it? How does that fit with xlibres anti-corporate control stance? I guess it they push wayland then drop it, we're left with X11 ignored and a non-financially supported alternative?
I guess I just don't get how the third E in EEE plays out in an open source environment.
ridiculous, wayland all in all provides a far better experience than X11, and wayland projects like plasma, hyprland, sway etc are very much not failed
Might be a stupid question, but what's wrong with Xorg?
I know that it wasn't originally conceived to do what it does today, but I've never had any problem using it, and when I tried Wayland I didn't notice any difference whatsoever.
Is it just that it's a pain to write apps for it..?
It makes sand-boxing security impossible. The moment a process has access to the Xorg socket, it has access to everything. It is weird that this oftentimes misses from the discussion though.
QubesOS and Xpra+Firejail demostrate security can be improved including the X11 side. Solaris had Trusted Extensions. X11Libre has a proposal for using different magic cookies to isolate clients and give dummy data to the untrusted. Keith Packard also proposed something in 2018.
It is already possible today. There are access control hooks provided via XACE. Nobody uses them because the attack scenario is basically non-existent. If you run untrusted malicious apps having full access to your home directory you have big problems anyways. Not giving them access to e.g. the screen coordinates of their windows won't help you much then.
I often see comments of "everything works perfect in wayland" which makes me wonder how many features some people use. I've tried wayland a few times now and have always noticed small quirks. A few current examples: shading a window actually leaves an invisible section that can't be clicked where the window was, shading and other window activities being inconsistent across various window types (terminal, file manager, etc.), picture-in-picture mode of browsers doesn't maintain aspect ratio, picture-in-picture doesn't maintain "always on top" or position when enabling it (I've managed to fix the "always on top" by writing a rule to apply to windows with "Picture in picture" as the title, at least)
KDE Plasma switched to Wayland by default sometime last year, and so far the main issue I run into is that a few screen recording tools I like stopped working. (Mostly simplescreenrecorder, which seems to be entirely unmaintained at this point.) Other than some initial instability with accelerated rendering on my GPU, which was quickly addressed, it kinda just works. I mostly don't notice.
Actually, GPU acceleration was why I initially switched. For whatever reason, this GPU (Radeon VII) crashes regularly under X11 nearly every time I open a new window, but is perfectly stable under wayland. Really frustrating! So, I had some encouragement, and I was waiting for plasma-wayland to stabilize enough to try it properly. I still have the X11 environment installed as a fallback, just in case, but I haven't needed to actually use it for months.
Minor pain points so far mostly include mouse acceleration curves being different and screen capture being slightly more annoying. Most programs do this OS-level popup and then so many follow that up with their own rectangle select tool after I already did that. I had some issues with sdl2-compat as well, but I'm not sure that was strictly wayland's fault, and it cleared up on its own after a round of updates. (I develop an SDL2 game that needs pretty low latency audio sync to run smoothly)
> Mostly simplescreenrecorder, which seems to be entirely unmaintained at this point
I use it extensively, it's easy to use, UI is compact but clear, works perfectly all the time. I honestly don't care that it is unmaintained at this point.
> KDE Plasma switched to Wayland by default sometime last year, and so far the main issue I run into is that a few screen recording tools I like stopped working. (Mostly simplescreenrecorder, which seems to be entirely unmaintained at this point.) Other than some initial instability with accelerated rendering on my GPU, which was quickly addressed, it kinda just works. I mostly don't notice.
FWIW, I have a KDE Wayland box and OBS works for screen recording. Slightly more complex than simplescreenrecorder, but not bad.
I've been using Kooha, but it's painful in a few ways, not least of which having aggressive compression that can't be disabled. SSR was nice because of the reduced time between "decide to record" to "draw a rectangle, done." OBS works, but is very clunky and cumbersome to reconfigure.
At some point I'll get irritated enough to seek out more alternatives and give them a whirl. Such is fate :)
2026 is starting with half-baked NVidia drivers and missing functionality on linux? I am so surprised... did you try 17 different previous versions to get it running in true NV-Linux fashion?
This stuff has been flawless on AMD systems for a while a couple of years now, with the exception of the occasional archaic app that only runs on X11 (thus shoved in a container).
Flawless on AMD? Absolutely not. 2-3 years ago there used to be a amdgpu bug that froze my entire laptop randomly with no recourse beyond the 4 second power button press. After that was fixed, it sometimes got stuck on shutdown. Now it doesn't do that randomly anymore, but yet all it takes to break it, is to turn off the power to my external monitor (or the monitor powering off by itself to save energy) or unplugging it, after which it can no longer be used without rebooting and then sometimes it gets stuck on shutdown.
Clarification: The AMD iGPU driver (or Chrome) on Ubuntu 24.04 has bugs on your hardware. You could try a newer and different distro (just using a live-USB) to see if that has been fixed.
I recently upgraded to Ubuntu 25.10, and decided to give Wayland another go since X.org isn't installed by default anymore.
Good news: My laptop (Lenovo P53) can now suspend / resume successfully. With Ubuntu 25.04 / Wayland it wouldn't resume successfully, which was a deal breaker.
Annoying thing: I had a script that I used to organize workspaces using wmctrl, which doesn't work anymore so I had to write a gnome-shell extension. Which (as somebody who's never written a gnome-shell extension before) was quite annoying as I had to keep logging out and in to test it. I got it working eventually but am still grumpy about it.
Overall: From my point of view as a user, the switch to Wayland has wasted a lot of my time and I see no visible benefits. But, it seems to basically work now and it seems like it's probably the way things are headed.
Edit: Actually I've seen some gnome crashes that I think happen when I have mpv running, but I can't say for sure if that's down to Wayland.
At this point the primary thing that's keeping me from switching to Wayland (KDE) is lack of support for remote desktop software, especially with multiple monitors...
Hopefully AnyDesk and Remmina will address this issue before KDE ends it's mainline X11 support next year.
I've had a similar issue recently and I found that rustdesk[0] works pretty well for casual use despite wayland support being labelled experimental. I use it for pair programming with someone on multiple monitors while I'm on a laptop and all the switching and zooming required worked.
It doesn’t have to be like X11. Presumably, it’d be something you could disable if you’d like.
It’d be very handy if we had a performant remote desktop option for Linux. I could resume desktop sessions on my workstation from my laptop and I could pair program with remote colleagues more effectively.
In the past I’d boot into Windows and then boot my Linux system as a raw disk VM just so I could use Windows’s Remote Desktop. Combined with VMware Workstation’s support for multiple monitors, I had a surprisingly smooth remote session. But, it was a lot of ceremony.
Glad to see a good write-up of Wayland issues. My day-to-day doesn't run into the vast majority of these problems so when I see people melt down over a single trivial seeming Wayland choice about window coordinates then I have a really hard time relating.
This post is a lot more relatable.
As an aside, regarding remote Emacs - I can attest that Waypipe does indeed work fantastically for this. Better than X11 ever worked over the network for me.
I, too, suffer from the pgtk is slow issue (only a 4k monitor though it's mitigable and manageable for me)
As an Emacs PGTK user, do you have any experience with modifiers beyond the basic 4? I recently tried to use PGTK Emacs and it seems to not support e.g. Hyper, which is a bummer, because I extensively use Hyper in my keybindings.
Note to people on this thread: the impression the discussions give is that Linux isn't ready for prime time desktop use. I thought Wayland was the latest and greatest, but folks here report issues and even refuse to ever use it.
Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026. If you are a Linux on desktop advocate, read the comments and see why so many are still hesitating.
>I thought Wayland was the latest and greatest, but folks here report issues and even refuse to ever use it.
>Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026.
Quite ironically there're people refusing to leave Windows 7, which has been EOS since 2020, because they find modern Windows UI unbearable. Windows 11 being considered that bad that people are actually switching OSes due to it. Have seen similar comments about OSX/macOS.
The big difference between those and Linux is that Linux users have a choice to reject forced "upgrades" and build very personalized environments. If had to live with Wayland could do it really, even if there're issues, but since my current environment is fine don't really need/care to. And it's having a personalized environment such a change is a chore. If was using a comprehensive desktop environment like GNOME (as many people do), maybe wouldn't even understand something changed underneath.
> Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026.
LOL
I installed a new windows 11 yesterday on a fairly powerful machine, everything lags so much on a brand new install it's unreal. Explorer takes ~2-3 seconds to be useable. Any app that opens in the blink of an eye under Linux on the same machine takes seconds to start. Start menu lags. It's just surrealistic. People who say these things work just have never used something that is actually fast.
I am not sure how people get all these issues. I installed fresh windows recently, and I don't see noticable any slowdowns.
Linux is faster in some places, maybe. But still with many issues like some applications not being drawn properly or just some applications not available (nice GUI for monitor control over ddc)
Anecdotally, everything works flawlessly on my work machine: Optiplex Micro, Intel iGPU, Fedora KDE 43, 4K 32" primary monitor at 125% scale, 1440p 27" secondary monitor at 100%. No issues with Wayland or with anything else.
Everything actually feels significantly more solid/stable/reliable than modern Windows does. I can install updates at my own pace and without worrying that they'll add an advert for Candy Crush to my start menu.
I also run Bazzite-deck on an old AMD APU minipc as a light gaming HTPC. Again, it's a much better experience than my past attempts to run Windows on an HTPC.
As with everything, the people having issues will naturally be heard louder than the people who just use it daily without issues.
As a long-time Linux user I've also felt an incongruity between my own experiences with Wayland and the recent rush of "year of the Linux desktop" posts. To be fair, I think the motivation is at least as much about modern Windows' unsuitability for prime time rather as Linux's suitability. I haven't used Windows for a long time so I can't say how fair that is, but I definitely see people questioning 2026 Windows' readiness for prime time.
For me, Wayland seems to work OK right now, but only since the very latest Ubuntu release. I'm hoping at this point we can stop switching to exciting new audio / graphics / init systems for a while, but I might be naive.
Edit: I guess replacing coreutils is Ubuntu's latest effort to keep things spicy, but I haven't seen any issues with that yet.
Edit2: I just had the dispiriting thought that it's about twenty years since I first used Ubuntu. At that point it all seemed tantalizingly close to being "ready for primetime". You often had to edit config files to get stuff working, and there were frustrating deficits in the application space, but the "desktop" felt fine, with X11, Alsa, SysV etc. Two decades on we're on the cusp of having a reliable graphics stack.
>I just had the dispiriting thought that it's about twenty years since I first used Ubuntu. At that point it all seemed tantalizingly close to being "ready for primetime".
I feel the same and find it a bit strange. I am happy with hyprland on wayland since a few months back but somehow it reminds me of running enlightenment or afterstep in the 90s. My younger self would have expected at least a decade of "this is how the UI works in Linux and it's great" by now.
Docker and node both got started after wayland and they are mature enterprise staples. What makes wayland such a tricky problem?
But then I try and focus on what each author thinks is important to them and it’s often wildly different than what’s important to me.
But a lot of internet discussion turns into very ego-centric debate including on here, where a lot of folks who are very gung-ho on the adoption of something (let’s say Linux, but could be anything) don’t adequately try and understand that people have different needs and push the idea of adoption very hard in the hopes that once you’re over the hump you might not care about what you lost.
I don't know what "ready for prime time desktop use" means. I suspect it means different things for different people.
But with Linux being mostly hobbyist-friendly a number of folks have custom setups and do not want to be forced into the standardized mold for the sake of making it super smooth to transition from Windows.
I have such a setup (using FVWM with customized key bindings and virtual layout that I like, which cannot work under Wayland), so can I donate some money to Microsoft to keep Windows users less grumpy and not bringing yet another eternal September to Linux. I like my xorg, thank you very much :).
I'm not very familiar with Wayland, and the fact that XWayland exists means that I don't really have much sense for whether a given app is using Wayland or not. I also don't do anything very fancy. I have a single, sub-4k monitor and don't use HDR or other things. Am I using Wayland? Sometimes? Most of the time? I'm really not 100% sure.
> Wayland smells like IPv6 to me. No need to switch, and it hurts when you try.
I'm very happy with Wayland, but what a strange comparison to make if you're not. IPv6 is objectively an enormous improvement over IPv4, and the only gripe with it is that it's still not ubiquitous.
I’ll concede that IPv6 has usefulness on the public Internet, where adoption is actually gaining nicely. No issues there really.
However, my comparison is end-user focused (ie. the Linux desktop experience). I should have been more clear about the scope perhaps.
Both IPv6 and Wayland have increased complexity and surface area for pain (cost) without an obvious benefit for the end-user.
Also: wrt IPv6 specifically, I don’t believe every device on a private network should be publicly addressable/routable. To me that’s a bug, not a feature, and again does not serve the consumer, only the producer.
> Both IPv6 and Wayland have increased complexity and surface area for pain (cost) without an obvious benefit for the end-user.
I'd argue the opposite: IPv6 has lowered complexity for the end user: SLAAC, endless addresses, no need for CIDR – these are all simplifications for the end user.
> Also: wrt IPv6 specifically, I don’t believe every device on a private network should be publicly addressable/routable. To me that’s a bug, not a feature,
Some would argue it's a feature. But let's say it's not useful. It's still surely not a bug. An address being publicly routeable doesn't mean you have to route traffic to it. Just don't, if you don't want to.
> and again does not serve the consumer, only the producer.
I'd argue that it simplifies some things for the consumer (see above), and also lets the consumer be a producer more easily. I'd argue that that's a good thing, more in the spirit of the internet. But even if the end user doesn't care, it's not a detriment.
I agree with the parent comment. I have sway on my laptop, i3 on my desktop, I don't notice any difference. Well except sharing and annoying small sway things that works on i3.
Just as I am oblivious to whether this is posted over ipv4 or 6.
That they all have to implement the protocol seems like 20 years of wayland might actually have hurt Linux more than it fixed - without it something else would have happened. Think of how many man hours have been wasted doing the same thing for KDE, gnome, sway, hyprland, etc.
(also I agree about the publicly available thing, it's a bug for me as well. Companies will harvest everything they can and you better believe defaults matter - aka publicly available, for the producer, but they will say your security, of course)
I've been trying to switch to Wayland and KDE plasma for some time but it's just so glitchy. Graphics bugs such as the tasks switcher showing black or flickery preview thumbnails or Firefox bringing down the whole system when opening a single 4k PNG indicate that it's still unfortunately very much an alpha.
Interesting, I had these issues around 2 years ago with my Nvidia GPU, making Wayland unusable (especially the honestly probably epilepsy-inducing flicker).
After an Nvidia graphics driver release everything cleared up to be very usable (though occasionally stuff still crashed, like once or twice a week). I heavily dislike Nvidia and went with AMD just around a month ago, zero issues.
My experience is very similar. Just today, I was trying Wayland again but it didn't work out.
One of the obstacle that I faced is wrong resolution. On Xorg I could just add new mode and get up and running quickly. On Wayland, I have to either do some EDID changes or go through even worse.
Can anyone recommend an autoclicker they actively use on Wayland? I've been using ydotool but the daemon service is janky (fails to startup/shutdown frequently, also had issues where half my inputs don't work while it's running)
> I've been using ydotool but the daemon service is janky (fails to startup/shutdown frequently,
I'd be investigating that issue instead, should have errors in systemd/journalctl or whatever you use for managing daemons. I'm using ydotool on Arch, pretty much all defaults, together with a homegrown voice dictation thing, and it's working 100% of the times.
What a fantastic and timely post! Especially coming from i3 maintainer! Michael did such diligent analasys and saved me (and hopefully others) a lot of time. I was considering trying Wayland/sway and this post answered all my questions and showed me that it is not ready, at least for me, yet.
I'm on macOS, and I use XQuartz [1] occasionally for Linux/Unix GUI apps: if something is 'written for' (?) Wayland, can I send its GUI windows across the network (over SSH)?
My question is how long will it take for core necessities like push to talk in discord running in a background tab in my browser while I game with my 50+ closest friends to work under wayland. I hope I don’t develop a need for accessibility tooling the next couple of decades given the current progress.
I looked into this lately - Discord needs to use the Global Shortcuts Portal to do it properly but how is unclear. Discord is based on Electron which is based on Chromium. Chromium has support and Electron kind of has support since https://github.com/electron/electron/pull/45171 but this seems to be rather unknown and unused. Although somewhere in this API chain keyup events are lost, meaning that only "normal" shortcuts would work but no push-to-talk. There are multiple options for Discord to implement this: implement Global Shortcuts Portal directly, go via Electron global shortcuts API, hook into Chromium shortcuts API, maybe others - with the caveat that some of those don't support keyup events. Vesktop devs are currently stuck in same dilemma: https://github.com/Vencord/Vesktop/issues/18
> Sometimes, keyboard shortcuts seem to be executed twice!
Sounds like someone made a listener that listens on key events, but didn't bother to check the state of the event, meaning it hits releases as well. Should be easy to verify by keeping them pressed long enough to trigger the key repeat events.
> I also noticed that font rendering is different between X11 and Wayland! The difference is visible in Chrome browser tab titles and the URL bar, for example:
Tab title display is not owned by wayland unless you are running with the client side decor extension, which Gnome is not. So looking at the application or GUI framework (GTK in this case) are really the two only choices.
Try using Zoom client with screen sharing. Doesn't work and so on many applications limited by functionality. People say its year of Linux 2026 and xorg is dead. But its not even close to make it work for basic functionality. You can blame on vendors but as long as user functionality is not working, its never a working solution
Yes, it now works in Zoom, but not in Webex, unfortunately. That's been a big obstacle for me. I'd need to be able to share individual windows with audio.
> But rather quickly, after moving and resizing browser windows, the GPU process dies with messages like the following and, for example, WebGL is no longer hardware accelerated:
Is this specific to the WM he used or does HW acceleration straight up not work in browsers under Wayland? That to me seems like a complete deal breaker.
For me a no-go for wayland is no support in LXDE and Xfce, which are very good lightweight out of the box user friendly DEs. There is LXQt and Xfce initiated the migration but until then it doesn't worth. Other hurdles are less important like multiscreen, multi-seat, Nvidia.
Wayland being contemporary with the financial crisis makes sense in my head but
I'll probably spend the rest of today processing that it's 18 years ago.
Does hot plugging work right yet? Was quickly discouraged when KVM caused crashes and the open issue said "you're holding it wrong, buy edid emulators"
It's fun how most of the complaints are like "it works fine on Gnome but I will still blame Wayland because my tiling WM doesn't support it". So maybe try using a proper Wayland implementation
The Chrome crashes when resizing a window doesn't makes any sense, apart from being a WM fault. The Xwayland scaling, again, has native scaling support on Gnome. Same for the monitor resolution problem (which he acknowledged). Same for font rendering. Idk.
GNOME’s “proper wayland implementation” also does not work with my monitor, as I explained in the article:
> By the way, when I mentioned that GNOME successfully configures the native resolution, that doesn’t mean the monitor is usable with GNOME! While GNOME supports tiled displays, the updates of individual tiles are not synchronized, so you see heavy tearing in the middle of the screen, much worse than anything I have ever observed under X11. GNOME/mutter merge request !4822 should hopefully address this.
This reminds me of when pulseaudio came on the scene. Bizarrely there was a short period when PA was superior to everything else. I could set per source and per sink volumes. It was bonkers. The perfect mixer. Then something else happened.
Don’t know what the deal is with Linux desktop experience. I have encountered various forms of perfection and had them taken away.
Once on my XPS M1330 I clicked to lift a window and then three finger swiped to switch workspace and the workspace switched and I dropped the window. It was beautiful. I didn’t even notice until after I’d done it what an intuitive thing it felt like.
Then a few years later I tried with that fond memory and it didn’t work. Where did the magic go?
Probably some accidental confluence of features broken in some change.
So im using linux desktops for decades now, and bout 2 years ago i finally ditched my for gaming only windows install to go onto linux only setups for gaming also.
I mean, it works alot better than it did before, still i wouldn't recommend it for someone who isn't ready to tinker in order to make stuff work.
The point why i mention this is, while most normal desktop/coding stuff works okay with wayland, as soon i try any gaming its just a sh*show. From stuff that doesn't even start (but works when i run on x) to heavyly increased performance demands from games that work a lot smoother on x.
While i have no personal relation to any of both, and i couldn't technically care less which of them to use - if you are into gaming, at least in my experience, x is rn still the more stable solution.
1) Hugely enjoyable content - as usual - by Michael Stapelberg: relevant, detailed, organized, well written.
2) I am also an X11 + i3 user (and huge thanks to Michael for writing i3, I'm soooo fast with it), I also keep trying wayland on a regular basis because I don't want to get stuck using deprecated software.
I am very, very happy to read this article, if only because it proves I'm not the only one and probably not crazy.
Same experience he has: everytime I try wayland ... unending succession of weird glitches and things that plain old don't work.
Verdict: UNUSABLE.
I am going to re-iterate something I've said on HN many times: the fact that X11 has designs flaws is a well understood and acknowledged fact.
So is the fact that a new solution is needed.
BUT, because Wayland is calling themselves the new shite supposed to be that solution DOES NOT AUTOMATICALLY MEAN they actually managed to solve the problem.
As a matter of fact, in my book, after so many years, they completely and utterly failed, and they should rethink the whole thing from scratch.
And certainly not claim they're the replacement until they have reached feature and ease of use parity.
Which they haven't as Michael's article clearly points out.
You are totally free to work on whatever you want to. You don't have to use the software that the Wayland devs (and other developers that like Wayland) produces. You can use and code whatever you want.
The way this article styles the name of the GPU company "nVidia" is really distracting! The company has always referred to itself in all capitals, as in NVIDIA, and only their logos have stylized a lowercase initial n, which leads to perhaps nVIDIA if you want, or nᴠɪᴅɪᴀ for those with skills or, for normal people, just nvidia. But "nVidia" is a mixture of mistakes.
"Jen-Hsun Huang certifies that he is the president and secretary of NVidia Corporation, a California corporation." - ARTICLES OF INCORPORATION OF NVidia Corporation, 1993, filed with the California Secretary of State and available online.
"The name of this corporation is NVIDIA Corporation." - 1995 amendment.
I’ve been using Wayland on Debian 12 since 2023. On an Apple Studio Display (5K) over thunderbolt (the built-in camera, speakers, etc work fine)
I screen share and video call with Slack and Google Meet.
I use alacritty/zsh/tmux as my terminal. I use chromium as my browser, vscode and sublime text as code editors.
Slack, Spotify, my studio mic, my Scarlett 2i2, 10gbe networking, thunderbolt, Logitech unifying receiver…. Literally everything “just works” and has been a joy to use.
Only issues I’ve ever faced have been forcing an app to run native Wayland not xwayland (varies from app to app but usually a cli flag needed) and Bluetooth pairing with my Sony noise canceling which is unrelated to Wayland. Periodically I get into a dance where it won’t pair, but most of the time it pairs fine.
Since there are many wayland compositors, wayland clients must be very conservative (don't be fancy) and most of all respect the dynamic discovery of the interfaces and features and must adjust (from core to stable interfaces).
For instance, a compositor may not support a clipboard, and the "data" related interfaces must be queried for availability (those interface are stable in core) and the client must disable such functionality if not there (for instance, wterm terminal is faulty because it forces a compositor to have such interfaces... but havoc terminal is doing it right). I don't know yet if libSDL3 wayland support "behaves" properly. wterm fix is boring but should be easy.
As wayland usage, it is probably almost everwhere (and Xwayland is there for some level of legacy compatibility).
(I am currently writting my own compositor for AMD GPUs... in risc-v assembly running on x86_64 via an interpreter)
I’ve been using Wayland exclusively for about 2 years.
It’s great.
And when it’s not it gets fixed.
X11 isn’t a project anymore, it’s a nightmare of empty meetings and discussions, no coders.
Wayland is primarily a protocol, but most definitely not a "success" to
the xorg-server. This is why it does not have - and will never have - the
same feature set. So trying to sell it as "the new shiny thing" after almost
20 (!!!!!) years, is simply wrong. One should instead point out that wayland
is a separate way to handle a display server / graphics. There are different
trade-offs.
> but for the last 18 years (!), Wayland was never usable on my computers
I can relate to this a bit, but last year or perhaps even the year before,
I used wayland via plasma on manjaro. It had various issues, but it kind of
worked, even on nvidia (using the proprietary component; for some reason the
open-source variant nouveau works less-well on my current system). So I think
wayland was already usable even before 2025, even on problematic computer
systems.
> I don’t want to be stuck on deprecated software
I don't want to be stuck on software that insinuates it is the future when it
really is not.
> With nVidia graphics cards, which are the only cards that support my 8K monitor, Wayland would either not work at all or exhibit heavy graphics glitches and crashes.
I have a similar problem. Not with regards to a 8K monitor, but my ultra-widescreen
monitor also has tons of issues when it comes to nvidia. I am also getting kind of tired of nvidia refusing to fix issues. They are cheap, granted, but I'd love viable alternatives. It seems we have a virtual monopoly situation here. That's not good.
> So the pressure to switch to Wayland is mounting!
What pressure? I don't feel any pressure. Distributions that would only support
wayland I would not use anyway; I am not depending on that, though, as I compile everything from source using a set of ruby scripts. And that actually works, too.
(Bootstrapping via existing distributions is easier and faster though. As stated, trade-offs everywhere.)
> The reason behind this behavior is that wlroots does not support the TILE property (issue #1580 from 2019).
This has also been my impression. The wayland specific things such as wlroots, but also other things, just flat out suck. There are so many things that suck with this regard - and on top of that, barely any real choice on wayland. Wayland seems to have dumbed down the whole ecosystem. After 20 years, having such a situation is shameful. That's the future? I am terrified of that future.
> During 2025, I switched all my computers to NixOS. Its declarative approach is really nice for doing such tests, because you can reliably restore your system to an earlier version.
I don't use NixOS myself, but being able to have determined system states that work and are guaranteed to work, kind of extends the reproducible builds situation. It's quite cool. I think all systems should incorporate that approach. Imagine you'd no longer need StackOverflow because people in the NixOS sphere solved all those problems already and you could just jump from guaranteed snapshot to another one that is guaranteed to also work. That's kind of a cool idea.
The thing I dislike about NixOS the most is ... nix. But I guess that is hard to change now. Every good idea to be ruined via horrible jokes of an underperforming programming language ...
> So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
I had a similar impression. I guess things will improve, but right now I feel as if I lose too much for "this is now the only future". And I don't trust the wayland-promo devs anymore either - too much promo, too few results. After 20 years guys ...
> The thing I dislike about NixOS the most is ... nix. But I guess that is hard to change now. Every good idea to be ruined via horrible jokes of an underperforming programming language ...
I don’t get the hate for Nix, honestly. (I don’t get the complaints that it’s difficult, either, but I’m guessing you’re not making one here. I do get the complaint that the standard library is a joke, but you’re not making that one either that I can see.) The derivation and flake stuff excepted, Nix is essentially the minimal way to add lazy functions to JSON, plus a couple of syntax tweaks. The only performance-related thing you could vary here is the laziness, and it’s essential to the design of Nixpkgs and especially NixOS (the only config generator I know that doesn’t suck).
I’ll grant that the application of Nix to Nixpkgs is not in any reasonable sense fast, but it looks like a large part of that is fairly inherent to the problem: you’ve got a humongous blob of code that you’re going to (lazily and in part) evaluate once. That’s not really something typical dynamic-language optimization techniques excels at, whatever the language.
There’s still probably at least an order of magnitude to be had compared to mainline Nix the implementation, like in every codebase that hasn’t undergone a concerted effort to not lose performance for stupid reasons, but there isn’t much I can find to blame Nix the language for.
Was expecting some real unproductive and entitled whining based on the title, but was pleasantly surprised - someone actually investigating and debugging their wayland issues rather than putting their head in the sand and screaming “X11 FOREVER!!!”
A rather big problem is that Wayland is just a protocol, not an implementation. There are many competing implementations, like Gnome, KDE and wlroots. The problems you have with one of them might not appear in another. The reference compositor, Weston, is not really usable as a daily driver. So while with Xorg you have a solid base, and desktops are implemented on top of that, with Wayland the each desktop is reinventing the wheel, and each of them has to deal with all the quirks of the graphics drivers. I think this is a big problem with the architecture of Wayland. There really should be a standard library that all desktops use. Wlroots aims to be one, but I don't see Gnome and KDE moving to it anytime soon.
X.org picked the right level of abstraction (even if implementation could use a rewrite). No WM should care about handling raw inputs or forced to be proxy between driver and the app for the output (it could be, if it needed/wanted, but there is no reason to add another layer of abstraction and cycle-wasting for most use cases). And it shows in complexity and even in power use. Wayland basically failed to learn the lessons from X11
That's easy to say in hindsight. It is only with the specific failures of Wayland that we see which lessons it could have learned from X11.
No, the lesson of “separate display server from window manager” was very clear when Wayland was started. People have been discussing this over the years ever since. (See also “client-side decorations” for another part of this issue that was heavily discussed.)
I seem to remember reading in an old paper (1990s?) that the asynchronous nature of the connection between the X server and the window manager results in essentially unfixable protocol-level races.
Maybe this one [1] from 1987? It says:
> There are many race conditions that must be dealt with in input and window management because of the asynchronous nature of event handling. [...] However, not all race conditions have acceptable solutions within the current X design. For a general solution it must be possible for the manager to synchronize operations explicitly with event processing in the server. For example, a manager might specify that, at the press of a button, event processing in the server should cease until an explicit acknowledgment is received from the manager.
[1] https://web.mit.edu/6.033/2006/wwwdocs/papers/protected/xwin...
I went and checked, and it was an SPE article from 1990[1]: “Why X is not our ideal window system” by Gajewska, Manasse, and McCormack. Section 3 has extensive diagrams of various races, including ones between clients and the window manager. Was even discussed here at one point[2], it turns out. Where I came across it I’m not sure (on X.org[3] possibly?).
[1] https://dx.doi.org/10.1002/spe.4380201409, https://people.freedesktop.org/~ajax/WhyX.pdf or http://os.4uj.org/WhyX.pdf
[2] https://news.ycombinator.com/item?id=15120308
[3] https://x.org/wiki/XorgDeveloperDocumentation/
Wayland has a philosophy of "every frame is perfect", which means fixing every race condition. However, X11 doesn't have this philosophy. If the window manager is slow and doesn't respond to a notification that a window has been resized, drawing the new window content over the old borders is the correct thing to do. What sense does it make to freeze the whole display just for a window border?
Similarly, tearing gets pixels to the screen faster.
Reacting somehow to user input, even if not perfect, is more important for me... That's why we have HW cursors, and frame interpolation in games, etc...
Tearing and hidpi is why I left Linux for Windows between 2012 ans 2022. Once wayland was good enough I returned. Tearing is awful, and should be opt in (which wayland provides), not opt out.
Conversely, I much prefer lowest latency at the cost tearing; when I'm forced to use windows I generally disabled the compositor there too whenever i could (I certainly don't use one under Linux and that's one of my reasons for being there). I find macOS unuseable, even on brand new top-end mac studios the input lag and slow reaction of the OS to... any user input, is frightening when you're used to your computer reacting instantly.
The races I recall being described were substantially worse, but that’s largely beside my point.
My point is that, now that bare fillrate and framebuffer memory haven’t been a limiting factor for 15 to 20 years, it is a reasonable choice to build a desktop graphics system with the invariant of every frame being perfect—not even because of the user experience, but because that allows the developer to unequivocally classify every imperfect frame as a bug. Invariants are nice like that. And once that decision has been made, you cannot have asynchronous out-of-process window management. (I’m not convinced that out-of-process but synchronous is useful.) A reasonable choice is not necessarily the right choice, but neither is it moronic, and I’ve yet to see a discussion of that choice that doesn’t start with calling (formerly-X11) Wayland designers morons for not doing the thing that X11 did (if in not so many words).
To be clear, I’m still low-key pissed that a crash in my desktop shell, which was deliberately designed as a dynamic-language extensibility free-for-all in the vein of Emacs or TeX, crashes my entire graphical session, also as a result of deliberate design. The combination of those two reasonable decisions is, in fact, moronic. But it didn’t need to be done that way even on Wayland.
Perfect frames is what Mac and Windows provide and what Linux should also aim for. Border tearing is a display bug, correctness should come first, Wayland's approach is right. X was designed for CPU and IO constraints that no longer apply. _Graceful_ degradation of slow UI should lower the frame rate, not compromise rendering of individual frames.
Your memory is insanely impressive to remember such an obscure article from so long ago.
>That's easy to say in hindsight
That's an easy way to excuse bad design. Look at the designs of other operating systems designed by professionals and you won't see windows managers having to handle raw inputs or being in the same process as the compositor.
Examples of other operating systems allegedly not designed by professionals:
https://en.wikipedia.org/wiki/Desktop_Window_Manager
The Desktop Window Manager is a compositing window manager, meaning that each program has a buffer that it writes data to; DWM then composites each program's buffer into a final image.
https://web.archive.org/web/20040925095929/http://developer....
The Quartz Compositor layer of Mac OS X comprises the window server and the (private) system programming interfaces (SPI) implemented by the window server. In this layer are the facilities responsible for rudimentary screen displays, window compositing and management, event routing, and cursor management.
The window server is a single system-wide process that coordinates low-level windowing behavior and enforces a fundamental uniformity in what appears on the screen. It is a lightweight server in that it does not do any rendering itself, but instead communicates with the client graphics libraries layered on top of it. It is “agnostic” in terms of a drawing model.
The window server has few dependencies on other system services and libraries. It relies on the kernel environment’s I/O Kit (specifically, device drivers built with the I/O Kit) in order to communicate with the frame buffer, the input infrastructure, and input and output devices.
Window management on Windows is done by Explore which talks to DWM where the underlying windows live.
Window management on MacOS is done by Dock which talks to Quartz Compositor where the underlying windows live.
You are conflating Window Manager with Task Switcher programs.
No, I'm not. Explore and Dock are responsible for more than just that.
Sorry but you’re just wrong. Explore.exe and Dock.app are nere user interfaces and are not involved in the render pipeline of other apps.
I am talking about window management. Window management is about controling windows, windows managers should not care about how windows are rendered.
1. Nobody else is talking about managing windows as a user. They’re talking about the system that manages windows for drawing and interaction.
2. You’re provably wrong even if someone followed your description because you can kill the dock or explorer process and still be able to switch between windows and move them around. Killing explorer is a little more heavy handed than killing the dock but it doesn’t take down the window manager.
Explorer.exe and Dock.app have nothing to do with anything anyone is talking about here.
No, this was a major point of discussion right at the start. They chose to ignore it.
They should have looked at Plan9 and the Rio window manager there.
I don’t know how GPU acceleration would have fit in, but I bet it would have been trivial provided the drivers were sufficient.
All of Rio in Plan9 is 6K lines of code and it’s a more powerful display protocol and window manager (all of the fundamentals are there but none of the niceties) than anything else I’ve ever seen.
The defacto way to remote into a Plan9 system from any OS even today is to use a client side program which implements it all the same way Plan9 does.
The beauty with Free Software and Linux distros is that "they" don't have to do it, anyone who wants to (including you!) can do it.
I think the lack of base abstraction layer was pretty obvious from the start.
Since this is a Wayland thread, obviously the problem is a lack of a common implementation, which deviates from UNIX tradition.
For those who want to complain how lack of choice between multiple implementations is an obvious problem and deviates from UNIX tradition, please wait until the next systemd thread.
Weird strawman, but you do you.
The blanket statement "right level of abstraction" betrays a pretty narrow minded view. Right abstraction for what?
The big thing to me is, Wayland servers have way way less responsibility than X. X had a huge Herculean task, of doing everything the video card needed. It was a big honking display server because it took up a huge chunk of the stack to run a desktop.
Wayland servers all use kernel mode setting kernel buffers, so much more. So much of the job is done. There is a huge shared code base that Wayland has that X never had, good Kernel's with actual drivers for GPUs.
If we wanted one stable platform that we could not innovate on, that was what it was and we all had to deal with it... We'd all just use Mac. punchyHamster is saying The Cathedral is the right model and The Bazaar is the bad model, of the famous Cathedral vs Bazaar.
But the model really does not enable fast iteration & broader exploration of problem spaces. The ask doesn't even make sense: there are incredibly good libraries for making Wayland servers (wlroots, smithay, more). And they're not always even huge, but do all the core protocols. Some people really want professional industrial direct software that they never have to think about that only works one way and will only evolve slowly and deliberately. I'm thankful as fuck Wayland developers aren't catering to these people, and I think that's the wrong abstraction for open source and the wrong excitement to allow timeless systems to be built grown and evolved. We should avoid critical core dependencies, so that we can send into the future, without being tied to particular code-bases. That seems obvious and proposing otherwise to consign ourselves to small limp fates.
> Wayland basically failed to learn the lessons from X11
To me the biggest issue of Wayland is that it aimed, on purpose, to imitate Windows or OS X or any GUI that is not built on the idea of a client/server protocol.
From TFA:
> I’ll also need a solution for running Emacs remotely.
If only there was something conceived from the start as a client/server display protocol...
For remote applications, Waypipe works fine for me at least.
I use it as my daily driver. I used Sway for a very long time, tried Hyprland for a bit and am now running niri as my daily driver. Sway and niri are wlroots based, Hyprland at some point rolled its own because they didn't want to wait for wlroots protocol extensions. Sometimes I have to switch to Gnome to do screen sharing.
2026 and you will still run into plenty of issues with random behaviour, especially if you run anything based on wlroots. Wine apps will randomly have pointer location issues if you run multiple displays. Crashes, video sharing issues with random apps, 10 bit issues. Maybe in 2027 we'll finally make it. But I feel like these 20 years of development could have been better spent on something that doesn't end up with 4 or more implementations.
There also isn't nearly as much choice for wms. My favorite WM is cwm, but the closest alternative on Wayland is Hikari which is abandoned.
I noticed it's far far more work to build a wm for Wayland than it is for Xorg.
niri is based on smithay which is also used by COSMIC.
I used rust for my sleep daemon, but personally I think rust is a suboptimal language for efficiently writing wayland code.
> The problems you have with one of them might not appear in another.
Because both have their own portal implementation/compositor with their own issues and service spec implementations. KDE has xdg-desktop-portal-kde, and GNOME has xdg-desktop-portal-gnome. On top of that each (still) has their own display server; KDE has KWin, and GNOME has Mutter.
> The reference compositor, Weston, is not really usable as a daily driver.
Weston is probably good for two things: Running things in Kiosk mode and showcasing how to build a compositor.
That's why you should at least use xdg-desktop-portal if you are not running KDE or GNOME. But this is a vanilla compositor (without implementations of any freedesktop desktop protocols), and as-is has no knowledge of things like screenshots or screensharing.
If you run any wlroots based compositor except Hyprland you should run xdg-desktop-portal-wlr which does implement the desktop protocols org.freedesktop.impl.portal.Screenshot and org.freedesktop.impl.portal.ScreenCast.
If you use Hyprland you should run its fork xdg-desktop-portal-hyprland instead which additionaly has things like file picking built in. Additionally you can/should run xdg-desktop-portal-gtk and/or xdg-desktop-portal-kde to respectively get GTK ("GNOME") and QT ("KDE") specific implementations for desktop protocols. And you absolutely should use xdg-desktop-portal-gtk instead of xdg-desktop-portal-gnome, because xdg-desktop-portal-gnome really doesn't like to share with others.
> With Wayland the each desktop is reinventing the wheel
Not really true, as I mentioned earlier there's still a DE specific display server running in the background (like Mutter and KWin-X11 for X11), and graphics in each compositor is driven directly by the graphics driver in the kernel (through KMS/DRM).
In fact, on paper and in theory, the architecture looks really good: https://wayland.freedesktop.org/architecture.html. However, in practice, some pretty big chunks of functionality on the protocol level is missing but the freedesktop contributors, and the GNOME and KDE teams will get there eventually.
The fact that we need the entire xdg-desktop-portal stack for screen sharing on browsers is a major annoyance. We now have a standardised extension for screencasting and screencopy (formerly it was not standard, but had been around for years), but browsers only support the Flatpak stack, which has a lot of moving parts and IPC. Doing out-of-band IPC for this is kind of pointless when the client and the server already have a Wayland connection to begin with.
Outside of the domain of Firefox/Chromium, screencasting is much seamless. But 90% of the screen-sharing happens in browsers.
> Outside of the domain of Firefox/Chromium, screencasting is much seamless
Not always. In my experience Zoom screencasting is much, much worse than on browsers in Wayland. But that isn't terribly surprising given how generally bad Zoom UX is on Linux.
> but browsers only support the Flatpak stack
Well, I think you should blame Google and Mozilla for that.
KDE (through Discover, https://apps.kde.org/discover/) and GNOME (through Software, https://apps.gnome.org/Software/) both have innate support for Flatpak.
So, given that the majority of normie Linux users will use Flatpak to install a browser, they will just use and support that in the browser (because the underlying DE wil more than likely have Flatpak support integrated too) and go on with their day to day.
Means that people who don't want to deal with Flatpak have to deal with Flatpak (or at least parts of it) too unfortunately.
Most distros come with Firefox which most normies will simply use as is.
Also, software stores show both native and flatpak or on Ubuntu snap. One can easily install the system package of chrome if one doesn't want to deal with flatpak.
The real problem with post X compositors is that the Wayland developers assumed that the compositor developers will develop additional working groups (an input protocol, a window management protocol, etc) on top of the working group that exclusively focuses on display aka Wayland. Wayland was supposed to be one protocol out of many, with the idea being that if Wayland ever turns out to be a problem it is small in scope and can be replaced easily.
People who are thinking of a Wayland replacement at this stage, mostly because they don't like it, will waste their time reinventing the mature parts instead of thinking about how to solve the remaining problems.
There is also a misunderstanding of the ideology the Wayland developers subscribe to. They want Wayland to be display only, but that doesn't mean they would oppose an input protocol or a window protocol. They just don't want everything to be under the Wayland umbrella like systemd.
> People who are thinking of a Wayland replacement at this stage, mostly because they don't like it, will waste their time reinventing the mature parts instead of thinking about how to solve the remaining problems.
Now, if only people deciding to replace X11 with Wayland heeded your suggestion...
If they thought this then they misunderstood the people and problem space basically everything of importance.
A very insightful comment. I was a victim to exactly the misunderstanding you explained (as are many other commenters here). Thank you!
Technically, X is also just a protocol. But there was just one main implementation of the server (X.org), and just a couple implementations of the client library (xlib and xcb).
There isn't any technical reason we couldn't have a single standardized library, at the abstraction level of wlroots.
Every major DE had its very own compositing implementation back in X11, so what was "easy" got to be more standardized, and what was hard remained so.
I still don't know why I would want to use it. The benefits don't seem to outweigh the costs yet, and xorg is tried and true. So many Linux articles and forum posts about fixing problems with your desktop graphics start with "If you're using Wayland, go back to xorg, it'll probably fix the problem you're seeing."
You don't always have to replace something that works with something that doesn't but is "modern."
My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
There's no obvious reason for an end used to switch to Wayland if there isn't any particular problems with their current setup, the main improvements come down to things X11 never supported particularly well and are unlikely to be used in many existing X11 setups. My big use case that Wayland enabled was being able to dock my laptop and seamlessly switching apps between displays with different scale factors. And as an added bonus my experience has been that apps, even proprietary ones like Zoom, tend to handle switching scale factors completely seamlessly. It's not that high of importance, but I do like polish like this. (Admittedly, this article outlines that foot on Sway apparently doesn't handle this as gracefully. The protocols enable seamless switching, but of course, they can't really guarantee apps will always render perfect frames.)
OTOH though, there are a lot of reasons for projects like GNOME and KDE to want to switch to Wayland, and especially why they want to drop X11 support versus maintaining it indefinitely forever, so it is beneficial if we at least can get a hold on what issues are still holding things up, which is why efforts like the ones outlined in this blog post are so important: it's hard to fix bugs that are never reported, and I especially doubt NVIDIA has been particularly going out of their way to find such bugs, so I can only imagine the reports are pretty crucial for them.
So basically, this year the "only downsides" users need to at least move into "no downsides". The impetus for Wayland itself is mainly hinged on features that simply can be done better in a compositor-centric world, but the impetus for the great switchover is trying to reduce the maintenance burden of having to maintain both X11 and Wayland support forever everywhere. (Support for X11 apps via XWayland, though, should basically exist forever, of course.)
> having to maintain both X11 and Wayland support forever everywhere
I don't get why X11 shouldn't work forever. It works today. As you said, there's no obvious reason for an end user to switch to Wayland if there isn't any particular problems with their current setup. "Because it's modern" and "Because it's outdated" just aren't compelling reasons for anyone besides software developers. And "because we're going to drop support so you have to switch eventually" is an attitude I'd expect out of Apple, not Linux distributions.
X11 as a protocol will probably continue to work ~forever.
X11 as a display server will continue to work ~forever as long as someone maintains a display server that targets Linux.
KDE and GNOME will not support X11 forever because it's too much work. Wayland promises to improve on many important desktop use cases where X.org continues to struggle and where the design of X11 has proven generally difficult to improve. The desktop systems targeting Linux want these improvements.
> "Because it's modern" and "Because it's outdated" just aren't compelling reasons for anyone besides software developers.
I can do you one better: that's also not really compelling to software developers either most of the time. I beg you to prove that the KDE developers pushed Wayland hard because they badly wanted to have to greatly refactor the aging and technical debt heavy KWin codebase, just for the hell of it. Absolutely not.
The Wayland switchover that is currently ongoing is entirely focused on end users, but it's focused on things they were never able to do well in X11, and it shows. This is the very reason why Wayland compositors did new things better before they handled old use cases at parity. The focus was on shortcomings of X11 based desktops.
> And "because we're going to drop support so you have to switch eventually" is an attitude I'd expect out of Apple, not Linux distributions.
Yeah. Except Apple is one of the five largest companies in the United States and GNOME and KDE are software lemonade stands. I bet if they could they would love to handle this switchover in a way that puts no stress on anyone, but as it is today it's literally not feasible to even find the problems that need to be solved without real users actually jumping on the system.
This isn't a thing where people are forcing you to switch to something you don't want under threat of violence. This is a thing where the desktop developers desperately want to move forward on issues, they collectively picked a way forward, and there is simply no bandwidth (or really, outside of people complaining online, actual interest) for indefinitely maintaining their now-legacy X11-based desktop sessions.
It actually would have been totally possible, with sufficient engineering, to go and improve things to make it maintainable longer term and to try to backport some more improvements from the Wayland world into X11; it in fact seems like some interested people are experimenting with these ideas now. On the other hand though, at this point it's mostly wishful thinking, and the only surefire thing is that Wayland is shipping across all form factors. This is no longer speculative, at this point.
If you really want to run X.org specifically, that will probably continue to work for a decently long time, but you can't force the entire ecosystem to all also choose to continue to support X.org anymore than anyone can force you to switch to Wayland.
Sometimes gnome developers out-apple apple in their attitudes, fwiw.
That was the first thing I noticed when I recently went back to messing with Linux distros after 15 years. Booting into Ubuntu and having to use Gnome Tweaks or whatever it’s called for basic customizations was incredibly confusing considering Linux is touted as being the customizable and personal OS. I doubt I’ll ever give Gnome another try after that.
Same, so I switched to KDE and life has been good.
I get the impression gnome3 is loosely a clone of osx, I much prefer a windows-esc desktop. I’ve never tried kde but feel pretty at home with xfce or openbox. YMMV, but if you have the time they’re worth trying if you’re a recent windows refugee.
GNOME is a much closer match for iPadOS than it is macOS due to how far it goes with minimalism, as well as how it approaches power user functionality (where macOS might move it off to the side or put it behind a toggle, GNOME just won’t implement it at all). Extensions can alleviate that to a limited extent, but there are several aspects that can’t be improved upon without forking.
Funny that you mention this, because broadly GNOME is seen as Linux' MacOS, and KDE as Linux' Android. At least in terms of user customization.
At the end of the day these developers are almost entirely volunteers. Codebases that are a mess, ie X11, are not enjoyable to work on and therefore convincing people to use their discretionary time on it is more difficult. If there wasn't Wayland the current set of developers on Wayland might not have been doing DE work at all.
Attracting new contributors is an existential problem in OSS.
I mean, because maintaining software is hard and costly, and a lot of this is developed by enthusiasts in their spare time?
Supporting legacy stuff is universally difficult, and makes it significantly harder to implement new things.
I prefer Wayland, as I feel Wayland's performance is much smoother than Xorg. Though, I have no use for VRR, and I hate the slight lag that is introduced due to font scaling, so I do not use it either.
But, I am stuck on Xorg only because of one app that I have to use to work.
> My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
This is already happening. in my knowledge, Archlinux, Ubuntu already switched to Gnome 49, which do not support X without recompilation. So most likely, any distro using Gnome 49 upwards will not provide Xorg by default. KDE also going to do it soon.
Xorg is going away pretty soon
I believe its step to the right direction, only issue is some annoying app holding us back
This is the real reason to make the wayland switch.
It doesn't really matter if you like or dislike wayland, the major DE have decided they don't like X11 and they are making the switch to wayland. X11 code is actively being removed from these desktop environments.
If you want to use X11, you can either stay on an old unmaintained DE or switch to a smaller one that supports X11. But you should realize that with wayland being the thing major DEs are targeting, your experience with X11 will likely degrade with time.
Yes, and besides, developers not having to support two servers, can focus on improving the DE where it actually matters. And with that, fixing issues, adding features becomes much faster.
I see it as a win for both developers and users in the long run.
> But, I am stuck on Xorg only because of one app that I have to use to work.
Does XWayland help?
Unfortunately no. The app takes automatic screenshots, and the developers are simply not interested to fix it.
I've had to dive into xorg.conf more than once. Switched to Wayland as soon as it became an (experimental) option in Ubuntu and never looked back. Probably helps that I've always had AMD cards when running Linux, but it has been smooth sailing nonetheless. I can vaguely remember something not working under Wayland in the early days.. Maybe something with Wine or Steam? Anyway, that has to be 10 years ago now.
That's always what's missing from these threads. Wayland is boring actually. It just works. KDE, Sway, Niri, whatever, are all good.
I don't know what to do. The outpouring of negative energies are so severe. But I think it's so incredibly un-representative, is so misleading. The silent majority problem is so real. Come to the better place.
I have essential workflows using x2x, xev, and xdotool. Apparently this kind of stuff is contrary to Wayland's security model, so I'm stuck on Xorg, and I'm ok with that.
Out of curiosity, what are these workflows?
I don’t use x2x but use xev and xdotool for automated regression testing of GUI tools.
I can’t find a low-effort, high-portability, low-menory way to do it with Wayland.
Local and also in CI pipelines.
wev+ydotool?
Not the GP, but I recall the KeePass password manager using xdotool for its autotype feature. I struggled to get xdotool to work correctly back in 2014 on a Debian 7 personal computer. Not familiar with 'x2x' or 'xev'
Working fractional scaling
I would venture to say that there is little overlap between X11 users and people with high-DPI screens.
We are under an article which tells you that you can have problems with Wayland and hiDPI screens. And for example I’m one of those people, who uses X11, because Wayland failed on many levels, like buggy video playing, crashing while changing monitors, or simply waking up my laptop with an external monitor, and I didn’t give more than a few days to fix these (cheers to the author to try this long), so I went back to X11. Which is still buggy, but on a “you can live with it level” buggy.
Btw, everybody who I know, and I too, changes the font size, and leaving the DPI scaling on 100%, or maybe 200% on X11.
> Btw, everybody who I know, and I too, changes the font size, and leaving the DPI scaling on 100%, or maybe 200% on X11.
Doesn't work if your screens are too different (e.g. 4k laptop screen and 32" desktop monitor).
It does work for Qt and KDE at least.
I have a setup with a high DPI monitor mixed with a normal DPI monitor and KDE over Wayland just works fine. The only issue that I found are with Libre Office doing weird over scaling and Chrome/Chromium window resizing his window to the oblivion.
I’ve been using X11 with high-DPI screens since 2013, but with integer scaling (200% or 300%), never fractional scaling.
Nobody's going to buy monitors where they need fractional scaling or multiple monitors with mixed DPI if they know it's broken.
That's a really odd thing to say.
I don't really care about this but here's an example:
I have 2 27" screens, usually connected to a windows box, but while working they're connected to a MBP.
Before the MBP they were connected to several ThinkPads where I don't remember what screen size or scaling, I don't even remember if I used X11 or Wayland. But the next ThinkPad that will be connected will probably be HiDPI and with Wayland. What will happen without buying a monitor? No one knows.
Everyone’s so excited about the wave if windows users coming to Linux. Those people already have monitors.
I switched in 2018 and was surprised I couldn’t use fractional scaling on one monitor like I’d been doing for years on windows.
Not to mention that fractional scaling is practically required in order to use the majority of higher DPI monitors on the market today. Manufacturers have settled on 4K at 27" or 32" as the new standard, which lends itself to running at around 150% scale, so to avoid fractional scaling you either need to give up on high DPI or pay at least twice as much for a niche 5K monitor which only does 60hz.
Fractional scaling is a really bad solution. The correct way to fix this is to have the dpi aware applications and toolkits. This does in fact work and I have ran xfce under xorg for years now on hi-dpi screens just by setting a custom dpi and using a hi-dpi aware theme. When the goal is to have perfect output why do people suddenly want to jump to stretching images?
The overwhelming majority of the low-DPI external displays at this point are 24-27 1080p
Most high-DPI displays are simply the same thing with exactly twice the density.
We settled on putting exactly twice as many pixels in the same panels because it facilitates integer scaling
That doesn't gel with my experience, 1080p was the de-facto resolution for 24" monitors but 27" monitors were nearly always 1440p, and switching from 27" 1440p to 27" 4K requires a fractional 150% scale to maintain the same effective area.
To maintain a clean 200% scale you need a 27" 5K panel instead, which do exist but are vastly more expensive than 4K ones and perform worse in aspects other than pixel density, so they're not very popular.
Why not give up on high DPI?
Save money on the monitor, save money on the gpu (because it's pushing fewer pixels, you don't need as much oomph), save frustration with software.
4K monitors aren't a significant expense at this point, and text rendering is a lot nicer at 150% scale. The GPU load can be a concern if you're gaming but most newer games have upscalers which decouple the render resolution from the display resolution anyway.
I used to be like this. I actually ran a 14" FHD laptop with a 24" 4k monitor, both at 100%. Using i3 and not caring about most interface chrome was great, it was enough for me to zoom the text on the 4k one. But then we got 27" 5k screens at work, and that had me move to wayland since 100% on that was ridiculously small.
Why not 200% and increase font size slightly in all 3 cases?
Because although I don't care much about the chrome, I sometimes have to use it. For example, the address bar in firefox is ridiculously small. Also, some apps, like firefox (again) have a weird adaptation of the scroll to the zoom. So if you zoom at 300%, it will scroll by a lot at a time, whereas 200% is still usable.
Also, 200% on an FHD 14" laptop means 960x540 px equivalent. That's too big to the point of rendering the laptop unusable. Also, X11 doesn't support switching DPI on the fly AFAIK, and I don't want to restart my session whenever I plug or unplug the external monitor, which happens multiple times a day when I'm at the office.
14 fhd is 157 ppi 24 4k is 184 ppi
This really isn't this far off. If we imagined the screens overlayed semi-transparently an 16 pixel letter would be over a 14 pixel one.
If one imagines an ideal font size for a given user's preference for physical height of letterform one one could imagine a idealized size of 12 on another and 14 on the other and setting it to 13 and being extremely close to ideal.
>So if you zoom at 300%, it will scroll by a lot at a time, whereas 200% is still usable.
This is because it's scrolling a fixed number of lines which occupy more space at 300% zoom notably this applies pretty much only to people running high DPI screens at 100% because if one zoomed to 300% otherwise the letter T would be the size of the last joint on your thumb and legally blind folks could read it. It doesn't apply to setting the scale factor to 200% nor the setting for Firefox's internal scale factor which is independent from the desktop supports fractional scaling in 0.05 steps and can be configured in about:config
layout.css.devPixelsPerPx
Why to have a home if you can sleep in a cardboard box?
There is no particular reason for this theory to be true. X supports high DPI screens well and has for ages.
Fractional scaling is very common with high dpi screens. I don't I'd be able to have a 175% scaling on my 14" 3k screen with X11.
Maybe it supports it sure, the problem is that it doesn't work at all.
It does work and has worked for over a decade. You can configure scaling under settings in Cinnamon or plasma for instance or via environmental variables in a simple environment like i3wm.
The post is from the Dev of i3wm an x11 window manager complaining among other things about how well his 8k monitor works under x11 and how poorly it works under Wayland.
You can also consult the arch wiki article on high DPI which is broadly applicable beyond arch
Yes, I know all that. Except it doesn't work. At all.
Ten years ago there were cursor clipping issues, cursor coordinates issues and crashes and I've been home-baking patches for that.
Also it was impossible for one X session to span across two GPUs. Dunno if that was improved.
Now it's bit better, but for sure your amdgpu will entertain you with little nice crashes when you run something heavy on a scaled display.
I'm not even talking about VRR, HDR and all that stuff.
In that time I've had Hidpi work perfectly on first on Nvidia then recently on AMD GPUs on several different distros and desktops all running on X on several distros. They all worked out of the box and were able to scale correctly once configured.
The totality of my education on the topic was reading the arch wiki on hidpi once.
AFAIK one cannot span one x session across multiple GPUs although AMD had something that it once referred to as "eyefinity" for achieving this.
It is rarely needed discreet GPU often support 3 or even 4 outputs
One may wonder if you tried this a very long time ago back when AMD sucked and Nvidia worked well in 2005-2015
The main reason I could imagine is security.
Right now with X11, IIRC, if one application has access to your display they can read what is going on in other applications running on the same display.
If browser tabs were able to do that, all hell would break loose. So why do we accept it from applications?
Anyway, despite this, I still use X11 instead of Wayland because of all the shortcomings.
> If browser tabs were able to do that, all hell would break loose. So why do we accept it from applications?
Because I don't run random untrusted apps all the time. Whereas I do visit random untrusted websites all the time.
But Snap and Flatpak have an advanced permission system, designed so you _can_ run random applications that you don't trust.
That leads to technical debt in the long term. Yes, it might be working well for now but the more outdated it becomes, the harder it will be to maintain later.
For me mainly better HDR implementation.
Does X even support hdr yet? When I looked last time the answer was none
Different refresh rates on different displays is just a killer feature for me.
it already is default in many places, and is used by a large percentage
>nVidia refused to support the API that Wayland was using, insisting that their EGLStreams approach was superior
This is a common mischaracterizarion of what happened. This API, GBM, was a proprietary API that was a part of Mesa. Nvidia couldn't add GBM to their own driver as it is a Mesa concept. So instead Nvidia tried to make a vendor neutral solution that any graphics drivers could use which is where you see EGLStreams come into the picture. Such an EGL API was also useful for other nonwayland embedded usecases. In regards to Nvidia's proprietary driver's GBM support, Nvidia themselves had to add support to the Mesa project to support dynamically loading new backends that weren't precompiled into Mesa. Then they were able to make their own backend.
For some reason when this comes up people always phrase it in terms of Nvidia not supporting something instead of the freedesktop people not offering a way for the Nvidia driver to work, which is a prerequisite of Nvidia following such guidance.
Sorry, but how can an open source project like Mesa be reliant on a proprietary API?
I mean proprietary API in the sense that the API is solely owned and developed by Mesa. It is not a standardized API, but a custom one specific to their project.
Even today if you use the API your program has to link to Mesa's libgbm.so as opposed to linking to a library provided by the graphics driver like libEGL.so.
OK, leaving aside the fact that "proprietary" has a very well defined meaning in this context and using it makes your comment very charged, you're basically telling us that Nvidia was not willing to implement an API for their drivers, but tried to push for one designed by themselves (you're calling it "vendor neutral", but since Mesa is not an actual GPU vendor it's most likely another subtle mistake on your part that completely changes the meaning of your words) and all the other vendors (Intel and AMD at this point), which have already implemented GBM should switch too in the name of this ?
How can you call all of that a mischaracterization? In my humble opinion, and I am not anything more than a bystander in this with only superficial knowledge of the domain, it's you that is trying to mischaracterize the situation.
>leaving aside the fact that "proprietary" has a very well defined meaning in this context
Yes, it does and it is different the the well defined meaning when talking in regards to the software itself. OpenGL is an open API, but the source code for an implementation isn't necessarily open.
>Nvidia was not willing to implement an API for their drivers
They couldn't because this API is a part of Mesa itself. As I mentioned programs link to a Mesa library directly.
>since Mesa is not an actual GPU vendor
They are a driver vendor.
>the other vendors (Intel and AMD at this point), which have already implemented GBM
Support was added to Mesa itself and not to the driver's by those companies. The proprietary, now deprecated, AMD kernel module still doesn't support GBM.
>should switch too in the name of this
I think it is beneficial for standards to be implemented by multiple vendors, so I think they should implement it at least.
>How can you call all of that a mischaracterization?
What people think as Nvidia needing to implement an API is actually an ask for Nvidia to make a Mesa API work.
From my perception essentially the ask was that Nvidia needed to open source the kernel driver like AMD did and then eventually a nvidia gbm backend would be built into Mesa for it. For obvious reasons this was never going to happen. The fact that no agreeable solution was figured out in about a decade, and then Nvidia has to code up that solution for the Mesa project is a failure on Mesa's end. A lot of user pain happened due due to them not willing to work together with proprietary software and focusing solely on supporting open source drivers.
> For obvious reasons this was never going to happen.
Well, I guess this is the crux of the problem, and for open-source enthusiasts like me this is not obvious at all. What we can surmise is that Nvidia refused to collaborate, therefore they were the party to blame for the status of their video cards not being supported as well as others' vendors on linux.
>What we can surmise is that Nvidia refused to collaborate
I saw more effort on Nvidia's side trying to collaborate than on the Wayland side. I think it's unfair to not call out the people who had a hardline stance of only caring about open source drivers and didn't want to do the work to onboard Nvidia.
I think you’re significantly retconning what happened.
Mesa did discuss EGL but felt it wasn’t the right choice. https://mesa-dev.freedesktop.narkive.com/qq4iQ7RR/egl-stream...
In much the same way that NVIDIA may have felt that EGL was the better choice.
However none of your description of the way things are explains why NVIDIA couldn’t have made their own libgbm that matched the symbols of mesa and worked on standardizing the api by de facto.
It may not just be NVIDIA. From what I understand any open source solution is stuck with second rate graphics support on Linux, simply because the groups behind HDMI and other graphics related standards have overly restrictive licensing agreements. Valve ran directly into that while working on its newest console, the AMD drivers for its GPU cannot legally provide full support HDMI 2.1 .
I've been using wayland with Gnome for years without a single issue.
Arguably my hardware is a lot simpler and I don't use Nvidia. But I just want to point out that, for all the flak wayland receives, it can work quite well.
Me too. But first with Sway in 2016, then with KDE Plasma 6. Everything works flawless, everything runs in native Wayland except Steam games. I prefer AMD or Intel hardware over NVIDIA since forever.
I've maybe used Wayland on Gnome for 1-2 years at this point, always with nvidia hardware. Works OK now, but didn't 2 years ago, and before that, used to be very janky, today is smoother than Xorg. But at this point, I don't think there is a single blocker left for me. Took some time to rewrite some programs that I have to control their own window position and wants to see what other applications are running, but was easy to work around with a Gnome Shell Extension in the end, as the design of Wayland doesn't really "allow" those sort of things.
I'm having more issues with games/websites/programs that didn't take high display refresh rate into account, than Wayland, at this point.
I remember having a gentleman over I think to fix something or other and when he walked into the living room he explained my crt monitor was misconfigured and to his perception had a visible flicker. We checked it and it was indeed misconfigured although I couldn't see it but it was such an aberration to him that he took time away from his actual job to make the flicker go away.
You will also note many items in the post above are papercuts that might go unnoted like input feeling a little worse or font issues.
I just recently switched to Linux since I had some weird Windows issues I couldn't fix. I've tried to switch a few times before, but the main problem at some point was that I didn't have proper fractional scaling on Linux. And that alone pretty much made Linux unusable for me on my specific hardware.
Wayland fixes that, so that part is a huge improvement to me. Unfortunately this also limited my choice of Distros as not all of them use Wayland. I landed on Ubuntu again, despite some issues I have with it. The most annoying initially was that the Snap version of Firefox didn't use hardware acceleration, which is just barely usable.
Yeah, fractional scaling is absolutely the one thing that I miss on Linux. On X11 it's too slow and laggy. On Wayland I have... Wayland issues.
I don't entirely love MacOS (mostly because I can't run it on my desktop, lol). But it does fractional scaling so well, I always choose the "looks like 1440p" scaling on 4K resolution, and literally every app looks perfect and consistent and I don't notice any performance impact.
On windows the same thing, except some things are blurry.
On Linux yeah I just have to bear huge UI (x2 scaling) or tiny UI (X1) or live with a noticeable performance delay that's just too painful to work with.
Try just setting the correct dpi for your monitor and use a hi-dpi theme. No scaling required. Pixel perfect graphics.
It seems wayland has fractional scaling, but it is recent, but the bottom of this is high DPI handling should be handled at the GUI toolkit level. Compositor scaling is just a dirty fix for legacy GUI apps.
I've been using Wayland (wlroots/swaywm) for a few years now and it's been flawless, even with an eGPU.
But I'm also running all AMD hardware, that may be a factor. Life is too short for nvidia bullshit on Linux.
And I stopped using sway on my Intel integrated graphics because it still crashes more than i3+Xorg. Maybe someday.
I am the same, now. But I did have it working previously on Nvidia and it was good enough. I’ve also used the TILE patch at work and that seemed pretty good on the 5k screens they have there.
I switched to get support for different scaling on different outputs and I have gone back.
The right way to handle high DPI is at the GUI toolkit level. Scaling in the compositor is just a dirty fix for legacy apps.
> Life is too short for nvidia bullshit on Linux.
So much NVidia hate, but in 23 years the only problems I've had with NVidia on Linux were when they dropped support for old GPUs. Even on proprietary hardware like iMacs and MacBooks.
But to each their own.
Even when it worked, it was more clunky than AMD at all times. It came in form of a huge driver bundle you had to install manually (or rely on a distro to do it for you), while AMD GPUs just worked due to `amdgpu` being part of the kernel since forever. Then is the EGLStreams debacle where nvidia lost a lot of goodwill in my opinion, including mine. And finally, nvidia has managed to opensource their driver on Linux - except that it's less performant then the closed source one, and thus a second class citizen, still. Correct me if I am wrong, please.
The better path on Linux was always AMD, and still is, to this day, since it simply works without me needing to care about driver versions, or open vs closed source, at all.
AMD used to be terrible on Linux, perhaps before your time. Nvidia was always the choice if you needed functional hardware acceleration that worked on par with windows. The nvidia driver was/still is? the same driver accross platforms with a compatibility shim for each OS. This is how Nvidia managed to have best in class 3D acceleration accross Windows, FreeBSD, and Linux for decades now. OpenGL support historically on AMD was really bad, and AMD support was through a propriety driver back the day as well. Part of the reason the AMD/ATI opensource driver gained so much traction and support was that the propritery driver was so bad! Then you get onto other support for things like CUDA for professional work which Nvidia has always been light years ahead of any other card manufacturer.
Source: Was burned many times by ATI's promises to deliver functioning software over the years. Been using Nvidia on Linux and Freebsd for as long as can recall now.
Nvidia and intel on linux for near on 20 years now, and also agree - generally the ATI/AMD experience was markedly worse.
Currently dual 3090s in this box and nvidia is still as simple as just installing the distro package.
There was a period in the mid 2010s where trying to get better battery life on laptops by optionally using the discrete gpu vs the integrated was a real pain (bumblebee/optirun were not so solid), but generally speaking for desktop machines that need GPU support... Nvidia was the route.
Don't love their company politics so much, although I think they're finally getting on board now that so many companies want to run GPU accelerated workloads on linux hosts for LLMs.
But ATI sucked. They seem to have finally gotten there, but they were absolutely not the best choice for a long time.
Hell - I still have a machine in my basement running a GTX970 from 2015, and it also works fine on modern linux. It currently does the gpu accel for whisper speech to text for HA.
It's ridiculous to say that AMD (previously ATI) has always been the better choice, and I don't think anybody who used the ATI drivers would agree with you. For years the only reliable way to get GPU acceleration on Linux was basically NVidia.
When AMD bought ATI they started work on the open source drivers and improved the situation, but they had already lost me as a GPU customer by that point.
Maybe now in the 2020s AMD has caught up, and I'll keep them in mind next time I buy a GPU, but I've been happy with NVidia for a long time.
It would be nice if the NVidia driver were in the kernel and open source, but the Debian package has just worked for a very long time now.
.... and this is why Linux has only a 3% share of desktop user. Having to limit oneself to 2nd-rate GPU hardware is a pretty big ask.
Viewed from the Other Side, I'm far more inclined to think that NVidia actually knows what they are doing and the authors of Wayland do not.
Heh, interesting seeing we use pretty much the same things, i3+NixOS+urxvt+zsh+Emacs+rofi+maim+xdotool, only differentiating in browser choice (it's Firefox for me) and (me) not using any term multiplexer.
>So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
Kudos to Michael for even attempting it. Personally nowadays unless my working stack stops, well, working, or there're significant benefits to be found, don't really feel even putting the effort to try the shiny new things out.
> Kudos to Michael for even attempting it.
And for taking the time to thoroughly document real issues.
I'm not switching to Wayland until my window manager supports it. It doesn't look like anybody has time to do the work, so I'll probably switch, grudgingly, to XWayland whenever X gets removed from Debian.
I feel like the biggest issue for Wayland is the long tail of people using alternative WMs. A lot of those projects don't have manpower to do what amounts to a complete rewrite.
I honestly don't have a preference between Wayland and X, but I feel very strongly about keeping my current WM. XWayland supposedly works, but I'm not in any hurry to add an extra piece of software and extra layer of configuration for something I already have working exactly the way I want. If Wayland offered some amazing advantages over X, it might be different, but I haven't seen anything to win me over.
> I'm not switching to Wayland until my window manager supports it.
Looking at your github, it seems you use StumpWM. It seems they are also working on a wayland version under the name Mahogany. Development seems pretty active: https://github.com/stumpwm/mahogany
> I'll probably switch, grudgingly, to XWayland whenever X gets removed from Debian.
FWIW I think "wayback" is the project for this. It seems to be trying to use XWayland to run full X11 desktop environments on top of Wayland: https://gitlab.freedesktop.org/wayback/wayback
I've been running Wayland on a Framework laptop and it just works. Droves my 4K external monitor, quickly switches to single screen, does fractional scaling well, runs all my apps without complaint.
I had an old Chromebook which had Lubuntu on it - screen tearing was driving me crazy so I switched to Wayland and it is buttery smooth. No mean feat given the decrepit hardware.
I'm sure someone will be along to tell me that I'm wrong - but I've yet to experience any downsides, other than people telling me I'm wrong.
> I'm sure someone will be along to tell me that I'm wrong - but I've yet to experience any downsides, other than people telling me I'm wrong.
That's fine as long as it goes both ways. If Wayland works for you, great. Equally, for some of us it doesn't work.
Do downsides not exist if you are lucky enough to not experience them?
I don't think Wayland is fully ready, at least not with NVIDIA GPUs with limited GPU memory.
I have a 7,000 word blog post and demo videos coming out this Tuesday with the details but I think I uncovered a driver bug having switched to native Linux a week ago with a low GPU memory card (750 Ti).
Basically on Wayland, apps that request GPU memory will typically crash if there's no more GPU memory to allocate where as on X11 it will transparently offload those requests to system memory so you can open up as much as you want (within reason) and the system is completely usable.
In practice this means opening up a few hardware accelerated apps in Wayland like Firefox and most terminals will likely crash your compositor or at the very least crash those apps. It can crash or make your compositor unstable because if it in itself gets an error allocating GPU memory to spawn the window it can do whatever weird things it was programmed to do in that scenario.
I reported it here: https://github.com/NVIDIA/egl-wayland/issues/185
Some end users on the NVIDIA developer forums looked into it and determined it's likely a problem for everyone, it's just less noticeable if you have more GPU memory and it's especially less noticeable if you reboot daily since that clears all GPU memory leaks which is also apparent in a lot of Wayland compositors.
For me wayland offers only downsides, without any upsides. I feel the general idea behind it (pushing all complexity and work onto other layers) is broken. I'll stick to xorg and openbox for many years to come.
Moving complexity from the layer you only have one of, to the layers where there are many, many competing pieces of software, was an absolutely bonkers decision.
There's just no way to make that make sense.
It's hard to imagine a statement that could fly more in the face of open source.
It's absolutely an essential characteristic for long term survival, for long term excellence. To not be married to one specific implementation forever.
Especially in open source! What is that organizational model for this Authoritarian path, how are you going to - as Wayland successfully has - get every display server person onboard? Who would had the say on what goes into The Wayland Server? What would the rules be?
Wayland is the only thing that makes any sense at all. A group of peers, fellow implementers, each striving for better, who come together to define protocols. This is what made the internet amazing what made the web the most successful media platform, is what creates the possibility for ongoing excellence. Not being bound to fixed decisions is an option most smart companies lust but somehow when Wayland vs X comes up, everyone super wants there to be one and only one path, set forth three decades ago that no one can ever really overhaul or redo.
It's so unclear to me how people can be so negative and so short and so mean on Wayland. There's no viable alternative organization model for Authoritarian display servers. And if somehow you did get people signed up, this fantasy, there's such a load of pretense that it would have cured all ills? I don't get it.
I think then big part is maintenance, xorg doesn't look likely to be maintained long into the future in the way Wayland will be. And a lot of the Xorg maintainers are now working in Wayland.
So good or bad idea, Wayland is slowly shifting to being the default in virtue of being the most maintained up to date compositor.
Wayland is not a compositor. Being more maintaned than Xorg doesn't mean anything because wayland doesn't do a tenth of the things Xorg did.
What used to be maintained in one codebase by Xorg devs is now duplicated in at least three major compositors, each with their own portal implementation and who knows what else. And those are primarily maintaned by desktop environment devs who also have the whole rest of the DE to worry about.
There's actually a very active fork of Xorg called Xlibre, started by a former Xorg contributor, which seeks to "revitalize and modernize" X: https://github.com/X11Libre/xserver?tab=readme-ov-file
This guy started that Xlibre fork over throwing a fit because he was told not to break Xorg with his contributions, and he ranted that he just wants to be able to merge whatever he wants. I would not trust the stability of that fork at all.
Looks to me like he's a belligerent personality, but probably not wrong when he says Redhat has an agenda that involves suppressing progress on Xorg and forcing Wayland on users instead.
I was open minded toward Wayland when the project was started... in 2008. We are 18 years down the road now. It has failed to deliver a usable piece of software for the desktop. That's long enough for me to consider it a failed project. If it still exists, it's probably for the wrong reasons (or at the least, reasons unrelated to any version of desktop Linux I want to run, like perhaps it has use in the embedded space).
Taking the proposition as true, what goal does Redhat have in "forcing wayland on users"? I am asking this in good faith, I literally naively do not understand what the "bad" bit is.
Like, ok, its 2030 and X11 is dead, no one works on it anymore and 90% of Linux users use Wayland, what did they gain? I know they did employ Pottering but not anymore, and AFAIK they contribute a non-trivial amount of code up stream to, Linux, Gnome? KDE? If more users are on wayland they can pressure Gnome to ... what?
I sort of get an argument around systemd and this, in that they can push I guess their target feature sets into systemd and force the rest of the eco-system to follow them, but, well I guess I don't get that argument either, cause they can already put any special sauce they want in Redhat's shipped systemd implementation and if its good it will be picked up, if its bad it wont be?
I guess, if Redhat maintains systemd & wayland, then they could choke out community contributions by ignoring them or whatever, but wouldn't we just see forks? Arch would just ship with cooler-systemd or whatever?
Redhat gains at least a few things...
- Maintaining X requires a lot of time, expertise and cost, it's a hard codebase to work with, deprecating X saves them money
- Wayland is simpler and achieves greater security by eliminating features of the desktop that most users value, but perhaps Redhat's clients in security-conscious fields like healthcare, finance and government are willing to live without
So I suspect it comes down to saving money and delivering something they have more control of which is more tailored to their most lucrative enterprise scenarios; whereas X is an old mess of cranky unix guys and their belligerent libre cruft.
There are some parallels to systemd I guess, in that its design rejected the Unix philosophy, and this was a source of concern for a lot of people. Moreover at the time systemd was under development, my impression of Poettering was that he was as incompetent as he was misguided and belligerent - he was also advocating for abandoning POSIX compatibility, and PulseAudio was the glitchiest shit on my desktop back then. But in the end systemd simply appeared on my computer one day and nothing got worse, and that is the ultimate standard. If they forced wayland on me tomorrow something on my machine would break (this is the main point of the OP), and they've had almost 20 years to fix that but it may arguably never get fixed due to Wayland's design. So Wayland can go the way of the dodo as far as I'm concerned.
Cant Edit: I mention Pottering above because I remember similar arguments against his stuff (that I also never really fully understood in terms of "end game"), not because I have personal animosity against him or his projects or want to hold him up as "example of what can go wrong".
There is a massive class of things in open source you can look at from the perspective of "Suppose a megacorp or private equity owns this entity and wants to cut costs as much as possible while contributing as little back to the community/ecosystem as possible... what happens next?" And boom you can suddenly see the Matrix. So in the case of Redhat it's likely just IBM being IBM at the financial level and all these little decisions trend a certain way in the long run because of that
> ...what goal does Redhat have in "forcing wayland on users"?
The same goal any group of savvy corporate employees has when their marquee project has proved to be far more difficult, taken way longer, and required far more resources than anticipated to get within artillery distance of its originally-stated goal?
I've personally seen this sort of thing play out several times during my tenure in the corporate environment.
I honestly dont know what that means, I've never worked in a big company/corporation. Try and disown it? How does that fit with xlibres anti-corporate control stance? I guess it they push wayland then drop it, we're left with X11 ignored and a non-financially supported alternative?
I guess I just don't get how the third E in EEE plays out in an open source environment.
ridiculous, wayland all in all provides a far better experience than X11, and wayland projects like plasma, hyprland, sway etc are very much not failed
I build my own distro, I build myself Xorg: I did witness his Xorg breakage all over the place.
I did welcome happily the revert of all his code.
There is also Phoenix: https://git.dec05eba.com/phoenix/about/
Might be a stupid question, but what's wrong with Xorg?
I know that it wasn't originally conceived to do what it does today, but I've never had any problem using it, and when I tried Wayland I didn't notice any difference whatsoever.
Is it just that it's a pain to write apps for it..?
The real story behind Wayland and X, Linux.conf.au 2013
https://www.youtube.com/watch?v=GWQh_DmDLKQ
https://people.freedesktop.org/~daniels/lca2013-wayland-x11....
All the major arguments in that decades old talk are invalidated with the introduction of DRI3.
Oh, nice--thank you!
Good question.
It makes sand-boxing security impossible. The moment a process has access to the Xorg socket, it has access to everything. It is weird that this oftentimes misses from the discussion though.
Can't this aspect be improved, vs. switching to something else?
QubesOS and Xpra+Firejail demostrate security can be improved including the X11 side. Solaris had Trusted Extensions. X11Libre has a proposal for using different magic cookies to isolate clients and give dummy data to the untrusted. Keith Packard also proposed something in 2018.
It is already possible today. There are access control hooks provided via XACE. Nobody uses them because the attack scenario is basically non-existent. If you run untrusted malicious apps having full access to your home directory you have big problems anyways. Not giving them access to e.g. the screen coordinates of their windows won't help you much then.
which is exactly why you often dont give your sandboxed applications full access to your home directory :)
> Is it just that it's a pain to write apps for it..?
Other way around: Maintaining Xorg itself is awful.
I often see comments of "everything works perfect in wayland" which makes me wonder how many features some people use. I've tried wayland a few times now and have always noticed small quirks. A few current examples: shading a window actually leaves an invisible section that can't be clicked where the window was, shading and other window activities being inconsistent across various window types (terminal, file manager, etc.), picture-in-picture mode of browsers doesn't maintain aspect ratio, picture-in-picture doesn't maintain "always on top" or position when enabling it (I've managed to fix the "always on top" by writing a rule to apply to windows with "Picture in picture" as the title, at least)
KDE Plasma switched to Wayland by default sometime last year, and so far the main issue I run into is that a few screen recording tools I like stopped working. (Mostly simplescreenrecorder, which seems to be entirely unmaintained at this point.) Other than some initial instability with accelerated rendering on my GPU, which was quickly addressed, it kinda just works. I mostly don't notice.
Actually, GPU acceleration was why I initially switched. For whatever reason, this GPU (Radeon VII) crashes regularly under X11 nearly every time I open a new window, but is perfectly stable under wayland. Really frustrating! So, I had some encouragement, and I was waiting for plasma-wayland to stabilize enough to try it properly. I still have the X11 environment installed as a fallback, just in case, but I haven't needed to actually use it for months.
Minor pain points so far mostly include mouse acceleration curves being different and screen capture being slightly more annoying. Most programs do this OS-level popup and then so many follow that up with their own rectangle select tool after I already did that. I had some issues with sdl2-compat as well, but I'm not sure that was strictly wayland's fault, and it cleared up on its own after a round of updates. (I develop an SDL2 game that needs pretty low latency audio sync to run smoothly)
> Mostly simplescreenrecorder, which seems to be entirely unmaintained at this point
I use it extensively, it's easy to use, UI is compact but clear, works perfectly all the time. I honestly don't care that it is unmaintained at this point.
> KDE Plasma switched to Wayland by default sometime last year, and so far the main issue I run into is that a few screen recording tools I like stopped working. (Mostly simplescreenrecorder, which seems to be entirely unmaintained at this point.) Other than some initial instability with accelerated rendering on my GPU, which was quickly addressed, it kinda just works. I mostly don't notice.
FWIW, I have a KDE Wayland box and OBS works for screen recording. Slightly more complex than simplescreenrecorder, but not bad.
I've been using Kooha, but it's painful in a few ways, not least of which having aggressive compression that can't be disabled. SSR was nice because of the reduced time between "decide to record" to "draw a rectangle, done." OBS works, but is very clunky and cumbersome to reconfigure.
At some point I'll get irritated enough to seek out more alternatives and give them a whirl. Such is fate :)
I am still sad that shading windows stopped working with Wayland.
2026 is starting with half-baked NVidia drivers and missing functionality on linux? I am so surprised... did you try 17 different previous versions to get it running in true NV-Linux fashion?
This stuff has been flawless on AMD systems for a while a couple of years now, with the exception of the occasional archaic app that only runs on X11 (thus shoved in a container).
Flawless on AMD? Absolutely not. 2-3 years ago there used to be a amdgpu bug that froze my entire laptop randomly with no recourse beyond the 4 second power button press. After that was fixed, it sometimes got stuck on shutdown. Now it doesn't do that randomly anymore, but yet all it takes to break it, is to turn off the power to my external monitor (or the monitor powering off by itself to save energy) or unplugging it, after which it can no longer be used without rebooting and then sometimes it gets stuck on shutdown.
AMD iGPU driver is broken for me right this very moment: https://issues.chromium.org/issues/442860477?pli=1
Clarification: The AMD iGPU driver (or Chrome) on Ubuntu 24.04 has bugs on your hardware. You could try a newer and different distro (just using a live-USB) to see if that has been fixed.
Me too. I have the freezes-on-shutdown bug on my AMD video adapter.
Does AMD support CUDA yet? Because otherwise you can't use it for video editing.
Ask NV to open up CUDA and then maybe AMD can start even thinking about supporting it :)
Or you could just not use Linux. lol
I recently upgraded to Ubuntu 25.10, and decided to give Wayland another go since X.org isn't installed by default anymore.
Good news: My laptop (Lenovo P53) can now suspend / resume successfully. With Ubuntu 25.04 / Wayland it wouldn't resume successfully, which was a deal breaker.
Annoying thing: I had a script that I used to organize workspaces using wmctrl, which doesn't work anymore so I had to write a gnome-shell extension. Which (as somebody who's never written a gnome-shell extension before) was quite annoying as I had to keep logging out and in to test it. I got it working eventually but am still grumpy about it.
Overall: From my point of view as a user, the switch to Wayland has wasted a lot of my time and I see no visible benefits. But, it seems to basically work now and it seems like it's probably the way things are headed.
Edit: Actually I've seen some gnome crashes that I think happen when I have mpv running, but I can't say for sure if that's down to Wayland.
At this point the primary thing that's keeping me from switching to Wayland (KDE) is lack of support for remote desktop software, especially with multiple monitors...
Hopefully AnyDesk and Remmina will address this issue before KDE ends it's mainline X11 support next year.
I've had a similar issue recently and I found that rustdesk[0] works pretty well for casual use despite wayland support being labelled experimental. I use it for pair programming with someone on multiple monitors while I'm on a laptop and all the switching and zooming required worked.
[0] https://rustdesk.com/
:) This is a feature of a wayland compositor: I don't want it able to do remote.
OTOH, the enthusiasm for breaking legitimate features that people were using has not helped Wayland adoption.
It doesn’t have to be like X11. Presumably, it’d be something you could disable if you’d like.
It’d be very handy if we had a performant remote desktop option for Linux. I could resume desktop sessions on my workstation from my laptop and I could pair program with remote colleagues more effectively.
In the past I’d boot into Windows and then boot my Linux system as a raw disk VM just so I could use Windows’s Remote Desktop. Combined with VMware Workstation’s support for multiple monitors, I had a surprisingly smooth remote session. But, it was a lot of ceremony.
Glad to see a good write-up of Wayland issues. My day-to-day doesn't run into the vast majority of these problems so when I see people melt down over a single trivial seeming Wayland choice about window coordinates then I have a really hard time relating.
This post is a lot more relatable.
As an aside, regarding remote Emacs - I can attest that Waypipe does indeed work fantastically for this. Better than X11 ever worked over the network for me.
I, too, suffer from the pgtk is slow issue (only a 4k monitor though it's mitigable and manageable for me)
As an Emacs PGTK user, do you have any experience with modifiers beyond the basic 4? I recently tried to use PGTK Emacs and it seems to not support e.g. Hyper, which is a bummer, because I extensively use Hyper in my keybindings.
Note to people on this thread: the impression the discussions give is that Linux isn't ready for prime time desktop use. I thought Wayland was the latest and greatest, but folks here report issues and even refuse to ever use it.
Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026. If you are a Linux on desktop advocate, read the comments and see why so many are still hesitating.
>I thought Wayland was the latest and greatest, but folks here report issues and even refuse to ever use it.
>Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026.
Quite ironically there're people refusing to leave Windows 7, which has been EOS since 2020, because they find modern Windows UI unbearable. Windows 11 being considered that bad that people are actually switching OSes due to it. Have seen similar comments about OSX/macOS.
The big difference between those and Linux is that Linux users have a choice to reject forced "upgrades" and build very personalized environments. If had to live with Wayland could do it really, even if there're issues, but since my current environment is fine don't really need/care to. And it's having a personalized environment such a change is a chore. If was using a comprehensive desktop environment like GNOME (as many people do), maybe wouldn't even understand something changed underneath.
> Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026.
LOL
I installed a new windows 11 yesterday on a fairly powerful machine, everything lags so much on a brand new install it's unreal. Explorer takes ~2-3 seconds to be useable. Any app that opens in the blink of an eye under Linux on the same machine takes seconds to start. Start menu lags. It's just surrealistic. People who say these things work just have never used something that is actually fast.
I am not sure how people get all these issues. I installed fresh windows recently, and I don't see noticable any slowdowns.
Linux is faster in some places, maybe. But still with many issues like some applications not being drawn properly or just some applications not available (nice GUI for monitor control over ddc)
There is:
https://github.com/rockowitz/ddcui
Anecdotally, everything works flawlessly on my work machine: Optiplex Micro, Intel iGPU, Fedora KDE 43, 4K 32" primary monitor at 125% scale, 1440p 27" secondary monitor at 100%. No issues with Wayland or with anything else.
Everything actually feels significantly more solid/stable/reliable than modern Windows does. I can install updates at my own pace and without worrying that they'll add an advert for Candy Crush to my start menu.
I also run Bazzite-deck on an old AMD APU minipc as a light gaming HTPC. Again, it's a much better experience than my past attempts to run Windows on an HTPC.
As with everything, the people having issues will naturally be heard louder than the people who just use it daily without issues.
As a long-time Linux user I've also felt an incongruity between my own experiences with Wayland and the recent rush of "year of the Linux desktop" posts. To be fair, I think the motivation is at least as much about modern Windows' unsuitability for prime time rather as Linux's suitability. I haven't used Windows for a long time so I can't say how fair that is, but I definitely see people questioning 2026 Windows' readiness for prime time.
For me, Wayland seems to work OK right now, but only since the very latest Ubuntu release. I'm hoping at this point we can stop switching to exciting new audio / graphics / init systems for a while, but I might be naive.
Edit: I guess replacing coreutils is Ubuntu's latest effort to keep things spicy, but I haven't seen any issues with that yet.
Edit2: I just had the dispiriting thought that it's about twenty years since I first used Ubuntu. At that point it all seemed tantalizingly close to being "ready for primetime". You often had to edit config files to get stuff working, and there were frustrating deficits in the application space, but the "desktop" felt fine, with X11, Alsa, SysV etc. Two decades on we're on the cusp of having a reliable graphics stack.
>I just had the dispiriting thought that it's about twenty years since I first used Ubuntu. At that point it all seemed tantalizingly close to being "ready for primetime".
I feel the same and find it a bit strange. I am happy with hyprland on wayland since a few months back but somehow it reminds me of running enlightenment or afterstep in the 90s. My younger self would have expected at least a decade of "this is how the UI works in Linux and it's great" by now.
Docker and node both got started after wayland and they are mature enterprise staples. What makes wayland such a tricky problem?
I too share your sense of incongruity .
But then I try and focus on what each author thinks is important to them and it’s often wildly different than what’s important to me.
But a lot of internet discussion turns into very ego-centric debate including on here, where a lot of folks who are very gung-ho on the adoption of something (let’s say Linux, but could be anything) don’t adequately try and understand that people have different needs and push the idea of adoption very hard in the hopes that once you’re over the hump you might not care about what you lost.
I don't know what "ready for prime time desktop use" means. I suspect it means different things for different people.
But with Linux being mostly hobbyist-friendly a number of folks have custom setups and do not want to be forced into the standardized mold for the sake of making it super smooth to transition from Windows.
I have such a setup (using FVWM with customized key bindings and virtual layout that I like, which cannot work under Wayland), so can I donate some money to Microsoft to keep Windows users less grumpy and not bringing yet another eternal September to Linux. I like my xorg, thank you very much :).
I'm not very familiar with Wayland, and the fact that XWayland exists means that I don't really have much sense for whether a given app is using Wayland or not. I also don't do anything very fancy. I have a single, sub-4k monitor and don't use HDR or other things. Am I using Wayland? Sometimes? Most of the time? I'm really not 100% sure.
Wayland smells like IPv6 to me. No need to switch, and it hurts when you try.
> Wayland smells like IPv6 to me. No need to switch, and it hurts when you try.
I'm very happy with Wayland, but what a strange comparison to make if you're not. IPv6 is objectively an enormous improvement over IPv4, and the only gripe with it is that it's still not ubiquitous.
I’ll concede that IPv6 has usefulness on the public Internet, where adoption is actually gaining nicely. No issues there really.
However, my comparison is end-user focused (ie. the Linux desktop experience). I should have been more clear about the scope perhaps.
Both IPv6 and Wayland have increased complexity and surface area for pain (cost) without an obvious benefit for the end-user.
Also: wrt IPv6 specifically, I don’t believe every device on a private network should be publicly addressable/routable. To me that’s a bug, not a feature, and again does not serve the consumer, only the producer.
> without an obvious benefit for the end-user.
I guess HDR support, 10/12bit colors, displays with different dpi/refresh rate etc is just not really an obvious benefit to you?
> Both IPv6 and Wayland have increased complexity and surface area for pain (cost) without an obvious benefit for the end-user.
I'd argue the opposite: IPv6 has lowered complexity for the end user: SLAAC, endless addresses, no need for CIDR – these are all simplifications for the end user.
> Also: wrt IPv6 specifically, I don’t believe every device on a private network should be publicly addressable/routable. To me that’s a bug, not a feature,
Some would argue it's a feature. But let's say it's not useful. It's still surely not a bug. An address being publicly routeable doesn't mean you have to route traffic to it. Just don't, if you don't want to.
> and again does not serve the consumer, only the producer.
I'd argue that it simplifies some things for the consumer (see above), and also lets the consumer be a producer more easily. I'd argue that that's a good thing, more in the spirit of the internet. But even if the end user doesn't care, it's not a detriment.
I agree with the parent comment. I have sway on my laptop, i3 on my desktop, I don't notice any difference. Well except sharing and annoying small sway things that works on i3.
Just as I am oblivious to whether this is posted over ipv4 or 6.
That they all have to implement the protocol seems like 20 years of wayland might actually have hurt Linux more than it fixed - without it something else would have happened. Think of how many man hours have been wasted doing the same thing for KDE, gnome, sway, hyprland, etc.
(also I agree about the publicly available thing, it's a bug for me as well. Companies will harvest everything they can and you better believe defaults matter - aka publicly available, for the producer, but they will say your security, of course)
I've been trying to switch to Wayland and KDE plasma for some time but it's just so glitchy. Graphics bugs such as the tasks switcher showing black or flickery preview thumbnails or Firefox bringing down the whole system when opening a single 4k PNG indicate that it's still unfortunately very much an alpha.
Maybe in another decade or so.
Interesting, I had these issues around 2 years ago with my Nvidia GPU, making Wayland unusable (especially the honestly probably epilepsy-inducing flicker).
After an Nvidia graphics driver release everything cleared up to be very usable (though occasionally stuff still crashed, like once or twice a week). I heavily dislike Nvidia and went with AMD just around a month ago, zero issues.
I'm curious to hear about what hardware you have.
Hmm, I use KDE Plasma with Wayland and have had zero issues. What GPU are you using?
My experience is very similar. Just today, I was trying Wayland again but it didn't work out.
One of the obstacle that I faced is wrong resolution. On Xorg I could just add new mode and get up and running quickly. On Wayland, I have to either do some EDID changes or go through even worse.
Can anyone recommend an autoclicker they actively use on Wayland? I've been using ydotool but the daemon service is janky (fails to startup/shutdown frequently, also had issues where half my inputs don't work while it's running)
> I've been using ydotool but the daemon service is janky (fails to startup/shutdown frequently,
I'd be investigating that issue instead, should have errors in systemd/journalctl or whatever you use for managing daemons. I'm using ydotool on Arch, pretty much all defaults, together with a homegrown voice dictation thing, and it's working 100% of the times.
What a fantastic and timely post! Especially coming from i3 maintainer! Michael did such diligent analasys and saved me (and hopefully others) a lot of time. I was considering trying Wayland/sway and this post answered all my questions and showed me that it is not ready, at least for me, yet.
If somewm[1] continues to develop, why not? Otherwise I will not be sacrificing my decade-old work style with awesomewm.
[1] https://github.com/trip-zip/somewm
I'm on macOS, and I use XQuartz [1] occasionally for Linux/Unix GUI apps: if something is 'written for' (?) Wayland, can I send its GUI windows across the network (over SSH)?
[1] https://en.wikipedia.org/wiki/XQuartz
My question is how long will it take for core necessities like push to talk in discord running in a background tab in my browser while I game with my 50+ closest friends to work under wayland. I hope I don’t develop a need for accessibility tooling the next couple of decades given the current progress.
I looked into this lately - Discord needs to use the Global Shortcuts Portal to do it properly but how is unclear. Discord is based on Electron which is based on Chromium. Chromium has support and Electron kind of has support since https://github.com/electron/electron/pull/45171 but this seems to be rather unknown and unused. Although somewhere in this API chain keyup events are lost, meaning that only "normal" shortcuts would work but no push-to-talk. There are multiple options for Discord to implement this: implement Global Shortcuts Portal directly, go via Electron global shortcuts API, hook into Chromium shortcuts API, maybe others - with the caveat that some of those don't support keyup events. Vesktop devs are currently stuck in same dilemma: https://github.com/Vencord/Vesktop/issues/18
> Sometimes, keyboard shortcuts seem to be executed twice!
Sounds like someone made a listener that listens on key events, but didn't bother to check the state of the event, meaning it hits releases as well. Should be easy to verify by keeping them pressed long enough to trigger the key repeat events.
> I also noticed that font rendering is different between X11 and Wayland! The difference is visible in Chrome browser tab titles and the URL bar, for example:
Tab title display is not owned by wayland unless you are running with the client side decor extension, which Gnome is not. So looking at the application or GUI framework (GTK in this case) are really the two only choices.
About the beginning of the article: Wayland is an alternative, not a successor, of X11.
Try using Zoom client with screen sharing. Doesn't work and so on many applications limited by functionality. People say its year of Linux 2026 and xorg is dead. But its not even close to make it work for basic functionality. You can blame on vendors but as long as user functionality is not working, its never a working solution
Screen sharing works in Zoom now, you'll have to find a new cherry-picked example
Yes, it now works in Zoom, but not in Webex, unfortunately. That's been a big obstacle for me. I'd need to be able to share individual windows with audio.
not sure it is fair to call something that is a show-stopper a cherry-pick
> But rather quickly, after moving and resizing browser windows, the GPU process dies with messages like the following and, for example, WebGL is no longer hardware accelerated:
Is this specific to the WM he used or does HW acceleration straight up not work in browsers under Wayland? That to me seems like a complete deal breaker.
Probably not specific to Sway, but specific to the nVidia driver.
For me a no-go for wayland is no support in LXDE and Xfce, which are very good lightweight out of the box user friendly DEs. There is LXQt and Xfce initiated the migration but until then it doesn't worth. Other hurdles are less important like multiscreen, multi-seat, Nvidia.
Its amazing that Xfce still manages to fool people into believing its lightweight. It uses the same resources as Gnome.
multi seat works great with wayland, in fact slightly better than on X, where it also worked fine
Does anyone have a workaround to get i3 fullscreen behaviour in sway?
And is it possible to get fullscreen but within a container (e.g. get rid of browser gui to see more in a small container)
Wayland being contemporary with the financial crisis makes sense in my head but I'll probably spend the rest of today processing that it's 18 years ago.
Does hot plugging work right yet? Was quickly discouraged when KVM caused crashes and the open issue said "you're holding it wrong, buy edid emulators"
It's fun how most of the complaints are like "it works fine on Gnome but I will still blame Wayland because my tiling WM doesn't support it". So maybe try using a proper Wayland implementation
The Chrome crashes when resizing a window doesn't makes any sense, apart from being a WM fault. The Xwayland scaling, again, has native scaling support on Gnome. Same for the monitor resolution problem (which he acknowledged). Same for font rendering. Idk.
GNOME’s “proper wayland implementation” also does not work with my monitor, as I explained in the article:
> By the way, when I mentioned that GNOME successfully configures the native resolution, that doesn’t mean the monitor is usable with GNOME! While GNOME supports tiled displays, the updates of individual tiles are not synchronized, so you see heavy tearing in the middle of the screen, much worse than anything I have ever observed under X11. GNOME/mutter merge request !4822 should hopefully address this.
This reminds me of when pulseaudio came on the scene. Bizarrely there was a short period when PA was superior to everything else. I could set per source and per sink volumes. It was bonkers. The perfect mixer. Then something else happened.
Don’t know what the deal is with Linux desktop experience. I have encountered various forms of perfection and had them taken away.
Once on my XPS M1330 I clicked to lift a window and then three finger swiped to switch workspace and the workspace switched and I dropped the window. It was beautiful. I didn’t even notice until after I’d done it what an intuitive thing it felt like.
Then a few years later I tried with that fond memory and it didn’t work. Where did the magic go?
Probably some accidental confluence of features broken in some change.
I have nothing to add other than I use bazzite for everything now. Windows gone.
So im using linux desktops for decades now, and bout 2 years ago i finally ditched my for gaming only windows install to go onto linux only setups for gaming also.
I mean, it works alot better than it did before, still i wouldn't recommend it for someone who isn't ready to tinker in order to make stuff work.
The point why i mention this is, while most normal desktop/coding stuff works okay with wayland, as soon i try any gaming its just a sh*show. From stuff that doesn't even start (but works when i run on x) to heavyly increased performance demands from games that work a lot smoother on x.
While i have no personal relation to any of both, and i couldn't technically care less which of them to use - if you are into gaming, at least in my experience, x is rn still the more stable solution.
I must say:
1) Hugely enjoyable content - as usual - by Michael Stapelberg: relevant, detailed, organized, well written.
2) I am also an X11 + i3 user (and huge thanks to Michael for writing i3, I'm soooo fast with it), I also keep trying wayland on a regular basis because I don't want to get stuck using deprecated software.
I am very, very happy to read this article, if only because it proves I'm not the only one and probably not crazy.
Same experience he has: everytime I try wayland ... unending succession of weird glitches and things that plain old don't work.
Verdict: UNUSABLE.
I am going to re-iterate something I've said on HN many times: the fact that X11 has designs flaws is a well understood and acknowledged fact.
So is the fact that a new solution is needed.
BUT, because Wayland is calling themselves the new shite supposed to be that solution DOES NOT AUTOMATICALLY MEAN they actually managed to solve the problem.
As a matter of fact, in my book, after so many years, they completely and utterly failed, and they should rethink the whole thing from scratch.
And certainly not claim they're the replacement until they have reached feature and ease of use parity.
Which they haven't as Michael's article clearly points out.
You are totally free to work on whatever you want to. You don't have to use the software that the Wayland devs (and other developers that like Wayland) produces. You can use and code whatever you want.
The way this article styles the name of the GPU company "nVidia" is really distracting! The company has always referred to itself in all capitals, as in NVIDIA, and only their logos have stylized a lowercase initial n, which leads to perhaps nVIDIA if you want, or nᴠɪᴅɪᴀ for those with skills or, for normal people, just nvidia. But "nVidia" is a mixture of mistakes.
No, the company has not always referred to itself in all capitals.
https://forums.tomshardware.com/threads/nvidias-name-change....
When I got to know their products, they were nVidia.
"Jen-Hsun Huang certifies that he is the president and secretary of NVidia Corporation, a California corporation." - ARTICLES OF INCORPORATION OF NVidia Corporation, 1993, filed with the California Secretary of State and available online.
"The name of this corporation is NVIDIA Corporation." - 1995 amendment.
I’ve been using Wayland on Debian 12 since 2023. On an Apple Studio Display (5K) over thunderbolt (the built-in camera, speakers, etc work fine)
I screen share and video call with Slack and Google Meet.
I use alacritty/zsh/tmux as my terminal. I use chromium as my browser, vscode and sublime text as code editors.
Slack, Spotify, my studio mic, my Scarlett 2i2, 10gbe networking, thunderbolt, Logitech unifying receiver…. Literally everything “just works” and has been a joy to use.
Only issues I’ve ever faced have been forcing an app to run native Wayland not xwayland (varies from app to app but usually a cli flag needed) and Bluetooth pairing with my Sony noise canceling which is unrelated to Wayland. Periodically I get into a dance where it won’t pair, but most of the time it pairs fine.
Since there are many wayland compositors, wayland clients must be very conservative (don't be fancy) and most of all respect the dynamic discovery of the interfaces and features and must adjust (from core to stable interfaces).
For instance, a compositor may not support a clipboard, and the "data" related interfaces must be queried for availability (those interface are stable in core) and the client must disable such functionality if not there (for instance, wterm terminal is faulty because it forces a compositor to have such interfaces... but havoc terminal is doing it right). I don't know yet if libSDL3 wayland support "behaves" properly. wterm fix is boring but should be easy.
As wayland usage, it is probably almost everwhere (and Xwayland is there for some level of legacy compatibility).
(I am currently writting my own compositor for AMD GPUs... in risc-v assembly running on x86_64 via an interpreter)
If I could in 2020.... Maybe?
Do we have any choice ?
I’ve been using Wayland exclusively for about 2 years. It’s great. And when it’s not it gets fixed. X11 isn’t a project anymore, it’s a nightmare of empty meetings and discussions, no coders.
The article already begins with a wrong claim:
"Wayland is the successor to the X server "
Wayland is primarily a protocol, but most definitely not a "success" to the xorg-server. This is why it does not have - and will never have - the same feature set. So trying to sell it as "the new shiny thing" after almost 20 (!!!!!) years, is simply wrong. One should instead point out that wayland is a separate way to handle a display server / graphics. There are different trade-offs.
> but for the last 18 years (!), Wayland was never usable on my computers
I can relate to this a bit, but last year or perhaps even the year before, I used wayland via plasma on manjaro. It had various issues, but it kind of worked, even on nvidia (using the proprietary component; for some reason the open-source variant nouveau works less-well on my current system). So I think wayland was already usable even before 2025, even on problematic computer systems.
> I don’t want to be stuck on deprecated software
I don't want to be stuck on software that insinuates it is the future when it really is not.
> With nVidia graphics cards, which are the only cards that support my 8K monitor, Wayland would either not work at all or exhibit heavy graphics glitches and crashes.
I have a similar problem. Not with regards to a 8K monitor, but my ultra-widescreen monitor also has tons of issues when it comes to nvidia. I am also getting kind of tired of nvidia refusing to fix issues. They are cheap, granted, but I'd love viable alternatives. It seems we have a virtual monopoly situation here. That's not good.
> So the pressure to switch to Wayland is mounting!
What pressure? I don't feel any pressure. Distributions that would only support wayland I would not use anyway; I am not depending on that, though, as I compile everything from source using a set of ruby scripts. And that actually works, too. (Bootstrapping via existing distributions is easier and faster though. As stated, trade-offs everywhere.)
> The reason behind this behavior is that wlroots does not support the TILE property (issue #1580 from 2019).
This has also been my impression. The wayland specific things such as wlroots, but also other things, just flat out suck. There are so many things that suck with this regard - and on top of that, barely any real choice on wayland. Wayland seems to have dumbed down the whole ecosystem. After 20 years, having such a situation is shameful. That's the future? I am terrified of that future.
> During 2025, I switched all my computers to NixOS. Its declarative approach is really nice for doing such tests, because you can reliably restore your system to an earlier version.
I don't use NixOS myself, but being able to have determined system states that work and are guaranteed to work, kind of extends the reproducible builds situation. It's quite cool. I think all systems should incorporate that approach. Imagine you'd no longer need StackOverflow because people in the NixOS sphere solved all those problems already and you could just jump from guaranteed snapshot to another one that is guaranteed to also work. That's kind of a cool idea.
The thing I dislike about NixOS the most is ... nix. But I guess that is hard to change now. Every good idea to be ruined via horrible jokes of an underperforming programming language ...
> So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
I had a similar impression. I guess things will improve, but right now I feel as if I lose too much for "this is now the only future". And I don't trust the wayland-promo devs anymore either - too much promo, too few results. After 20 years guys ...
>The thing I dislike about NixOS the most is ... nix.
There's Nickel, if it's only about the language, and Guix (Guile Scheme) which goes beyond just the language.
> The thing I dislike about NixOS the most is ... nix. But I guess that is hard to change now. Every good idea to be ruined via horrible jokes of an underperforming programming language ...
I don’t get the hate for Nix, honestly. (I don’t get the complaints that it’s difficult, either, but I’m guessing you’re not making one here. I do get the complaint that the standard library is a joke, but you’re not making that one either that I can see.) The derivation and flake stuff excepted, Nix is essentially the minimal way to add lazy functions to JSON, plus a couple of syntax tweaks. The only performance-related thing you could vary here is the laziness, and it’s essential to the design of Nixpkgs and especially NixOS (the only config generator I know that doesn’t suck).
I’ll grant that the application of Nix to Nixpkgs is not in any reasonable sense fast, but it looks like a large part of that is fairly inherent to the problem: you’ve got a humongous blob of code that you’re going to (lazily and in part) evaluate once. That’s not really something typical dynamic-language optimization techniques excels at, whatever the language.
There’s still probably at least an order of magnitude to be had compared to mainline Nix the implementation, like in every codebase that hasn’t undergone a concerted effort to not lose performance for stupid reasons, but there isn’t much I can find to blame Nix the language for.
Was expecting some real unproductive and entitled whining based on the title, but was pleasantly surprised - someone actually investigating and debugging their wayland issues rather than putting their head in the sand and screaming “X11 FOREVER!!!”