it's year of the linux desktop baby!!

Like any good blog post [[citation needed]], I'll be starting out with an unneccesary amount of context.

I have long since been a linux user. However, I'm also (unfortunately) a gamer with standards1, so I strive to make no compromises about the performance of my system's output compositing first and foremost. And so it was, I unfortunately used Windows from 2020 onwards, due to the hardware I had - my old r9 290 from my very first computer was not sufficient, my Radeon VII had hardware flaws that led to it continually falling off the bus, and a 3090 fell into my hands through good fortune2.

1

I am extremely sensitive to eyestrain, especially as it pertains to migraines - I frequently suffer them, and eyestrain is a significant factor. Choppy refresh rates, tearing, ghosting, and chromatic abberations are some of the worst offenders/contributors, but suffice it to say that I like it when my frametimes are low and when the picture is clear and crisp.

2

keep in mind, this was in the summer of 2020 - a friend of mine was trying to check out with at least one gpu from microcenter, and ended up with a spare. So I want to use this 3090 for at least a decade - it's 5 years old now, and I sincerely doubt more performance will ever be actually needed or appreciably available from a GPU.

Prior to my Nvidia days, I was daily driving linux. i3 was fantastic and performance was reasonable. Then, I got a freesync monitor out of an e-waste bin on campus some time during my undergrad, circa 2016ish. A few things became immediately clear to me:

My migraines also ramped up in frequency around this time3, which was another contributing factor. So, I primarily used Windows at home for high-performance tear-free gaming that would not make my eyes/head explode.

hecate's head exploding cartoonishly, as a memetic shorthand for having a migraine

credit: cinnamonspots
3

partially estrogen, partially family history, and later, being subjected to the sun at 100% brightness all the time in california

Fast-forward to 2021ish, and Wayland is looking much better. I was still using Linux every day, albeit at work, and I had switched from i3+compton to Sway, and was very happy with just how much more sensible everything was from the ground up. I kept going over to my linux install at home every 6 months or so to update package and evaluate if it was usable yet with my 3090. Those vibe checks progressed roughly as follows over the next few years:

Then, one day in late 2024, I updated all my packages and rebooted into a new kernel and driver, and.... everything seemed fine?5 I still had some issues in Sway, but KDE was working pretty much perfectly - it served as a fantastic comparison point from Windows, and the performance was better. Windows 11 had found its way to my desktop, and I was not happy about that.

So, some time last fall marked approximately when, from my perspective with my wonderfully powerful 3090, everything on the kernel, driver, and compositor stack on linux on my system outperformed Windows, noticably.

There were still a few classic holdouts - I no longer was bothered by Spotify's dogshit linux client, but I still used discord in my day-to-day (because I am, again, unfortuntely a gamer). I wasn't really keen on using a custom client or web-browser versions of it, because that also removed my ability to use push-to-talk/deafen keybinds/etc. However, discord on Windows was also busy being janky in its own ways (leaking lots of vram, which'll come up again shortly), so I took that L, and was pleasantly surprised by Discord managing to get functioning6 wayland screensharing into their canary client in late 2024. And that brings us to....

2025: YEAR OF THE LINUX DESKTOP BABY

It is 2025. Windows 11 is widely regarded as bogged down by AI shitware that no one really wants, in addition to the 'usual' amount of ads, and the new ads-disguised-as-news-stories on the lockscreen. I edit my rEFInd config to make it just boot linux by default, instead of last-boot for seamless reboots. However, NVIDIA is still managing to make the experience kinda annoying, but not in any unresolvable way. Just classic hostility towards the entire non-enterprise-linux ecosystem:

That all being said - presently, my niri process is sitting at a reasonable 190MiB of ram, and things do largely Just Work. There are still some lingering minor issues - there will always be some bugs, and the lack of (consistent?) vram-to-system-ram paging is a negative mark - but largely, linux is presently the minimal, high-performance alternative to Windows.

I'm not saying you won't have to make some consolations or sacrifices or compromises if you completely ditch Windows and embrace Linux as your desktop operating system - I've had to give up on having a good mouse with an unlockable scroll wheel and 3+ thumb buttons that are trivially configurable on linux, unfortunately - but it is worth it to be able to use an operating system with transparent and sensible logs and centralized package management and isn't trying to shove ads down your face.8

I am saying that it's possible for The Computer to feel like a pleasant control that you own, instead of something with a license you are allowed to use & subjected to rent-seeking landlords. You will be subject to bugs, and weird errors, and gaps in your knowledge - but there are fixes, there's manual pages, there's documentation and wikis, there's tutorial videos, there's the steam deck. There's all the friend(s)-that-uses(^H)-linux folks that love answering questions, who would probably kill for someone to ask for their help installing linux.

I haven't touched windows a single day this year, and reclaiming the NTFS partitions felt fantastic - it's not every day that you get to free yourself from the shadow of a megacorp, after all.

4

I do not recall what the actual flags were, but I do recall it was mildly annoying in a funny way. No shade; I also do not want to receive bug reports for a hostile hardware manufacturer.

5

much credit to everyone working on Proton and Wine, as these were huge! A lot of this is complaining about nvidia (and discord, etc) barely having their shit together, but that is because so much else works, and works wonderfully.

6

BARELY. It grabs your entire desktop audio (as opposed to the old reality of "no audio") by attaching a capture sink to every audio source in the pipewire graph, except for its own webrtc source (call audio). Except it also still grabs its own application audio, so everyone listening to the stream will hear double join noises, will hear your client mute/deafen noises, etc through the stream. You can maybe futz with the pipewire graph manually to un-fuck that, but the couple times I tried to do that, it just broke all capture audio. I really wish there was any competition in this space9.

9

yes, I'm self-hosting the latest teamspeak6 server binary and have tried that out. The UX is still incredibly noncompetitive; the people do yearn for being able to read at least a couple text channels without first connecting to that one server and being present in a voice channel. Discord is functionally teamspeak+matrix, and nothing else has replicated that.

7

But because this heavily leverages offloading functionality to firmware blobs on the gpu, it's not available on the older cards, which is there are still two versions. afaik the amd drivers are not dissimilar, except that their old driver was open source and in-tree.

8

There's even general HDR support, which was one of the last remaining barriers - at this point, Adobe and fortnite/COD/riot's kernel anti-cheats are the main application-level considerations, I think. Aside from discord being kinda shitty on all platforms in different ways.

10

I don't feel like I can do a good job of talking about the intersection of my migraine susceptibility and how otherwise-seemingly-insignificant things impact my day to day computer usage a ton. The issue of tearing or flickering displays, be it in the display hardware or the graphics hardware or the computer software, is something that can potentially make my brain hurt immensely for 6 to 18 hours and make me have to exist in a dark quiet space for a while. It's not a fun model to have to operate under, and it's why I do care deeply about the performance and correctness of the rendering+compositing pipelines on my computer.