This is my old man nerd point every time (and by the way, we all keep having the exact same conversation here, which is infuriating).
It is NOT, in fact, more user friendly than 15 years ago.
Not Linux’s fault, necessarily, but hardware got… weird since the days of the mid 00s when Linux WAS pretty much a drop-in replacement. What it couldn’t do then is run Windows software very well at all, and that was the blocker. If we had Proton and as many web-based apps as we do now in 2004 I’d have been on Linux full time.
These days it’s a much harder thing to achieve despite a lot more work having gone into it (to your point on moving goalposts).
it definitely is more user friendly, i remember trying ubuntu 10+ years ago and the default driver was awful, the nvidia driver install ran in the terminal and asked questions that i had no answer to, so half the time i fucked it up, and then it didn’t support my monitor so i had to edit the x server conf to get the correct resolution and refresh rate. and when the new drivers came out i had to re-do everything every time
for a few years now you just install with a usb stick and everything runs great
Installing Windows machines 10+ years ago wasn’t much more fun either… (I’m not sure it’s any more fun these days, but I haven’t done it in ages, so I’ve no idea).
Having recently spent the equivalent to five work days trying to get an Nvidia setup working on Linux I’m going to say the experience isn’t necessarily much better, depending on what you are trying to do and how.
Audio and networking were a shitshow back then, nowadays almost everything just works on those two fronts. Also, having to edit your Xorg.conf is not what I’d call user friendly…
But there was this brief moment, though. Maybe that’s my problem, that I remember it as this momentous piece of Linux history to start getting these cool distros in nice, shiny professional-looking CDs with proper installers that would set up your DE first time every time and get everything mostly there… and it turns out that it was like three years and a couple of Ubuntu iterations.
FWIW, networking mostly works, but I had a heck of a time finding a distro that would properly do 5.1 out of my integrated ASUS audio device last time I went distro hopping. I think audio got better, worse and then better again since the good old days.
It’s not being used as an indicator of user friendliness (that’d be the atrocious time I had setting up my Nvidia GPU and HDR monitors). It’s specifically an anecdote replying to the previous guy’s (accurate) comment regarding how finicky old implementations of audio on Linux used to be.
But also, in case you’re wondering, that setup worked first time on Windows with no additional work beyond the drivers installed by Asus itself. Do I like, or even tolerate, ASUS’s weird driver manager? Nope, frickin’ hate it, would switch to Linux to avoid it all things being equal. But one thing worked first time, the other needed five different distros before one randomly got it right for no discernible reason.
I’ve had the opposite experience with Windows audio though. It’s always been weird for me, randomly switching outputs for no reason, and I stopped even trying to connect wireless headphones because it would always seem to prioritize those, even when they’re turned off. Every 5 to 6 months I’d have to dig deep in the audio settings to fiddle with the gain on my mic so I’d stop blowing out my friends’ ears on discord.
I think we all need to start differentiating the usability quirks and general jank that all OS have in different areas from the blockers.
Yes, the way Windows handles sources and prioritization sucks, while different Linux DEs have dumb problems with UI scaling or their own audio quirks or MacOS has weird multimonitor support or whathaveyou. If that was it I’d be all for prioritizing the free alternative, no questions asked.
The issue is the blocking issues. Entire features not working, or working at noticeably sub-par performance. Hardware with straight-up nonexistent support you need to replace to make the jump, or that is so finicky to set up that it may as well not work for all the average user is concerned. Those are showstoppers.
The problem is you could have a LOT fewer of the quirks, but a single dealbreaker is enough to block somebody making the jump, or reporting that they tried and failed. I’m as annoyed with how inconsistently videoconferencing picks up the right audio output as anybody. I complain about it every time I have a work call. But I still wouldn’t suggest to any of my friends to try to set up their high end Nvidia GPU on Linux as a main gaming daily driver. Those two things are on completely different tiers.
This is my old man nerd point every time (and by the way, we all keep having the exact same conversation here, which is infuriating).
It is NOT, in fact, more user friendly than 15 years ago.
Not Linux’s fault, necessarily, but hardware got… weird since the days of the mid 00s when Linux WAS pretty much a drop-in replacement. What it couldn’t do then is run Windows software very well at all, and that was the blocker. If we had Proton and as many web-based apps as we do now in 2004 I’d have been on Linux full time.
These days it’s a much harder thing to achieve despite a lot more work having gone into it (to your point on moving goalposts).
it definitely is more user friendly, i remember trying ubuntu 10+ years ago and the default driver was awful, the nvidia driver install ran in the terminal and asked questions that i had no answer to, so half the time i fucked it up, and then it didn’t support my monitor so i had to edit the x server conf to get the correct resolution and refresh rate. and when the new drivers came out i had to re-do everything every time
for a few years now you just install with a usb stick and everything runs great
Installing Windows machines 10+ years ago wasn’t much more fun either… (I’m not sure it’s any more fun these days, but I haven’t done it in ages, so I’ve no idea).
Having recently spent the equivalent to five work days trying to get an Nvidia setup working on Linux I’m going to say the experience isn’t necessarily much better, depending on what you are trying to do and how.
This is just patently false. Pick any common distro.
Audio and networking were a shitshow back then, nowadays almost everything just works on those two fronts. Also, having to edit your Xorg.conf is not what I’d call user friendly…
But there was this brief moment, though. Maybe that’s my problem, that I remember it as this momentous piece of Linux history to start getting these cool distros in nice, shiny professional-looking CDs with proper installers that would set up your DE first time every time and get everything mostly there… and it turns out that it was like three years and a couple of Ubuntu iterations.
FWIW, networking mostly works, but I had a heck of a time finding a distro that would properly do 5.1 out of my integrated ASUS audio device last time I went distro hopping. I think audio got better, worse and then better again since the good old days.
That’s not even close to a common use case though. Using that as an indicator of how user friendly Linux is is unfair.
It’s not being used as an indicator of user friendliness (that’d be the atrocious time I had setting up my Nvidia GPU and HDR monitors). It’s specifically an anecdote replying to the previous guy’s (accurate) comment regarding how finicky old implementations of audio on Linux used to be.
But also, in case you’re wondering, that setup worked first time on Windows with no additional work beyond the drivers installed by Asus itself. Do I like, or even tolerate, ASUS’s weird driver manager? Nope, frickin’ hate it, would switch to Linux to avoid it all things being equal. But one thing worked first time, the other needed five different distros before one randomly got it right for no discernible reason.
Fair enough, sorry for the misunderstanding.
I’ve had the opposite experience with Windows audio though. It’s always been weird for me, randomly switching outputs for no reason, and I stopped even trying to connect wireless headphones because it would always seem to prioritize those, even when they’re turned off. Every 5 to 6 months I’d have to dig deep in the audio settings to fiddle with the gain on my mic so I’d stop blowing out my friends’ ears on discord.
I think we all need to start differentiating the usability quirks and general jank that all OS have in different areas from the blockers.
Yes, the way Windows handles sources and prioritization sucks, while different Linux DEs have dumb problems with UI scaling or their own audio quirks or MacOS has weird multimonitor support or whathaveyou. If that was it I’d be all for prioritizing the free alternative, no questions asked.
The issue is the blocking issues. Entire features not working, or working at noticeably sub-par performance. Hardware with straight-up nonexistent support you need to replace to make the jump, or that is so finicky to set up that it may as well not work for all the average user is concerned. Those are showstoppers.
The problem is you could have a LOT fewer of the quirks, but a single dealbreaker is enough to block somebody making the jump, or reporting that they tried and failed. I’m as annoyed with how inconsistently videoconferencing picks up the right audio output as anybody. I complain about it every time I have a work call. But I still wouldn’t suggest to any of my friends to try to set up their high end Nvidia GPU on Linux as a main gaming daily driver. Those two things are on completely different tiers.
Especially if you had a soft-modem.
And printing. Oh dear, I might have a headache if I think too much about it.
Oh, man, I had entirely blocked the concept of “soft-modems” from my memory. I’m having flashbacks.