As a person with a full time job, a significant other, and several hobbies, I just don't have time to invest in learning a new operating system. I grew up with windows (95, 98, xp, 7, 10), so that's what I'm familiar with. I recently switched to linux (mint), and it's fine. Just getting started though is something that was rather involved, and I would never expect a normie to be able to figure out. If microsoft wasn't insisting on making win11 a dumpster fire, I wouldn't have bothered. Now that things are running smoothly, there's some minor annoyances that I'd really like to change, and the prevailing sentiment from the linux community is "that's just how linux is" or sometimes "here's a hacky workaround that barely works in only certain controlled cases". It's better than it was 10 years ago, so there is that.
Linux
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
I just don't have time to invest in learning a new operating system.
That's fair. I got turned on to Linux in college so this is how I feel when confronted with Windows or Mac devices. I just get so frustrated every time I try, and it doesn't seem like the end result is worth it if I can just stick with what works and not have to worry about some random update radically and inexorably altering how my computer works.
I'm inclined to give Linux more benefit of the doubt than, say, Windows. That's because of the motives behind it.
Microsoft have a very long history of making design choices in their software that users don't like, and quite often that's because it suits their interests more than their customers. They are a commercial business that exists to benefit itself, after all. Same with Apple. Money spoils everything pure, after all. You mention privacy, but that's just one more example of someone wanting to benefit financially from you - it's just in a less transparent and more open-ended way than paying them some cash.
Linux, because that monetary incentive is far less, is usually designed simply "to be better". The developers are often primary users of the software. Sure - sometimes developers make choices that confuses users, but that over-arching driving business interest just isn't there.
For me the main difference is Linux only does something when I ask it to.
Windows does whatever Microsoft wants it to do.
Both have major usability issues. But Linux gets a higher tolerance level, because of higher trust levels.
As a newbie in this space, I had interactions with a few distros over the years and lately switched (hopefully) permanently.
My first experience was with Mint 10 years ago. Installing it would cause some GPU driver defect (AMD card) and would turn the whole login screen into an epileptic checkerboard pattern with no way of doing anything. It took me a few reinstalls and a ungodly amount of googling to find a solution which involved opening the terminal at boot process. You can only imagine how frustating that can be for a newcomer.
Later in time I had Ubuntu on my laptop which had a bug that wouldn't spin up the CPU fan and it would simply overheat and shutdown. I had to take it to a technician to find out what was causing the random shutdowns.
A year ago I decided to try Debian on my desktop PC as many have praized it for it's rock-wolid stability. It didn't want to work on my PC. No internet connection and some weird bugs. Took me two-three days to get ti to work and I still don't know what exactly fixed it as I have applied every possible solition I came across.
Much later, aka now, I decided to go with Bazzite on my desktop as many have claimed excelent support. I wanted to install the mimalloc because I play Factorio a lot and a few reddit posts claimed 20% UPS improvement over the stock scheduler. After downloading the source code and following the 4 very easy steps, cmake would throw some random eerors at me claiming some critical files were missing, although they were right there in the usr directory. Turns us Bazzite some some issue and Fedora 40 compiled the code in seconds without any issues.
Conclusion: Linux users, which are very tech savvy or work in that space, know what to do when things don't work out, while the rest of us keeps googling and crying over error messages for things that seem trivial. You never seem to know if it's you, the system or your hardware.
It's something we'll take for granted. With enough time and experience, you could fire off a one liner to fix a problem in less than a minute. For most people thst could take an hour, and they'd probably give up within 10 minutes
I just got to work and plugged my surface pro into my external monitor. It didn't switch inputs immediately, and I thought "Linux would have done that". But would it?
Nope. My laptop for example doesn't automatically use an output when plugged in, but that doesn't bother me because I know other DEs would do that, and it's my choice of having a minimal window manager that causes that.
And this goes into your next point, because I know that this comes from decisions I made, I'm okay with that. I also know I could probably fix it somehow, even if just by running a script in the background that checks if an output is plugged and tries to use it.
And for me that's the big difference. As a general rule when things break or don't work are not the fault of Linux as a general, but of a specific piece of the stack, and more often than not it's because that piece was backwards engineered without any help from the manufacturers of the hardware it's meant to be controlling, so I can be very tolerant of these errors since the bad guys here are the third-party who's refusing to make their things work on Linux. But even things that don't work as I want to, I can make them do so, and that's a huge change in viewpoint.
In other words, on Windows I used to be of the thought of things you can do, and things you can't, with time I noticed that in Linux this thought shifted, to the point that the only question I ever ask myself is: "HOW do I do this?". This implies that there are no impossible things in Linux, which is obviously false, but I would argue that the correct way to think about this is "things that are impossible on Linux, for now", and that's a huge difference, because Linux is always evolving and getting better and better, things you thought are impossible now might be trivial in a few months or years whenever someone with the knowledge to fix it gets bothered with it.
Just this morning I tried to make Outlook on my work laptop to open on startup. I have to find and add a shortcut of Outlook, buried somewhere in the machine, to the startup folder, buried somewhere else in the machine. The startup apps settings menu was just an eclectic list of programs and is of no use at all.
With Mint on my home machine I just go to startup programs settings menu and I can add whatever I want just by pointing it to the right program. It just works.
WIN+R , "shell:startup" in future by the way.
The other list you saw is programs that have added thier own AutoRuns registry keys.
I'm actually kinda surprised that functionality isn't in the new task manager yet. You can toggle on and off basically all startup items from there, but not add stuff.
XP-7 had this right with a folder in the start menu for startup items, just drag a file or shortcut there and it runs on startup.
I've used DOS, 3.11 to all the way to 11. Switched to Linux as main driver around 2009. Used MacOS at work for over a year now. I occasionally boot into windows for rare game that uses some anti cheat that doesn't play well with wine.
I'm old enough that I just want things to work. I don't care for any fanboyism. These are my opinions:
-
Windows is a mess. It has different UI from different decades, depending on what and where. NT kernel is ancient. The registry is a horror show. The only edge it has, is third party software, like propriatery drivers. that's it. And that's isn't a merit of windows, but rather market share.
-
MacOS is inconsistent at every turn. It's frustrating to use, and riddled with UX bugs, and seemingly deliberate lack of functionality. The core tooling, like the file manager, is absolute garbage. The only good thing it has going it, is that the Unix core is solid. In that year, I've experienced a soft brick once, that almost was a hard brick, and the reason was having set the display refresh rate from 120 to 60 Hz. Something I changed BTW, because certain animation transitions in MacOS took twice as long on 120 Hz... Yeah, top notch QA there Apple.
-
Linux. It has its own flaws. For sure. But as for "just works", it happens so often, that it's exactly why Windows and MacOS feels so frustrating. I'd have my grandmother use Linux.
And, I'm not just saying this. When I upgraded components on windows, I spent 2 hours debugging problems. One of the problems was also that it reverted a GPU driver, where every single version information was unmistakably older. It also made it not work.
I've also experienced that the WiFi network adapter also doesn't work until I download some proprietary software over ethernet cable.
On Linux? I didn't need to do a single thing in either case. It for sure didn't use to be this way. In 2009 I was hunting WiFi drivers for fedora over ethernet. But in the last, say 5 years, on Arch, it's been amazing. Did I mention that I use arch?
Ps: The last 4 times I've had problems on Linux have been:
-
- A Windows update fucks up grub.
-
- Reboot from windows doesn't release hardware claim on WiFi adapter, so it doesn't work on Linux.
-
- The system clock is wrong, which was easy to notice because of 2. leading to a lack of remote sync. This is due to Windows storing system time as local time, and not UTC. If you do software development, you'd know how dumb the former is.
-
- Raid partition destroyed because a windows 7 install decided to, unprompted, write a boot partition on a disk with "unknown" file system.
At this point, Linux or even any given distro isn't the problem. The problem is the software library.
I call it GIMP syndrome. There's a lot of capable and powerful apps in the FOSS ecosystem and most of them have some kind of critical functionality gap or the UX of an Oregon Trail era disease. A lot of them, with the notable exception of GIMP, are actually working on it now.
For me it's I can make Linux do this when I see another system perform well, in contrast with they took my vertical taskbar in windows 11 and I have to gut the system to get it back
I do have to remind myself that I'm still used to living in a world where Linux enjoyed immunity to most "consumer" malware just because it wasn't a popular desktop. Ultimately Linux is not more secure than any other system unless someone put in the work to make it that way.
My experience is generally it doesn't just work straight away unless it's something I've hammered out myself
I am also using one of the more DIY distros and window managers though, so I wouldn't expect it to without some attention from me to get it hammered out first
That said, once it's hammered out it continues to work exactly the way I want it to, it doesn't spy on me, it doesn't shove ads down my throat every 5 minutes
Would be an interesting experiment to see how non techy windows/mac users would get on if you just put stock mint/pantheon on their systems but I get the feeling it would not be as smooth as if they just had the thing everyone knows all the flaws of already
Who is "we", my friend? This all depends on your research and expectations. IMO Linux works great, but you should consider it before you buy a machine. Make sure your graphics card and other hardware is going to work. When in doubt, buy from a reliable shop that preinstalls Linux for you.
I find that the default settings and programs of Debian (or whatever major distro) do 95% of what I expect and want, and maybe 5% involve some customization. In other words, it's much simpler than getting Windows or Apple and then purchasing or downloading all the extra programs. But this depends on what you wanna do.
We've been having this discussion in the group I game/ play TTRPGs with. Like 7 of us total all windows, me and another switched to Linux, a third is a computational scientist who is forced to work with redhat frequently, and a fourth member was thinking of switching. After me and member 2 switched, member 4 saw that we had problems (entirely discord for me, all games have honestly worked so far) and changed his mind about switching because he doesn't want to deal with stuff not working OOTB.
I can't fault people who want that, hell I do, Linux is well worth it to me but I will begrudgingly admit there are draw backs to Linux.
I had this exact same thought but than I booted Windows. I get less frustrated because if use Linux I feel like I’m working with it and it is acceptable if there are mistakes. If I use Windows I feel like I’m working against it, and a big part of that is that a lot of issue aren’t there because they are bugs (of which there are probably as many as on Linux) but rather just bad/anti user design
The Linux kernel is wild and has more features and support than I have seen anywhere else. Everything from namespaces (containers) and virtualization to support for strange serial devices.
Well, I don't use a DE so your scenario of the new display not switching over right away is basically my life every time autorandr
decides not to run on startup.
IMO more people should be critical of the systems and tools that they use instead of shitting on the tools that others choose to use.
We do assume too much of our tools, but many people here are guilty of assuming that other OS's are broken in ways that do not reflect the average customer experience.
Linux isn't the best for every usecase, but its good enough for mine. Plus, the community around Linux is actually nice outside of the strange elitism here and there
I think most of us have a good idea of the benefits and drawbacks of Linux/Windows/Apple.
I have a Windows machine for media production, because Linux doesn't support all the software I need for media production. I use Linux for absolutely everything else, because it's better for literally everything else. In truth, a MacBook Pro would be better for media production but they're too expensive.