paint-brush
Leaving Linux for Mac After 15 Yearsby@austin
8,012 reads
8,012 reads

Leaving Linux for Mac After 15 Years

by Austin PocusOctober 27th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The "year of the Linux desktop" never came to fruition. The second most popular Linux distribution, according to Distro Watch, is Manjaro. It was to Arch Linux as Linux Mint is to Ubuntu, an easier-to-use version of the base distro. It's hard to understand what's going on and why, so why does it take so much to work? This is patently absurd. It’s impossible to understand why, unless you’re an audio engineer, you won’t be able to plug into your monitor.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Leaving Linux for Mac After 15 Years
Austin Pocus HackerNoon profile picture

I’m sorry Linux. It’s not me, it’s you. The “year of the Linux desktop” never came to fruition. It was always next year, year after, some year. Now, it’s almost a running joke.

The second most popular Linux distribution, according to Distro Watch, is Manjaro. Having used Ubuntu and Linux Mint for most of my professional life, I decided to give Manjaro a shot. After all, it’s based on Arch Linux, and that was probably my favorite distribution (if I had a couple of weeks to kill on configuration).

Manjaro was about as good as I could expect. It was to Arch Linux as Linux Mint is to Ubuntu, an easier-to-use version of the base distro. I’m still using it on my current laptop (until the Mac arrives) and my desktop. So, where’s the beef?

First, let me say, I grew up with Linux. I owe my career to Linux. I owe my very access to computing, to Linux, because without its customizable, configurable, do-one-thing-and-do-it-well mentality, I wouldn’t have had a computer growing up.

I started with Linux when I was 13 or so, when my first computer broke down. I didn’t have the money to replace it, of course, around 2004, when the average PC was $1,699, or over $2,200 in today’s dollars. My family didn’t have the cash, either. I had to be resourceful -- luckily, my brother was on the same wavelength. He told me, “man, there are computers all over the place when we’re scrapping!”

“Scrapping”, in this context, refers not to the slang term for fighting, but to the collection of scrap metal, to return to a recycling center, or “scrapyard”, for a profit.

In 2004, my family simply was not doing that well, financially, i.e. we were broke as a joke. We never went hungry, but we lived by the skin of our teeth, paycheck to paycheck. My dad, nearing retirement age, was out of work from the boring mill operator job he used to work. So, he got in his truck, along with my older brother, and collected scrap metal.

Getting back to the plot, my brother realized the opportunity inherent in these mysterious machines -- perhaps I could repair them, my brother wondered? It turns out, in part, that I could.

Before our first computer died, I had dabbled with Linux, this weird OS, this mutant creature of a runtime that could, it was told, run a computer on nearly zero-power. In other words, you didn’t need the latest and greatest machine to do stuff. We all remember those old DOS machines, but I wanted windows on my screen, dammit! A shell-based interface wouldn’t do.

So, I decided Linux was my best bet. For one, it could run on puny, underpowered machines. But even more so, it was harder to exploit. It’s not that Linux was inherently harder to hack (though it was, by a fair bit), it’s that the vast majority of viruses were written for Windows or Mac. It was naturally harder for Linux to catch viruses or malware, because far fewer people used it.

I quickly discovered that the vast majority of the machines we found were not fundamentally incapable of running, but simply infected.

So I installed Linux. Little did I know, it would become a 15-year rabbit hole. I started out wanting to code. I ended up studying OSes and ops instead.

Why? Because I had to understand every layer, of course! Not only did I have a hunger for the knowledge of the entire stack, back to front, silicon to screen, but I also hungered for more powerful machines!

Linux could be customized like a hotrod, tuned to perfection for a given machine. Gentoo, anyone? Bah, you’re all on Arch now.

So where did it all go wrong?

My beef with Manjaro begins with audio configuration.

pavucontrol
is powerful, but it’s damn near impossible to understand what’s going on and why, unless you’re an audio engineer. I ended up unable to plug my speakers into my desktop’s audio card directly, instead opting to plug into my monitor. This is patently absurd.

My audio card, a stock built-in Intel card, is perfectly compatible with the latest version of the Linux kernel used in Manjaro, so why does it take so much to work? I’m guessing my motherboard and its audio interface aren’t fully compatible with Linux, but there’s nothing I know of that I can do about that.

The second worst part of Linux is graphics configuration. When I got the Radeon RX 5700 XT, I wasn’t able to get it to work with the same distributions that had wifi support, i.e. those that had the latest Radeon support didn’t have wifi support, and vice versa.

Eventually I solved the issue with wifi, using a Panda-brand adapter, but I wasn’t able to resolve the graphics issue, least of all in Manjaro, despite the Arch Wiki’s help (easily the best documentation for a Linux system ever made, bar none).

So, I shelled out the extra dough for a GeForce 1080Ti, an adapter that I knew was compatible with Linux, and suitable for playing games too. That hurt, having to buy a brand new (ish) graphics card, knowing that I couldn’t get the old one to function.

There’s a reason the Radeon card didn’t work. While Radeons are much more suitable for Linux-only systems, the latest and greatest cards, whether the Radeon RX 5700 XT or the GeForce RTX 2080, aren’t going to work on Linux for the simple fact that no one has written drivers yet. Radeon cards are much better regarding drivers than Nvidia cards, easily, as ATI provides quality open source drivers -- they just haven’t gotten to the latest cards yet. Nvidia holds a certain contempt for Linux, or so it seems, as they only provide binary drivers, never open source.

Macs, on the other hand, take a middle ground. Linux provides support for as much hardware as possible, through open source, usually unofficial drivers. Windows has the same goal, to support as much hardware as possible, but through closed-source binary drivers. Macs, however, provide support as best as possible with a certain, pre-chosen set of hardware.

This ultimately means less configuration is required, since you have fewer options. While devices under Windows and Linux alike almost always need external drivers installed, Mac is pretty much taken care of.

Not only this, they provide an unbeatable user experience. The UI/UX of Mac products is unmatched -- I’m not being a fanboy here, I’m stating a fact. The sky is blue, water is wet, and MacOS provides a superior UX. Full stop.

Aside from providing excellent hardware and user support, Macs provide a top-notch development environment, thanks to their Unix-based roots. You can essentially get all the tools you get on Linux, on Mac instead, with nearly zero difference.

I originally chose Linux because, quite simply, I couldn’t afford a Mac. Hell, I couldn’t afford the computer itself! I had junk computers, machines other people threw away. I would create a custom Linux system, using Ubuntu, Arch, Debian, Mint, DSL, and a few dozen other distros to get the optimal environment for a particular machine.

These days, though, I simply don’t have the patience. I have the means, motive, and opportunity to murder this Linux system dead, and replace it with, by all counts, a superior alternative.

Let us count the straws that broke the camel’s back:

1. Audio configuration, or lack thereof.
2. Lack of default WiFi adapter support.
3. Lack of default graphics card support.
4. Lack of UI/UX to tell me what exactly is going on, and how to correct it.

The last one is a doozy. See, Linux assumes you know what you’re doing. Windows does too, to a certain point, when it comes to edge cases and power users. When I use Linux, there are a lot of steps between “it doesn’t work” and “it’s working again”. With Mac, it’s more like I can immediately correct the issue. There are just more “right” choices in the Mac UX.

For example, I wanted to add my home directory to my shortcuts, so I could open it in the Atom text editor, so it would appear in the left-hand panel in Finder. I simply search “austin” or “projects” (both directories on my drive) to find them, right click, and boom, it’s there. Doing the same on Linux would depend on which desktop environment / file browser you’re using, and whether it supports custom shortcuts (or even drive-wide search). Plugging in speakers usually “just works”. So does WiFi. So do graphics cards, naturally, because aside from Hackintoshes, Macs always know the hardware and protocols they’ll interact with.

Linux is like Perl. “There’s more than one way to do it.” Mac is more like Python, in that “there’s only one way to do it.” Both have their ups and downs, but ultimately, I think I prefer the latter.

The Linux desktop is dead. Long live MacOS!

Postscript:

I have since received my Macbook Pro, and let me tell you, setup was simple. Install homebrew, install all my usual apps and tools (yarn and nvm for node especially) and I was able to run a local server for Hacker Noon’s app subdomain in no time. (Did you know you, too, can write for Hacker Noon?!)

I’m absolutely loving the MacOS aesthetic and UI/UX, as I always have. It’s not for everyone, but it’s for me, folks. See the world don’t move to the beat of just one drum. What may be right for you, may not be right for some. It takes diff’rent strokes to move the world.

Enough of my rambling. As David is fond of saying: back to the internet!