Why I decided to ditch Apple

April 26th 2017

Because iSmartened and iWant less expensive technology that respects my data, doesn’t develop iSmudges and doesn’t require an iCork to just work.

I’m writing this story on the 2nd iMac I’ve bought so far, and probably the last one I’ll ever own. I’ve been doing web and software development since 1998 but — strange as it may sound — didn’t really meet the “Mac World” until 2010 when my interests and professional career started to seriously shift from developing classic, desktop oriented business applications in C# / .NET to building web and mobile-based, end-user oriented projects. That’s when I discovered the Macs (I’m referring to computers, not the portable iThingies) and, like everyone coming from the blue-greyish Windows world, I was simply amazed by the hardware build quality, the care they were putting into details and the perfection of their truly inspiring design. Not to mention the incredible stability of their BSD/UNIX-forked OS.

Now, as opposed to the vast majority of Windows users, I wasn’t a complete stranger to UNIX: during college and my early working years I’ve used various Linux distros and even configured web servers, back when PHP was the new kid on the block. So I’ve come to love and regard OSX as the Linux that “just works”, with a beautifully-designed interface on top, running exclusively on the most expensive and best-looking hardware ever built.

Some things have changed since then. The new macOS is still stable and beautifully-designed, but so is the Redmond OS nowadays. The iMacs and MacBooks are still beautiful and perhaps the most expensive personal computers around, but there’s been a constant decrease in the quality of materials and build process, especially since Apple became the Greatest Tech Company in the World.

This may sound like a rant, but let’s take a look at the back of my iMac.

Back of the iMac

Notice anything strange? Look closer, behind the stand.

Back of the iMac — a closer look

Curious about the odd-looking black thing?

Here’s how it all happened for me, on a late working night. I was quietly writing some back-end JavaScript API code (nothing too stressful for my computer, I trust, though I wasn’t using Apple’s Xcode) when my productive serenity was suddenly disrupted by a shockingly loud snap… and the iMac stopped looking me in the eye and started bashfully looking at my desk, like a fragile snow-bell in the early spring. A loud snap indeed. The snap of your heart when an $800 stack of bills suddenly leaves your wallet. It’s hard to believe that replacing a broken $5 hinge can cost $800 — the sum that Apple service representatives were initially asking for fixing the broken hinges on the affected units. Sounds incredible, but the figure is not totally unjustified: the new iMacs were so beautifully-designed that they lacked any kind of screws. Their screens were seamlessly glued to the cases in such a way that it was impossible to replace the $5 hinge in the back without completely tearing the entire device apart. And the labour involved in the process was indeed expensive, since it involved professionally qualified genius-bar waiters, working in glass boxes and not the ordinary handy kids with screwdrivers that make up the lame computer shop in the corner of the street… you know, the one where mere mortals running Windows go to fix their blasphemous boxes.

It’s not just me bitching and preaching here. You don’t have to take my word; google for it and you’ll find reports from customers complaining about the atrocious hardware design fault and poor choice of materials in the 2013 iMac generation that led to thousands of unit stands literally breaking under their own weight.

A few months later, following growing media coverage, Apple silently admitted the design flaw and agreed to fix the broken devices for free. Hooray for the customers living in the US, UK or Western Europe. A big middle finger for the ones living in Eastern Europe or other more exotic and less regulated parts of the world. The likes of Romania, where I happen to live. I had no doubt there was no way to settle this for free with iStyle, the “semi-official” Apple dealer in these parts, although I payed the equivalent of 5 average national wages to get the iMac from them. To be honest, I didn’t even try, because I couldn’t afford 3 or 4 months without it. After all, I bought it because I really needed it to do my work. Or so I thought at the time. Nah, enough that I waited for almost two months to get it after paying the full price in advance. I was already fed up with dealing with iStyle and heard enough horror stories from other friends who bought from them.

But since I couldn’t work on it as it was, and since — as a Romanian citizen — I already had a dark sense of humour and unorthodox but ingenious coping mechanisms, I found a bright, yet astonishingly simple solution: the iCork.

The iCork.

A plain 20 cents rubber cork to fix an $800 problem. Not bad. Not bad at all.

In retrospective, considering I was repairing a beautifully-designed $3,500 worth piece of equipment, I should have probably used the cork of a vintage Merlot or Cabernet, but you see, those are made from genuine cork, and cork (the material, the oak tree bark) is less adherent than rubber. So the plain, boring and inconspicuous looking black rubber does a better job. Maybe there’s a lesson in there for Apple.

Admittedly, the whole hinge mess wasn’t as bad as a certain South-Korean company recently selling incendiary devices mislabeled as mobile phones. Still, it was bad enough for the industry’s number one who was obsessively using the “i” in product names to strategically place their customers’ ego between their wallets and their common-sense.

Back to the iCork now. 3 years later, it still works surprisingly well. The rubber, I mean. The rest of the iMac? Not entirely so. Within the next few months it developed an odd sort of blemish, at the lower corners of its ginormous 27-Inch elegant visage. Wanna see?

The iSmudges.

Now, I don’t mind them that much, because there’s not much happening in those areas anyway. I’m writing code in Atom and I prefer the dark theme, like most sensible programmers, and the left iSmudge is almost invisible on the dark background. The right side? Well, that’s usually in the corner of the browser, the iPhone simulator, the Android emulator, or it just covers a small part of Sketch or Gravit Designer’s interface. So, not a huge problem. I was just surprised, that’s all, to see the phenomenon happening on a fully-laminated screen, as they were marketing it with the usual bells and whistles of an Apple conference. And again, it wasn’t just me. Just google for “imac screen smudge” and you’ll find a lot of unhappy customers and a long list of excuses invoked by Apple’s genius-bar waiters.

It’s true, I’ve seen the same problem happening on my previous iMac, but that one wasn’t fully-laminated and as beautifully-designed as the new one, so anyone could easily use two suction cups to remove the front glass and clean the dust underneath. Which I did many times.

There’s a way to do this on the new iMacs, the guys at iFixit wrote extensively about it and they’re even selling kits that will help you remove the screen and do the dirty work. But it’s a lot more complicated than it used to be.

Now, don’t imagine I keep my iMac in a barn behind a haystack. My home office is a clean environment, I live on the 5th floor of a residential building near the centre of Bucharest, no one is allowed to smoke in the apartment and I have an obsession bordering OCD regarding the hygiene of my desk. But dust particles are odd things: they are forming constantly and are present everywhere, even in hospital operation suites. Fan blades tend to attract them, break them into even smaller particles and electrically charge them, in which state they have a proclivity to stick to things. Things such as the inner layers of beautiful, fully — yet somehow improperly — laminated iMac screens.

I’ve learned about the mechanics and effects of dust on electronic equipment in my university years. I assume most product designers and engineers know about them and that’s why the vast majority of screens/monitors built by DELL, Lenovo, LG and other manufacturers seldom have “dust problems”. Sometimes problems do happen, and we’ve learned to accept or even expect a dose of imperfections and corner-cutting/cost-optimizations in the industrial production processes of cheap or shady manufacturers. But Apple was not one of them shady ones, back in the good old days. Well, some things have changed. Their pricing policy, coupled with the recent decrease in build quality, starts to look less and less like “customer satisfaction” and more and more like a robbery “Designed in California”. No offense intended for the state of California, I’m referring to Apple’s marketing mantra printed on the back of their products. I don’t mind that they are “designed in California” and “manufactured in China”. I care not for such things and I definitely do not consider “manufactured in China” a bad thing. Just look at the newest Xiaomi smartphones: same (or better, as some would argue) build quality as the iPhones or Samsung flagship devices for a third of the price. What I do mind is paying a premium price for a non-premium product.

I’ll admit I was a fool back then. I wanted things that looked slick and “just worked” and didn’t know or cared much about the true stories behind catchy marketing mantras and how large a part of the product actually “happens” in the customer’s mind. But, as they say, a fool and his money are easily parted. Especially if you manipulate his ego.

Well, I’m less of a fool now. Or maybe just a different kind of fool. I’ve learned a few lessons and developed a different set of concerns. Which brings me to the second part of the story — the one about respecting the customer’s freedom to choose and data privacy.

Apple is definitely not the worst when it comes to such things. Lots of people, including me, cheered Tim Cook’s stand against FBI’s claims last year. I won’t go into details here, but for the sake of principles or marketing reasons, they do appear to care about protecting users’ right to privacy.

However, when setting up macOS (or iOS on any iDevice, AFAIK), you’ll be insistently asked to back-up your digital life on iCloud. Recent history proved that most users are pressing “yes” without really understanding what that means. A few Hollywood celebrities learned that the hard way, not long ago. Not that it’s something wrong with Apple’s security. On the contrary, it’s one of the strongest and best guarded in the industry, but more often than not hackers exploit people behavioural patterns rather than technology flaws. And it’s wrong to trick customers into backing up their data online without fully understanding the risks and how to mitigate them.

At the same time, willingly installing software from sources other than the App Store becomes more and more difficult for the average user, which forces “unidentified” open-source developers building free software to pay the $99 annual App Store tax if they want to make their applications accessible to the masses. It’s not a one-time only $25 fee as Google has for its Play store. No, you have to pay each year for as long as you want your apps to remain in the store. Like paying rent. And you’re not even allowed to deploy an app in the App Store if you don’t own a Mac.

You’ve probably heard about React-Native, the newest and coolest JavaScript way of building mobile applications. One of its major advantages is its platform-independence: you can set up a development environment on virtually any platform available, it doesn’t matter if you’re using Windows, Mac or Linux. And you can deploy to Google Play from any of those platforms. Some very special people worked hard and open-source to make that platform-independence possible, though most of them are almost certainly using MacBooks in their workflow.

Now imagine Google trying to force you to buy an expensive Chromebook Pixel in order to deploy a Candy Crush clone in the Play store.

Don’t get me wrong, I don’t mind that I have to pay to deploy an app. After all, they’ve built the App Store and they have every right to claim a rent for using it. Maybe $99 per year is not that much, even if you’re distributing the the app for free. And ultimately I don’t even mind that Apple became a financial behemoth. Some of that fortune is well earned and well deserved.

I do mind, however, when my expensive and beautifully-designed macOS computer breaks due to usage of cheap materials in the manufacturing process, or refuses to talk to my non-iOS tablet or read microSD cards formatted by my Android smartphone. Or when my iOS device obstructs my access to my own data if I’m trying to use non-Apple means. And all that in order to convince me to buy the Apple means. The tactics are cunning but cheap. Unlike the products. Don’t try to force me, let me choose to use and love your products. Otherwise I might end up hating them.

And what troubles me the most is the emerging monopolistic pattern and its potentially destructive effects on the industry and ultimately our own lives. As technology consumers and “information workers”, we’re willingly placing way too much power in the hands of Apple, Microsoft, Google, and — almost everywhere in the world — in the hands of our own governments. It’s just a matter of time until it will get abused.

It’s concerning to notice that most modern-day JS/web developers are so accustomed to exclusively use macOS, that some are already forgetting that the rest of the world is mainly using Android and Windows. I’m a contributor to a number of open-source projects, and I’ve noticed a steady increase in the number of UI bugs and glitches in various web frameworks, bugs that were introduced simply because developers never thought or had the chance to test their work on other platforms.

I’m not telling you to ditch your Apple devices. If you’re happy with them, keep them. My wife is perfectly happy with her old iPad. Then again, her iPad was significantly cheaper than my iMac. I’m merely telling you the reasons compelling me to ditch mine. And I know ditching Apple sounds a bit extreme. It sounds equally extreme in both the US and Eastern Europe, but for different reasons: in the US most small/independent web developers could probably not imagine their work and life in the absence of Mac devices, while in Eastern Europe the Macs are expensive enough that few developers and companies are actually using them. Not because they couldn’t afford to buy them from time to time, but because their customers would generally not pay a bill that includes a workflow based on Apple products.

How can a JS/web developer ditch Apple in today’s world? Well, something interesting happened over the last few months, an event that empowers you to do just that, if you really mean it.

I’ve done some extended research on the subject and I intend to write another story soon. So if you’re interested to learn the conclusions, stay tuned.

Hacker Noon is how hackers start their afternoons. We’re a part of the @AMI family. We are now accepting submissions and happy to discuss advertising & sponsorship opportunities.
If you enjoyed this story, we recommend reading our latest tech stories and trending tech stories. Until next time, don’t take the realities of the world for granted!

More by Ionut-Cristian Florescu

More Related Stories