Your OS sucks
After celebrating my one year anniversary working at Facebook a couple months back, I started putting together a summary of some of the things I've learned during my 20+ years neck deep in the software industry. I spent 3 years in MacOS, back when it was called "System 6/7/8", a couple years in Unix land (and in Linux land before normal people had ever heard of it), then 17 years at Microsoft, fully immersed in the MSFT kool-aid, and now a bit over 1 year at Facebook where MacBook Pro's abound, and engineers connect to a Linux ‘dev server’ to do most of the interesting stuff. My entire post-college career has been focused on a particular niche of software development called compilers. So, understanding that I’m a nerdy software developer who will talk about details that you may not understand or care about, with a bit of a foul mouth, read on at your own risk.
Coming from a world where I was highly productive using Windows, with brief stints in Mac & Linux before, and now having spent time becoming moderately productive on a Mac and on Linux I can say with authority: Your OS sucks. I don't care what OS you're talking about: it sucks. There are very few truly differentiating factors, and people who point out "feature <blah> is so much better on <insert your favorite OS here>" are kind of missing the point. Each of the 3 general purpose computing platforms that matter have certain things that they're good at, and certain things that they're bad at. But the reason they all suck is that they each have big fairly important things that they're truly terrible at. We've been at this game for 30+ years! The 'desktop' form factor hasn't changed significantly since the old Apple Lisa more 32 than years ago (seriously: the screens are nicer, the pointing devices are more svelte, but they're all still just screens with pointing devices and keyboards attached).
I'll start with the OS set that I have the most experience with: Windows. I've used every version of Windows professionally since Windows NT 3.5, with the only exception being Windows ME (and Windows 10, now). But rather than drudge up the past, and carry on about how Windows XP was so wonderful (it really wasn't) and Windows Vista was a crime against nature (Again, not really, or probably not for the reasons you think it was), I'll focus on Windows 7 and Windows 8.1 (Win8 sucked on so many levels that I won't go into it, and I haven't done too much with Windows 10 yet, but it appears to suck in similar ways that Windows 7 & 8.1 suck). Windows gets a pretty large swath of edge cases ‘right’, but general usability is barely passable. How many different hoops do I have to jump through to add a printer? What monstrosity am I forced to use to talk to a Mac or Linux system? How long does it take to wake up from sleep? What atrocious installer technology is going to crap all over my computer in sufficient measure to force me to completely pave the machine and start over? And don't even get me started about the command line. Why are the useful tools for command line usage so bad, when there are freely available, dramatically better versions that Microsoft should have been distributing with their operating system, or at least with their developer tools, for 20 years? These aren't weird corner cases: how long does it take to get a machine back to a usable state if Windows Update has decided to shit itself? When a hard drive dies, it's about the worst thing ever, because then I have to reinstall the OS, which requires 14 Windows Update/Reboot cycles before the machine is safe to use on the internet. I'm not saying that the Mac is more secure (it's clearly not), but I'm saying that requiring that I download 1GB of updates, install them, restart, download another 1GB of updates, restart, etc… is about the stupidest thing ever. There's a way to back up your system in such a way that you can just get a new hard drive, plug it in, and restore it, but Microsoft decided that making this capability discoverable was a bad idea, so in Windows 8.1, it's hidden and only discoverable if you have 3 live chickens to sacrifice while staring at the full screen start menu and cursing your own existence. Every release, the way you configure your system is put in a shuffler, and re-organized so that you can never really remember how to change that one thing you figured out once. But then, once you scratch the surface of this OS, you may decide you'd like to write some software for it. HAHAHAHA: Sucker! Nothing you've ever written is actually usable, and anything that looks like it might be usable is no longer actively supported by Microsoft anymore. You know that routine to write your in-memory database to a file that someone wrote in C 25 years ago before they decided to pursue their dream of becoming a penguin psychologist? If you'd like to write some new UI on top of it, you get to REWRITE THE WHOLE DAMN THING! Microsoft decided that because file i/o can take a while, you shouldn't ever be allowed to do it synchronously, because god forbid you might actually have a fucking brain in your head and do the synchronous file i/o on a background thread. So you get to start over. Why? Because Windows hates you. This OS sucks.
Moving on to my most risky statement of sucking, Linux. This is the poster boy of nerdy fanboi-hood. It's all open source. You can recompile this thing to do whatever you want, if only you spent the 2743 years learning everything about it. This operating system is great to create write once, run forever software, but a universe where that is possible (or even a good idea) doesn't exist. Linux has security exploits, just like Windows and MacOS, and so you have to constantly keep the hodge-podge of libraries, SDK's, tools, and daemons that are running on your machine up to date. And so most people pick a distribution. Ubuntu, Debian, & Red Hat Fedora are the 3 that come to mind. Actually, that's a lie: the first one that comes to mind is CentOS, because that's the criminal distro that we use internally at Facebook. But it's some conglomeration of bits that we pay a group of engineers some well deserved pile of money to tweak, maintain, secure, and all that. But if you want to do anything, you get to search the web for how to do it, and pray that the question has already been answered on StackOverflow, because if it hasn't you're completely screwed. If you ask the question, you'll get an army of vocal assholes belittling you for not being able to search through the poorly written man pages to figure out that you have to add 3 lines to your .wtf file, or create a .WTF.d directory underneath your .whyIsThisFolderHere configuration folder, and that will work fine, if you're using version 1.4.2.1a of the OpenBITEME.so library, but if you have the 1.4.3.7b or later version, you have to switch the order of .WTF.d and .whyIsThisFolderHere directories. Then you realize that you’re running 1.4.3.1, which isn’t discussed anywhere in the help docs, and when you mention it people either ignore you, or mock you for not just applying the 13 patches from 5 git repo’s, 7 subversions servers, and a CVS repository stored on a someone’s private NFS server in Belarus, and why don’t you just use Gnome Desktop, or KDE, or some other hideous UI layer on top of this crap that was clearly designed by a 16 year old kid who just finished binge watching every Transformers movie, and doesn’t know what normal feels like, because he’s been drinking Red Bull every 2 hours for the past 4 months. And if you'd like to actually try to do anything real on your Linux machine, you're really screwed, because your graphics card driver will probably shit itself because when you typed 'yum update' it updated your version of libnonstdc++.stfu.666.so to a version that's incompatible with AMVidia graphics card you bought because someone else told you it worked well in Linux, when what they meant was it works well when they had the kernel debugger attached to their open source WiFi router. Why? Because Linux hates you. This OS sucks.
Now on to the most religiously defended shitty OS I've seen: MacOS. This operating system has an ever shrinking number of zealots who will claim everything about Macintosh is perfection. I'm here to decree that they're all completely wrong. I'll give them some amount of “ancient history” usability: Back in the bad ol' days, Macs were easier to configure and keep working than Windows 3.0-3.11. But today, if you want to do anything that isn't "mainstream", you're 100% completely and totally fucked. If all you want to do is click around in a web browser on your MacBook, you're going to be just fine. But you'd be just as fine doing that on some way cheaper device. Go get a $250 tablet, or a $150 Chromebook. A browser is a browser, and once you're inside a browser, your OS claims are pointless and you’re stuck in the misery that is HTML+JS+CSS. Chrome runs pretty similarly on MacOS, Linux, Windows, and ChromeOS. So does FireFox. But if you want to do anything real on your Mac, you will quickly discover how much it sucks. There's virtually no decent software for MacOS, because Apple has, for decades, crapped all over the developers. And because of Apple's vaunted Human Interface Guidelines, you're stuck using the top of the line UI from 1987: Menu at the top, folks, and a massive pile of little useless icons on the right. You wanna use an external monitor? Get use to squinting, because MacOS doesn't scale the UI elements for shit, unless you shell out hundreds more dollars for a 4K monitor. Are you writing code? You're stuck with Xcode, where almost nothing is documented, and if you're authoring code for an iPhone, be careful to see if your phone is plugged into your Mac or not because that might change what your build actually builds! Wanna try to connect to your Mac from another Mac? VNC is arguably the most disgusting, least usable way to do anything like this. You better just stick with SSH, because with VNC, if your coworker is an asshole, they can walk up to your Mac and start sending e-mails as you. But then there's my biggest beef with MacOS: the goddamned mouse. There is absolutely no excuse for the need to use a mouse as the only way to do anything. Wanna move windows around? Go get an add-in, and then watch when some of those add-ins start crashing. Wanna look at menu options? You can hit ctrl-F2, and then explore with arrow keys, or just grab the mouse. Wanna use an app? You get to learn a whole pile of custom short cuts for each wacky feature. And let's talk about the mouse: A mouse requires a hand. That hand has 4 fingers and a thumb. But Apple's decided that you should generally use both hands to do anything complex, because more than one button on a mouse is confusing. What's far simpler is using one button on a mouse, while holding down 2 buttons on your keyboard. And then there's the absolute, complete, and total inability to run this shitty half-stolen OS in a virtual machine. This is kind of comical, because some of the earliest virtual machine software in existence actually ran on a Macintosh, and it was emulating a Windows machine. It was slow and buggy for technical reasons (emulating x86 on a 68K or PowerPC was pretty bad), but now MacOS runs on pretty standard x86 (though absurdly overpriced) hardware, so VM's should be a piece of cake, but Apple will be damned if they give you a way to run multiple, isolated, copies of their precious OS that would let you roll back the system to a known good state on the same piece of hardware. And let's talk about stability. Apple supports, at any given moment, no more than 20 or 30 configurations of hardware (that's probably an exaggeration). I've used a total of 4 different devices in the past 14 months: A 13" MacBook Pro, 2 15" Mac Book Pro's , and an ash tray, er, I mean Mac Pro. And they all crash or freeze. On a pretty regular basis. In my house, I have 8 PC's. 5 of them are custom chunks of hardware that I've pieced together over the past 10 years, with all sorts of different motherboards, CPU's, RAM sticks, SSD's & hard drives, monitors and peripherals. And they crash or freeze very rarely. I've seen perhaps 4 blue screens in the past year, but I've seen at least 10 sad mac's. But Apple's perfect, and it's your fault for wanting to try to use your computer in a fashion that doesn't aggravate your wrists. Why? Because OS X 10."stupid code name" hates you. This OS sucks.
So look, get off your high horse and own that your crappy OS is just the thing that you're comfortable with, and that yes, there are major parts of your crappy OS that suck, because the people that make your OS just don't give a shit about that thing that actually really matters to a significant minority of people. Why? Because your OS sucks. All OS’s suck.
Now don't get me started on these dumb "smart phones".
Disclaimer: I’m sorry that I didn’t catalog everything about each & every OS that sucks. There are lots of things that suck about these OSes that I haven’t even scratched the surface on. Please, feel free to add your complaints to the comments section. I promise not to care. And to be clear, I know engineers who work on a wide variety of software, some of whom work on software I’ve disparaged above. Know that I’m not really impugning on your skills, folks. I’m not claiming that I can do any better. Your ego can remain intact. I know that making software is hard, and shipping software is even harder. That’s one of the reasons why all software sucks. I hold no personal grudges against anyone about any of the stuff above. My personal grudges are reserved for far more trivial, silly things.