Don’t do it! Don’t succumb to the fantasy of that warm and fuzzy upgrade.
I’m a software developer; more specifically, an embedded software developer. For those who don’t know what “embedded” means, it’s software that runs hidden inside a machine, often with no obvious interface to the user. The world is filled with machines running embedded software. If it’s done well, the software is invisible.
Reliability is key for embedded software. It doesn’t have a human interfacing with it to debug problems or help recover the system. Often, it doesn’t have a Internet or network connection that allows upgrades. Hence, changes to embedded software should be done carefully, and only when the benefit of the changes exceed the risk.
The same is true of my development platform. If I have a computer that’s using development tools (compiler, debugger, source control, etc.) that are working, why upgrade? I use my computer for play and work, so that new feature could be for play, but there has to be some reason to take that risk.
If it’s not broke, don’t fix it.
I’m developing on a Mac Book Pro. Much of my embedded software runs in a Linux environment, and many development tools are targeted at Linux. Windows is also often supported, and to a lesser extent, Mac OS, but because my target environment is Linux, there’s a definite advantage to using the Linux OS. No problem, I’ll run Linux in a virtual machine. VMware Fusion is the common VM used on Mac OS. It works well, although it does have its idiosyncrasies. So I ran Ubuntu 14.04 LTS in Fusion, on Mac OS El Capitan. It worked fine.
My current project is building a Linux file system using Yocto. I’m not a fan of Yocto. It takes 6 hours to build an image, and consumes almost 40 GB of storage, to get an 500 MB image. Build failures are common if remote repositories are not available. A lot of data is downloaded from the Internet; don’t even think about doing the build on any sort of metered connection.
This environment uses a bash script to detect the target board being connected via a USB OTG (On-the-go) port, and copying the image to that target. This worked fine using Mac OS El Capitan and the current VMware Fusion at that time.
Then I foolishly upgraded to Mac OS Sierra. There was no new feature in Sierra that I wanted. I’d like the ability to lock my laptop and not put the OS into sleep mode, so I could let it compile unattended for hours (I’m looking at you, Yocto), a strange feature that’s missing in Mac OS, but Sierra doesn’t offer that simple, handy feature. It offers Siri, which I don’t want (for resource and privacy reasons), and it doesn’t offer the improved Apple File System, which I would, as it should better protect against file corruption. So why upgrade? Because I’m surrounded by Apple fans who get excited about anything Cupertino does, and I let it affect me. I installed Sierra.
Oh oh, I broke it.
At first there were no obvious problems. I could still do everything I wanted in the Mac OS, and I didn’t notice any problem doing builds in Linux under VMware, until one day I tried to flash the Linux image, the one that took 6 hours to build, to the target. I could not get the script to detect the target when I connected it. I tried running as root, changing when I connected the OTG port, rebooting at different times, but nothing I did helped.
It used to work, so what changed? This is when you start thinking about what you’ve changed in your system since the last time you succeeded doing this. Was I just doing something different, and wrong, or did my system change in a way that broke things. I found I could flash the image from a native Linux installation (Ubuntu 14.04 LTS), Windows, and with the additional tools installed, under Mac OS Sierra. It worked everywhere except under Ubuntu running in VMware Fusion, which had worked previously. This convinced me it was the upgrade to Sierra, and maybe some incompatibility with VMware, that broke some Linux command, preventing the flashing of the image.
A New Approach
That’s really the end of this story. I upgraded, with no real upside, and broke things. Don’t do that, not an a development machine that uses a lot of various tools, and hence can be quite fragile, and a system you need working to do your job. If you’re interested in my solution to the problem, read on, but it doesn’t change the message.
I decided building in a VM wasn’t the ideal solution. A VM inherently doesn’t have access to all the system’s resources. My Mac Book Pro has 4 independent cores, each of which as 2 logical cores. Since my heavy duty work is done in the VM, and the Mac OS is primarily a GUI with office applications for communicating with, you know, phone sanitizers, I allocated 6 cores to the VM, and the majority of the RAM. But if I could build the system natively, I’d get to use all the resources that aren’t tied up by other applications.
So I decided to install Ubuntu natively on my Mac Book. This isn’t for the feint of heart, but Ubuntu (and Linux in general) continues to get better at detecting and working with hardware, even as computer manufacturers seem to have abandoned supporting Linux in any meaningful way. I went with Ubuntu 16.04 LTS, as it has better hardware support for my Mac Book. Note that this introduces a compatibility problem with building the Yocto image I was building, and I still need to use a VM to build that image at this time, but we’re working on upgrading Yocto and removing that problem. Soon, I hope to be able to use all my computer’s hardware to build my projects, and not use a VM.
I wasn’t ready to abandon a commercial OS completely, so I setup a dual-boot system, allowing me to boot into Mac OS if needed, and as I sorted out the kinks in my Linux environment, I did need it. The open source LibreOffice doesn’t work that well with Microsoft documents (I get a lot of crashes), and join.me, our meeting application, has not web-based tool, and does not have a Linux option. But I’ve been able to find replacement Linux applications for some things, and run a VM to run those other applications as needed. So I still have that VM issue, but now it’s running applications that don’t take much processing power, and I can shut down when needed, and dedicate all the hardware to my software builds.
Aftermath
Going native Linux isn’t for everybody, or really, hardly anybody. It’s still a niche because most of the applications used today are still commercial applications running on commercial OSs. The point here isn’t to go Linux; that was just my solution. The point is to think about upgrading. Even on non-development systems, don’t upgrade unless there’s a good reason. You can decide what’ constitutes “good”, but give it some thought, and do remember that new software brings along with it new bugs, new vulnerabilities. There’s a downside to upgrading. Make sure the upside outweighs that before you click that button.