paint-brush
eGPU with macOS: How useful is one, really?by@Alex_Wulff
23,107 reads
23,107 reads

eGPU with macOS: How useful is one, really?

by Alex WulffSeptember 1st, 2018
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

The biggest gripe I hear about Apple’s laptops is about their graphics processing ability. Apple’s highest-end laptop, the 2018 MacBook Pro 15”, ships with a Radeon Pro 560X. This is fine for everything up to basic gaming, but it’s way out of its depth in rendering and real graphics performance. A comparable GTX 1050 ships for around $150. Considering the $2700+ price tag of the aforementioned MacBook one would imagine that Apple could pack in a more powerful graphics processing unit.

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coin Mentioned

Mention Thumbnail
featured image - eGPU with macOS: How useful is one, really?
Alex Wulff HackerNoon profile picture

The biggest gripe I hear about Apple’s laptops is about their graphics processing ability. Apple’s highest-end laptop, the 2018 MacBook Pro 15”, ships with a Radeon Pro 560X. This is fine for everything up to basic gaming, but it’s way out of its depth in rendering and real graphics performance. A comparable GTX 1050 ships for around $150. Considering the $2700+ price tag of the aforementioned MacBook one would imagine that Apple could pack in a more powerful graphics processing unit.


A direct comparison to similar Windows machines or off-the-shelf cards isn’t necessarily valid. Apple is at heart a systems integration company, and they did an excellent job integrating the 560X in all the Apple-created drivers running on the MacBook. Resources are managed much more effectively than on other operating systems. Unfortunately, this integration does not extend to 3rd-party programs. Very few apps’ rendering engines utilize Metal, Apple’s preferred graphics computation framework. So while you will get incredible fluidity in basic interactions even while connected to a few 5K displays, trying to render a video or play a demanding game might not go over too well. Apple Inc. is not shortsighted — they recognized this deficiency and in macOS High Sierra 10.13.4 included support for AMD external graphics processing units (eGPUs). All of this is made possible by Thunderbolt 3 (TB3) ports. A bandwidth of 40 Gbps with TB3 means that the CPU on your computer can push as much information as it needs to the GPU without any kind of throttling, and the GPU can even send computation results back to your screen.

Initial eGPU support on macOS was huge news, but many noticed that Apple did not mention anything about the more-popular NVIDIA GPU. At the time of writing NVIDIA GPUs are not officially supported by Apple but NVIDIA has released beta drivers that allow NVIDIA cards to operate within macOS. Since many programs support CUDA, NVIDIA’s GPU-accelerated computation language, I wanted to try and see if I could get an NVIDIA eGPU to function with my 2016 MacBook Pro 15”.

The first decision I had to make when purchasing my eGPU was between an integrated enclosure and a generic enclosure in which you can insert your own graphics card. The integrated enclosures contain everything needed to function, such as the power supply, graphics card, Thunderbolt controller, and graphics outputs. They are also much smaller, as the enclosure designers can reorganize components for maximum space efficiency. The other enclosures just have a large empty slot where you can insert a full-sized graphics card. They also contain supporting hardware such as the power supply. My final decision came down to the Sonnet eGFX Breakaway Box (650W model) with a GTX 1080 or the Gigabyte Aorus GTX 1080 Gaming Box. Both deliver 100W of power to my MacBook via Power Delivery (PD), enough to charge it quickly and keep it powered during intense rendering and gaming.

I eventually decided on the Gigabyte Aorus GTX 1080 Gaming Box for a number of reasons. Firstly, it looks very sleek and is significantly smaller than most other ePGUs (it’s about the size of a Wii). It also includes four full-size USB-A ports which are useful for my MacBook with only USB-C ports. Lastly, it’s much cheaper. I ordered it, and with two-day Amazon prime shipping it soon appeared at my door.

My first order of business to get it set up was to downgrade from the macOS 10.14 Mojave beta back to High Sierra (10.13), since NVIDIA doesn’t have any drivers that yet support 10.14. I then used the macOS-eGPU script to automatically download and install the NVIDIA web drivers. After a day of fiddling, it eventually kind of worked. In order for my computer to actually recognize the eGPU I had to go through a complicated process of restarting my computer, plugging in the eGPU, logging out, then logging back in and my computer would finally see the GTX 1080. When connected to an external display through the eGPU I had to use some complicated scripts to disable the discreet AMD Radeon Pro 450 in my MacBook otherwise otherwise both displays’ refresh rates slow to a crawl. I also had no way of disconnecting the eGPU without shutting down my computer which was a huge hassle. All in all, I was very dissatisfied with the “hacky” solution. I’m a college student, so my laptop frequently leaves my desk. If I had to go through all this every time I plugged my computer back in at my desk I would lose my mind.

I promptly returned the GTX 1080 model and instead opted to purchase Aorus’ Radeon RX 580 Gaming Box. It has a much less powerful graphics processor, but the AMD’s Radeon series is officially supported by Apple, and it is over $250 cheaper. Once it arrived I connected the DisplayPort cable of my 4K display to one of the three DisplayPort ports on the back of the device, connected the power cable, and plugged the Thunderbolt 3 cable into my MacBook. Within a second my external display flashed to life, the eGPU icon appeared in my menu bar, and my MacBook started charging at 100W. This is truly plug-and-play at its best and an incredible display of Apple’s systems integration ability. No drivers to install, no complicated wiring, just one cable to plug into your computer and that’s it. All I need to do to disconnect the eGPU when I want to bring my laptop somewhere is press the “disconnect” button, and reconnecting is as simple as plugging the cable back in. I don’t even need to have a charger, USB-C to USB-A adapter, or a USB-C to DisplayPort adapter at my desk because the Gaming Box includes all of these things.

Now came the fun part: testing. The discreet GPU in my computer, an AMD Radeon Pro 450, is seriously underpowered so I was excited to try things like real-time rendering and gaming that I couldn’t do before. These boxes are designed to output information to a monitor, not send the data back to a computer, so I performed everything on my external monitor (driven by the eGPU). If you want to use the eGPU without an external display it is possible through the set-eGPU script, but this is not officially supported by Apple and is quite a bit less reliable.

The first thing I did was run some benchmarks. Unigine Valley is a cool-looking one — I ran it at 1920x1080 with 2x Anti-Aliasing and Ultra graphics with a score of 1944, which is about what one can expect from an RX 580. This same configuration except with my laptop’s internal Radeon Pro 450 scored a 590. I used Activity Monitor’s GPU-usage graphs to make sure the RX 580 was actually being used for the first test. To pull this up you can open Activity Monitor and press ⌘4. If you have a computer with a discreet GPU (dGPU), you will see three separate panes with the eGPU connected. Other computers will just have the eGPU and the Intel integrated GPU (iGPU). If the eGPU is utilized correctly the graph will show a spike in activity on the desired card.

Benchmarks are all well and good, but the more interesting tests are real-life performance. I next wanted to see if I got any improvement in rendering in Blender, an animation and game design tool. I decided to first try the BMW benchmarking scene. Later versions of Blender allow you to use Cycles Render to render with OpenCL, a GPU-accelerated computational language that’s supported on AMD cards. I rendered the below scene in 7:15 using Cycles GPU rendering with OpenCL. For comparison this took 10:25 on Cycles’ traditional CPU rendering. AMD also released a proprietary renderer called ProRender which better utilizes AMD GPUs. Scenes rendered with ProRender need to be optimized for it, however, and I didn’t want to go about optimizing the BMW benchmark.

My last test was for gaming. Fortnite is notoriously bad in macOS, so I wanted to see if my eGPU could remedy this situation. Without the eGPU my computer could run Fortnite at 22 FPS with a resolution of 1920x1080 on high settings. The eGPU bumped the frame rate up to 45 FPS, a decent improvement.

Conclusion

If you do any kind of work that can benefit from a better graphics processor then likely an eGPU is worth it for you. The NVIDIA eGPUs are a viable option for macOS desktops, since you should only need to perform the setup hassle every once in a while. Users who often take their laptop will likely prefer AMD chips in their eGPUs due to the plug-and-play nature (at least until Apple adds support for NVIDIA architectures). The added convenience of USB-A adapters and power delivery in eGPUs such as the AORUS Gaming Box is fantastic. I can just come home, plug in one cable, and my workstation will be all set up. Just make sure you have some kind of external display — I ran all my tests on a budget-oriented $250 Samsung 4K display which worked very well.

Apple also officially dropped support for OpenGL and OpenCL in macOS 10.14. This is huge news for app developers, since it now means that all new apps and games should be developed using Metal. Metal is deeply integrated with macOS and the eGPU frameworks, so as more apps adopt Metal graphics performance should only improve.

Like this article? check out my Medium page and my website for more of my work.