mrdatamx Posted June 10, 2016 Share Posted June 10, 2016 (edited) NOTE: This post has been updated to reflect the latest state of this implementation... Hello mates, I am delighted to share a bit of my new successful implementation... After fighting my way thru previous EGPU implementations using several Linux distributions. From Ubuntu Mate 14 & 15 to Linux Mate and Centos 5 and 6. I only documented one of them. I had to share this experience, mostly because I am amazed by what the community behind Ubuntu Mate 16.04 has achieved. So bear with me. System Specs Lenovo T430 Intel Core i5-3320m at 2.6 Ghz 8 GB DDR3L 12800 16 GB DDR3L 12800 Intel HD 4000 EGPU: Zotac GeForce GTX 750 1GB EVGA GeForce GTX 950 SC+ 2GB KFA2 GeForce GTX 970 OC Silent "Infin8 Black Edition" 4GB EXP GDC v8.3 Beast Express Card Seasonic 350 watts 80+ bronze Display: Internal LCD 1600x900 Dell UltraSharp 2007FP - 20.1" LCD Monitor Procedure: I prepared the hardware as usual. Feeding power to the Beast adapter using the PSU. Plugging these into the laptop's ExpressCard slot. The installation of Ubuntu Mate 16.04 used is only a couple of weeks old and is loaded only with a full stack of Python and web tools I need. For the integrated graphic card, stock open source drivers are used. For the EGPU... I was ready to perform the usual steps, disable nouveau drivers, reboot switch to run level 3, install the cuda drivers, etc. But.. Following the advice read on a Ubuntu/Nvidia forum, and very sceptical, I installed the most recent proprietary drivers for my card. Reboot. Boom! I am done. Even functionality previously not available in Linux is now available... As you can see from the last screenshot the drivers now report what processes are being executed on the GPU, that was something reserved previously to high-end GPUs like Teslas. That screenshot also shows the evidence of the computation being performed in the GPU while the display is rendered in my laptop's LCD. This screenshot also shows how the proprietary driver can now display the GPU temperature as well as other useful data. For those of you into CUDA computing, I can report CUDA toolkit 7.5 is now available in the Ubuntu repository and also installs and performs without any issue. I went from zero to training TensorFlow models using the GPU in 30 minutes or so. Amazing! I could expand this post if anyone needs more info, but it was very easy. Cheers! After upgrading the GPU two times, my system is now capable of handling Doom fairly easy. Now some benchmark results. RAM eGPU PCIe gen 3d Mark 11 3dm11 Graphics 8 GB GTX 750 2 P3 996 4 095 16 GB GTX 750 2 P3 994 4 094 16 GB GTX 950 1 P5 214 7 076 16 GB GTX 950 2 P5 249 7 709 16 GB GTX 970 1 P7 575 11 202 16 GB GTA 970 2 P8 176 12 946 Now, the difference between Gen 1 and Gen 2 might not seem relevant from the results in the table. But playing Doom there is a difference of around 15 fps on average between both modes. This brief difference is even more noticeable during intense fights. Edited August 6, 2016 by mrdatamx Updated title to reflect final state. Quote Link to comment Share on other sites More sharing options...
Draekris Posted July 23, 2016 Share Posted July 23, 2016 Wow, this is a really nice setup! Do you know if you get the pcie compression benefits of Optimus that the Windows drivers have? Quote Link to comment Share on other sites More sharing options...
jowos Posted July 25, 2016 Share Posted July 25, 2016 Did you do anything like Tech Inferno Fan's diy gpu setup 1.30? Sent from my iPhone using Tapatalk Quote Link to comment Share on other sites More sharing options...
TheReturningVoid Posted July 25, 2016 Share Posted July 25, 2016 Nice setup! I'm planning to do a setup on an Arch Linux system myself soon. Do you still need bumblebee for this to work? I'd assume it's unnecessary, because the new card has the power to handle everything Quote Link to comment Share on other sites More sharing options...
mrdatamx Posted August 1, 2016 Author Share Posted August 1, 2016 Thank you for your messages, and sorry pals, I was not paying attention to this forum. This setup has been super stable. In fact, I have even increased the Ram to 16 GB, and then upgraded the eGPU to an EVGA GTX 950 SC+ without any problem and having the same benefits. This laptop has seen a good amount of gaming (just DOOM and The sims 3) with PCI 2.0 enabled and everything rock solid! Right now I cannot perform any test, I have sold the GPUs, but should get an upgrade in two days. Now to answer your questions. On 23/07/2016 at 8:16 PM, Draekris said: Wow, this is a really nice setup! Do you know if you get the pcie compression benefits of Optimus that the Windows drivers have? Actually, I did not test the compression. I was just amazed I was able to use the eGPU to drive the internal LCD without doing anything special. On 25/07/2016 at 3:54 AM, jowos said: Did you do anything like Tech Inferno Fan's diy gpu setup 1.30? Sent from my iPhone using Tapatalk 2 I did not use any extra software. Everything was plug and play. On 25/07/2016 at 7:36 AM, TheReturningVoid said: Nice setup! I'm planning to do a setup on an Arch Linux system myself soon. Do you still need bumblebee for this to work? I'd assume it's unnecessary, because the new card has the power to handle everything Good luck with your setup! By the way, I was an Arch user 4 years ago. Sweet distro. About the bumblebee, no I did not need it. The nVidia Drivers handle all the work by themselves. The "Nvidia X server settings" lets you select the card you want to use. Then just a quick log out and log in and you are running the selected card. One thing to notice here. If you select "NVIDIA (Performance Mode)" the laptop panel is driven or accelerated by the eGPU, you can plug an external panel if you want, both are accelerated. If you select "Intel (Power Saving Mode)" the eGPU gets disabled as if no eGPU was installed, even the drivers seem as if they were not installed. So if you intend to completely dedicate the eGPU for CUDA/OPENCL computing, this is the procedure I follow: Select "NVIDIA (Performance Mode)" Edit your "xorg.conf" to switch the "active" screen from "Nvidia" to "intel" Log out and Log in again. This is the only way I have found to free the eGPU from being "distracted" by drawing the desktop. Then if you want to do some gaming, just edit your "xorg.conf" again. 2 Quote Link to comment Share on other sites More sharing options...
jowos Posted August 1, 2016 Share Posted August 1, 2016 Nice one! Are you playing those titles in linux? Sent from my iPhone using Tapatalk Quote Link to comment Share on other sites More sharing options...
mrdatamx Posted August 2, 2016 Author Share Posted August 2, 2016 @jowos I play Doom on windows, my wife plays The sims 3 on windows. On Linux, I was playing OpenArena, Quake 3 and Xonotic. But I'm so into Doom right now, I stopped playing anything else. Quote Link to comment Share on other sites More sharing options...
ish Posted August 2, 2016 Share Posted August 2, 2016 Thanks a lot for your post. I am also trying to build a similar configuration using rMBP (early 2015) + GTX 1070 + AKitio Thunder2 + Internal display + Ubuntu 16.04. It would be very helpful if you could detail out the complete setup process. Thank you very much for your help. Quote Link to comment Share on other sites More sharing options...
jowos Posted August 2, 2016 Share Posted August 2, 2016 @jowos I play Doom on windows, my wife plays The sims 3 on windows. On Linux, I was playing OpenArena, Quake 3 and Xonotic. But I'm so into Doom right now, I stopped playing anything else. So while you played Doom, were you using an external monitor plugged on eGPU? Or on the internal monitor? I would be very curious to know your configuration if it's the latter. Sent from my iPhone using Tapatalk Quote Link to comment Share on other sites More sharing options...
mrdatamx Posted August 2, 2016 Author Share Posted August 2, 2016 5 hours ago, ish said: Thanks a lot for your post. I am also trying to build a similar configuration using rMBP (early 2015) + GTX 1070 + AKitio Thunder2 + Internal display + Ubuntu 16.04. It would be very helpful if you could detail out the complete setup process. Thank you very much for your help. Actually, the full setup was very simple for this implementation: Every device was off. Plug the "Seasonic 350 watts 80+ bronze" to the "EXP GDC Beast adapter" Physically install one of the cards (Zotac Geforce GTX 750 1GB or EVGA GTX 950 SC+) on the "Beast adapter" Provide power to the GPU (only the 950 needed this). I use a power cable from the Beast to the PCIe power Plug the ExpressCard adapter into the Lenovo T430. Turn on the T430, this it sends a power on signal to the Beast/EGPU, so they turn on too. On the OS selection screen, before booting into Ubuntu 16.04, I manually added two grub flags: pci=nocrs pci=realloc. These were later added permanently to grub Check if the eGPU was recognised. On the terminal, I executed: lspci. My device was listed. After two minutes Ubuntu had enabled the Nouveau drivers. No problem Open the "Additional Drivers" application in the configuration. I selected to use the proprietary and tested drivers (In the first screenshot of my original post). Wait for it to finish installing. Reboot Done The Nouveau drivers get disabled by the automated setup and they have not given any problem to me. Now if I am using an external LCD monitor I need to boot the system with it unplugged. Then, once I started a session in the OS I plug the monitor and enable it. Quote Link to comment Share on other sites More sharing options...
mrdatamx Posted August 2, 2016 Author Share Posted August 2, 2016 (edited) On 02/08/2016 at 10:47 AM, jowos said: So while you played Doom, were you using an external monitor plugged on eGPU? Or on the internal monitor? I would be very curious to know your configuration if it's the latter. Sent from my iPhone using Tapatalk With this setup using Windows I have played Doom using both external only or internal only. I have found the texture compression to be severe when using the internal LCD so I prefer to use the external monitor for Doom. Now on the performance side. With the GTX 950, I used the "optimised" settings recommended by the Nvidia Experience application (mostly Medium quality settings). I was delighted to game at 900P resolution with 60+ fps on light-action scenes, dropping to low 50 fps in ONLY ONE of the heavy action parts of the demo at Nightmare difficulty. Using the GTX 970 described on my first post, I get to enjoy Doom at 1600x1200 on High-Quality settings at 90+ fps. Nvidia Experience recommended to use Ultra settings, but using they got my implementation unstable, the Nvidia drivers started crashing and so I am back to High-Quality. Edited August 4, 2016 by mrdatamx Added screenshots. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.