Jump to content

14" Lenovo T430 + GTX970@4Gbps-EC2 (EXP GDC v8.3) + Dual Boot internal LCD [mrdatamx]


Recommended Posts

NOTE: This post has been updated to reflect the latest state of this implementation...

 

Hello mates,

I am delighted to share a bit of my new successful implementation...

 

After fighting my way thru previous EGPU implementations using several Linux distributions. From Ubuntu Mate 14 & 15 to Linux Mate and Centos 5 and 6. I only documented one of them.

I had to share this experience, mostly because I am amazed by what the community behind Ubuntu Mate 16.04 has achieved. So bear with me.

 

System Specs

Lenovo T430

Intel Core i5-3320m at 2.6 Ghz

8 GB DDR3L 12800

16 GB DDR3L 12800

Intel HD 4000

EGPU:

Zotac GeForce GTX 750 1GB

EVGA GeForce GTX 950 SC+ 2GB

KFA2 GeForce GTX 970 OC Silent "Infin8 Black Edition" 4GB

EXP GDC v8.3 Beast Express Card

Seasonic 350 watts 80+ bronze

Display:

Internal LCD 1600x900

Dell UltraSharp 2007FP - 20.1" LCD Monitor

 

Procedure:

I prepared the hardware as usual. Feeding power to the Beast adapter using the PSU. Plugging these into the laptop's ExpressCard slot.

 

The installation of Ubuntu Mate 16.04 used is only a couple of weeks old and is loaded only with a full stack of Python and web tools I need.

For the integrated graphic card, stock open source drivers are used. For the EGPU... I was ready to perform the usual steps, disable nouveau drivers, reboot switch to run level 3, install the cuda drivers, etc. But..

Following the advice read on a Ubuntu/Nvidia forum, and very sceptical, I installed the most recent proprietary drivers for my card. Reboot. Boom! I am done. Even functionality previously not available in Linux is now available...

Screenshot at 2016-06-10 21:44:03.png

Screenshot at 2016-06-10 21:47:31.png

As you can see from the last screenshot the drivers now report what processes are being executed on the GPU, that was something reserved previously to high-end GPUs like Teslas.

That screenshot also shows the evidence of the computation being performed in the GPU while the display is rendered in my laptop's LCD.

 

Screenshot at 2016-06-10 21:55:18.png

This screenshot also shows how the proprietary driver can now display the GPU temperature as well as other useful data.

For those of you into CUDA computing, I can report CUDA toolkit 7.5 is now available in the Ubuntu repository and also installs and performs without any issue. I went from zero to training TensorFlow models using the GPU in 30 minutes or so. Amazing!

I could expand this post if anyone needs more info, but it was very easy.

Cheers!

 

After upgrading the GPU two times, my system is now capable of handling Doom fairly easy. Now some benchmark results.

 

RAM eGPU PCIe gen 3d Mark 11 3dm11 Graphics
8 GB GTX 750 2 P3 996 4 095
16 GB GTX 750 2 P3 994 4 094
16 GB GTX 950 1 P5 214 7 076
16 GB GTX 950 2 P5 249 7 709
16 GB GTX 970 1 P7 575 11 202
16 GB GTA 970 2 P8 176 12 946


Now, the difference between Gen 1 and Gen 2 might not seem relevant from the results in the table. But playing Doom there is a difference of around 15 fps on average between both modes. This brief difference is even more noticeable during intense fights.

 

Edited by mrdatamx
Updated title to reflect final state.
Link to comment
Share on other sites

  • 1 month later...
  • Tech Inferno Fan changed the title to 14" Lenovo T430 + GTX750@4Gbps-EC2 (EXP GDC v8.3) + Linux internal LCD [mrdatamx]

Thank you for your messages, and sorry pals, I was not paying attention to this forum. This setup has been super stable. In fact, I have even increased the Ram to 16 GB, and then upgraded the eGPU to an EVGA GTX 950 SC+ without any problem and having the same benefits. This laptop has seen a good amount of gaming (just DOOM and The sims 3) with PCI 2.0 enabled and everything rock solid!

 

Right now I cannot perform any test, I have sold the GPUs, but should get an upgrade in two days.

 

Now to answer your questions.

 

On 23/07/2016 at 8:16 PM, Draekris said:

Wow, this is a really nice setup!  Do you know if you get the pcie compression benefits of Optimus that the Windows drivers have?

Actually, I did not test the compression. I was just amazed I was able to use the eGPU to drive the internal LCD without doing anything special.

 

On 25/07/2016 at 3:54 AM, jowos said:

Did you do anything like Tech Inferno Fan's diy gpu setup 1.30?

Sent from my iPhone using Tapatalk

2

I did not use any extra software. Everything was plug and play.

 

On 25/07/2016 at 7:36 AM, TheReturningVoid said:

Nice setup! I'm planning to do a setup on an Arch Linux system myself soon. Do you still need bumblebee for this to work? I'd assume it's unnecessary, because the new card has the power to handle everything :P

 

Good luck with your setup! By the way, I was an Arch user 4 years ago. Sweet distro. About the bumblebee, no I did not need it. The nVidia Drivers handle all the work by themselves. The "Nvidia X server settings"  lets you select the card you want to use. Then just a quick log out and log in and you are running the selected card.

One thing to notice here.

  • If you select "NVIDIA (Performance Mode)" the laptop panel is driven or accelerated by the eGPU, you can plug an external panel if you want, both are accelerated. 
  • If you select "Intel (Power Saving Mode)" the eGPU gets disabled as if no eGPU was installed, even the drivers seem as if they were not installed.

Screenshot at 2016-08-01 16:36:42.png 

 

So if you intend to completely dedicate the eGPU for CUDA/OPENCL computing, this is the procedure I follow:

  1. Select "NVIDIA (Performance Mode)"
  2. Edit your "xorg.conf" to switch the "active" screen from "Nvidia" to "intel"
  3. Log out and Log in again.

This is the only way I have found to free the eGPU from being "distracted" by drawing the desktop. Then if you want to do some gaming, just edit your "xorg.conf" again.

 

  • Thumbs Up 2
Link to comment
Share on other sites

Thanks a lot for your post. I am also trying to build a similar configuration using rMBP (early 2015) + GTX 1070 + AKitio Thunder2 + Internal display + Ubuntu 16.04. It would be very helpful if you could detail out the complete setup process. Thank you very much for your help.

Link to comment
Share on other sites

@jowos I play Doom on windows, my wife plays The sims 3 on windows. On Linux, I was playing OpenArena, Quake 3 and Xonotic. But I'm so into Doom right now, I stopped playing anything else.

So while you played Doom, were you using an external monitor plugged on eGPU? Or on the internal monitor? I would be very curious to know your configuration if it's the latter.

Sent from my iPhone using Tapatalk

Link to comment
Share on other sites

5 hours ago, ish said:

Thanks a lot for your post. I am also trying to build a similar configuration using rMBP (early 2015) + GTX 1070 + AKitio Thunder2 + Internal display + Ubuntu 16.04. It would be very helpful if you could detail out the complete setup process. Thank you very much for your help.

 
 

Actually, the full setup was very simple for this implementation:

  1. Every device was off.

  2. Plug the "Seasonic 350 watts 80+ bronze" to the "EXP GDC Beast adapter"

  3. Physically install one of the cards (Zotac Geforce GTX 750 1GB or EVGA GTX 950 SC+) on the "Beast adapter"

  4. Provide power to the GPU (only the 950 needed this). I use a power cable from the Beast to the PCIe power

  5. Plug the ExpressCard adapter into the Lenovo T430.

  6. Turn on the T430, this it sends a power on signal to the Beast/EGPU, so they turn on too.

  7. On the OS selection screen, before booting into Ubuntu 16.04, I manually added two grub flags: pci=nocrs pci=realloc. These were later added permanently to grub

  8. Check if the eGPU was recognised. On the terminal, I executed: lspci. My device was listed.

  9. After two minutes Ubuntu had enabled the Nouveau drivers. No problem

  10. Open the "Additional Drivers" application in the configuration. I selected to use the proprietary and tested drivers (In the first screenshot of my original post). Wait for it to finish installing.

  11. Reboot

  12. Done

The Nouveau drivers get disabled by the automated setup and they have not given any problem to me.

 

Now if I am using an external LCD monitor I need to boot the system with it unplugged. Then, once I started a session in the OS I plug the monitor and enable it.

Link to comment
Share on other sites

On 02/08/2016 at 10:47 AM, jowos said:

So while you played Doom, were you using an external monitor plugged on eGPU? Or on the internal monitor? I would be very curious to know your configuration if it's the latter.

Sent from my iPhone using Tapatalk

 
 
 

With this setup using Windows I have played Doom using both external only or internal only. I have found the texture compression to be severe when using the internal LCD so I prefer to use the external monitor for Doom.

Now on the performance side.

evga_gtx950-SC+ACX2.0.gif

With the GTX 950, I used the "optimised" settings recommended by the Nvidia Experience application (mostly Medium quality settings). I was delighted to game at 900P resolution with 60+ fps on light-action scenes, dropping to low 50 fps in ONLY ONE of the heavy action parts of the demo at Nightmare difficulty.

 

Using the GTX 970 described on my first post, I get to enjoy Doom at 1600x1200 on High-Quality settings at 90+ fps. Nvidia Experience recommended to use Ultra settings, but using they got my implementation unstable, the Nvidia drivers started crashing and so I am back to High-Quality.DOOMx64_2016_08_04_16_29_02_277.jpg

Edited by mrdatamx
Added screenshots.
Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.