Jump to content

eGPU experiences [version 2.0]


Tech Inferno Fan

Recommended Posts

@cheRRymonk, are you hotplugging the eGPU after windows welcome screen, or from the start? Just make sure that you have yellow and green light when you're plugging it not the red light. Did you run a PCI compaction, or DSDT override?

Nevermind... It was a faulty videocard so I brought it back to the store. Now I bought GTX 560TI and everything is working now :) The only question I have is how to turn x1.2 setup on my Lifebook T900? Is it even possible? I did not find any info on it's expresscard slot speed. I don't see speed settings in BIOS.

P.S. Just did a quick test playing Alan Wake. I know it's a poor console port but I was hope to get something more then 15-20 fps on low settings =/ Probably I missed something. GPU-Z show me PCIE 1.1 x16 @ x1.1 1. How do I know if optimus is working? I installed 331.82 WHQL driver.

Maybe you'll have to enable the Gen2 link speed for the port. Since you don't have it in your bios you'll need setup 1.x. But just to make sure what's your CPU? Is it an intel 6 series or after?

@coolioboy23 I don't think there's any improvements between a 34/54 EC I guess. It would be a known workaround.

  • Thumbs Up 1
Link to comment
Share on other sites

I want to understand the content off the pci.bat files a bit better. Is there any good source where I can read and learn more about them?

http://www.pcisig.com/specifications/pciexpress/technical_library/dev_con_09_02/specifications/pciexpress/PCIExpress_SoftwareandConfigurationModel.pdf

http://egpu.maeth.net/help/compact.txt

Link to comment
Share on other sites

Thanks, especially for the second link.

I've been trying out the DIY Setup 1.x tonight and ran into some issues with the PCI compaction. Is it supposed to take a long time to run? I get stuck at "setpci" and it just stands there blinking. Have waited for 5 minutes before shutting of the computer. Doesn't seem to matter what setting I change but always gets stuck here.

(If it matters I got a mid-2011 Macbook Air with 4GB and a i7-2677m trying to run a GTX 660 through a sonnet echo express card and a PE4L).

Link to comment
Share on other sites

Thanks, especially for the second link.

I've been trying out the DIY Setup 1.x tonight and ran into some issues with the PCI compaction. Is it supposed to take a long time to run? I get stuck at "setpci" and it just stands there blinking. Have waited for 5 minutes before shutting of the computer. Doesn't seem to matter what setting I change but always gets stuck here.

(If it matters I got a mid-2011 Macbook Air with 4GB and a i7-2677m trying to run a GTX 660 through a sonnet echo express card and a PE4L).

If you are running Setup 1.30 then contact me offline at Tech Inferno [email protected] to discuss.

Link to comment
Share on other sites

- - - Updated - - -

Maybe you'll have to enable the Gen2 link speed for the port. Since you don't have it in your bios you'll need setup 1.x. But just to make sure what's your CPU? Is it an intel 6 series or after?

It's Intel i5 520m. Btw, I tested BF3, it's 30 fps with ultra settings. So..I guess everything works properly. I can switch between High Performance and old graphics in Nvidia Control panel, it's now on Auto-select. Does it mean optimus is working?

Link to comment
Share on other sites

Thank you @ Jacobsson you gave me the best info. Cleared all me doubts. You definitely rock bro. Appreciated. Thank you.

Now the only thing I am left with, is with checking out whether PE4L v2.1b would be easily available in the Indian market so that I could purchase it hand to hand as it is too expensive for me purchasing it online.

Hey and one more thing will it be okay if I use a 54mm pci Express card intead of the 34mm pci Express card in my eGPU setup ??? Will I get some improved performance if I use a 54mm pci Express card compared to the 34mm pci Express card ??

Np!

I will say this now before you jump on the wagon, many times 'setup 1.x' (pre-boot software for eGPU) is necessary in order for the eGPU to be detected correctly.

You get the setup 1.x by donating to @Tech Inferno Fan (and it costs $25 if I'm correct?). I'm just saying this since it sounds like you might be short on money and this might be the case later.

The PE4L 2.1b-EC is 34mm and fits in both 34mm/54mm slots. There is no performance improvements between the two, only the physical size differs.

  • Thumbs Up 1
Link to comment
Share on other sites

- - - Updated - - -

It's Intel i5 520m. Btw, I tested BF3, it's 30 fps with ultra settings. So..I guess everything works properly. I can switch between High Performance and old graphics in Nvidia Control panel, it's now on Auto-select. Does it mean optimus is working?

GPU-Z should show a "x16 @2.0", either way you'll need a setup 1.x to enable Gen2 link speed if you can't do it through the BIOS. Also in Nvidia control panel make sure to change the Physx to your GFX card (I think it's in "3D managing settings").

  • Thumbs Up 1
Link to comment
Share on other sites

Here is a proof that there is no point of buying GTX680. Because it is similar with GTX670, there is no statistic or even "subjective" differences in playing in my opinion.

GPU is fully loaded by i7-3630QM bu there is no difference is FPS.

and here Ypu have proof that there is a reason to buy 35W TDP instead of 45W TDP for small laptops.

- - - http://forum.techinferno.com/hp-business-class-notebooks/2537-12-5-hp-elitebook-2570p-owners-lounge-36.html#post77289

FHD, paracel, medium, 64x64. post-10292-14494996766205_thumb.png

Link to comment
Share on other sites

HP Elitebook 2570p: Radeon HD6850 eGPU implementation

In order to make a comparison of AMD and Nvidia eGPU setup, I decided to make an HD6850 eGPU implementation. It took me three days of struggle, but finally I've got it working.

The reason for the comparison is that I wanted to check the level, in which both cards are limited by eGPU bandwidth. I've experienced huge FPS drops in a few games on my usual GTX660 eGPU, and it got me thinking if an AMD GPU would perform better. Great performance achieved with a HD7970 GPU by user sskillz was very promising. Maybe an AMD GPU could somehow overcome the bandwidth limitation problem?

Great thanks to Tech Inferno Fan and sskillz, who helped me to succesfully install HD6850 on my notebook.

Of course using the HD6850 GPU required making and DSDT Override and PCIe compaction. I used the newest Setup 1.3, because it's much more user friendly and powerful than previous version that I've owned, I mean Setup 1.1.

After three days I achieved this:

19850138_hd6850.gif

Looks nice to me :)

First, I made a PCIe bandwidth benchmark using PCIeSpeedTest. The peak values for 20 benchmarks were:

CPU->GPU: 360MB/s

GPU->CPU: 440MB/s

Compared to the [email protected], there is no big difference. In fact, depending on how many benchmarks we run, this numbers are quite the same. So, there is a chance that comparing an GTX660 with it's AMD equivalent (7870?) there could be a similar performance.

First bench:

HD6850 x1.2 3dmark 11 -for me the GPU score is rather normal for a HD6850 GPU, but on our forum nhl.pl people said it's too low... what do you think?

And the most important test for me, Crysis 3, mission 2 "Welcome to the Jungle" (map fields). The reason for it's importance is, that it's the first and the most distinctive example of the GPU limited by the eGPU bandwidth. After about 2 minutes of gameplay there is a beautiful "grass moment", meaning that the player goes outside and discovers the fuly grass and tree covered New York. The grass moves naturally with the wind blowing, and somehow it does not only load the CPU... it loads the PCIe Bus really hard. On GTX660 O've got about 20 FPS in this scene. How about the HD6850?

19847595_crysis-3_all-low_hd6850-x1-2_fullhd_map-fields.png

Unfortunately, performance is even worse than with GTX660. I could play the first mission on GTX660 with 30-45 FPS on medium setting, with HD6850 I could barely pass 25 FPS. In the second mission we can see a really low FPS, about 3 less average than with GTX660. And here is the fun part: on GTX660 the CPU was constantly loaded at about 95-99%, here we can see a waaay lower CPU utilisation.

The GPU usage is not present in the HWInfo logs, but checked with MSI Afterburner it's a constant 99% load.

Now, what do you think about this performance? Is something wrong? Do I need to turn something off, like AMD HD Audio...?

Link to comment
Share on other sites

First bench:

HD6850 x1.2 3dmark 11 -for me the GPU score is rather normal for a HD6850 GPU, but on our forum nhl.pl people said it's too low... what do you think?

That 3dmk11 result is indeed low. From this point on it's going to be difficult to make any meaningful comparison of a GTX660 versus a HD6850. The HD6850 uses the old VLIW5 architecture, not the GCN that a comparable HD7870 would. In addition we found that HD5xxx (and likely HD6xxx) needed to run x1E to get what appears to be full duplex bandwidth on older chipsets . Certainly my testing of a HD7870 vs GTX660, both at x1 2.0 didn't have this problem. http://forum.techinferno.com/diy-e-gpu-projects/2747-%5Bguide%5D-12-dell-e6230-gtx660%40x1-2opt-hd7870%40x1-2-pe4l-ec060a-2-1b.html#post37197

Is there any chance you can borrow a GCN-based AMD card (HD78xx, HD79xx) off someone to do what would be more comparable testing?

Link to comment
Share on other sites

That 3dmk11 result is indeed low. From this point on it's going to be difficult to make any meaningful comparison of a GTX660 versus a HD6850. The HD6850 uses the old VLIW5 architecture, not the GCN that a comparable HD7870 would. In addition we found that HD5xxx (and likely HD6xxx) needed to run x1E to get what appears to be full duplex bandwidth on older chipsets . Certainly my testing of a HD7870 vs GTX660, both at x1 2.0 didn't have this problem. http://forum.techinferno.com/diy-e-gpu-projects/2747-%5Bguide%5D-12-dell-e6230-gtx660%40x1-2opt-hd7870%40x1-2-pe4l-ec060a-2-1b.html#post37197

Is there any chance you can borrow a GCN-based AMD card (HD78xx, HD79xx) off someone to do what would be more comparable testing?

You mean that it's too low for a comparison, or too low for a HD6850 card?

It's not possible to get a kind of x1E 2.0? I mean, could I somehow use the x1E tweak on my setup?

For now it's not possible too get a GCN-based GPU. I'm curious what is the difference between HD6850 and 7770 in eGPU.

Maybe I could OC my GPU, but I'm not sure about the results.

Crysis 3 performance is quite disappointing. I hoped to get FPS closer to the desktop HD6850.The FPS is not CPU limited, is it?

Link to comment
Share on other sites

Oh now I wish I tried that grass scene before I sold my GPU :/

But can't you test that scene on a desktop at 1x 2x... etc?

[ATTACH=CONFIG]10035[/ATTACH]

Or a PCI 1x - 1x riser and just test it on your PC

I tested it on a desktop PC with this GPU, but there's an ancient Core 2 Duo e6550 CPU. Performance was fairly the same in the grass scene, that's why I thought it's rather CPU related.

But the first mission is not playable, even on low setting. GTX660 or desktop HD6850 gave me better FPS there.

On our forum nhl.pl there is an user waldeksik, he really helps us in testing by emulating x1 2.0 PCIe Bus on his Quadro 4000m. I'll ask him to make a Crysis 3 comparison, if he didn't do it already (I am not sure).

Or we can wait till you get a new GPU :) MY GTX660 and HD6850 is not going anywhere at the moment.

To be honest, I hoped that AMD would give me higher FPS and thus be a better choice. Now, looking at your BF4 performance I'd like to get a powerful AMD GPU, but looking at how much HD6850 is limited in the grass scene, I'm not sure if Radeon is a better choice than Geforce. For now, given that 6850 is way weaker than 660, AMD and Nvidia are pretty the same in eGPU use.

I still think I'm missing something. I feel there is an important factor which affects my performance.

Link to comment
Share on other sites

I tested it on a desktop PC with this GPU, but there's an ancient Core 2 Duo e6550 CPU. Performance was fairly the same.

On our forum nhl.pl there is an user waldeksik, he really helps us in testing by emulating x1 2.0 PCIe Bus on his Quadro 4000m. I'll ask him to make a Crysis 3 comparison, if he didn't do it already (I am not sure).

Or we can wait till you get a new GPU :) MY GTX660 and HD6850 is not going anywhere at the moment.

To be honest, I hoped that AMD would give me higher FPS and thus be a better choice. Now, looking at your BF4 performance I'd like to get a powerful AMD GPU, but looking at how much HD6850 is limited in the grass scene, I'm not sure if Radeon is a better choice than Geforce. For now, given that 6850 is way weaker than 660, AMD and Nvidia are pretty the same in eGPU use.

Your 6850 gpu score is in the ball park for your core clock rate, you can see it in the following search results:

https://www.google.coom/search?q=6850+3dm11+&ie=utf-8&oe=utf-8&rls=org.mozilla:en-US:official&client=firefox-a&gws_rd=cr&ei=KXS0UuSaOoSUhQfJsIGwAg#q=6850+3dm11+6850%281x%29&rls=org.mozilla:en-US:official&safe=off

I'm currently waiting for a decent cooling for 290 or an alternative to pop up, and it gives me a chance to study :P.

Right now 770 is priced at the same price I sold my 7970, and a 290(non X) is 100$ higher but a reference design. A 780 (none TI) is 150$ higher.

Its anyone guess how much a 290 non-reference will cost.

Link to comment
Share on other sites

Your 6850 gpu score is in the ball park for your core clock rate, you can see it in the following search results:

https://www.google.coom/search?q=6850+3dm11+&ie=utf-8&oe=utf-8&rls=org.mozilla:en-US:official&client=firefox-a&gws_rd=cr&ei=KXS0UuSaOoSUhQfJsIGwAg#q=6850+3dm11+6850%281x%29&rls=org.mozilla:en-US:official&safe=off

I'm currently waiting for a decent cooling for 290 or an alternative to pop up, and it gives me a chance to study :P.

Right now 770 is priced at the same price I sold my 7970, and a 290(non X) is 100$ higher but a reference design. A 780 (none TI) is 150$ higher.

Its anyone guess how much a 290 non-reference will cost.

OK, I hoped that my score was to low :)

Looking at my performance, I doubt if going with AMD GPU is a good choice. I thought it would be less PCIe bus demanding, but it's not. And using Nvidia GPU is easier.

Link to comment
Share on other sites

OK, I hoped that my score was to low :)

Looking at my performance, I doubt if going with AMD GPU is a good choice. I thought it would be less PCIe bus demanding, but it's not. And using Nvidia GPU is easier.

I quote @Tech Inferno Fan here: "The HD6850 uses the old VLIW5 architecture, not the GCN that a comparable HD7870 would."

This architecture difference might be a very important factor to consider, meaning that AMD could yield better performance in an eGPU perspective. If you have a friend using a HD7950 (look here) card that you could re-do this test with we might get our answer. This is really exciting stuff!

Link to comment
Share on other sites

I quote @Tech Inferno Fan here: "The HD6850 uses the old VLIW5 architecture, not the GCN that a comparable HD7870 would."

This architecture difference might be a very important factor to consider, meaning that AMD could yield better performance in an eGPU perspective. If you have a friend using a newer HD7XXX-card that you could re-do this test with we might get our answer. This is really exciting stuff!

OK, maybe there is some magic that AMD implemented in GCN architecture, which works well with eGPU as the Nvidia Optimus does, and AMD does not know how brilliant they are :)

So, I'll try to get my hands on any GCN-based GPU. I'm excited about this benchmarks too.

Link to comment
Share on other sites

OK, maybe there is some magic that AMD implemented in GCN architecture, which works well with eGPU as the Nvidia Optimus does, and AMD does not know how brilliant they are :)

So, I'll try to get my hands on any GCN-based GPU. I'm excited about this benchmarks too.

I edited my last post to HD7950, since this seems to be equivalent to GTX660 in Crysis 3 performance.

Also, HD76XX is not to be considered since these uses the old VLIW5 .

Link to comment
Share on other sites

Quote

Quote

- - - Updated - - -





GPU-Z should show a "x16 @2.0", either way you'll need a setup 1.x to enable Gen2 link speed if you can't do it through the BIOS. Also in Nvidia control panel make sure to change the Physx to your GFX card (I think it's in "3D managing settings").



I already tweaked the physx but didn't notice improvements. My 3dmark06 score is 11500. I found a similar system in the leaderboard here it is:
14" Lenovo_Y460 i5-520M 2.4 4.0 [email protected] 13500 Mjolner Y HM55 EC Win7/64
He's got 13500 points. Can't figure where I lost 2000 points... So for now I have PCIE 1.1 x16 @ x1 1.1 in GPU-Z. Could anybody confirm that it is possible to achieve x1.2opt? I'd be glad of any performance increasing.

Link to comment
Share on other sites

Quote



I already tweaked the physx but didn't notice improvements. My 3dmark06 score is 11500. I found a similar system in the leaderboard here it is:

14" Lenovo_Y460 i5-520M 2.4 4.0 [email protected] 13500 Mjolner Y HM55 EC Win7/64
He's got 13500 points. Can't figure where I lost 2000 points... So for now I have PCIE 1.1 x16 @ x1 1.1 in GPU-Z. Could anybody confirm that it is possible to achieve x1.2opt? I'd be glad of any performance increasing.



You have Optimus pci-e compression working because if you didn't your 3dmark06 score would be < 5k. You can't achieve x1.2Opt as your chipset limits you to x1 1.0. Best you can get is x1.Opt. Since you haven't included a link to your 3dmark06 score I can't see how the individual CPU and GPU result compares to Mjolner's. What I speculate is you have Windows power profile set to non High Performance one and are getting a lower 3dmark06 score due to the CPU not running at full performance.

Quote

OK, maybe there is some magic that AMD implemented in GCN architecture, which works well with eGPU as the Nvidia Optimus does, and AMD does not know how brilliant they are :) So, I'll try to get my hands on any GCN-based GPU. I'm excited about this benchmarks too.



I've excited for you too. My comparison at [url]http://forum.techinferno.com/diy-e-gpu-projects/2747-%5Bguide%5D-12-dell-e6230-gtx660@[email protected][/url] saw AMD cards perform very well. What I didn't test is FPS of recent games across areas where NVidia eGPUs have significant slowdowns. Yourself and Bjorm are providing very valuable results and analysis there. A direct comparison of similar performing GTX660 vs HD7870 would give us the final answer on which is the better performing option when using an external LCD.

You can see that AMD has two other factors working against it for eGPU use: no internal LCD support provide by it's drivers to match NVidia Optimus. There require LucidLogix Virtu as a lower-performing option. Also, the large pci-e aperature requirements required a more complex DSDT override/substitution eGPU implementation requiring Setup 1.x on your 2570P. A NVidia GTX660+ was plug and play. Hence why we are seeing a ton more NVidia eGPUs.

One thing favoring AMD is it's x1 2.0 high performance regardless if the system has or doesn't have an iGPU. So for IVB/SB systems with no active iGPU (dGPU only, eg: HP 8570W), an AMD card is the better performance option.
Link to comment
Share on other sites

SUCCESS! ! ! ! ! !

Almost instant success. Well I set it up as per my post #2406 on page 241 and at first after plugging in the eGPU and waking the laptop, nothing. So I went into DM and bingo, there it was, naked as a j-bird, "device (Nvidia GTS460 VGA card) not recognised". Right clicked and updated the card with the Nvidia drivers I had down loaded in anticipation of this and bingo, device recognised.

Put the laptop to sleep, plugged the eGPU in, woke it up and the 27" Dell activated but a grey screen with no image but did have the mouse pointer. Right clicked and set up the monitors in "Screen Resolution", turned off the lappy monitor in same window and the full monty appeared barfing all over the Del 27" . She all works.

When disconnecting as a reverse of above the first time, I got BSOD. So, using the reverse of above, after changing the monitors back, I used SAFELY DISCONNECT HARDWARE to disconnect the Nvidia eGPU, then put the laptop back to sleep, unplugged the E2C2 card and the laptop woke it up with no BSOD. All works.

Thank you Nando and all other contributors here for the 500 pages and 2 weeks of heavy reading it took me to get the thing going but it WORKS! It has been fun making this.

Next is a nice enclosure and tweaking of the GTS450.

Cheers

DIY eGPU Project details: TOSHIBA Satellite L300 PSLB8A: WIN 7.Processor: Genuine Intel® CPU 585 @ 2.16GHz, ~2.2GHz Memory: 2048MB RAMAvailable OS Memory: 1916MB RAM: Mobile ntel® 4 Series Express Chipset Family. Display Memory: 830 MB Dedicated Memory: 128 MB Shared Memory: 702 MB. The following DIY parts are all 2nd hand from ebay; PE4H ver 2.4, EC2C + flat cable, SWEX, ASUS ENGTS450 DIRECTCU/DI/1GD5 GeForce GTS 450 (Fermi) 1GB128-bit DDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card using Driver Ver 306.63WHQL. Dell 2707 27" vga/HDMI/DVI monitor. Dell PC 450W PSU. 2 x 4 pin molex on seperate rails to 1 x 6pin connector on VGA, powered PE4H via 3rd rail. Cost: EP4H/EC2C + cable + SWEX. $130. ENGTS450 VGA, $60, Monitor $100. PSU free, cables, free, 1 x molex $5. = $295. I'm happy. To the computer "techs" that told me this was impossible, we at TFI can say " . . . eat our shorts!

  • Thumbs Up 2
Link to comment
Share on other sites

You have Optimus pci-e compression working because if you didn't your 3dmark06 score would be < 5k. You can't achieve x1.2Opt as your chipset limits you to x1 1.0. Best you can get is x1.Opt. Since you haven't included a link to your 3dmark06 score I can't see how the individual CPU and GPU result compares to Mjolner's. What I speculate is you have Windows power profile set to non High Performance one and are getting a lower 3dmark06 score due to the CPU not running at full performance.

It was set to Balanced, I changed it to High Performance. Then I decided to check BIOS settings to see if something is limitating my CPU. I found only Virtualization was Disabled in "CPU features", so I enabled this. This is ridiculous, but my score became even lower now: NVIDIA GeForce GTX 560 Ti video card benchmark result - Intel Core i5-520M Processor,FUJITSU FJNB204

P.S. THe score shows "Non-default settings were used" but I didn't change anything, it's not allowed to change settings in 3dmark06.

The resolution is 1280x800

Link to comment
Share on other sites

It was set to Balanced, I changed it to High Performance. Then I decided to check BIOS settings to see if something is limitating my CPU. I found only Virtualization was Disabled in "CPU features", so I enabled this. This is ridiculous, but my score became even lower now: NVIDIA GeForce GTX 560 Ti video card benchmark result - Intel Core i5-520M Processor,FUJITSU FJNB204

P.S. THe score shows "Non-default settings were used" but I didn't change anything, it's not allowed to change settings in 3dmark06.

The resolution is 1280x800

Attach an external LCD to the GTX560Ti eGPU rather than driving the internal LCD via Optimus, which needs some of the precious bandwidth. Then you'll get a similar 3dmark06 score to Mjolner's 13.5k

  • Thumbs Up 1
Link to comment
Share on other sites

×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use. We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.